WorldWideScience

Sample records for hirnantian events based

  1. The Hirnantian δ13C Positive Excursion in the Nabiullino Section (South Urals)

    Science.gov (United States)

    Yakupov, R. R.; Mavrinskaya, T. M.; Smoleva, I. V.

    2018-02-01

    The upper Sandbian, Katian, and Hirnantian complexes of conodonts in the upper Ordovician section of the western slope of the Southern Urals near the village of Nabiullino were studied. The δ13C positive excursion with a maximum of 3.3‰ associated with the global Hirnantian isotopic event, HICE, was fixed for the first time. This excursion shows the beginning of the Hirnantian stage in the terrigenous-carbonate section of the upper Ordovician in the Southern Urals. It coincides with the first occurrence of the Hirnantian conodont species of Gamachignathus ensifer and the conodonts of shallow-water biophacies, Aphelognathus-Ozarkodina, reflecting the global glacio-eustatic event.

  2. Reconstruction of the mid-Hirnantian palaeotopography in the Upper Yangtze region, South China

    Directory of Open Access Journals (Sweden)

    Linna Zhang

    2014-12-01

    Full Text Available Reconstruction of the Hirnantian (Late Ordovician palaeotopography in South China is important for understanding the distribution pattern of the Hirnantian marine depositional environment. In this study, we reconstructed the Hirnantian palaeotopography in the Upper Yangtze region based on the rankings of the palaeo-water depths, which were inferred according to the lithofacies and biofacies characteristics of the sections. Data from 374 Hirnantian sections were collected and standardized through the online Geobiodiversity Database. The Ordinary Kriging interpolation method in the ArcGIS software was applied to create the continuous surface of the palaeo-water depths, i.e. the Hirnantian palaeotopography. Meanwhile, the line transect analysis was used to further observe the terrain changes along two given directions.The reconstructed palaeotopographic map shows a relatively flat and shallow epicontinental sea with three local depressions and a submarine high on the Upper Yangtze region during the Hirnantian. The water depth is mostly less than 60 m and the Yangtze Sea gradually deepens towards the north.

  3. Hirnantian (latest Ordovician bio- and chemostratigraphy of the Stirnas-18 core, western Latvia

    Directory of Open Access Journals (Sweden)

    Hints, Linda

    2010-03-01

    Full Text Available Integrated study of the uppermost Ordovician Porkuni Stage in the Stirnas-18 core, western Latvia, has revealed one of the most complete Hirnantian successions in the eastern Baltic region. The interval is characterized by two shallowing upwards depositional sequences that correspond to the Kuldiga and Saldus formations. The whole-rock carbon stable isotope curve indicates a long rising segment of the Hirnantian carbon isotope excursion, with the highest peak in the upper part of the Kuldiga Formation. The bioclast carbon and oxygen curves fit well with the whole-rock carbon data. Micro- and macrofossil data enabled seven combined associations to be distinguished within the Hirnantian strata. The early Porkuni fauna of the Spinachitina taugourdeaui Biozone, with pre-Hirnantian affinities, is succeeded by an interval with a Hindella–Cliftonia brachiopod association, a specific polychaete fauna, the chitinozoan Conochitina scabra, and the conodont Noixodontus girardeauensis. The middle part of the Kuldiga Formation is characterized by a low-diversity Dalmanella testudinaria brachiopod association, high diversity of scolecodonts, and the occurrence of the chitinozoan Lagenochitina prussica. From the middle part of the Kuldiga Formation the youngest occurrence yet known of the conodont Amorphognathus ordovicicus is reported. Also typical of the Kuldiga Formation is the occurrence of the trilobite Mucronaspis mucronata. The uppermost Hirnantian Saldus Formation contains no shelly fauna, but yields redeposited conodonts and at least partly indigenous chitinozoans and scolecodonts. Palaeontological criteria and stable isotope data enable correlation of the Stirnas section with other Hirnantian successions in the Baltic region and elsewhere.

  4. Mass concentration of Hirnantian cephalopods from the Siljan District, Sweden; taxonomy, palaeoecology and palaeobiogeographic relationships

    Directory of Open Access Journals (Sweden)

    B. Kröger

    2011-02-01

    Full Text Available The Hirnantian Glisstjärn Formation (Normalograptus persculptus graptolite Biozone is a succession of limestones and shales onlapping the Katian Boda Limestone in the Siljan District, Sweden. It contains a conspicuous, up to several decimeter thick bed densely packed with bipolarly oriented, orthoconic cephalopod conchs that can reach lengths of more than 120 cm. Conch fragmentation, bioereosion and the generally poor preservation of the conchs indicate time averaging and the conchs are tentatively interpreted as beached, and a result of winnowing. Ten nautiloid species were collected from the Glisstjärn Formation of which five are new: Dawsonoceras gregarium n. sp., Discoceras siljanense n. sp., Isorthoceras dalecarlense n. sp., Retizitteloceras rarum gen. et sp. n., and Transorthoceras osmundsbergense gen. et sp. n. The non-endemic taxa in most cases are known from elsewhere in Baltoscandia, except one species which is known from Siberia, and North America respectively. Proteocerid orthoceridans dominate the association, of which T. osmundsbergense is the predominant species. Oncocerids are diverse but together with tarphycerids very rare. Notable is the lack of many higher taxa, that are typical for other Late Ordovician shallow water depositional settings. Based on the taxonomical composition of the cephalopod mass occurrence it is interpreted as an indicator of eutrophication of the water masses in the area. doi:10.1002/mmng.201000014

  5. Redox conditions and marine microbial community changes during the end-Ordovician mass extinction event

    Science.gov (United States)

    Smolarek, Justyna; Marynowski, Leszek; Trela, Wiesław; Kujawski, Piotr; Simoneit, Bernd R. T.

    2017-02-01

    The end-Ordovician (Hirnantian) crisis is the first globally distinct extinction during the Phanerozoic, but its causes are still not fully known. Here, we present an integrated geochemical and petrographic analysis to understand the sedimentary conditions taking place before, during and after the Late Ordovician ice age. New data from the Zbrza (Holy Cross Mountains) and Gołdap (Baltic Depression) boreholes shows that, like in other worldwide sections, the total organic carbon (TOC) content is elevated in the upper Katian and uppermost Hirnantian to Rhudannian black shales, but depleted (below 1%) during most of the Hirnantian. Euxinic conditions occurred in the photic zone in both TOC-rich intervals. This is based on the maleimide distribution, occurrence of aryl isoprenoids and isorenieratane, as well as a dominance of tiny pyrite framboids. Euxinic conditions were interrupted by the Hirnantian regression caused by glaciation. Sedimentation on the deep shelf changed to aerobic probably due to intense thermohaline circulation. Euxinia in the water column occurred directly during the time associated with the second pulse of the mass extinction with a termination of the end-Ordovician glaciation and sea level rise just at the Ordovician/Silurian (O/S) boundary. In contrast, we suggest based on inorganic proxies that bottom water conditions were generally oxic to dysoxic due to upwelling in the Rheic Ocean. The only episode of seafloor anoxia in the Zbrza basin was found at the O/S boundary, where all inorganic indicators showed elevated values typical for anoxia (U/Th > 1.25; V/Cr > 4.25; V/(V + Ni): 0.54-0.82 and Mo > 10-25 ppm). Significant differences in hopanes to steranes ratio and in C27-C29 sterane distribution between the Katian, Rhudannian and Hirnantian deposits indicate changes in marine microbial communities triggered by sharp climate change and Gondwana glaciation. The increase from biomarkers of cyanobacteria (2α-methylhopanes) after the O

  6. Sedimentology of Hirnantian glaciomarine deposits in the Balkan Terrane, western Bulgaria: Fixing a piece of the north peri-Gondwana jigsaw puzzle

    Science.gov (United States)

    Chatalov, Athanas

    2017-04-01

    Glaciomarine deposits of late Hirnantian age in the western part of the Palaeozoic Balkan Terrane have persistent thickness ( 7 m) and lateral uniformity in rock colour, bedding pattern, lithology, and sedimentary structures. Four lithofacies are distinguished from base to top: lonestone-bearing diamictites, interbedded structureless mudstones, crudely laminated diamictites, and finely laminated mudstones. The diamictites are clast-poor to clast-rich comprising muddy to sandy varieties. Their compositional maturity is evidenced by the very high amount of detrital quartz compared to the paucity of feldspar and unstable lithic grains. Other textural components include extraclasts derived from the local Ordovician basement, mudstone intraclasts, and sediment aggregates. Turbate structures, grain lineations, and soft sediment deformation of the matrix below larger grains are locally observed. Sedimentological analysis reveals that deposition occurred in an ice-intermediate to ice-distal, poorly agitated shelf environment by material supplied from meltwater buoyant plumes and rain-out from ice-rafted debris. Remobilization by mass-flow processes (cohesive debris flows and slumps) was an important mechanism particularly for the formation of massive diamictites. The glaciomarine deposits represent a typical deglaciation sequence reflecting retreat of the ice front (grounded or floating ice sheet), relative sea-level rise and gradually reduced sedimentation rate with increasing contribution from suspension fallout. This sequence was deposited on the non-glaciated shelf of the intracratonic North Gondwana platform along the southern margin of the Rheic Ocean. The Hirnantian strata of the Balkan Terrane can be correlated with similar glaciomarine deposits known from peri-Gondwana terranes elsewhere in Europe showing clear 'Armorican affinity'. Several lines of evidence suggest that the provenance of siliciclastic material was associated mainly with sedimentary recycling of

  7. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  8. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  9. Graptolites of the Králův Dvůr Formation (mid Katian to earliest Hirnantian, Czech Republic)

    Czech Academy of Sciences Publication Activity Database

    Kraft, P.; Štorch, Petr; Mitchell, C. E.

    2015-01-01

    Roč. 90, č. 1 (2015), s. 195-225 ISSN 1214-1119 R&D Projects: GA AV ČR IAA301110908 Institutional support: RVO:67985831 Keywords : graptolite * Ordovician * Katian * Hirnantian * Prague Basin * biostratigraphy Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.700, year: 2015

  10. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  11. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  12. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  13. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  14. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  15. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  16. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  17. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  18. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  19. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  20. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  1. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  2. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  3. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  4. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  5. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  6. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  7. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  8. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  9. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  10. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  11. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  12. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  13. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  14. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  15. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  16. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  17. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  18. Rocchio-based relevance feedback in video event retrieval

    NARCIS (Netherlands)

    Pingen, G.L.J.; de Boer, M.H.T.; Aly, Robin; Amsaleg, Laurent; Guðmundsson, Gylfi Þór; Gurrin, Cathal; Jónsson, Björn Þór; Satoh, Shin’ichi

    This paper investigates methods for user and pseudo relevance feedback in video event retrieval. Existing feedback methods achieve strong performance but adjust the ranking based on few individual examples. We propose a relevance feedback algorithm (ARF) derived from the Rocchio method, which is a

  19. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  20. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  1. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  2. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  3. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  4. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  5. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  6. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  7. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  9. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  10. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  11. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  12. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  13. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  14. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  15. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  16. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  17. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  18. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  19. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  20. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  1. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  2. Short-Period Surface Wave Based Seismic Event Relocation

    Science.gov (United States)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  3. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  4. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  5. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  6. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  7. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  8. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  9. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  10. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  11. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  12. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  13. VLSI-based video event triggering for image data compression

    Science.gov (United States)

    Williams, Glenn L.

    1994-02-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  14. Event-based proactive interference in rhesus monkeys.

    Science.gov (United States)

    Devkar, Deepna T; Wright, Anthony A

    2016-10-01

    Three rhesus monkeys (Macaca mulatta) were tested in a same/different memory task for proactive interference (PI) from prior trials. PI occurs when a previous sample stimulus appears as a test stimulus on a later trial, does not match the current sample stimulus, and the wrong response "same" is made. Trial-unique pictures (scenes, objects, animals, etc.) were used on most trials, except on trials where the test stimulus matched potentially interfering sample stimulus from a prior trial (1, 2, 4, 8, or 16 trials prior). Greater interference occurred when fewer trials separated interference and test. PI functions showed a continuum of interference. Delays between sample and test stimuli and intertrial intervals were manipulated to test how PI might vary as a function of elapsed time. Contrary to a similar study with pigeons, these time manipulations had no discernable effect on the monkey's PI, as shown by compete overlap of PI functions with no statistical differences or interactions. These results suggested that interference was strictly based upon the number of intervening events (trials with other pictures) without regard to elapsed time. The monkeys' apparent event-based interference was further supported by retesting with a novel set of 1,024 pictures. PI from novel pictures 1 or 2 trials prior was greater than from familiar pictures, a familiar set of 1,024 pictures. Moreover, when potentially interfering novel stimuli were 16 trials prior, performance accuracy was actually greater than accuracy on baseline trials (no interference), suggesting that remembering stimuli from 16 trials prior was a cue that this stimulus was not the sample stimulus on the current trial-a somewhat surprising conclusion particularly given monkeys.

  15. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  16. Event Completion: Event Based Inferences Distort Memory in a Matter of Seconds

    Science.gov (United States)

    Strickland, Brent; Keil, Frank

    2011-01-01

    We present novel evidence that implicit causal inferences distort memory for events only seconds after viewing. Adults watched videos of someone launching (or throwing) an object. However, the videos omitted the moment of contact (or release). Subjects falsely reported seeing the moment of contact when it was implied by subsequent footage but did…

  17. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event...

  18. SPEED : a semantics-based pipeline for economic event detection

    NARCIS (Netherlands)

    Hogenboom, F.P.; Hogenboom, A.C.; Frasincar, F.; Kaymak, U.; Meer, van der O.; Schouten, K.; Vandic, D.; Parsons, J.; Motoshi, S.; Shoval, P.; Woo, C.; Wand, Y.

    2010-01-01

    Nowadays, emerging news on economic events such as acquisitions has a substantial impact on the financial markets. Therefore, it is important to be able to automatically and accurately identify events in news items in a timely manner. For this, one has to be able to process a large amount of

  19. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  20. Logical Discrete Event Systems in a trace theory based setting

    NARCIS (Netherlands)

    Smedinga, R.

    1993-01-01

    Discrete event systems can be modelled using a triple consisting of some alphabet (representing the events that might occur), and two trace sets (sets of possible strings) denoting the possible behaviour and the completed tasks of the system. Using this definition we are able to formulate and solve

  1. A model-based approach to operational event groups ranking

    Energy Technology Data Exchange (ETDEWEB)

    Simic, Zdenko [European Commission Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport; Maqua, Michael [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2014-04-15

    The operational experience (OE) feedback provides improvements in all industrial activities. Identification of the most important and valuable groups of events within accumulated experience is important in order to focus on a detailed investigation of events. The paper describes the new ranking method and compares it with three others. Methods have been described and applied to OE events utilised by nuclear power plants in France and Germany for twenty years. The results show that different ranking methods only roughly agree on which of the event groups are the most important ones. In the new ranking method the analytical hierarchy process is applied in order to assure consistent and comprehensive weighting determination for ranking indexes. The proposed method allows a transparent and flexible event groups ranking and identification of the most important OE for further more detailed investigation in order to complete the feedback. (orig.)

  2. Prediction problem for target events based on the inter-event waiting time

    Science.gov (United States)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  3. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  4. Autocorrel I: A Neural Network Based Network Event Correlation Approach

    National Research Council Canada - National Science Library

    Japkowicz, Nathalie; Smith, Reuben

    2005-01-01

    .... We use the autoassociator to build prototype software to cluster network alerts generated by a Snort intrusion detection system, and discuss how the results are significant, and how they can be applied to other types of network events.

  5. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  6. Integrated analyzing method for the progress event based on subjects and predicates in events

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Munesawa, Yoshiomi

    2014-01-01

    It is expected to make use of the knowledge that was extracted by analyzing the mistakes of the past to prevent recurrence of accidents. Currently main analytic style is an analytic style that experts decipher deeply the accident cases, but cross-analysis has come to an end with extracting the common factors in the accident cases. We propose an integrated analyzing method for progress events to analyze among accidents in this study. Our method realized the integration of many accident cases by the integration connecting the common keyword called as 'Subject' or 'Predicate' that are extracted from each progress event in accident cases or near-miss cases. Our method can analyze and visualize the partial risk identification and the frequency to cause accidents and the risk assessment from the data integrated accident cases. The result of applying our method to PEC-SAFER accident cases identified 8 hazardous factors which can be caused from tank again, and visualized the high frequent factors that the first factor was damage of tank 26% and the second factor was the corrosion 21%, and visualized the high risks that the first risk was the damage 3.3 x 10 -2 [risk rank / year] and the second risk was the destroy 2.5 x 10 -2 [risk rank / year]. (author)

  7. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  8. Characterising Event-Based DOM Inputs to an Urban Watershed

    Science.gov (United States)

    Croghan, D.; Bradley, C.; Hannah, D. M.; Van Loon, A.; Sadler, J. P.

    2017-12-01

    Dissolved Organic Matter (DOM) composition in urban streams is dominated by terrestrial inputs after rainfall events. Urban streams have particularly strong terrestrial-riverine connections due to direct input from terrestrial drainage systems. Event driven DOM inputs can have substantial adverse effects on water quality. Despite this, DOM from important catchment sources such as road drains and Combined Sewage Overflows (CSO's) remains poorly characterised within urban watersheds. We studied DOM sources within an urbanised, headwater watershed in Birmingham, UK. Samples from terrestrial sources (roads, roofs and a CSO), were collected manually after the onset of rainfall events of varying magnitude, and again within 24-hrs of the event ending. Terrestrial samples were analysed for fluorescence, absorbance and Dissolved Organic Carbon (DOC) concentration. Fluorescence and absorbance indices were calculated, and Parallel Factor Analysis (PARAFAC) was undertaken to aid sample characterization. Substantial differences in fluorescence, absorbance, and DOC were observed between source types. PARAFAC-derived components linked to organic pollutants were generally highest within road derived samples, whilst humic-like components tended to be highest within roof samples. Samples taken from the CSO generally contained low fluorescence, however this likely represents a dilution effect. Variation within source groups was particularly high, and local land use seemed to be the driving factor for road and roof drain DOM character and DOC quantity. Furthermore, high variation in fluorescence, absorbance and DOC was apparent between all sources depending on event type. Drier antecedent conditions in particular were linked to greater presence of terrestrially-derived components and higher DOC content. Our study indicates that high variations in DOM character occur between source types, and over small spatial scales. Road drains located on main roads appear to contain the poorest

  9. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  10. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  11. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-11-13

    that may include users, applications, operating systems, servers, hosts, routers, switches, control planes , and instrumentation planes , many of...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology...configuration errors in environment designs for several cyber range events. The rest of the paper is organized as follows. Section 2 provides an overview of

  12. Fault trees based on past accidents. Factorial analysis of events

    International Nuclear Information System (INIS)

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  13. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  14. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  15. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  16. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  17. Knowledge based query expansion in complex multimedia event detection

    NARCIS (Netherlands)

    Boer, M. de; Schutte, K.; Kraaij, W.

    2016-01-01

    A common approach in content based video information retrieval is to perform automatic shot annotation with semantic labels using pre-trained classifiers. The visual vocabulary of state-of-the-art automatic annotation systems is limited to a few thousand concepts, which creates a semantic gap

  18. Knowledge based query expansion in complex multimedia event detection

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2015-01-01

    A common approach in content based video information retrieval is to perform automatic shot annotation with semantic labels using pre-trained classifiers. The visual vocabulary of state-of-the-art automatic annotation systems is limited to a few thousand concepts, which creates a semantic gap

  19. Tag and Neighbor based Recommender systems for Medical events

    DEFF Research Database (Denmark)

    Bayyapu, Karunakar Reddy; Dolog, Peter

    2010-01-01

    This paper presents an extension of a multifactor recommendation approach based on user tagging with term neighbours. Neighbours of words in tag vectors and documents provide for hitting larger set of documents and not only those matching with direct tag vectors or content of the documents. Tag...... in the situations where the quality of tags is lower. We discuss the approach on the examples from the existing Medworm system to indicate the usefulness of the approach....

  20. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  1. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  2. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    Science.gov (United States)

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  3. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  4. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    Science.gov (United States)

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  5. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  6. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  7. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  8. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  9. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  10. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  11. Central FPGA-based destination and load control in the LHCb MHz event readout

    Science.gov (United States)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  12. Central FPGA-based destination and load control in the LHCb MHz event readout

    International Nuclear Information System (INIS)

    Jacobsson, R.

    2012-01-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  13. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  14. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Science.gov (United States)

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms. PMID:22163972

  15. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  16. Noether's Theorem and its Inverse of Birkhoffian System in Event Space Based on Herglotz Variational Problem

    Science.gov (United States)

    Tian, X.; Zhang, Y.

    2018-03-01

    Herglotz variational principle, in which the functional is defined by a differential equation, generalizes the classical ones defining the functional by an integral. The principle gives a variational principle description of nonconservative systems even when the Lagrangian is independent of time. This paper focuses on studying the Noether's theorem and its inverse of a Birkhoffian system in event space based on the Herglotz variational problem. Firstly, according to the Herglotz variational principle of a Birkhoffian system, the principle of a Birkhoffian system in event space is established. Secondly, its parametric equations and two basic formulae for the variation of Pfaff-Herglotz action of a Birkhoffian system in event space are obtained. Furthermore, the definition and criteria of Noether symmetry of the Birkhoffian system in event space based on the Herglotz variational problem are given. Then, according to the relationship between the Noether symmetry and conserved quantity, the Noether's theorem is derived. Under classical conditions, Noether's theorem of a Birkhoffian system in event space based on the Herglotz variational problem reduces to the classical ones. In addition, Noether's inverse theorem of the Birkhoffian system in event space based on the Herglotz variational problem is also obtained. In the end of the paper, an example is given to illustrate the application of the results.

  17. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  18. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  19. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  20. Evaluation of extreme temperature events in northern Spain based on process control charts

    Science.gov (United States)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  1. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  2. Fluence-based and microdosimetric event-based methods for radiation protection in space

    International Nuclear Information System (INIS)

    Curtis, S.B.

    2002-01-01

    The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report no.137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/linear energy transfer (LET) method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented. (author)

  3. Event-based motion correction for PET transmission measurements with a rotating point source

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Meikle, Steven R; Fulton, Roger

    2011-01-01

    Accurate attenuation correction is important for quantitative positron emission tomography (PET) studies. When performing transmission measurements using an external rotating radioactive source, object motion during the transmission scan can distort the attenuation correction factors computed as the ratio of the blank to transmission counts, and cause errors and artefacts in reconstructed PET images. In this paper we report a compensation method for rigid body motion during PET transmission measurements, in which list mode transmission data are motion corrected event-by-event, based on known motion, to ensure that all events which traverse the same path through the object are recorded on a common line of response (LOR). As a result, the motion-corrected transmission LOR may record a combination of events originally detected on different LORs. To ensure that the corresponding blank LOR records events from the same combination of contributing LORs, the list mode blank data are spatially transformed event-by-event based on the same motion information. The number of counts recorded on the resulting blank LOR is then equivalent to the number of counts that would have been recorded on the corresponding motion-corrected transmission LOR in the absence of any attenuating object. The proposed method has been verified in phantom studies with both stepwise movements and continuous motion. We found that attenuation maps derived from motion-corrected transmission and blank data agree well with those of the stationary phantom and are significantly better than uncorrected attenuation data.

  4. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  5. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    OpenAIRE

    Tramblay, Yves; Bouvier, Christophe; Martin, C.; Didon-Lescot, J. F.; Todorovik, D.; Domergue, J. M.

    2010-01-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture,...

  6. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  7. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  8. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  9. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  10. Supervision in the PC based prototype for the ATLAS event filter

    CERN Document Server

    Bee, C P; Etienne, F; Fede, E; Meessen, C; Nacasch, R; Qian, Z; Touchard, F

    1999-01-01

    A prototype of the ATLAS event filter based on commodity PCs linked by a Fast Ethernet switch has been developed in Marseille. The present contribution focus on the supervision aspects of the prototype based on Java and Java mobile agents technology. (5 refs).

  11. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  12. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  13. Event-based scenario manager for multibody dynamics simulation of heavy load lifting operations in shipyards

    Directory of Open Access Journals (Sweden)

    Sol Ha

    2016-01-01

    Full Text Available This paper suggests an event-based scenario manager capable of creating and editing a scenario for shipbuilding process simulation based on multibody dynamics. To configure various situation in shipyards and easily connect with multibody dynamics, the proposed method has two main concepts: an Actor and an Action List. The Actor represents the anatomic unit of action in the multibody dynamics and can be connected to a specific component of the dynamics kernel such as the body and joint. The user can make a scenario up by combining the actors. The Action List contains information for arranging and executing the actors. Since the shipbuilding process is a kind of event-based sequence, all simulation models were configured using Discrete EVent System Specification (DEVS formalism. The proposed method was applied to simulations of various operations in shipyards such as lifting and erection of a block and heavy load lifting operation using multiple cranes.

  14. Limits on the efficiency of event-based algorithms for Monte Carlo neutron transport

    Directory of Open Access Journals (Sweden)

    Paul K. Romano

    2017-09-01

    Full Text Available The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, the vector speedup is also limited by differences in the execution time for events being carried out in a single event-iteration.

  15. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  16. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  17. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  18. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    Science.gov (United States)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  19. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  20. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  1. Life review based on remembering specific positive events in active aging.

    Science.gov (United States)

    Latorre, José M; Serrano, Juan P; Ricarte, Jorge; Bonete, Beatriz; Ros, Laura; Sitges, Esther

    2015-02-01

    The aim of this study is to evaluate the effectiveness of life review (LR) based on specific positive events in non-depressed older adults taking part in an active aging program. Fifty-five older adults were randomly assigned to an experimental group or an active control (AC) group. A six-session individual training of LR based on specific positive events was carried out with the experimental group. The AC group undertook a "media workshop" of six sessions focused on learning journalistic techniques. Pre-test and post-test measures included life satisfaction, depressive symptoms, experiencing the environment as rewarding, and autobiographical memory (AM) scales. LR intervention decreased depressive symptomatology, improved life satisfaction, and increased specific memories. The findings suggest that practice in AM for specific events is an effective component of LR that could be a useful tool in enhancing emotional well-being in active aging programs, thus reducing depressive symptoms. © The Author(s) 2014.

  2. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  3. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  4. Automated reasoning with dynamic event trees: a real-time, knowledge-based decision aide

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Subramanyan, N.

    1988-01-01

    The models and data contained in a probabilistic risk assessment (PRA) Event Sequence Analysis represent a wealth of information that can be used for dynamic calculation of event sequence likelihood. In this paper we report a new and unique computerization methodology which utilizes these data. This sub-system (referred to as PREDICTOR) has been developed and tested as part of a larger system. PREDICTOR performs a real-time (re)calculation of the estimated likelihood of core-melt as a function of plant status. This methodology uses object-oriented programming techniques from the artificial intelligence discipline that enable one to codify event tree and fault tree logic models and associated probabilities developed in a PRA study. Existence of off-normal conditions is reported to PREDICTOR, which then updates the relevant failure probabilities throughout the event tree and fault tree models by dynamically replacing the off-the-shelf (or prior) probabilities with new probabilities based on the current situation. The new event probabilities are immediately propagated through the models (using 'demons') and an updated core-melt probability is calculated. Along the way, the dominant non-success path of each event tree is determined and highlighted. (author)

  5. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  6. A browser-based event display for the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Hategan, M; McCauley, T; Nguyen, P

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  7. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  8. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  9. Location aware event driven multipath routing in Wireless Sensor Networks: Agent based approach

    Directory of Open Access Journals (Sweden)

    A.V. Sutagundar

    2013-03-01

    Full Text Available Wireless Sensor Networks (WSNs demand reliable and energy efficient paths for critical information delivery to sink node from an event occurrence node. Multipath routing facilitates reliable data delivery in case of critical information. This paper proposes an event triggered multipath routing in WSNs by employing a set of static and mobile agents. Every sensor node is assumed to know the location information of the sink node and itself. The proposed scheme works as follows: (1 Event node computes the arbitrary midpoint between an event node and the sink node by using location information. (2 Event node establishes a shortest path from itself to the sink node through the reference axis by using a mobile agent with the help of location information; the mobile agent collects the connectivity information and other parameters of all the nodes on the way and provides the information to the sink node. (3 Event node finds the arbitrary location of the special (middle intermediate nodes (above/below reference axis by using the midpoint location information given in step 1. (4 Mobile agent clones from the event node and the clones carry the event type and discover the path passing through special intermediate nodes; the path above/below reference axis looks like an arc. While migrating from one sensor node to another along the traversed path, each mobile agent gathers the node information (such as node id, location information, residual energy, available bandwidth, and neighbors connectivity and delivers to the sink node. (5 The sink node constructs a partial topology, connecting event and sink node by using the connectivity information delivered by the mobile agents. Using the partial topology information, sink node finds the multipath and path weight factor by using link efficiency, energy ratio, and hop distance. (6 The sink node selects the number of paths among the available paths based upon the criticalness of an event, and (7 if the event is non

  10. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  11. Lyapunov design of event-based controllers for the rendez-vous of coupled systems

    NARCIS (Netherlands)

    De Persis, Claudio; Postoyan, Romain

    2014-01-01

    The objective is to present a new type of triggering conditions together with new proof concepts for the event-based coordination of multi-agents. As a first step, we focus on the rendez-vous of two identical systems modeled as double integrators with additional damping in the velocity dynamics. The

  12. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...

  13. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  14. Mind the gap: modelling event-based and millennial-scale landscape dynamics

    NARCIS (Netherlands)

    Baartman, J.E.M.

    2012-01-01

    This research looks at landscape dynamics – erosion and deposition – from two different perspectives: long-term landscape evolution over millennial timescales on the one hand and short-term event-based erosion and deposition at the other hand. For the first, landscape evolution models (LEMs) are

  15. Component-Based Data-Driven Predictive Maintenance to Reduce Unscheduled Maintenance Events

    NARCIS (Netherlands)

    Verhagen, W.J.C.; Curran, R.; de Boer, L.W.M.; Chen, C.H.; Trappey, A.C.; Peruzzini, M.; Stjepandić, J.; Wognum, N.

    2017-01-01

    Costs associated with unscheduled and preventive maintenance can contribute significantly to an airline's expenditure. Reliability analysis can help to identify and plan for maintenance events. Reliability analysis in industry is often limited to statistically based

  16. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  17. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  18. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    OpenAIRE

    Katherine M. Boydell

    2011-01-01

    A symbolic interaction (Blumer, 1969; Mead, 1934; Prus, 1996; Prus & Grills, 2003) approach was taken to study the collective event (Prus, 1997) of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed the authors to study the process of the creation of the dance and its capacity to convert abstract research into concre...

  19. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  20. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  1. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.

  2. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  3. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  4. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.

  5. Event-based rainfall-runoff modelling of the Kelantan River Basin

    International Nuclear Information System (INIS)

    Basarudin, Z; Adnan, N A; Latif, A R A; Syafiqah, N; Tahir, W

    2014-01-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area

  6. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  7. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  8. LCP method for a planar passive dynamic walker based on an event-driven scheme

    Science.gov (United States)

    Zheng, Xu-Dong; Wang, Qi

    2018-06-01

    The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.

  9. Pull-Based Distributed Event-Triggered Consensus for Multiagent Systems With Directed Topologies.

    Science.gov (United States)

    Yi, Xinlei; Lu, Wenlian; Chen, Tianping

    2017-01-01

    This paper mainly investigates consensus problem with a pull-based event-triggered feedback control. For each agent, the diffusion coupling feedbacks are based on the states of its in-neighbors at its latest triggering time, and the next triggering time of this agent is determined by its in-neighbors' information. The general directed topologies, including irreducible and reducible cases, are investigated. The scenario of distributed continuous communication is considered first. It is proved that if the network topology has a spanning tree, then the event-triggered coupling algorithm can realize the consensus for the multiagent system. Then, the results are extended to discontinuous communication, i.e., self-triggered control, where each agent computes its next triggering time in advance without having to observe the system's states continuously. The effectiveness of the theoretical results is illustrated by a numerical example finally.

  10. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  11. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  12. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  13. Nest-crowdcontrol: Advanced video-based crowd monitoring for large public events

    OpenAIRE

    Monari, Eduardo; Fischer, Yvonne; Anneken, Mathias

    2015-01-01

    Current video surveillance systems still lack of intelligent video and data analysis modules for supporting situation awareness of decision makers. Especially in mass gatherings like large public events, the decision maker would benefit from different views of the area, especially from crowd density estimations. This article describes a multi-camera system called NEST and its application for crowd density analysis. First, the overall system design is presented. Based on this, the crowd densit...

  14. Spatio-Temporal Story Mapping Animation Based On Structured Causal Relationships Of Historical Events

    Science.gov (United States)

    Inoue, Y.; Tsuruoka, K.; Arikawa, M.

    2014-04-01

    In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.

  15. Network based on statistical multiplexing for event selection and event builder systems in high energy physics experiments

    International Nuclear Information System (INIS)

    Calvet, D.

    2000-03-01

    Systems for on-line event selection in future high energy physics experiments will use advanced distributed computing techniques and will need high speed networks. After a brief description of projects at the Large Hadron Collider, the architectures initially proposed for the Trigger and Data AcQuisition (TD/DAQ) systems of ATLAS and CMS experiments are presented and analyzed. A new architecture for the ATLAS T/DAQ is introduced. Candidate network technologies for this system are described. This thesis focuses on ATM. A variety of network structures and topologies suited to partial and full event building are investigated. The need for efficient networking is shown. Optimization techniques for high speed messaging and their implementation on ATM components are described. Small scale demonstrator systems consisting of up to 48 computers (∼1:20 of the final level 2 trigger) connected via ATM are described. Performance results are presented. Extrapolation of measurements and evaluation of needs lead to a proposal of implementation for the main network of the ATLAS T/DAQ system. (author)

  16. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    Science.gov (United States)

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4

  17. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  18. A Cluster-Based Fuzzy Fusion Algorithm for Event Detection in Heterogeneous Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    ZiQi Hao

    2015-01-01

    Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.

  19. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    Science.gov (United States)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  20. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  1. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  2. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  3. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  4. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.

    Science.gov (United States)

    Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos

    2018-03-23

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  5. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies

    Directory of Open Access Journals (Sweden)

    Ali Balador

    2018-03-01

    Full Text Available Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  6. Mining web-based data to assess public response to environmental events

    International Nuclear Information System (INIS)

    Cha, YoonKyung; Stow, Craig A.

    2015-01-01

    We explore how the analysis of web-based data, such as Twitter and Google Trends, can be used to assess the social relevance of an environmental accident. The concept and methods are applied in the shutdown of drinking water supply at the city of Toledo, Ohio, USA. Toledo's notice, which persisted from August 1 to 4, 2014, is a high-profile event that directly influenced approximately half a million people and received wide recognition. The notice was given when excessive levels of microcystin, a byproduct of cyanobacteria blooms, were discovered at the drinking water treatment plant on Lake Erie. Twitter mining results illustrated an instant response to the Toledo incident, the associated collective knowledge, and public perception. The results from Google Trends, on the other hand, revealed how the Toledo event raised public attention on the associated environmental issue, harmful algal blooms, in a long-term context. Thus, when jointly applied, Twitter and Google Trend analysis results offer complementary perspectives. Web content aggregated through mining approaches provides a social standpoint, such as public perception and interest, and offers context for establishing and evaluating environmental management policies. - The joint application of Twitter and Google Trend analysis to an environmental event offered both short and long-term patterns of public perception and interest on the event

  7. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  8. The taxable events for the Value-Added Tax (VAT based on a Comparative Law approach

    Directory of Open Access Journals (Sweden)

    Walker Villanueva Gutiérrez

    2014-07-01

    Full Text Available This article analyzes the definitions of the main taxable events for the Value-Added Tax (VAT based on a comparative approach to thelegislation of different countries (Spain, Mexico, Chile, Colombia, Argentina and Peru. In this regard, it analyzes which legislations offer definitions according to the principles of generality, fiscal neutrality and legal certainty for VAT. Moreover, it points out that the VAT systems of those countries do not require as a condition for the configuration of the taxable events that the transactions involve a «value added» or a final consumption. In the specificcase of «supplies of goods», the VAT systems have a similar definition of the taxable event, although there are a few differences. However, in the case of«supplies of services», which is the most important taxable event for VAT, there are important differences at the time each country defines it. This is not a desirable effect for the international trade of services, since the lack of harmonization produces double taxation or double non taxation.

  9. An adverse events potential costs analysis based on Drug Programs in Poland. Dermatology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-09-01

    Full Text Available The aim of the project, carried out within the Polish Society for Pharmacoeconomics (PTFE, was to estimate the potential costs of treatment of the side effects which (theoretically may occur as a result of treatments for the selected diseases. This paper deals solely with dermatology related events. Herein, several Drug Programs financed by the National Health Fund in Poland, in 2012, were analyzed. The adverse events were selected based on the Summary of Product Characteristics of the chosen products. We focused the project on those potential adverse events which were defined in SPC as frequent and very frequent. The results are presented according to their therapeutic areas, and in this paper, the focus is upon that which is related to dermatology. The events described as ‘very common’ had an incidence of ≥ 1/10, and that which is ‘common’ - ≥ 1/100, <1 /10. In order to identify the resources used, we, with the engagement of clinical experts, performed a survey. In our work, we employed only the total direct costs incurred by the public payer, based on valid individual cost data in February 2014. Moreover, we calculated the total spending from the public payer’s perspective, as well as the patient’s perspective, and the percentage of each component of the total cost in detail. The paper, thus, informs the reader of the estimated costs of treatment of side effects related to the dermatologic symptoms and reactions. Based on our work, we can state that the treatment of skin adverse drug reactions generates a significant cost - one incurred by both the public payer and the patient.

  10. ADEpedia: a scalable and standardized knowledge base of Adverse Drug Events using semantic web technology.

    Science.gov (United States)

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-01-01

    A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.

  11. Results from a data acquisition system prototype project using a switch-based event builder

    International Nuclear Information System (INIS)

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D.; Bowden, M.; Booth, A.; Cancelo, G.

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs

  12. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    OpenAIRE

    Boydell, Katherine M.

    2011-01-01

    A symbolic interaction (BLUMER, 1969; MEAD, 1934; PRUS, 1996; PRUS & GRILLS, 2003) approach was taken to study the collective event (PRUS, 1997) of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed us to study the process of the creation of the dance and its capacity to convert abstract research into concrete form a...

  13. Results from a data acquisition system prototype project using a switch-based event builder

    Energy Technology Data Exchange (ETDEWEB)

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D. (Fermi National Accelerator Lab., Batavia, IL (United States)); Bowden, M.; Booth, A. (Superconducting Super Collider Lab., Dallas, TX (United States)); Cancelo, G. (La Plata Univ. Nacional (Argentina))

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs.

  14. Event-triggered hybrid control based on multi-Agent systems for Microgrids

    DEFF Research Database (Denmark)

    Dou, Chun-xia; Liu, Bin; Guerrero, Josep M.

    2014-01-01

    This paper is focused on a multi-agent system based event-triggered hybrid control for intelligently restructuring the operating mode of an microgrid (MG) to ensure the energy supply with high security, stability and cost effectiveness. Due to the microgrid is composed of different types...... of distributed energy resources, thus it is typical hybrid dynamic network. Considering the complex hybrid behaviors, a hierarchical decentralized coordinated control scheme is firstly constructed based on multi-agent sys-tem, then, the hybrid model of the microgrid is built by using differential hybrid Petri...

  15. Building a knowledge base of severe adverse drug events based on AERS reporting data using semantic web technologies.

    Science.gov (United States)

    Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G

    2013-01-01

    A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.

  16. Identifying Typhoon Tracks based on Event Synchronization derived Spatially Embedded Climate Networks

    Science.gov (United States)

    Ozturk, Ugur; Marwan, Norbert; Kurths, Jürgen

    2017-04-01

    Complex networks are commonly used for investigating spatiotemporal dynamics of complex systems, e.g. extreme rainfall. Especially directed networks are very effective tools in identifying climatic patterns on spatially embedded networks. They can capture the network flux, so as the principal dynamics of spreading significant phenomena. Network measures, such as network divergence, bare the source-receptor relation of the directed networks. However, it is still a challenge how to catch fast evolving atmospheric events, i.e. typhoons. In this study, we propose a new technique, namely Radial Ranks, to detect the general pattern of typhoons forward direction based on the strength parameter of the event synchronization over Japan. We suggest to subset a circular zone of high correlation around the selected grid based on the strength parameter. Radial sums of the strength parameter along vectors within this zone, radial ranks are measured for potential directions, which allows us to trace the network flux over long distances. We employed also the delay parameter of event synchronization to identify and separate the frontal storms' and typhoons' individual behaviors.

  17. GIS-based rare events logistic regression for mineral prospectivity mapping

    Science.gov (United States)

    Xiong, Yihui; Zuo, Renguang

    2018-02-01

    Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

  18. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  19. A Hospital Nursing Adverse Events Reporting System Project: An Approach Based on the Systems Development Life Cycle.

    Science.gov (United States)

    Cao, Yingjuan; Ball, Marion

    2017-01-01

    Based on the System Development Life Cycle, a hospital based nursing adverse event reporting system was developed and implemented which integrated with the current Hospital Information System (HIS). Besides the potitive outcomes in terms of timeliness and efficiency, this approach has brought an enormous change in how the nurses report, analyze and respond to the adverse events.

  20. Discrete event dynamic system (DES)-based modeling for dynamic material flow in the pyroprocess

    International Nuclear Information System (INIS)

    Lee, Hyo Jik; Kim, Kiho; Kim, Ho Dong; Lee, Han Soo

    2011-01-01

    A modeling and simulation methodology was proposed in order to implement the dynamic material flow of the pyroprocess. Since the static mass balance provides the limited information on the material flow, it is hard to predict dynamic behavior according to event. Therefore, a discrete event system (DES)-based model named, PyroFlow, was developed at the Korea Atomic Energy Research Institute (KAERI). PyroFlow is able to calculate dynamic mass balance and also show various dynamic operational results in real time. By using PyroFlow, it is easy to rapidly predict unforeseeable results, such as throughput in unit process, accumulated product in buffer and operation status. As preliminary simulations, bottleneck analyses in the pyroprocess were carried out and consequently it was presented that operation strategy had influence on the productivity of the pyroprocess.

  1. Triggerless Readout with Time and Amplitude Reconstruction of Event Based on Deconvolution Algorithm

    International Nuclear Information System (INIS)

    Kulis, S.; Idzik, M.

    2011-01-01

    In future linear colliders like CLIC, where the period between the bunch crossings is in a sub-nanoseconds range ( 500 ps), an appropriate detection technique with triggerless signal processing is needed. In this work we discuss a technique, based on deconvolution algorithm, suitable for time and amplitude reconstruction of an event. In the implemented method the output of a relatively slow shaper (many bunch crossing periods) is sampled and digitalised in an ADC and then the deconvolution procedure is applied to digital data. The time of an event can be found with a precision of few percent of sampling time. The signal to noise ratio is only slightly decreased after passing through the deconvolution filter. The performed theoretical and Monte Carlo studies are confirmed by the results of preliminary measurements obtained with the dedicated system comprising of radiation source, silicon sensor, front-end electronics, ADC and further digital processing implemented on a PC computer. (author)

  2. A data-based model to locate mass movements triggered by seismic events in Sichuan, China.

    Science.gov (United States)

    de Souza, Fabio Teodoro

    2014-01-01

    Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.

  3. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    Energy Technology Data Exchange (ETDEWEB)

    Khristova, R; Kaltchev, B; Dimitrov, B [Energoproekt, Sofia (Bulgaria); Nedyalkova, D; Sonev, A [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1996-12-31

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs.

  4. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  5. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    International Nuclear Information System (INIS)

    Khristova, R.; Kaltchev, B.; Dimitrov, B.; Nedyalkova, D.; Sonev, A.

    1995-01-01

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs

  6. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  7. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  8. Social importance enhances prospective memory: evidence from an event-based task.

    Science.gov (United States)

    Walter, Stefan; Meier, Beat

    2017-07-01

    Prospective memory performance can be enhanced by task importance, for example by promising a reward. Typically, this comes at costs in the ongoing task. However, previous research has suggested that social importance (e.g., providing a social motive) can enhance prospective memory performance without additional monitoring costs in activity-based and time-based tasks. The aim of the present study was to investigate the influence of social importance in an event-based task. We compared four conditions: social importance, promising a reward, both social importance and promising a reward, and standard prospective memory instructions (control condition). The results showed enhanced prospective memory performance for all importance conditions compared to the control condition. Although ongoing task performance was slowed in all conditions with a prospective memory task when compared to a baseline condition with no prospective memory task, additional costs occurred only when both the social importance and reward were present simultaneously. Alone, neither social importance nor promising a reward produced an additional slowing when compared to the cost in the standard (control) condition. Thus, social importance and reward can enhance event-based prospective memory at no additional cost.

  9. Ant colony optimization and event-based dynamic task scheduling and staffing for software projects

    Science.gov (United States)

    Ellappan, Vijayan; Ashwini, J.

    2017-11-01

    In programming change organizations from medium to inconceivable scale broadens, the issue of wander orchestrating is amazingly unusual and testing undertaking despite considering it a manual system. Programming wander-organizing requirements to deal with the issue of undertaking arranging and in addition the issue of human resource portion (also called staffing) in light of the way that most of the advantages in programming ventures are individuals. We propose a machine learning approach with finds respond in due order regarding booking by taking in the present arranging courses of action and an event based scheduler revives the endeavour arranging system moulded by the learning computation in perspective of the conformity in event like the begin with the Ander, the instant at what time possessions be free starting to ended errands, and the time when delegates stick together otherwise depart the wander inside the item change plan. The route toward invigorating the timetable structure by the even based scheduler makes the arranging method dynamic. It uses structure components to exhibit the interrelated surges of endeavours, slip-ups and singular all through different progression organizes and is adjusted to mechanical data. It increases past programming wander movement ask about by taking a gander at a survey based process with a one of a kind model, organizing it with the data based system for peril assessment and cost estimation, and using a choice showing stage.

  10. A Markovian event-based framework for stochastic spiking neural networks.

    Science.gov (United States)

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  11. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications.

    Science.gov (United States)

    Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just

    2018-04-03

    Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  12. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications

    Directory of Open Access Journals (Sweden)

    Daniel G. Costa

    2018-04-01

    Full Text Available Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  13. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2017-09-01

    Full Text Available Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  14. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation.

    Science.gov (United States)

    Huang, Dongmei; Xu, Chenyixuan; Zhao, Danfeng; Song, Wei; He, Qi

    2017-09-21

    Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  15. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  16. Leading indicators of community-based violent events among adults with mental illness.

    Science.gov (United States)

    Van Dorn, R A; Grimm, K J; Desmarais, S L; Tueller, S J; Johnson, K L; Swartz, M S

    2017-05-01

    The public health, public safety and clinical implications of violent events among adults with mental illness are significant; however, the causes and consequences of violence and victimization among adults with mental illness are complex and not well understood, which limits the effectiveness of clinical interventions and risk management strategies. This study examined interrelationships between violence, victimization, psychiatric symptoms, substance use, homelessness and in-patient treatment over time. Available data were integrated from four longitudinal studies of adults with mental illness. Assessments took place at baseline, and at 1, 3, 6, 9, 12, 15, 18, 24, 30 and 36 months, depending on the parent studies' protocol. Data were analysed with the autoregressive cross-lag model. Violence and victimization were leading indicators of each other and affective symptoms were a leading indicator of both. Drug and alcohol use were leading indicators of violence and victimization, respectively. All psychiatric symptom clusters - affective, positive, negative, disorganized cognitive processing - increased the likelihood of experiencing at least one subsequent symptom cluster. Sensitivity analyses identified few group-based differences in the magnitude of effects in this heterogeneous sample. Violent events demonstrated unique and shared indicators and consequences over time. Findings indicate mechanisms for reducing violent events, including trauma-informed therapy, targeting internalizing and externalizing affective symptoms with cognitive-behavioral and psychopharmacological interventions, and integrating substance use and psychiatric care. Finally, mental illness and violence and victimization research should move beyond demonstrating concomitant relationships and instead focus on lagged effects with improved spatio-temporal contiguity.

  17. Event recognition in personal photo collections via multiple instance learning-based classification of multiple images

    Science.gov (United States)

    Ahmad, Kashif; Conci, Nicola; Boato, Giulia; De Natale, Francesco G. B.

    2017-11-01

    Over the last few years, a rapid growth has been witnessed in the number of digital photos produced per year. This rapid process poses challenges in the organization and management of multimedia collections, and one viable solution consists of arranging the media on the basis of the underlying events. However, album-level annotation and the presence of irrelevant pictures in photo collections make event-based organization of personal photo albums a more challenging task. To tackle these challenges, in contrast to conventional approaches relying on supervised learning, we propose a pipeline for event recognition in personal photo collections relying on a multiple instance-learning (MIL) strategy. MIL is a modified form of supervised learning and fits well for such applications with weakly labeled data. The experimental evaluation of the proposed approach is carried out on two large-scale datasets including a self-collected and a benchmark dataset. On both, our approach significantly outperforms the existing state-of-the-art.

  18. An energy estimation framework for event-based methods in Non-Intrusive Load Monitoring

    International Nuclear Information System (INIS)

    Giri, Suman; Bergés, Mario

    2015-01-01

    Highlights: • Energy estimation is NILM has not yet accounted for complexity of appliance models. • We present a data-driven framework for appliance modeling in supervised NILM. • We test the framework on 3 houses and report average accuracies of 5.9–22.4%. • Appliance models facilitate the estimation of energy consumed by the appliance. - Abstract: Non-Intrusive Load Monitoring (NILM) is a set of techniques used to estimate the electricity consumed by individual appliances in a building from measurements of the total electrical consumption. Most commonly, NILM works by first attributing any significant change in the total power consumption (also known as an event) to a specific load and subsequently using these attributions (i.e. the labels for the events) to estimate energy for each load. For this last step, most published work in the field makes simplifying assumptions to make the problem more tractable. In this paper, we present a framework for creating appliance models based on classification labels and aggregate power measurements that can help to relax many of these assumptions. Our framework automatically builds models for appliances to perform energy estimation. The model relies on feature extraction, clustering via affinity propagation, perturbation of extracted states to ensure that they mimic appliance behavior, creation of finite state models, correction of any errors in classification that might violate the model, and estimation of energy based on corrected labels. We evaluate our framework on 3 houses from standard datasets in the field and show that the framework can learn data-driven models based on event labels and use that to estimate energy with lower error margins (e.g., 1.1–42.3%) than when using the heuristic models used by others

  19. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  20. Automatic Classification of volcano-seismic events based on Deep Neural Networks.

    Science.gov (United States)

    Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.

  1. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  2. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    Directory of Open Access Journals (Sweden)

    Ninna Reitzel Jensen

    2015-06-01

    Full Text Available Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account the Markov model for the state of the policyholder and, hereby, facilitating event risk.

  3. Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition

    Science.gov (United States)

    Daigle, Matthew; Roychoudhury, Indranil

    2010-01-01

    We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach

  4. A data base approach for prediction of deforestation-induced mass wasting events

    Science.gov (United States)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  5. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS).

    Science.gov (United States)

    Rigi, Amin; Baghaei Naeini, Fariborz; Makris, Dimitrios; Zweiri, Yahya

    2018-01-24

    In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments.

  6. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  7. Non-Cooperative Regulation Coordination Based on Game Theory for Wind Farm Clusters during Ramping Events

    DEFF Research Database (Denmark)

    Qi, Yongzhi; Liu, Yutian; Wu, Qiuwei

    2017-01-01

    With increasing penetration of wind power in power systems, it is important to track scheduled wind power output as much as possible during ramping events to ensure security of the system. In this paper, a non‐cooperative coordination strategy based on the game theory is proposed for the regulation...... of the regulation revenue function according to the derived Nash equilibrium condition, the ER strategy is the Nash equilibrium of the regulation competition. Case studies were conducted with the power output data of wind farms from State Grid Jibei Electric Power Company Limited of China to demonstrate...

  8. Arachne-A web-based event viewer for MINER{nu}A

    Energy Technology Data Exchange (ETDEWEB)

    Tagg, N., E-mail: ntagg@otterbein.edu [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Brangham, J. [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Chvojka, J. [Rochester, NY 14610 (United States); Clairemont, M. [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Day, M. [Rochester, NY 14610 (United States); Eberly, B. [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Felix, J. [Lascurain de Retana No. 5, Col. Centro. Guanajuato, Guanajuato 36000 (Mexico); Fields, L. [Northwestern University, Evanston, IL 60208 (United States); Gago, A.M. [Seccion Fisica, Departamento de Ciencias, Pontificia Universidad Catolica del Peru, Apartado 1761, Lima (Peru); Gran, R. [Department of Physics, University of Minnesota - Duluth, Duluth, MN 55812 (United States); Harris, D.A. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Kordosky, M. [Department of Physics, College of William and Mary, Williamsburg, VA 23187 (United States); Lee, H. [Rochester, NY 14610 (United States); Maggi, G. [Departamento de Fisica, Universidad Tecnica Federico Santa Maria, Avda. Espana 1680 Casilla 110-V Valparaiso (Chile); Maher, E. [Massachusetts College of Liberal Arts, 375 Church Street, North Adams, MA 01247 (United States); Mann, W.A. [Physics Department, Tufts University, Medford, MA 02155 (United States); Marshall, C.M.; McFarland, K.S.; McGowan, A.M.; Mislivec, A. [Rochester, NY 14610 (United States); and others

    2012-06-01

    Neutrino interaction events in the MINER{nu}A detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINER{nu}A to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  9. Arachne - A web-based event viewer for MINERvA

    International Nuclear Information System (INIS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.

    2011-01-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  10. Ptaquiloside from bracken in stream water at base flow and during storm events

    DEFF Research Database (Denmark)

    Clauson-Kaas, Frederik; Ramwell, Carmel; Hansen, Hans Chr. Bruun

    2016-01-01

    not decrease over the course of the event. In the stream, the throughfall contribution to PTA cannot be separated from a possible below-ground input from litter, rhizomes and soil. Catchment-specific factors such as the soil pH, topography, hydrology, and bracken coverage will evidently affect the level of PTA...... rainfall and PTA concentration in the stream, with a reproducible time lag of approx. 1 h from onset of rain to elevated concentrations, and returning rather quickly (about 2 h) to base flow concentration levels. The concentration of PTA behaved similar to an inert tracer (Cl(-)) in the pulse experiment...

  11. Modeling crowd behavior based on the discrete-event multiagent approach

    OpenAIRE

    Лановой, Алексей Феликсович; Лановой, Артем Алексеевич

    2014-01-01

    The crowd is a temporary, relatively unorganized group of people, who are in close physical contact with each other. Individual behavior of human outside the crowd is determined by many factors, associated with his intellectual activities, but inside the crowd the man loses his identity and begins to obey more simple laws of behavior.One of approaches to the construction of multi-level model of the crowd using discrete-event multiagent approach was described in the paper.Based on this analysi...

  12. Arachne—A web-based event viewer for MINERνA

    International Nuclear Information System (INIS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.; Kordosky, M.; Lee, H.; Maggi, G.; Maher, E.; Mann, W.A.; Marshall, C.M.; McFarland, K.S.; McGowan, A.M.; Mislivec, A.

    2012-01-01

    Neutrino interaction events in the MINERνA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERνA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  13. Real-time identification of residential appliance events based on power monitoring

    Science.gov (United States)

    Yang, Zhao; Zhu, Zhicheng; Wei, Zhiqiang; Yin, Bo; Wang, Xiuwei

    2018-03-01

    Energy monitoring for specific home appliances has been regarded as the pre-requisite for reducing residential energy consumption. To enhance the accuracy of identifying operation status of household appliances and to keep pace with the development of smart power grid, this paper puts forward the integration of electric current and power data on the basis of existing algorithm. If average power difference of several adjacent cycles varies from the baseline and goes beyond the pre-assigned threshold value, the event will be flagged. Based on MATLAB platform and domestic appliances simulations, the results of tested data and verified algorithm indicate that the power method has accomplished desired results of appliance identification.

  14. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  15. Arachne - A web-based event viewer for MINERvA

    Energy Technology Data Exchange (ETDEWEB)

    Tagg, N.; /Otterbein Coll.; Brangham, J.; /Otterbein Coll.; Chvojka, J.; /Rochester U.; Clairemont, M.; /Otterbein Coll.; Day, M.; /Rochester U.; Eberly, B.; /Pittsburgh U.; Felix, J.; /Guanajuato U.; Fields, L.; /Northwestern U.; Gago, A.M.; /Lima, Pont. U. Catolica; Gran, R.; /Maryland U.; Harris, D.A.; /Fermilab /William-Mary Coll.

    2011-11-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  16. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Location-based technologies for supporting elderly pedestrian in "getting lost" events.

    Science.gov (United States)

    Pulido Herrera, Edith

    2017-05-01

    Localization-based technologies promise to keep older adults with dementia safe and support them and their caregivers during getting lost events. This paper summarizes mainly technological contributions to support the target group in these events. Moreover, important aspects of the getting lost phenomenon such as its concept and ethical issues are also briefly addressed. Papers were selected from scientific databases and gray literature. Since the topic is still in its infancy, other terms were used to find contributions associated with getting lost e.g. wandering. Trends of applying localization systems were identified as personal locators, perimeter systems and assistance systems. The first system barely considered the older adult's opinion, while assistance systems may involve context awareness to improve the support for both the elderly and the caregiver. Since few studies report multidisciplinary work with a special focus on getting lost, there is not a strong evidence of the real efficiency of localization systems or guidelines to design systems for the target group. Further research about getting lost is required to obtain insights for developing customizable systems. Moreover, considering conditions of the older adult might increase the impact of developments that combine localization technologies and artificial intelligence techniques. Implications for Rehabilitation Whilst there is no cure for dementia such as Alzheimer's, it is feasible to take advantage of technological developments to somewhat diminish its negative impact. For instance, location-based systems may provide information to early diagnose the Alzheimer's disease by assessing navigational impairments of older adults. Assessing the latest supportive technologies and methodologies may provide insights to adopt strategies to properly manage getting lost events. More user-centered designs will provide appropriate assistance to older adults. Namely, customizable systems could assist older adults

  18. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  19. Tsunami Source Identification on the 1867 Tsunami Event Based on the Impact Intensity

    Science.gov (United States)

    Wu, T. R.

    2014-12-01

    The 1867 Keelung tsunami event has drawn significant attention from people in Taiwan. Not only because the location was very close to the 3 nuclear power plants which are only about 20km away from the Taipei city but also because of the ambiguous on the tsunami sources. This event is unique in terms of many aspects. First, it was documented on many literatures with many languages and with similar descriptions. Second, the tsunami deposit was discovered recently. Based on the literatures, earthquake, 7-meter tsunami height, volcanic smoke, and oceanic smoke were observed. Previous studies concluded that this tsunami was generated by an earthquake with a magnitude around Mw7.0 along the Shanchiao Fault. However, numerical results showed that even a Mw 8.0 earthquake was not able to generate a 7-meter tsunami. Considering the steep bathymetry and intense volcanic activities along the Keelung coast, one reasonable hypothesis is that different types of tsunami sources were existed, such as the submarine landslide or volcanic eruption. In order to confirm this scenario, last year we proposed the Tsunami Reverse Tracing Method (TRTM) to find the possible locations of the tsunami sources. This method helped us ruling out the impossible far-field tsunami sources. However, the near-field sources are still remain unclear. This year, we further developed a new method named 'Impact Intensity Analysis' (IIA). In the IIA method, the study area is divided into a sequence of tsunami sources, and the numerical simulations of each source is conducted by COMCOT (Cornell Multi-grid Coupled Tsunami Model) tsunami model. After that, the resulting wave height from each source to the study site is collected and plotted. This method successfully helped us to identify the impact factor from the near-field potential sources. The IIA result (Fig. 1) shows that the 1867 tsunami event was a multi-source event. A mild tsunami was trigged by a Mw7.0 earthquake, and then followed by the submarine

  20. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  1. Event-based prospective memory in mildly and severely autistic children.

    Science.gov (United States)

    Sheppard, Daniel P; Kvavilashvili, Lia; Ryder, Nuala

    2016-01-01

    There is a growing body of research into the development of prospective memory (PM) in typically developing children but research is limited in autistic children (Aut) and rarely includes children with more severe symptoms. This study is the first to specifically compare event-based PM in severely autistic children to mildly autistic and typically developing children. Fourteen mildly autistic children and 14 severely autistic children, aged 5-13 years, were matched for educational attainment with 26 typically developing children aged 5-6 years. Three PM tasks and a retrospective memory task were administered. Results showed that severely autistic children performed less well than typically developing children on two PM tasks but mildly autistic children did not differ from either group. No group differences were found on the most motivating (a toy reward) task. The findings suggest naturalistic tasks and motivation are important factors in PM success in severely autistic children and highlights the need to consider the heterogeneity of autism and symptom severity in relation to performance on event-based PM tasks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Valenced cues and contexts have different effects on event-based prospective memory.

    Science.gov (United States)

    Graf, Peter; Yu, Martin

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  3. Valenced cues and contexts have different effects on event-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Peter Graf

    Full Text Available This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  4. Event-Based Prospective Memory Is Resistant but Not Immune to Proactive Interference.

    Science.gov (United States)

    Oates, Joyce M; Peynircioglu, Zehra F

    2016-01-01

    Recent evidence suggests that proactive interference (PI) does not hurt event-based prospective memory (ProM) the way it does retrospective memory (RetroM) (Oates, Peynircioglu, & Bates, 2015). We investigated this apparent resistance further. Introduction of a distractor task to ensure we were testing ProM rather than vigilance in Experiment 1 and tripling the number of lists to provide more opportunity for PI buildup in Experiment 2 still did not produce performance decrements. However, when the ProM task was combined with a RetroM task in Experiment 3, a comparable buildup and release was observed also in the ProM task. It appears that event based ProM is indeed somewhat resistant to PI, but this resistance can break down when the ProM task comprises the same stimuli as in an embedded RetroM task. We discuss the results using the ideas of cue overload and distinctiveness as well as shared attentional and working memory resources.

  5. Simulating the influence of life trajectory events on transport mode behavior in an agent-based system

    NARCIS (Netherlands)

    Verhoeven, M.; Arentze, T.A.; Timmermans, H.J.P.; Waerden, van der P.J.H.J.

    2007-01-01

    this paper describes the results of a study on the impact of lifecycle or life trajectory events on activity-travel decisions. This lifecycle trajectory of individual agents can be easily incorporated in an agent-based simulation system. This paper focuses on two lifecycle events, change in

  6. Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network

    Directory of Open Access Journals (Sweden)

    Hasmaini Mohamad

    2016-06-01

    Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.

  7. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    Directory of Open Access Journals (Sweden)

    Katherine M. Boydell

    2011-01-01

    Full Text Available A symbolic interaction (BLUMER, 1969; MEAD, 1934; PRUS, 1996; PRUS & GRILLS, 2003 approach was taken to study the collective event (PRUS, 1997 of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed us to study the process of the creation of the dance and its capacity to convert abstract research into concrete form and to produce generalizable abstract knowledge from the empirical research findings. Thus, through the techniques of movement, metaphor, voice-over, and music, the characterization of experience through dance was personal and generic, individual and collective, particular and trans-situational. The dance performance allowed us to address the visceral, emotional, and visual aspects of our research which are frequently invisible in traditional academia. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs110155

  8. Emergency Load Shedding Strategy Based on Sensitivity Analysis of Relay Operation Margin against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun Sun

    2012-01-01

    the runtime emergent states of related system component. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal load shedding strategy is applied to adjust the emergent states timely before the unwanted relay operation. Load dynamics is also taken...... into account to compensate load shedding amount calculation. And the multi-agent technology is applied for the whole strategy implementation. A test system is built in real time digital simulator (RTDS) and has demonstrated the effectiveness of the proposed strategy.......In order to prevent long term voltage instability and induced cascading events, a load shedding strategy based on the sensitivity of relay operation margin to load powers is discussed and proposed in this paper. The operation margin of critical impedance backup relay is defined to identify...

  9. Acquisition and classification of static single-event upset cross section for SRAM-based FPGAs

    International Nuclear Information System (INIS)

    Yao Zhibin; Fan Ruyu; Guo Hongxia; Wang Zhongming; He Baoping; Zhang Fengqi; Zhang Keying

    2011-01-01

    In order to evaluate single event upsets (SEUs) in SRAM-based FPGAs and to find the sensitive resource in configuration memory, a heavy ions irradiation experiment was carried out on a Xilinx FPGAs device XCV300PQ240. The experiment was conducted to gain the static SEU cross section and classify the SEUs in configurations memory according to different resource uses. The results demonstrate that the inter-memory of SRAM-based FPGAs is extremely sensitive to heavy-ion-induced SEUs. The LUT and routing resources are the main source of SEUs in the configuration memory, which covers more than 97.46% of the total upsets. The SEU sensitivity of various resources is different. The IOB control bit and LUT elements are more sensitive,and more attention should be paid to the LUT elements in radiation hardening,which account for a quite large proportion of the configuration memory. (authors)

  10. Event-Based Color Segmentation With a High Dynamic Range Sensor

    Directory of Open Access Journals (Sweden)

    Alexandre Marcireau

    2018-04-01

    Full Text Available This paper introduces a color asynchronous neuromorphic event-based camera and a methodology to process color output from the device to perform color segmentation and tracking at the native temporal resolution of the sensor (down to one microsecond. Our color vision sensor prototype is a combination of three Asynchronous Time-based Image Sensors, sensitive to absolute color information. We devise a color processing algorithm leveraging this information. It is designed to be computationally cheap, thus showing how low level processing benefits from asynchronous acquisition and high temporal resolution data. The resulting color segmentation and tracking performance is assessed both with an indoor controlled scene and two outdoor uncontrolled scenes. The tracking's mean error to the ground truth for the objects of the outdoor scenes ranges from two to twenty pixels.

  11. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    Science.gov (United States)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  12. A Community-Based Event Delivery Protocol in Publish/Subscribe Systems for Delay Tolerant Sensor Networks

    Directory of Open Access Journals (Sweden)

    Haigang Gong

    2009-09-01

    Full Text Available The basic operation of a Delay Tolerant Sensor Network (DTSN is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  13. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    Science.gov (United States)

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  14. wayGoo recommender system: personalized recommendations for events scheduling, based on static and real-time information

    Science.gov (United States)

    Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.

    2016-05-01

    wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.

  15. Parachuting from fixed objects: descriptive study of 106 fatal events in BASE jumping 1981-2006.

    Science.gov (United States)

    Westman, A; Rosén, M; Berggren, P; Björnstig, U

    2008-06-01

    To analyse the characteristics of fatal incidents in fixed object sport parachuting (building, antenna, span, earth (BASE) jumping) and create a basis for prevention. Descriptive epidemiological study. Data on reported fatal injury events (n = 106) worldwide in 1981-2006 retrieved from the BASE fatality list. Human, equipment and environmental factors. Identification of typical fatal incident and injury mechanisms for each of the four fixed object types of BASE jumping (building, antenna, span, earth). Human factors included parachutist free fall instability (loss of body control before parachute deployment), free fall acrobatics and deployment failure by the parachutist. Equipment factors included pilot chute malfunction and parachute malfunction. In cliff jumping (BASE object type E), parachute opening towards the object jumped was the most frequent equipment factor. Environmental factors included poor visibility, strong or turbulent winds, cold and water. The overall annual fatality risk for all object types during the year 2002 was estimated at about one fatality per 60 participants. Participants in BASE jumping should target risk factors with training and technical interventions. The mechanisms described in this study should be used by rescue units to improve the management of incidents.

  16. Analysis of adverse events of renal impairment related to platinum-based compounds using the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Naganuma, Misa; Motooka, Yumi; Sasaoka, Sayaka; Hatahira, Haruna; Hasegawa, Shiori; Fukuda, Akiho; Nakao, Satoshi; Shimada, Kazuyo; Hirade, Koseki; Mori, Takayuki; Yoshimura, Tomoaki; Kato, Takeshi; Nakamura, Mitsuhiro

    2018-01-01

    Platinum compounds cause several adverse events, such as nephrotoxicity, gastrointestinal toxicity, myelosuppression, ototoxicity, and neurotoxicity. We evaluated the incidence of renal impairment as adverse events are related to the administration of platinum compounds using the Japanese Adverse Drug Event Report database. We analyzed adverse events associated with the use of platinum compounds reported from April 2004 to November 2016. The reporting odds ratio at 95% confidence interval was used to detect the signal for each renal impairment incidence. We evaluated the time-to-onset profile of renal impairment and assessed the hazard type using Weibull shape parameter and used the applied association rule mining technique to discover undetected relationships such as possible risk factor. In total, 430,587 reports in the Japanese Adverse Drug Event Report database were analyzed. The reporting odds ratios (95% confidence interval) for renal impairment resulting from the use of cisplatin, oxaliplatin, carboplatin, and nedaplatin were 2.7 (2.5-3.0), 0.6 (0.5-0.7), 0.8 (0.7-1.0), and 1.3 (0.8-2.1), respectively. The lower limit of the reporting odds ratio (95% confidence interval) for cisplatin was >1. The median (lower-upper quartile) onset time of renal impairment following the use of platinum-based compounds was 6.0-8.0 days. The Weibull shape parameter β and 95% confidence interval upper limit of oxaliplatin were impairment during cisplatin use in real-world setting. The present findings demonstrate that the incidence of renal impairment following cisplatin use should be closely monitored when patients are hypertensive or diabetic, or when they are co-administered furosemide, loxoprofen, or pemetrexed. In addition, healthcare professionals should closely assess a patient's background prior to treatment.

  17. Toward zero waste: Composting and recycling for sustainable venue based events

    International Nuclear Information System (INIS)

    Hottle, Troy A.; Bilec, Melissa M.; Brown, Nicholas R.; Landis, Amy E.

    2015-01-01

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO 2 equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO 2 eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO 2 eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night (with the

  18. Toward zero waste: Composting and recycling for sustainable venue based events

    Energy Technology Data Exchange (ETDEWEB)

    Hottle, Troy A., E-mail: troy.hottle@asu.edu [Arizona State University, School of Sustainable Engineering and the Built Environment, 370 Interdisciplinary Science and Technology Building 4 (ISTB4), 781 East Terrace Road, Tempe, AZ 85287-6004 (United States); Bilec, Melissa M., E-mail: mbilec@pitt.edu [University of Pittsburgh, Civil and Environmental Engineering, 153 Benedum Hall, 3700 O’Hara Street, Pittsburgh, PA 15261-3949 (United States); Brown, Nicholas R., E-mail: nick.brown@asu.edu [Arizona State University, University Sustainability Practices, 1130 East University Drive, Suite 206, Tempe, AZ 85287 (United States); Landis, Amy E., E-mail: amy.landis@asu.edu [Arizona State University, School of Sustainable Engineering and the Built Environment, 375 Interdisciplinary Science and Technology Building 4 (ISTB4), 781 East Terrace Road, Tempe, AZ 85287-6004 (United States)

    2015-04-15

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO{sub 2} equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO{sub 2} eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO{sub 2} eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night

  19. Various sizes of sliding event bursts in the plastic flow of metallic glasses based on a spatiotemporal dynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Jingli, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn; Chen, Cun [School of Mathematics and Statistics, Zhengzhou University, Zhengzhou 450001 (China); Wang, Gang, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn [Laboratory for Microstructures, Shanghai University, Shanghai 200444 (China); Cheung, Wing-Sum [Department of Mathematics, The University of HongKong, HongKong (China); Sun, Baoan; Mattern, Norbert [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Siegmund, Stefan [Department of Mathematics, TU Dresden, D-01062 Dresden (Germany); Eckert, Jürgen [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Institute of Materials Science, TU Dresden, D-01062 Dresden (Germany)

    2014-07-21

    This paper presents a spatiotemporal dynamic model based on the interaction between multiple shear bands in the plastic flow of metallic glasses during compressive deformation. Various sizes of sliding events burst in the plastic deformation as the generation of different scales of shear branches occurred; microscopic creep events and delocalized sliding events were analyzed based on the established model. This paper discusses the spatially uniform solutions and traveling wave solution. The phase space of the spatially uniform system applied in this study reflected the chaotic state of the system at a lower strain rate. Moreover, numerical simulation showed that the microscopic creep events were manifested at a lower strain rate, whereas the delocalized sliding events were manifested at a higher strain rate.

  20. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  1. Excessive Heat Events and National Security: Building Resilience based on Early Warning Systems

    Science.gov (United States)

    Vintzileos, A.

    2017-12-01

    Excessive heat events (EHE) affect security of Nations in multiple direct and indirect ways. EHE are the top cause for morbidity/mortality associated to any atmospheric extremes. Higher energy consumption used for cooling can lead to black-outs and social disorder. EHE affect the food supply chain reducing crop yield and increasing the probability of food contamination during delivery and storage. Distribution of goods during EHE can be severely disrupted due to mechanical failure of transportation equipment. EHE during athletic events e.g., marathons, may result to a high number of casualties. Finally, EHE may also affect military planning by e.g. reducing hours of exercise and by altering combat gear. Early warning systems for EHE allow for building resilience. In this paper we first define EHE as at least two consecutive heat days; a heat day is defined as a day with a maximum heat index with probability of occurrence that exceeds a certain threshold. We then use retrospective forecasts performed with a multitude of operational models and show that it is feasible to forecast EHE at forecast lead of week-2 and week-3 over the contiguous United States. We finally introduce an improved definition of EHE based on an intensity index and investigate forecast skill of the predictive system in the tropics and subtropics.

  2. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  3. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Science.gov (United States)

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  4. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    Directory of Open Access Journals (Sweden)

    Xiaoqing Hao

    Full Text Available To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P, weakly positive (p, neutral (o, weakly negative (n, and strongly negative (N. These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism.

  6. Management of investment-construction projects basing on the matrix of key events

    Directory of Open Access Journals (Sweden)

    Morozenko Andrey Aleksandrovich

    2016-11-01

    Full Text Available The article considers the current problematic issues in the management of investment-construction projects, examines the questions of efficiency increase of construction operations on the basis of the formation of a reflex-adaptive organizational structure. The authors analyzed the necessity of forming a matrix of key events in the investment-construction project (ICP, which will create the optimal structure of the project, basing on the work program for its implementation. For convenience of representing programs of the project implementation in time the authors make recommendations to consolidate the works into separate, economically independent functional blocks. It is proposed to use an algorithm of forming the matrix of an investment-construction project, considering the economic independence of the functional blocks and stages of the ICP implementation. The use of extended network model is justified, which is supplemented by organizational and structural constraints at different stages of the project, highlighting key events fundamentally influencing the further course of the ICP implementation.

  7. Opportunities for Web-based Drug Repositioning: Searching for Potential Antihypertensive Agents with Hypotension Adverse Events.

    Science.gov (United States)

    Wang, Kejian; Wan, Mei; Wang, Rui-Sheng; Weng, Zuquan

    2016-04-01

    Drug repositioning refers to the process of developing new indications for existing drugs. As a phenotypic indicator of drug response in humans, clinical side effects may provide straightforward signals and unique opportunities for drug repositioning. We aimed to identify drugs frequently associated with hypotension adverse reactions (ie, the opposite condition of hypertension), which could be potential candidates as antihypertensive agents. We systematically searched the electronic records of the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) through the openFDA platform to assess the association between hypotension incidence and antihypertensive therapeutic effect regarding a list of 683 drugs. Statistical analysis of FAERS data demonstrated that those drugs frequently co-occurring with hypotension events were more likely to have antihypertensive activity. Ranked by the statistical significance of frequent hypotension reporting, the well-known antihypertensive drugs were effectively distinguished from others (with an area under the receiver operating characteristic curve > 0.80 and a normalized discounted cumulative gain of 0.77). In addition, we found a series of antihypertensive agents (particularly drugs originally developed for treating nervous system diseases) among the drugs with top significant reporting, suggesting the good potential of Web-based and data-driven drug repositioning. We found several candidate agents among the hypotension-related drugs on our list that may be redirected for lowering blood pressure. More important, we showed that a pharmacovigilance system could alternatively be used to identify antihypertensive agents and sustainably create opportunities for drug repositioning.

  8. Impacts of European drought events: insights from an international database of text-based reports

    Science.gov (United States)

    Stahl, Kerstin; Kohn, Irene; Blauhut, Veit; Urquijo, Julia; De Stefano, Lucia; Acácio, Vanda; Dias, Susana; Stagge, James H.; Tallaksen, Lena M.; Kampragou, Eleni; Van Loon, Anne F.; Barker, Lucy J.; Melsen, Lieke A.; Bifulco, Carlo; Musolino, Dario; de Carli, Alessandro; Massarutto, Antonio; Assimacopoulos, Dionysis; Van Lanen, Henny A. J.

    2016-03-01

    Drought is a natural hazard that can cause a wide range of impacts affecting the environment, society, and the economy. Providing an impact assessment and reducing vulnerability to these impacts for regions beyond the local scale, spanning political and sectoral boundaries, requires systematic and detailed data regarding impacts. This study presents an assessment of the diversity of drought impacts across Europe based on the European Drought Impact report Inventory (EDII), a unique research database that has collected close to 5000 impact reports from 33 European countries. The reported drought impacts were classified into major impact categories, each of which had a number of subtypes. The distribution of these categories and types was then analyzed over time, by country, across Europe and for particular drought events. The results show that impacts on agriculture and public water supply dominate the collection of drought impact reports for most countries and for all major drought events since the 1970s, while the number and relative fractions of reported impacts in other sectors can vary regionally and from event to event. The analysis also shows that reported impacts have increased over time as more media and website information has become available and environmental awareness has increased. Even though the distribution of impact categories is relatively consistent across Europe, the details of the reports show some differences. They confirm severe impacts in southern regions (particularly on agriculture and public water supply) and sector-specific impacts in central and northern regions (e.g., on forestry or energy production). The protocol developed thus enabled a new and more comprehensive view on drought impacts across Europe. Related studies have already developed statistical techniques to evaluate the link between drought indices and the categorized impacts using EDII data. The EDII is a living database and is a promising source for further research on

  9. Allowing Brief Delays in Responding Improves Event-Based Prospective Memory for Young Adults Living with HIV Disease

    OpenAIRE

    Loft, Shayne; Doyle, Katie L.; Naar-King, Sylvie; Outlaw, Angulique Y.; Nichols, Sharon L.; Weber, Erica; Blackstone, Kaitlin; Woods, Steven Paul

    2014-01-01

    Event-based prospective memory (PM) tasks require individuals to remember to perform an action when they encounter a specific cue in the environment, and have clear relevance for daily functioning for individuals with HIV. In many everyday tasks, the individual must not only maintain the intent to perform the PM task, but the PM task response also competes with the alternative and more habitual task response. The current study examined whether event-based PM can be improved by slowing down th...

  10. Eruptive event generator based on the Gibson-Low magnetic configuration

    Science.gov (United States)

    Borovikov, D.; Sokolov, I. V.; Manchester, W. B.; Jin, M.; Gombosi, T. I.

    2017-08-01

    Coronal mass ejections (CMEs), a kind of energetic solar eruptions, are an integral subject of space weather research. Numerical magnetohydrodynamic (MHD) modeling, which requires powerful computational resources, is one of the primary means of studying the phenomenon. With increasing accessibility of such resources, grows the demand for user-friendly tools that would facilitate the process of simulating CMEs for scientific and operational purposes. The Eruptive Event Generator based on Gibson-Low flux rope (EEGGL), a new publicly available computational model presented in this paper, is an effort to meet this demand. EEGGL allows one to compute the parameters of a model flux rope driving a CME via an intuitive graphical user interface. We provide a brief overview of the physical principles behind EEGGL and its functionality. Ways toward future improvements of the tool are outlined.

  11. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  12. Ontology-based knowledge management for personalized adverse drug events detection.

    Science.gov (United States)

    Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue

    2011-01-01

    Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.

  13. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  14. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongxia Zhang

    2017-11-01

    Full Text Available Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM. This study investigated how EBPM performance is affected by task duration by having university students (n = 223 perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task’s duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1 Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2 As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3 The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  15. Historical Chronology of ENSO Events Based Upon Documentary Data From South America: Strengths and Limitations

    Science.gov (United States)

    Luc, O.

    2007-05-01

    The first reconstructions of past El Niño occurrences were proposed by W. Quinn twenty years ago. They were based on documentary evidence of anomalous rainfall episodes, destructive floods and other possible impacts of El Niño conditions in Peru and other South-American countries. It has been shown, later, that the El Niño chronological sequence covering the last four and a half centuries produced by Quinn needed a thorough revision since many so-called EN events had not occurred while some others had been overlooked. Beside the classical methodological problems met in historical climatology studies (reliability of data, confidence in the sources, primary and secondary information), the reconstruction of former EN events faces specific difficulties dealing with the significance of the indicators and their spatial location. For instance, strong precipitation anomalies during summer in Southern Ecuador and northern Peru and precipitation excess recorded in the preceding winter in central Chile constitute quite reliable proxies of El Niño conditions, in modern times. However this observed teleconnection pattern, which is useful to reinforce the interpretation of past EN occurrences, seems to have been inoperative before the early nineteenth century. It is interpreted that atmospheric circulation features during the Little Ice Age interfered with the teleconnection system linking the EN impacts in northern Peru and central Chile. As a consequence, how should be evaluated the significance of documented winter precipitation excess in central Chile in years during which there is drought evidence in northern Peru, during the sixteenth to eighteenth century? And vice versa, are former evidences for precipitation excess in northern Peru (prior to the nineteenth century) quite reliable indicators for EN conditions, even if the preceding winter was dry in the Valparaiso-Santiago region? Other specific problems met in the building-up of a consolidated EN chronological

  16. Swarm-Aurora: A web-based tool for quickly identifying multi-instrument auroral events

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Knudsen, D. J.; Frey, H. U.; Kauristie, K.; Partamies, N.; Jackel, B. J.; Gillies, M.; Holmdahl Olsen, P. E.

    2016-12-01

    In recent years there has been a dramatic increase in ground-based auroral imaging systems. These include the continent-wide THEMIS-ASI network, and imagers operated by other programs including GO-Canada, MIRACLE, AGO, OMTI, and more. In the near future, a new Canadian program called TREx will see the deployment of new narrow-band ASIs that will provide multi-wavelength imaging across Western Canada. At the same time, there is an unprecedented fleet of international spacecraft probing geospace at low and high altitudes. We are now in the position to simultaneously observe the magnetospheric drivers of aurora, observe in situ the waves, currents, and particles associated with MI coupling, and the conjugate aurora. Whereas a decade ago, a single magnetic conjunction between one ASI and a low altitude satellite was a relatively rare event, we now have a plethora of triple conjunctions between imagers, low-altitude spacecraft, and near-equatorial magnetospheric probes. But with these riches comes a new level of complexity. It is often difficult to identify the many useful conjunctions for a specific line of inquiry from the multitude of conjunctions where the geospace conditions are often not relevant and/or the imaging is compromised by clouds, moon, or other factors. Swarm-Aurora was designed to facilitate and drive the use of Swarm in situ measurements in auroral science. The project seeks to build a bridge between the Swarm science community, Swarm data, and the complimentary auroral data and community. Swarm-Aurora (http://swarm-aurora.phys.ucalgary.ca) incorporates a web-based tool which provides access to quick-look summary data for a large array of instruments, with Swarm in situ and ground-based ASI data as the primary focus. This web interface allows researchers to quickly and efficiently browse Swarm and ASI data to identify auroral events of interest to them. This allows researchers to be able to easily and quickly identify Swarm overflights of ASIs that

  17. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  18. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  19. Acute disseminated encephalomyelitis onset: evaluation based on vaccine adverse events reporting systems.

    Directory of Open Access Journals (Sweden)

    Paolo Pellegrino

    Full Text Available OBJECTIVE: To evaluate epidemiological features of post vaccine acute disseminated encephalomyelitis (ADEM by considering data from different pharmacovigilance surveillance systems. METHODS: The Vaccine Adverse Event Reporting System (VAERS database and the EudraVigilance post-authorisation module (EVPM were searched to identify post vaccine ADEM cases. Epidemiological features including sex and related vaccines were analysed. RESULTS: We retrieved 205 and 236 ADEM cases from the EVPM and VAERS databases, respectively, of which 404 were considered for epidemiological analysis following verification and causality assessment. Half of the patients had less than 18 years and with a slight male predominance. The time interval from vaccination to ADEM onset was 2-30 days in 61% of the cases. Vaccine against seasonal flu and human papilloma virus vaccine were those most frequently associated with ADEM, accounting for almost 30% of the total cases. Mean number of reports per year between 2005 and 2012 in VAERS database was 40±21.7, decreasing after 2010 mainly because of a reduction of reports associated with human papilloma virus and Diphtheria, Pertussis, Tetanus, Polio and Haemophilus Influentiae type B vaccines. CONCLUSIONS: This study has a high epidemiological power as it is based on information on adverse events having occurred in over one billion people. It suffers from lack of rigorous case verification due to the weakness intrinsic to the surveillance databases used. At variance with previous reports on a prevalence of ADEM in childhood we demonstrate that it may occur at any age when post vaccination. This study also shows that the diminishing trend in post vaccine ADEM reporting related to Diphtheria, Pertussis, Tetanus, Polio and Haemophilus Influentiae type B and human papilloma virus vaccine groups is most likely not [corrected] due to a decline in vaccine coverage indicative of a reduced attention to this adverse drug reaction.

  20. Cardiovascular events in patients with atherothrombotic disease: a population-based longitudinal study in Taiwan.

    Directory of Open Access Journals (Sweden)

    Wen-Hsien Lee

    Full Text Available BACKGROUND: Atherothrombotic diseases including cerebrovascular disease (CVD, coronary artery disease (CAD, and peripheral arterial disease (PAD, contribute to the major causes of death in the world. Although several studies showed the association between polyvascular disease and poor cardiovascular (CV outcomes in Asian population, there was no large-scale study to validate this relationship in this population. METHODS AND RESULTS: This retrospective cohort study included patients with a diagnosis of CVD, CAD, or PAD from the database contained in the Taiwan National Health Insurance Bureau during 2001-2004. A total of 19954 patients were enrolled in this study. The atherothrombotic disease score was defined according to the number of atherothrombotic disease. The study endpoints included acute coronary syndrome (ACS, all strokes, vascular procedures, in hospital mortality, and so on. The event rate of ischemic stroke (18.2% was higher than that of acute myocardial infarction (5.7% in our patients (P = 0.0006. In the multivariate Cox regression analyses, the adjusted hazard ratios (HRs of each increment of atherothrombotic disease score in predicting ACS, all strokes, vascular procedures, and in hospital mortality were 1.41, 1.66, 1.30, and 1.14, respectively (P≦0.0169. CONCLUSIONS: This large population-based longitudinal study in patients with atherothrombotic disease demonstrated the risk of subsequent ischemic stroke was higher than that of subsequent AMI. In addition, the subsequent adverse CV events including ACS, all stroke, vascular procedures, and in hospital mortality were progressively increased as the increase of atherothrombotic disease score.

  1. Event-based model diagnosis of rainfall-runoff model structures

    International Nuclear Information System (INIS)

    Stanzel, P.

    2012-01-01

    The objective of this research is a comparative evaluation of different rainfall-runoff model structures. Comparative model diagnostics facilitate the assessment of strengths and weaknesses of each model. The application of multiple models allows an analysis of simulation uncertainties arising from the selection of model structure, as compared with effects of uncertain parameters and precipitation input. Four different model structures, including conceptual and physically based approaches, are compared. In addition to runoff simulations, results for soil moisture and the runoff components of overland flow, interflow and base flow are analysed. Catchment runoff is simulated satisfactorily by all four model structures and shows only minor differences. Systematic deviations from runoff observations provide insight into model structural deficiencies. While physically based model structures capture some single runoff events better, they do not generally outperform conceptual model structures. Contributions to uncertainty in runoff simulations stemming from the choice of model structure show similar dimensions to those arising from parameter selection and the representation of precipitation input. Variations in precipitation mainly affect the general level and peaks of runoff, while different model structures lead to different simulated runoff dynamics. Large differences between the four analysed models are detected for simulations of soil moisture and, even more pronounced, runoff components. Soil moisture changes are more dynamical in the physically based model structures, which is in better agreement with observations. Streamflow contributions of overland flow are considerably lower in these models than in the more conceptual approaches. Observations of runoff components are rarely made and are not available in this study, but are shown to have high potential for an effective selection of appropriate model structures (author) [de

  2. Age Differences in the Experience of Daily Life Events: A Study Based on the Social Goals Perspective.

    Science.gov (United States)

    Ji, Lingling; Peng, Huamao; Xue, Xiaotong

    2017-01-01

    This study examined age differences in daily life events related to different types of social goals based on the socioemotional selectivity theory (SST), and determined whether the positivity effect existed in the context of social goals in older adults' daily lives. Over a course of 14 days, 49 older adults and 36 younger adults wrote about up to three life events daily and rated the valence of each event. The findings indicated that (1) although both older and younger adults recorded events related to both emotional and knowledge-acquisition goals, the odds ratio for reporting a higher number of events related to emotional goals compared to the number of events related to knowledge-acquisition goals was 2.12 times higher in older adults than that observed in younger adults. (2) Considering the number of events, there was an age-related positivity effect only for knowledge-related goals, and (3) older adults' ratings for events related to emotional and knowledge-acquisition goals were significantly more positive compared to those observed in younger adults. These findings supported the SST, and to some extent, the positivity effect was demonstrated in the context of social goals.

  3. Age Differences in the Experience of Daily Life Events: A Study Based on the Social Goals Perspective

    Directory of Open Access Journals (Sweden)

    Lingling Ji

    2017-09-01

    Full Text Available This study examined age differences in daily life events related to different types of social goals based on the socioemotional selectivity theory (SST, and determined whether the positivity effect existed in the context of social goals in older adults’ daily lives. Over a course of 14 days, 49 older adults and 36 younger adults wrote about up to three life events daily and rated the valence of each event. The findings indicated that (1 although both older and younger adults recorded events related to both emotional and knowledge-acquisition goals, the odds ratio for reporting a higher number of events related to emotional goals compared to the number of events related to knowledge-acquisition goals was 2.12 times higher in older adults than that observed in younger adults. (2 Considering the number of events, there was an age-related positivity effect only for knowledge-related goals, and (3 older adults’ ratings for events related to emotional and knowledge-acquisition goals were significantly more positive compared to those observed in younger adults. These findings supported the SST, and to some extent, the positivity effect was demonstrated in the context of social goals.

  4. Vision-based Detection of Acoustic Timed Events: a Case Study on Clarinet Note Onsets

    Science.gov (United States)

    Bazzica, A.; van Gemert, J. C.; Liem, C. C. S.; Hanjalic, A.

    2017-05-01

    Acoustic events often have a visual counterpart. Knowledge of visual information can aid the understanding of complex auditory scenes, even when only a stereo mixdown is available in the audio domain, \\eg identifying which musicians are playing in large musical ensembles. In this paper, we consider a vision-based approach to note onset detection. As a case study we focus on challenging, real-world clarinetist videos and carry out preliminary experiments on a 3D convolutional neural network based on multiple streams and purposely avoiding temporal pooling. We release an audiovisual dataset with 4.5 hours of clarinetist videos together with cleaned annotations which include about 36,000 onsets and the coordinates for a number of salient points and regions of interest. By performing several training trials on our dataset, we learned that the problem is challenging. We found that the CNN model is highly sensitive to the optimization algorithm and hyper-parameters, and that treating the problem as binary classification may prevent the joint optimization of precision and recall. To encourage further research, we publicly share our dataset, annotations and all models and detail which issues we came across during our preliminary experiments.

  5. An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.

    Science.gov (United States)

    Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein

    2017-12-22

    The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.

  6. Toward brain-computer interface based wheelchair control utilizing tactually-evoked event-related potentials

    Science.gov (United States)

    2014-01-01

    Background People with severe disabilities, e.g. due to neurodegenerative disease, depend on technology that allows for accurate wheelchair control. For those who cannot operate a wheelchair with a joystick, brain-computer interfaces (BCI) may offer a valuable option. Technology depending on visual or auditory input may not be feasible as these modalities are dedicated to processing of environmental stimuli (e.g. recognition of obstacles, ambient noise). Herein we thus validated the feasibility of a BCI based on tactually-evoked event-related potentials (ERP) for wheelchair control. Furthermore, we investigated use of a dynamic stopping method to improve speed of the tactile BCI system. Methods Positions of four tactile stimulators represented navigation directions (left thigh: move left; right thigh: move right; abdomen: move forward; lower neck: move backward) and N = 15 participants delivered navigation commands by focusing their attention on the desired tactile stimulus in an oddball-paradigm. Results Participants navigated a virtual wheelchair through a building and eleven participants successfully completed the task of reaching 4 checkpoints in the building. The virtual wheelchair was equipped with simulated shared-control sensors (collision avoidance), yet these sensors were rarely needed. Conclusion We conclude that most participants achieved tactile ERP-BCI control sufficient to reliably operate a wheelchair and dynamic stopping was of high value for tactile ERP classification. Finally, this paper discusses feasibility of tactile ERPs for BCI based wheelchair control. PMID:24428900

  7. An Address Event Representation-Based Processing System for a Biped Robot

    Directory of Open Access Journals (Sweden)

    Uziel Jaramillo-Avila

    2016-02-01

    Full Text Available In recent years, several important advances have been made in the fields of both biologically inspired sensorial processing and locomotion systems, such as Address Event Representation-based cameras (or Dynamic Vision Sensors and in human-like robot locomotion, e.g., the walking of a biped robot. However, making these fields merge properly is not an easy task. In this regard, Neuromorphic Engineering is a fast-growing research field, the main goal of which is the biologically inspired design of hybrid hardware systems in order to mimic neural architectures and to process information in the manner of the brain. However, few robotic applications exist to illustrate them. The main goal of this work is to demonstrate, by creating a closed-loop system using only bio-inspired techniques, how such applications can work properly. We present an algorithm using Spiking Neural Networks (SNN for a biped robot equipped with a Dynamic Vision Sensor, which is designed to follow a line drawn on the floor. This is a commonly used method for demonstrating control techniques. Most of them are fairly simple to implement without very sophisticated components; however, it can still serve as a good test in more elaborate circumstances. In addition, the locomotion system proposed is able to coordinately control the six DOFs of a biped robot in switching between basic forms of movement. The latter has been implemented as a FPGA-based neuromorphic system. Numerical tests and hardware validation are presented.

  8. Dust events in Beijing, China (2004–2006: comparison of ground-based measurements with columnar integrated observations

    Directory of Open Access Journals (Sweden)

    Z. J. Wu

    2009-09-01

    Full Text Available Ambient particle number size distributions spanning three years were used to characterize the frequency and intensity of atmospheric dust events in the urban areas of Beijing, China in combination with AERONET sun/sky radiometer data. Dust events were classified into two types based on the differences in particle number and volume size distributions and local weather conditions. This categorization was confirmed by aerosol index images, columnar aerosol optical properties, and vertical potential temperature profiles. During the type-1 events, dust particles dominated the total particle volume concentration (<10 μm, with a relative share over 70%. Anthropogenic particles in the Aitken and accumulation mode played a subordinate role here because of high wind speeds (>4 m s−1. The type-2 events occurred in rather stagnant air masses and were characterized by a lower volume fraction of coarse mode particles (on average, 55%. Columnar optical properties showed that the superposition of dust and anthropogenic aerosols in type-2 events resulted in a much higher AOD (average: 1.51 than for the rather pure dust aerosols in type-1 events (average AOD: 0.36. A discrepancy was found between the ground-based and column integrated particle volume size distributions, especially for the coarse mode particles. This discrepancy likely originates from both the limited comparability of particle volume size distributions derived from Sun photometer and in situ number size distributions, and the inhomogeneous vertical distribution of particles during dust events.

  9. Creating personalized memories from social events: Community-based support for multi-camera recordings of school concerts

    OpenAIRE

    Guimaraes R.L.; Cesar P.; Bulterman D.C.A.; Zsombori V.; Kegel I.

    2011-01-01

    htmlabstractThe wide availability of relatively high-quality cameras makes it easy for many users to capture video fragments of social events such as concerts, sports events or community gatherings. The wide availability of simple sharing tools makes it nearly as easy to upload individual fragments to on-line video sites. Current work on video mashups focuses on the creation of a video summary based on the characteristics of individual media fragments, but it fails to address the interpersona...

  10. Transcription-based model for the induction of chromosomal exchange events by ionising radiation

    International Nuclear Information System (INIS)

    Radford, I.A.

    2003-01-01

    The mechanistic basis for chromosomal aberration formation, following exposure of mammalian cells to ionising radiation, has long been debated. Although chromosomal aberrations are probably initiated by DNA double-strand breaks (DSB), little is understood about the mechanisms that generate and modulate DNA rearrangement. Based on results from our laboratory and data from the literature, a novel model of chromosomal aberration formation has been suggested (Radford 2002). The basic postulates of this model are that: (1) DSB, primarily those involving multiple individual damage sites (i.e. complex DSB), are the critical initiating lesion; (2) only those DSB occurring in transcription units that are associated with transcription 'factories' (complexes containing multiple transcription units) induce chromosomal exchange events; (3) such DSB are brought into contact with a DNA topoisomerase I molecule through RNA polymerase II catalysed transcription and give rise to trapped DNA-topo I cleavage complexes; and (4) trapped complexes interact with another topo I molecule on a temporarily inactive transcription unit at the same transcription factory leading to DNA cleavage and subsequent strand exchange between the cleavage complexes. We have developed a method using inverse PCR that allows the detection and sequencing of putative ionising radiation-induced DNA rearrangements involving different regions of the human genome (Forrester and Radford 1998). The sequences detected by inverse PCR can provide a test of the prediction of the transcription-based model that ionising radiation-induced DNA rearrangements occur between sequences in active transcription units. Accordingly, reverse transcriptase PCR was used to determine if sequences involved in rearrangements were transcribed in the test cells. Consistent with the transcription-based model, nearly all of the sequences examined gave a positive result to reverse transcriptase PCR (Forrester and Radford unpublished)

  11. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  12. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  13. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  14. New developments in file-based infrastructure for ATLAS event selection

    Energy Technology Data Exchange (ETDEWEB)

    Gemmeren, P van; Malon, D M [Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nowak, M, E-mail: gemmeren@anl.go [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2010-04-01

    In ATLAS software, TAGs are event metadata records that can be stored in various technologies, including ROOT files and relational databases. TAGs are used to identify and extract events that satisfy certain selection predicates, which can be coded as SQL-style queries. TAG collection files support in-file metadata to store information describing all events in the collection. Event Selector functionality has been augmented to provide such collection-level metadata to subsequent algorithms. The ATLAS I/O framework has been extended to allow computational processing of TAG attributes to select or reject events without reading the event data. This capability enables physicists to use more detailed selection criteria than are feasible in an SQL query. For example, the TAGs contain enough information not only to check the number of electrons, but also to calculate their distance to the closest jet-a calculation that would be difficult to express in SQL. Another new development allows ATLAS to write TAGs directly into event data files. This feature can improve performance by supporting advanced event selection capabilities, including computational processing of TAG information, without the need for external TAG file or database access.

  15. Selection of events at Ukrainian NPPs using the algorithm based on accident precursor method

    International Nuclear Information System (INIS)

    Vorontsov, D.V.; Lyigots'kij, O.Yi.; Serafin, R.Yi.; Tkachova, L.M.

    2012-01-01

    The paper describes a general approach to the first stage of research and development on analysis of Ukrainian NPP operation events from 1 January 2000 to 31 December 2010 using the accident precursor approach. Groups of potentially important events formed after their selection and classification are provided

  16. Some implications of an event-based definition of exposure to the risk of road accident.

    Science.gov (United States)

    Elvik, Rune

    2015-03-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the roadway. A typology of events representing a potential for an accident is proposed. Each event can be interpreted as a trial as defined in probability theory. Risk is the proportion of events that result in an accident. Defining exposure as events demanding the attention of road users implies that road users will learn from repeated exposure to these events, which in turn implies that there will normally be a negative relationship between exposure and risk. Four hypotheses regarding the relationship between exposure and risk are proposed. Preliminary tests support these hypotheses. Advantages and disadvantages of defining exposure as specific events are discussed. It is argued that developments in vehicle technology are likely to make events both observable and countable, thus ensuring that exposure is an operational concept. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Search for gamma-ray events in the BATSE data base

    Science.gov (United States)

    Lewin, Walter

    1994-01-01

    We find large location errors and error radii in the locations of channel 1 Cygnus X-1 events. These errors and their associated uncertainties are a result of low signal-to-noise ratios (a few sigma) in the two brightest detectors for each event. The untriggered events suffer from similarly low signal-to-noise ratios, and their location errors are expected to be at least as large as those found for Cygnus X-1 with a given signal-to-noise ratio. The statistical error radii are consistent with those found for Cygnus X-1 and with the published estimates. We therefore expect approximately 20 - 30 deg location errors for the untriggered events. Hence, many of the untriggered events occurring within a few months of the triggered activity from SGR 1900 plus 14 are indeed consistent with the SGR source location, although Cygnus X-1 is also a good candidate.

  18. Cooking and disgust sensitivity influence preference for attending insect-based food events.

    Science.gov (United States)

    Hamerman, Eric J

    2016-01-01

    Insects are energy-efficient and sustainable sources of animal protein in a world with insufficient food resources to feed an ever-increasing population. However, much of the western world refuses to eat insects because they perceive them as disgusting. This research finds that both animal reminder disgust and core disgust reduced people's willingness to attend a program called "Bug Appétit" in which insects were served as food. Additionally, people who were low in sensitivity to animal reminder disgust were more willing to attend this program after having been primed to think about cooking. Cooking is a process by which raw ingredients are transformed into finished products, reducing the "animalness" of meat products that renders them disgusting. Sensitivity to core disgust did not interact with cooking to influence willingness to attend the program. While prior research has emphasized that direct education campaigns about the benefits of entomophagy (the consumption of insects) can increase willingness to attend events at which insect-based food is served, this is the first demonstration that indirect priming can have a similar effect among a subset of the population. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Crime event 3D reconstruction based on incomplete or fragmentary evidence material--case report.

    Science.gov (United States)

    Maksymowicz, Krzysztof; Tunikowski, Wojciech; Kościuk, Jacek

    2014-09-01

    Using our own experience in 3D analysis, the authors will demonstrate the possibilities of 3D crime scene and event reconstruction in cases where originally collected material evidence is largely insufficient. The necessity to repeat forensic evaluation is often down to the emergence of new facts in the course of case proceedings. Even in cases when a crime scene and its surroundings have undergone partial or complete transformation, with regard to elements significant to the course of the case, or when the scene was not satisfactorily secured, it is still possible to reconstruct it in a 3D environment based on the originally-collected, even incomplete, material evidence. In particular cases when no image of the crime scene is available, its partial or even full reconstruction is still potentially feasible. Credibility of evidence for such reconstruction can still satisfy the evidence requirements in court. Reconstruction of the missing elements of the crime scene is still possible with the use of information obtained from current publicly available databases. In the study, we demonstrate that these can include Google Maps(®*), Google Street View(®*) and available construction and architecture archives. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Live event reconstruction in an optically read out GEM-based TPC

    Science.gov (United States)

    Brunbauer, F. M.; Galgóczi, G.; Gonzalez Diaz, D.; Oliveri, E.; Resnati, F.; Ropelewski, L.; Streli, C.; Thuiner, P.; van Stenis, M.

    2018-04-01

    Combining strong signal amplification made possible by Gaseous Electron Multipliers (GEMs) with the high spatial resolution provided by optical readout, highly performing radiation detectors can be realized. An optically read out GEM-based Time Projection Chamber (TPC) is presented. The device permits 3D track reconstruction by combining the 2D projections obtained with a CCD camera with timing information from a photomultiplier tube. Owing to the intuitive 2D representation of the tracks in the images and to automated control, data acquisition and event reconstruction algorithms, the optically read out TPC permits live display of reconstructed tracks in three dimensions. An Ar/CF4 (80/20%) gas mixture was used to maximize scintillation yield in the visible wavelength region matching the quantum efficiency of the camera. The device is integrated in a UHV-grade vessel allowing for precise control of the gas composition and purity. Long term studies in sealed mode operation revealed a minor decrease in the scintillation light intensity.

  1. Leveraging KVM Events to Detect Cache-Based Side Channel Attacks in a Virtualization Environment

    Directory of Open Access Journals (Sweden)

    Ady Wahyudi Paundu

    2018-01-01

    Full Text Available Cache-based side channel attack (CSCa techniques in virtualization systems are becoming more advanced, while defense methods against them are still perceived as nonpractical. The most recent CSCa variant called Flush + Flush has showed that the current detection methods can be easily bypassed. Within this work, we introduce a novel monitoring approach to detect CSCa operations inside a virtualization environment. We utilize the Kernel Virtual Machine (KVM event data in the kernel and process this data using a machine learning technique to identify any CSCa operation in the guest Virtual Machine (VM. We evaluate our approach using Receiver Operating Characteristic (ROC diagram of multiple attack and benign operation scenarios. Our method successfully separate the CSCa datasets from the non-CSCa datasets, on both trained and nontrained data scenarios. The successful classification also include the Flush + Flush attack scenario. We are also able to explain the classification results by extracting the set of most important features that separate both classes using their Fisher scores and show that our monitoring approach can work to detect CSCa in general. Finally, we evaluate the overhead impact of our CSCa monitoring method and show that it has a negligible computation overhead on the host and the guest VM.

  2. Model predictive control-based scheduler for repetitive discrete event systems with capacity constraints

    Directory of Open Access Journals (Sweden)

    Hiroyuki Goto

    2013-07-01

    Full Text Available A model predictive control-based scheduler for a class of discrete event systems is designed and developed. We focus on repetitive, multiple-input, multiple-output, and directed acyclic graph structured systems on which capacity constraints can be imposed. The target system’s behaviour is described by linear equations in max-plus algebra, referred to as state-space representation. Assuming that the system’s performance can be improved by paying additional cost, we adjust the system parameters and determine control inputs for which the reference output signals can be observed. The main contribution of this research is twofold, 1: For systems with capacity constraints, we derived an output prediction equation as functions of adjustable variables in a recursive form, 2: Regarding the construct for the system’s representation, we improved the structure to accomplish general operations which are essential for adjusting the system parameters. The result of numerical simulation in a later section demonstrates the effectiveness of the developed controller.

  3. Position sensitive regions in a generic radiation sensor based on single event upsets in dynamic RAMs

    International Nuclear Information System (INIS)

    Darambara, D.G.; Spyrou, N.M.

    1997-01-01

    Modern integrated circuits are highly complex systems and, as such, are susceptible to occasional failures. Semiconductor memory devices, particularly dynamic random access memories (dRAMs), are subject to random, transient single event upsets (SEUs) created by energetic ionizing radiation. These radiation-induced soft failures in the stored data of silicon based memory chips provide the foundation for a new, highly efficient, low cost generic radiation sensor. The susceptibility and the detection efficiency of a given dRAM device to SEUs is a complicated function of the circuit design and geometry, the operating conditions and the physics of the charge collection mechanisms involved. Typically, soft error rates measure the cumulative response of all sensitive regions of the memory by broad area chip exposure in ionizing radiation environments. However, this study shows that many regions of a dynamic memory are competing charge collection centres having different upset thresholds. The contribution to soft fails from discrete regions or individual circuit elements of the memory device is unambiguously separated. Hence the use of the dRAM as a position sensitive radiation detector, with high spatial resolution, is assessed and demonstrated. (orig.)

  4. Comparative Effectiveness of Tacrolimus-Based Steroid Sparing versus Steroid Maintenance Regimens in Kidney Transplantation: Results from Discrete Event Simulation.

    Science.gov (United States)

    Desai, Vibha C A; Ferrand, Yann; Cavanaugh, Teresa M; Kelton, Christina M L; Caro, J Jaime; Goebel, Jens; Heaton, Pamela C

    2017-10-01

    Corticosteroids used as immunosuppressants to prevent acute rejection (AR) and graft loss (GL) following kidney transplantation are associated with serious cardiovascular and other adverse events. Evidence from short-term randomized controlled trials suggests that many patients on a tacrolimus-based immunosuppressant regimen can withdraw from steroids without increased AR or GL risk. To measure the long-term tradeoff between GL and adverse events for a heterogeneous-risk population and determine the optimal timing of steroid withdrawal. A discrete event simulation was developed including, as events, AR, GL, myocardial infarction (MI), stroke, cytomegalovirus, and new onset diabetes mellitus (NODM), among others. Data from the United States Renal Data System were used to estimate event-specific parametric regressions, which accounted for steroid-sparing regimen (avoidance, early 7-d withdrawal, 6-mo withdrawal, 12-mo withdrawal, and maintenance) as well as patients' demographics, immunologic risks, and comorbidities. Regression-equation results were used to derive individual time-to-event Weibull distributions, used, in turn, to simulate the course of patients over 20 y. Patients on steroid avoidance or an early-withdrawal regimen were more likely to experience AR (45.9% to 55.0% v. 33.6%, P events and other outcomes with no worsening of AR or GL rates compared with steroid maintenance.

  5. Establishment of nuclear knowledge and information infrastructure; establishment of web-based database system for nuclear events

    Energy Technology Data Exchange (ETDEWEB)

    Park, W. J.; Kim, K. J. [Korea Atomic Energy Research Institute , Taejeon (Korea); Lee, S. H. [Korea Institute of Nuclear Safety, Taejeon (Korea)

    2001-05-01

    Nuclear events data reported by nuclear power plants are useful to prevent nuclear accidents at the power plant by examine the cause of initiating events and removal of weak points in the aspects of operational safety, and to improve nuclear safety in design and operation stages by backfitting operational experiences and practices 'Nuclear Event Evaluation Database : NEED' system distributed by CD-ROM media are upgraded to the NEED-Web (Web-based Nuclear Event Evaluation Database) version to manage event data using database system on network basis and the event data and the statistics are provided to the authorized users in the Nuclear Portal Site and publics through Internet Web services. The efforts to establish the NEED-Web system will improve the integrity of events data occurred in Korean nuclear power plant and the usability of data services, and enhance the confidence building and the transparency to the public in nuclear safety. 11 refs., 27 figs. (Author)

  6. A climate-based multivariate extreme emulator of met-ocean-hydrological events for coastal flooding

    Science.gov (United States)

    Camus, Paula; Rueda, Ana; Mendez, Fernando J.; Tomas, Antonio; Del Jesus, Manuel; Losada, Iñigo J.

    2015-04-01

    Atmosphere-ocean general circulation models (AOGCMs) are useful to analyze large-scale climate variability (long-term historical periods, future climate projections). However, applications such as coastal flood modeling require climate information at finer scale. Besides, flooding events depend on multiple climate conditions: waves, surge levels from the open-ocean and river discharge caused by precipitation. Therefore, a multivariate statistical downscaling approach is adopted to reproduce relationships between variables and due to its low computational cost. The proposed method can be considered as a hybrid approach which combines a probabilistic weather type downscaling model with a stochastic weather generator component. Predictand distributions are reproduced modeling the relationship with AOGCM predictors based on a physical division in weather types (Camus et al., 2012). The multivariate dependence structure of the predictand (extreme events) is introduced linking the independent marginal distributions of the variables by a probabilistic copula regression (Ben Ayala et al., 2014). This hybrid approach is applied for the downscaling of AOGCM data to daily precipitation and maximum significant wave height and storm-surge in different locations along the Spanish coast. Reanalysis data is used to assess the proposed method. A commonly predictor for the three variables involved is classified using a regression-guided clustering algorithm. The most appropriate statistical model (general extreme value distribution, pareto distribution) for daily conditions is fitted. Stochastic simulation of the present climate is performed obtaining the set of hydraulic boundary conditions needed for high resolution coastal flood modeling. References: Camus, P., Menéndez, M., Méndez, F.J., Izaguirre, C., Espejo, A., Cánovas, V., Pérez, J., Rueda, A., Losada, I.J., Medina, R. (2014b). A weather-type statistical downscaling framework for ocean wave climate. Journal of

  7. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  8. Event-Based Impulsive Control of Continuous-Time Dynamic Systems and Its Application to Synchronization of Memristive Neural Networks.

    Science.gov (United States)

    Zhu, Wei; Wang, Dandan; Liu, Lu; Feng, Gang

    2017-08-18

    This paper investigates exponential stabilization of continuous-time dynamic systems (CDSs) via event-based impulsive control (EIC) approaches, where the impulsive instants are determined by certain state-dependent triggering condition. The global exponential stability criteria via EIC are derived for nonlinear and linear CDSs, respectively. It is also shown that there is no Zeno-behavior for the concerned closed loop control system. In addition, the developed event-based impulsive scheme is applied to the synchronization problem of master and slave memristive neural networks. Furthermore, a self-triggered impulsive control scheme is developed to avoid continuous communication between the master system and slave system. Finally, two numerical simulation examples are presented to illustrate the effectiveness of the proposed event-based impulsive controllers.

  9. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  10. Microseismic Event Grouping Based on PageRank Linkage at the Newberry Volcano Geothermal Site

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2016-12-01

    The Newberry Volcano DOE FORGE site in Central Oregon has been stimulated two times using high-pressure fluid injection to study the Enhanced Geothermal Systems (EGS) technology. Several hundred microseismic events were generated during the first stimulation in the fall of 2012. Initial locations of this microseismicity do not show well defined subsurface structure in part because event location uncertainties are large (Foulger and Julian, 2013). We focus on this stimulation to explore the spatial and temporal development of microseismicity, which is key to understanding how subsurface stimulation modifies stress, fractures rock, and increases permeability. We use PageRank, Google's initial search algorithm, to determine connectivity within the events (Aguiar and Beroza, 2014) and assess signal-correlation topology for the micro-earthquakes. We then use this information to create signal families and compare these to the spatial and temporal proximity of associated earthquakes. We relocate events within families (identified by PageRank linkage) using the Bayesloc approach (Myers et al., 2007). Preliminary relocations show tight spatial clustering of event families as well as evidence of events relocating to a different cluster than originally reported. We also find that signal similarity (linkage) at several stations, not just one or two, is needed in order to determine that events are in close proximity to one another. We show that indirect linkage of signals using PageRank is a reliable way to increase the number of events that are confidently determined to be similar to one another, which may lead to efficient and effective grouping of earthquakes with similar physical characteristics, such as focal mechanisms and stress drop. Our ultimate goal is to determine whether changes in the state of stress and/or changes in the generation of subsurface fracture networks can be detected using PageRank topology as well as aid in the event relocation to obtain more accurate

  11. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    Science.gov (United States)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  12. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, M.; Gallagher, D. L.; Whitt, A.; Six, N. Frank (Technical Monitor)

    2002-01-01

    For the past four years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of science communication through the web resources on the Internet. The program includes extended stories about NAS.4 science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. We give here, examples of events, problems, and lessons learned from these activities.

  13. Distinct and shared cognitive functions mediate event- and time-based prospective memory impairment in normal ageing

    Science.gov (United States)

    Gonneaud, Julie; Kalpouzos, Grégoria; Bon, Laetitia; Viader, Fausto; Eustache, Francis; Desgranges, Béatrice

    2011-01-01

    Prospective memory (PM) is the ability to remember to perform an action at a specific point in the future. Regarded as multidimensional, PM involves several cognitive functions that are known to be impaired in normal aging. In the present study, we set out to investigate the cognitive correlates of PM impairment in normal aging. Manipulating cognitive load, we assessed event- and time-based PM, as well as several cognitive functions, including executive functions, working memory and retrospective episodic memory, in healthy subjects covering the entire adulthood. We found that normal aging was characterized by PM decline in all conditions and that event-based PM was more sensitive to the effects of aging than time-based PM. Whatever the conditions, PM was linked to inhibition and processing speed. However, while event-based PM was mainly mediated by binding and retrospective memory processes, time-based PM was mainly related to inhibition. The only distinction between high- and low-load PM cognitive correlates lays in an additional, but marginal, correlation between updating and the high-load PM condition. The association of distinct cognitive functions, as well as shared mechanisms with event- and time-based PM confirms that each type of PM relies on a different set of processes. PMID:21678154

  14. A new method to detect event-related potentials based on Pearson's correlation.

    Science.gov (United States)

    Giroldini, William; Pederzoli, Luciano; Bilucaglia, Marco; Melloni, Simone; Tressoldi, Patrizio

    2016-12-01

    Event-related potentials (ERPs) are widely used in brain-computer interface applications and in neuroscience.  Normal EEG activity is rich in background noise, and therefore, in order to detect ERPs, it is usually necessary to take the average from multiple trials to reduce the effects of this noise.  The noise produced by EEG activity itself is not correlated with the ERP waveform and so, by calculating the average, the noise is decreased by a factor inversely proportional to the square root of N , where N is the number of averaged epochs. This is the easiest strategy currently used to detect ERPs, which is based on calculating the average of all ERP's waveform, these waveforms being time- and phase-locked.  In this paper, a new method called GW6 is proposed, which calculates the ERP using a mathematical method based only on Pearson's correlation. The result is a graph with the same time resolution as the classical ERP and which shows only positive peaks representing the increase-in consonance with the stimuli-in EEG signal correlation over all channels.  This new method is also useful for selectively identifying and highlighting some hidden components of the ERP response that are not phase-locked, and that are usually hidden in the standard and simple method based on the averaging of all the epochs.  These hidden components seem to be caused by variations (between each successive stimulus) of the ERP's inherent phase latency period (jitter), although the same stimulus across all EEG channels produces a reasonably constant phase. For this reason, this new method could be very helpful to investigate these hidden components of the ERP response and to develop applications for scientific and medical purposes. Moreover, this new method is more resistant to EEG artifacts than the standard calculations of the average and could be very useful in research and neurology.  The method we are proposing can be directly used in the form of a process written in the well

  15. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  16. Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM

    Science.gov (United States)

    Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan

    2018-02-01

    The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.

  17. Sustainable Cultural Events Based on Marketing Segmentation: The Case of Faro Capital of Culture

    Directory of Open Access Journals (Sweden)

    Patricia Oom do Valle

    2010-04-01

    Full Text Available The city of Faro was designated by the Portuguese government as the 2005 National Capital of Culture. The Faro 2005 National Capital of Culture took place between May and December in several cities of the Algarve region, with most events occurring in Faro. The programme consisted of 185 different performances represented through music, cinema, theatre, ballet and plastic arts. The paper analysessegments of the population that participated in the Faro 2005 event and discusses the relation between the event’s success and the degree of satisfaction of the participants. The contribution of the paper lies in pointing to the importance of an adequate marketing approach of large-scale events, such as cultural events, in order to achieve greater audience appeal/impact, in order to ensure sustainability.

  18. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    Science.gov (United States)

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  19. Single event monitoring system based on Java 3D and XML data binding

    International Nuclear Information System (INIS)

    Wang Liang; Chinese Academy of Sciences, Beijing; Zhu Kejun; Zhao Jingwei

    2007-01-01

    Online single event monitoring is important to BESIII DAQ System. Java3D is extension of Java Language in 3D technology, XML data binding is more efficient to handle XML document than SAX and DOM. This paper mainly introduce the implementation of BESIII single event monitoring system with Java3D and XML data binding, and interface for track fitting software with JNI technology. (authors)

  20. Microseismic Event Relocation and Focal Mechanism Estimation Based on PageRank Linkage

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2017-12-01

    Microseismicity associated with enhanced geothermal systems (EGS) is key in understanding how subsurface stimulation can modify stress, fracture rock, and increase permeability. Large numbers of microseismic events are commonly associated with hydroshearing an EGS, making data mining methods useful in their analysis. We focus on PageRank, originally developed as Google's search engine, and subsequently adapted for use in seismology to detect low-frequency earthquakes by linking events directly and indirectly through cross-correlation (Aguiar and Beroza, 2014). We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes from the Newberry Volcano EGS in Central Oregon, which has been stimulated two times using high-pressure fluid injection. We create PageRank signal families from both data sets and compare these to the spatial and temporal proximity of associated earthquakes. PageRank families are relocated using differential travel times measured by waveform cross-correlation (CC) and the Bayesloc approach (Myers et al., 2007). Prior to relocation events are loosely clustered with events at a distance from the cluster. After relocation, event families are found to be tightly clustered. Indirect linkage of signals using PageRank is a reliable way to increase the number of events confidently determined to be similar, suggesting an efficient and effective grouping of earthquakes with similar physical characteristics (ie. location, focal mechanism, stress drop). We further explore the possibility of using PageRank families to identify events with similar relative phase polarities and estimate focal mechanisms following Shelly et al. (2016) method, where CC measurements are used to determine individual polarities within event clusters. Given a positive result, PageRank might be a useful tool in adaptive approaches to enhance production at well-instrumented geothermal sites. Prepared by LLNL under Contract DE-AC52-07NA27344

  1. How to model mutually exclusive events based on independent causal pathways in Bayesian network models

    OpenAIRE

    Fenton, N.; Neil, M.; Lagnado, D.; Marsh, W.; Yet, B.; Constantinou, A.

    2016-01-01

    We show that existing Bayesian network (BN) modelling techniques cannot capture the correct intuitive reasoning in the important case when a set of mutually exclusive events need to be modelled as separate nodes instead of states of a single node. A previously proposed ‘solution’, which introduces a simple constraint node that enforces mutual exclusivity, fails to preserve the prior probabilities of the events, while other proposed solutions involve major changes to the original model. We pro...

  2. OGLE-2016-BLG-0168 Binary Microlensing Event: Prediction and Confirmation of the Microlens Parallax Effect from Space-based Observations

    Energy Technology Data Exchange (ETDEWEB)

    Shin, I.-G.; Yee, J. C.; Jung, Y. K. [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Udalski, A.; Skowron, J.; Mróz, P.; Soszyński, I.; Poleski, R.; Szymański, M. K.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M. [Warsaw University Observatory, Al. Ujazdowskie 4,00-478 Warszawa (Poland); Novati, S. Calchi [IPAC, Mail Code 100-22, California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Han, C. [Department of Physics, Chungbuk National University, Cheongju 371-763 (Korea, Republic of); Albrow, M. D. [University of Canterbury, Department of Physics and Astronomy, Private Bag 4800, Christchurch 8020 (New Zealand); Gould, A. [Department of Astronomy, Ohio State University, 140 W. 18th Avenue, Columbus, OH 43210 (United States); Chung, S.-J.; Hwang, K.-H.; Ryu, Y.-H. [Korea Astronomy and Space Science Institute, 776 Daedeokdae-ro, Yuseong-Gu, Daejeon 34055 (Korea, Republic of); Collaboration: OGLE Collaboration; KMTNet Group; Spitzer Team; and others

    2017-11-01

    The microlens parallax is a crucial observable for conclusively identifying the nature of lens systems in microlensing events containing or composed of faint (even dark) astronomical objects such as planets, neutron stars, brown dwarfs, and black holes. With the commencement of a new era of microlensing in collaboration with space-based observations, the microlens parallax can be routinely measured. In addition, space-based observations can provide opportunities to verify the microlens parallax measured from ground-only observations and to find a unique solution to the lensing light-curve analysis. Furthermore, since most space-based observations cannot cover the full light curves of lensing events, it is also necessary to verify the reliability of the information extracted from fragmentary space-based light curves. We conduct a test based on the microlensing event OGLE-2016-BLG-0168, created by a binary lens system consisting of almost equal mass M-dwarf stars, to demonstrate that it is possible to verify the microlens parallax and to resolve degeneracies using the space-based light curve even though the observations are fragmentary. Since space-based observatories will frequently produce fragmentary light curves due to their short observing windows, the methodology of this test will be useful for next-generation microlensing experiments that combine space-based and ground-based collaboration.

  3. Adverse life events increase risk for postpartum psychiatric episodes: A population-based epidemiologic study.

    Science.gov (United States)

    Meltzer-Brody, S; Larsen, J T; Petersen, L; Guintivano, J; Florio, A Di; Miller, W C; Sullivan, P F; Munk-Olsen, T

    2018-02-01

    Trauma histories may increase risk of perinatal psychiatric episodes. We designed an epidemiological population-based cohort study to explore if adverse childhood experiences (ACE) in girls increases risk of later postpartum psychiatric episodes. Using Danish registers, we identified women born in Denmark between January 1980 and December 1998 (129,439 childbirths). Exposure variables were ACE between ages 0 and 15 including: (1) family disruption, (2) parental somatic illness, (3) parental labor market exclusion, (4) parental criminality, (5) parental death, (6) placement in out-of-home care, (7) parental psychopathology excluding substance use, and (8) parental substance use disorder. Primary outcome was first occurrence of in- or outpatient contact 0-6 months postpartum at a psychiatric treatment facility with any psychiatric diagnoses, ICD-10, F00-F99 (N = 651). We conducted survival analyses using Cox proportional hazard regressions of postpartum psychiatric episodes. Approximately 52% of the sample experienced ACE, significantly increasing risk of any postpartum psychiatric diagnosis. Highest risks were observed among women who experienced out-of-home placement, hazard ratio (HR) 2.57 (95% CI: 1.90-3.48). Women experiencing two adverse life events had higher risks of postpartum psychiatric diagnosis HR: 1.88 (95% CI: 1.51-2.36), compared to those with one ACE, HR: 1.24 (95% CI: 1.03-49) and no ACE, HR: 1.00 (reference group). ACE primarily due to parental psychopathology and disability contributes to increased risk of postpartum psychiatric episodes; and greater numbers of ACE increases risk for postpartum psychiatric illness with an observed dose-response effect. Future work should explore genetic and environmental factors that increase risk and/or confer resilience. © 2017 Wiley Periodicals, Inc.

  4. Ground-based solar radio observations of the August 1972 events

    International Nuclear Information System (INIS)

    Bhonsle, R.V.; Degaonkar, S.S.; Alurkar, S.K.

    1976-01-01

    Ground-based observations of the variable solar radio emission ranging from few millimetres to decametres have been used here as a diagnostic tool to gain coherent phenomenological understanding of the great 2, 4 and 7 August, 1972 solar events in terms of dominant physical processes like generation and propagation of shock waves in the solar atmosphere, particle acceleration and trapping. Four major flares are selected for detailed analysis on the basis of their ability to produce energetic protons, shock waves, polar cap absorptions (PCA) and sudden commencement (SC) geomagnetic storms. A comparative study of their radio characteristics is made. Evidence is seen for the pulsations during microwave bursts by the mechanism similar to that proposed by McLean et al. (1971), to explain the pulsations in the metre wavelength continuum radiation. It is suggested that the multiple peaks observed in some microwave bursts may be attributable to individual flares occurring sequentially due to a single initiating flare. Attempts have been made to establish identification of Type II bursts with the interplanetary shock waves and SC geomagnetic storms. Furthermore, it is suggested that it is the mass behind the shock front which is the deciding factor for the detection of shock waves in the interplantary space. It appears that more work is necessary in order to identify which of the three moving Type IV bursts (Wild and Smerd, 1972), namely, advancing shock front, expanding magnetic arch and ejected plasma blob serves as the piston-driver behind the interplanetary shocks. The existing criteria for proton flare prediction have been summarized and two new criteria have been proposed. (Auth.)

  5. Optically-based Sensor System for Critical Nuclear Facilities Post-Event Seismic Structural Assessment

    Energy Technology Data Exchange (ETDEWEB)

    McCallen, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Petrone, Floriana [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Buckle, Ian [Univ. of Nevada, Reno, NV (United States); Wu, Suiwen [Univ. of Nevada, Reno, NV (United States); Coates, Jason [California State Univ., Chico, CA (United States)

    2017-09-30

    The U.S. Department of Energy (DOE) has ownership and operational responsibility for a large enterprise of nuclear facilities that provide essential functions to DOE missions ranging from national security to discovery science and energy research. These facilities support a number of DOE programs and offices including the National Nuclear Security Administration, Office of Science, and Office of Environmental Management. With many unique and “one of a kind” functions, these facilities represent a tremendous national investment, and assuring their safety and integrity is fundamental to the success of a breadth of DOE programs. Many DOE critical facilities are located in regions with significant natural phenomenon hazards including major earthquakes and DOE has been a leader in developing standards for the seismic analysis of nuclear facilities. Attaining and sustaining excellence in nuclear facility design and management must be a core competency of the DOE. An important part of nuclear facility management is the ability to monitor facilities and rapidly assess the response and integrity of the facilities after any major upset event. Experience in the western U.S. has shown that understanding facility integrity after a major earthquake is a significant challenge which, lacking key data, can require extensive effort and significant time. In the work described in the attached report, a transformational approach to earthquake monitoring of facilities is described and demonstrated. An entirely new type of optically-based sensor that can directly and accurately measure the earthquake-induced deformations of a critical facility has been developed and tested. This report summarizes large-scale shake table testing of the sensor concept on a representative steel frame building structure, and provides quantitative data on the accuracy of the sensor measurements.

  6. Identify alternative splicing events based on position-specific evolutionary conservation.

    Directory of Open Access Journals (Sweden)

    Liang Chen

    Full Text Available The evolution of eukaryotes is accompanied by the increased complexity of alternative splicing which greatly expands genome information. One of the greatest challenges in the post-genome era is a complete revelation of human transcriptome with consideration of alternative splicing. Here, we introduce a comparative genomics approach to systemically identify alternative splicing events based on the differential evolutionary conservation between exons and introns and the high-quality annotation of the ENCODE regions. Specifically, we focus on exons that are included in some transcripts but are completely spliced out for others and we call them conditional exons. First, we characterize distinguishing features among conditional exons, constitutive exons and introns. One of the most important features is the position-specific conservation score. There are dramatic differences in conservation scores between conditional exons and constitutive exons. More importantly, the differences are position-specific. For flanking intronic regions, the differences between conditional exons and constitutive exons are also position-specific. Using the Random Forests algorithm, we can classify conditional exons with high specificities (97% for the identification of conditional exons from intron regions and 95% for the classification of known exons and fair sensitivities (64% and 32% respectively. We applied the method to the human genome and identified 39,640 introns that actually contain conditional exons and classified 8,813 conditional exons from the current RefSeq exon list. Among those, 31,673 introns containing conditional exons and 5,294 conditional exons classified from known exons cannot be inferred from RefSeq, UCSC or Ensembl annotations. Some of these de novo predictions were experimentally verified.

  7. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    Science.gov (United States)

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  8. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  9. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    Science.gov (United States)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once

  10. Positive predictive value of a register-based algorithm using the Danish National Registries to identify suicidal events.

    Science.gov (United States)

    Gasse, Christiane; Danielsen, Andreas Aalkjaer; Pedersen, Marianne Giørtz; Pedersen, Carsten Bøcker; Mors, Ole; Christensen, Jakob

    2018-04-17

    It is not possible to fully assess intention of self-harm and suicidal events using information from administrative databases. We conducted a validation study of intention of suicide attempts/self-harm contacts identified by a commonly applied Danish register-based algorithm (DK-algorithm) based on hospital discharge diagnosis and emergency room contacts. Of all 101 530 people identified with an incident suicide attempt/self-harm contact at Danish hospitals between 1995 and 2012 using the DK-algorithm, we selected a random sample of 475 people. We validated the DK-algorithm against medical records applying the definitions and terminology of the Columbia Classification Algorithm of Suicide Assessment of suicidal events, nonsuicidal events, and indeterminate or potentially suicidal events. We calculated positive predictive values (PPVs) of the DK-algorithm to identify suicidal events overall, by gender, age groups, and calendar time. We retrieved medical records for 357 (75%) people. The PPV of the DK-algorithm to identify suicidal events was 51.5% (95% CI: 46.4-56.7) overall, 42.7% (95% CI: 35.2-50.5) in males, and 58.5% (95% CI: 51.6-65.1) in females. The PPV varied further across age groups and calendar time. After excluding cases identified via the DK-algorithm by unspecific codes of intoxications and injury, the PPV improved slightly (56.8% [95% CI: 50.0-63.4]). The DK-algorithm can reliably identify self-harm with suicidal intention in 52% of the identified cases of suicide attempts/self-harm. The PPVs could be used for quantitative bias analysis and implemented as weights in future studies to estimate the proportion of suicidal events among cases identified via the DK-algorithm. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Atherosclerosis profile and incidence of cardiovascular events: a population-based survey

    Directory of Open Access Journals (Sweden)

    Bullano Michael F

    2009-09-01

    Full Text Available Abstract Background Atherosclerosis is a chronic progressive disease often presenting as clinical cardiovascular disease (CVD events. This study evaluated the characteristics of individuals with a diagnosis of atherosclerosis and estimated the incidence of CVD events to assist in the early identification of high-risk individuals. Methods Respondents to the US SHIELD baseline survey were followed for 2 years to observe incident self-reported CVD. Respondents had subclinical atherosclerosis if they reported a diagnosis of narrow or blocked arteries/carotid artery disease without a past clinical CVD event (heart attack, stroke or revascularization. Characteristics of those with atherosclerosis and incident CVD were compared with those who did not report atherosclerosis at baseline but had CVD in the following 2 years using chi-square tests. Logistic regression model identified characteristics associated with atherosclerosis and incident events. Results Of 17,640 respondents, 488 (2.8% reported having subclinical atherosclerosis at baseline. Subclinical atherosclerosis was associated with age, male gender, dyslipidemia, circulation problems, hypertension, past smoker, and a cholesterol test in past year (OR = 2.2 [all p Conclusion Self-report of subclinical atherosclerosis identified an extremely high-risk group with a >25% risk of a CVD event in the next 2 years. These characteristics may be useful for identifying individuals for more aggressive diagnostic and therapeutic efforts.

  12. Prognostic table for predicting major cardiac events based on J-ACCESS investigation

    International Nuclear Information System (INIS)

    Nakajima, Kenichi; Nishimura, Tsunehiko

    2008-01-01

    The event risk of patients with coronary heart disease may be estimated by a large-scale prognostic database in a Japanese population. The aim of this study was to create a heart risk table for predicting the major cardiac event rate. Using the Japanese-assessment of cardiac event and survival study (J-ACCESS) database created by a prognostic investigation involving 117 hospitals and >4000 patients in Japan, multivariate logistic regression analysis was performed. The major event rate over a 3-year period that included cardiac death, non-fatal myocardial infarction, and severe heart failure requiring hospitalization was predicted by the logistic regression equation. The algorithm for calculating the event rate was simplified for creating tables. Two tables were created to calculate cardiac risk by age, perfusion score category, and ejection fraction with and without the presence of diabetes. A relative risk table comparing age-matched control subjects was also made. When the simplified tables were compared with the results from the original logistic regression analysis, both risk values and relative risks agreed well (P<0.0001 for both). The Heart Risk Table was created for patients suspected of having ischemic heart disease and who underwent myocardial perfusion gated single-photon emission computed tomography. The validity of risk assessment using a J-ACCESS database should be validated in a future study. (author)

  13. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  14. Science-based risk assessments for rare events in a changing climate

    Science.gov (United States)

    Sobel, A. H.; Tippett, M. K.; Camargo, S. J.; Lee, C. Y.; Allen, J. T.

    2014-12-01

    History shows that substantial investments in protection against any specific type of natural disaster usually occur only after (usually shortly after) that specific type of disaster has happened in a given place. This is true even when it was well known before the event that there was a significant risk that it could occur. Presumably what psychologists Kahneman and Tversky have called "availability bias" is responsible, at least in part, for these failures to act on known but out-of-sample risks. While understandable, this human tendency prepares us poorly for events which are very rare (on the time scales of human lives) and even more poorly for a changing climate, as historical records become a poorer guide. A more forward-thinking and rational approach would require scientific risk assessments that can place meaningful probabilities on events that are rare enough to be absent from the historical record, and that can account for the influences of both anthropogenic climate change and low-frequency natural climate variability. The set of tools available for doing such risk assessments is still quite limited, particularly for some of the most extreme events such as tropical cyclones and tornadoes. We will briefly assess the state of the art for these events in particular, and describe some of our ongoing research to develop new tools for quantitative risk assessment using hybrids of statistical methods and physical understanding of the hazards.

  15. Tracking the evolution of stream DOM source during storm events using end member mixing analysis based on DOM quality

    Science.gov (United States)

    Yang, Liyang; Chang, Soon-Woong; Shin, Hyun-Sang; Hur, Jin

    2015-04-01

    The source of river dissolved organic matter (DOM) during storm events has not been well constrained, which is critical in determining the quality and reactivity of DOM. This study assessed temporal changes in the contributions of four end members (weeds, leaf litter, soil, and groundwater), which exist in a small forested watershed (the Ehwa Brook, South Korea), to the stream DOM during two storm events, using end member mixing analysis (EMMA) based on spectroscopic properties of DOM. The instantaneous export fluxes of dissolved organic carbon (DOC), chromophoric DOM (CDOM), and fluorescent components were all enhanced during peak flows. The DOC concentration increased with the flow rate, while CDOM and humic-like fluorescent components were diluted around the peak flows. Leaf litter was dominant for the DOM source in event 2 with a higher rainfall, although there were temporal variations in the contributions of the four end members to the stream DOM for both events. The contribution of leaf litter peaked while that of deeper soils decreased to minima at peak flows. Our results demonstrated that EMMA based on DOM properties could be used to trace the DOM source, which is of fundamental importance for understanding the factors responsible for river DOM dynamics during storm events.

  16. Cardiovascular Events in Cancer Patients Treated with Highly or Moderately Emetogenic Chemotherapy: Results from a Population-Based Study

    International Nuclear Information System (INIS)

    Vo, T. T.; Nelson, J. J.

    2012-01-01

    Studies on cardiovascular safety in cancer patients treated with highly or moderately emetogenic chemotherapy (HEC or MEC), who may have taken the antiemetic, aprepitant, have been limited to clinical trials and postmarketing spontaneous reports. Our study explored background rates of cardiovascular disease (CVD) events among HEC- or MEC-treated cancer patients in a population-based setting to contextualize events seen in a new drug development program and to determine at a high level whether rates differed by aprepitant usage. Medical and pharmacy claims data from the 2005-2007 IMPACT National Benchmark Database were classified into emetogenic chemotherapy categories and CVD outcomes. Among 5827 HEC/MEC-treated patients, frequencies were highest for hypertension (16-21%) and composites of venous (7-12%) and arterial thromboembolic events (4-7%). Aprepitant users generally did not experience higher frequencies of events compared to nonusers. Our study serves as a useful benchmark of background CVD event rates in a population-based setting of cancer patients.

  17. Tracking Real-Time Changes in Working Memory Updating and Gating with the Event-Based Eye-Blink Rate

    NARCIS (Netherlands)

    Rac-Lubashevsky, R.; Slagter, H.A.; Kessler, Y.

    2017-01-01

    Effective working memory (WM) functioning depends on the gating process that regulates the balance between maintenance and updating of WM. The present study used the event-based eye-blink rate (ebEBR), which presumably reflects phasic striatal dopamine activity, to examine how the cognitive

  18. Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models

    NARCIS (Netherlands)

    van Elburg, R.A.J.; van Ooyen, A.

    2009-01-01

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  19. Generalization of the Event-Based Carnevale-Hines Integration Scheme for Integrate-and-Fire Models

    NARCIS (Netherlands)

    van Elburg, Ronald A. J.; van Ooyen, Arjen

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  20. Effect of a ward-based pharmacy team on preventable adverse drug events in surgical patients (SUREPILL study)

    NARCIS (Netherlands)

    de Boer, M.; Boeker, E. B.; Ramrattan, M. A.; Kiewiet, J. J. S.; Ram, K.; Gombert-Handoko, K. B.; van Lent-Evers, N. A. E. M.; Kuks, P. F. M.; Mulder, W. M. C.; Breslau, P. J.; Oostenbroek, R. J.; Dijkgraaf, M. G. W.; Lie-A-Huen, L.; Boermeester, M. A.

    2015-01-01

    Surgical patients are at risk of adverse drug events (ADEs) causing morbidity and mortality. Much harm is preventable. Ward-based pharmacy interventions to reduce medication-related harm have not been evaluated in surgical patients. This multicentre prospective clinical trial evaluated a

  1. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    DEFF Research Database (Denmark)

    Jensen, Ninna Reitzel; Schomacker, Kristian Juul

    2015-01-01

    Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death......, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance...... product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account...

  2. Rocket and ground-based study of an auroral breakup event

    International Nuclear Information System (INIS)

    Marklund, G.

    1982-02-01

    On 27 January, 1979 the substorm-GEOS rocket S23H was launched from ESRANGE, Kiruna, shortly after the onset of an intense magnetospheric substorm over northern Scandinavia. Rocket electric field and particle observations have been used to calculate ionospheric currents and heating rates. These results are generally consistent with the ground magnetic and optical observations. An important finding emerging from a comparison of this event with a pre-breakup event earlier on this day is that the ionospheric substorm-related electric field could be split up into two parts, namely: 1) an ambient LT dependent field, probably of magnetospheric origin 2) superimposed on this a small-scale electric field associated with the bright auroral structures, being southward for both events. This is shown to have important consequences on the location of the ionospheric currents and the Joule energy discussion relative to the auroral forms. (Author)

  3. Early prediction of adverse events in enhanced recovery based upon the host systemic inflammatory response.

    Science.gov (United States)

    Lane, J C; Wright, S; Burch, J; Kennedy, R H; Jenkins, J T

    2013-02-01

    Early identification of patients experiencing postoperative complications is imperative for successful management. C-reactive protein (CRP) is a nonspecific marker of inflammation used in many specialties to monitor patient condition. The role of CRP measurement early in the elective postoperative colorectal patient is unclear, particularly in the context of enhanced recovery (ERAS). Five hundred and thirty-three consecutive patients who underwent elective colorectal surgery between October 2008 and October 2010 within an established ERAS programme were studied. Patients were separated into a development group of 265 patients and a validation group of 268 patients by chronological order. CRP and white cell count were added to a prospectively maintained ERAS database. The primary outcome of the study was all adverse events (including infective complications, postoperative organ dysfunction and prolonged length of stay) during the initial hospital admission. Significant predictors for adverse events on univariate analysis were submitted to multivariate regression analysis and the resulting model applied to the validation group. The validity and predictive accuracy of the regression model was assessed using receiver operating characteristic curve/area under the curve (AUC) analysis. CRP levels >150 mg/l on postoperative day 2 and a rising CRP on day 3 were independently associated with all adverse events during the hospital admission. A weighted model was applied to the validation group yielding an AUC of 0.65 (95% CI 0.58-0.73) indicating, at best, modest discrimination and predictive accuracy for adverse events. Measurement of CRP in patients after elective colorectal surgery in the first few days after surgery within ERAS can assist in identifying those at risk of adverse events and a prolonged hospital stay. A CRP value of >150 mg/l on day 2 and a rising CRP on day 3 should alert the surgeon to an increased likelihood of such events. © 2012 The Authors

  4. Some implications of an event-based definition of exposure to the risk of road accident

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the road......This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving...

  5. Recognition of power quality events by using multiwavelet-based neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Kaewarsa, Suriya; Attakitmongcol, Kitti; Kulworawanichpong, Thanatchai [School of Electrical Engineering, Suranaree University of Technology, 111 University Avenue, Muang District, Nakhon Ratchasima 30000 (Thailand)

    2008-05-15

    Recognition of power quality events by analyzing the voltage and current waveform disturbances is a very important task for the power system monitoring. This paper presents a novel approach for the recognition of power quality disturbances using multiwavelet transform and neural networks. The proposed method employs the multiwavelet transform using multiresolution signal decomposition techniques working together with multiple neural networks using a learning vector quantization network as a powerful classifier. Various transient events are tested, such as voltage sag, swell, interruption, notching, impulsive transient, and harmonic distortion show that the classifier can detect and classify different power quality signal types efficiency. (author)

  6. Event-based aquifer-to-atmosphere modeling over the European CORDEX domain

    Science.gov (United States)

    Keune, J.; Goergen, K.; Sulis, M.; Shrestha, P.; Springer, A.; Kusche, J.; Ohlwein, C.; Kollet, S. J.

    2014-12-01

    Despite the fact that recent studies focus on the impact of soil moisture on climate and especially land-energy feedbacks, groundwater dynamics are often neglected or conceptual groundwater flow models are used. In particular, in the context of climate change and the occurrence of droughts and floods, a better understanding and an improved simulation of the physical processes involving groundwater on continental scales is necessary. This requires the implementation of a physically consistent terrestrial modeling system, which explicitly incorporates groundwater dynamics and the connection with shallow soil moisture. Such a physics-based system enables simulations and monitoring of groundwater storage and enhanced representations of the terrestrial energy and hydrologic cycles over long time periods. On shorter timescales, the prediction of groundwater-related extremes, such as floods and droughts, are expected to improve, because of the improved simulation of components of the hydrological cycle. In this study, we present a fully coupled aquifer-to-atmosphere modeling system over the European CORDEX domain. The integrated Terrestrial Systems Modeling Platform, TerrSysMP, consisting of the three-dimensional subsurface model ParFlow, the Community Land Model CLM3.5 and the numerical weather prediction model COSMO of the German Weather Service, is used. The system is set up with a spatial resolution of 0.11° (12.5km) and closes the terrestrial water and energy cycles from aquifers into the atmosphere. Here, simulations of the fully coupled system are performed over events, such as the 2013 flood in Central Europe and the 2003 European heat wave, and over extended time periods on the order of 10 years. State and flux variables of the terrestrial hydrologic and energy cycle are analyzed and compared to both in situ (e.g. stream and water level gauge networks, FLUXNET) and remotely sensed observations (e.g. GRACE, ESA ICC ECV soil moisture and SMOS). Additionally, the

  7. Technical report on design base events related to the safety assessment of a Low-level Waste Storage Facility (LWSF)

    International Nuclear Information System (INIS)

    Karino, Motonobu; Uryu, Mitsuru; Miyata, Kazutoshi; Matsui, Norio; Imamoto, Nobuo; Kawamata, Tatsuo; Saito, Yasuo; Nagayama, Mineo; Wakui, Yasuyuki

    1999-07-01

    The construction of a new Low-level Waste Storage Facility (LWSF) is planned for storage of concentrated liquid waste from existing Low-level Radioactive Waste Treatment Facility in Tokai Reprocessing Plant of JNC. An essential base for the safety designing of the facility is correctly implemented the adoption of the defence in depth principle. This report summarized criteria for judgement, selection of postulated events, major analytical conditions for anticipated operational occurrences and accidents for the safety assessment and evaluation of each event were presented. (Itami, H.)

  8. The role of curiosity‐triggering events in game‐based learning for mathematics

    NARCIS (Netherlands)

    Wouters, P.J.M.; van Oostendorp, H.; terVrugte, Judith; Vandercruysse, Sylke; de Jong, Ton; Elen, Jan

    2015-01-01

    In this study, we investigate whether cognitive conflicts induced by curiosity-triggering events have a positive impact on learning and motivation. In two experiments, we tested a game about proportional reasoning for secondary prevocational students. Experiment 1 used a curiosity-triggering vs.

  9. The Role of Curiosity-Triggering Events in Game-Based Learning for Mathematics

    NARCIS (Netherlands)

    Wouters, Pieter; van Oostendorp, Herre; ter Vrugte, Judith; Vandercruysse, Sylke; de Jong, Anthonius J.M.; Elen, Jan; Torbeyns, Joke; Lehtinen, Erno; Elen, Jan

    2015-01-01

    In this study, we investigate whether cognitive conflicts induced by curiosity-triggering events have a positive impact on learning and motivation. In two experiments, we tested a game about proportional reasoning for secondary prevocational students. Experiment 1 used a curiosity-triggering vs.

  10. Automatic identification of web-based risk markers for health events

    DEFF Research Database (Denmark)

    Yom-Tov, Elad; Borsa, Diana; Hayward, Andrew C.

    2015-01-01

    but these are often limited in size and cost and can fail to take full account of diseases where there are social stigmas or to identify transient acute risk factors. Objective: Here we report that Web search engine queries coupled with information on Wikipedia access patterns can be used to infer health events...

  11. Evidence-Based Psychosocial Treatments for Children and Adolescents Exposed to Traumatic Events

    Science.gov (United States)

    Silverman, Wendy K.; Ortiz, Claudio D.; Viswesvaran, Chockalingham; Burns, Barbara J.; Kolko, David J.; Putnam, Frank W.; Amaya-Jackson, Lisa

    2008-01-01

    The article reviews the current status (1993-2007) of psychosocial treatments for children and adolescents who have been exposed to traumatic events. Twenty-one treatment studies are evaluated using criteria from Nathan and Gorman (2002) along a continuum of methodological rigor ranging from Type 1 to Type 6. All studies were, at a minimum, robust…

  12. Seismology-based early identification of dam-formation landquake events.

    Science.gov (United States)

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  13. A Short-term ESPERTA-based Forecast Tool for Moderate-to-extreme Solar Proton Events

    Science.gov (United States)

    Laurenza, M.; Alberti, T.; Cliver, E. W.

    2018-04-01

    The ESPERTA (Empirical model for Solar Proton Event Real Time Alert) forecast tool has a Probability of Detection (POD) of 63% for all >10 MeV events with proton peak intensity ≥10 pfu (i.e., ≥S1 events, S1 referring to minor storms on the NOAA Solar Radiation Storms scale), from 1995 to 2014 with a false alarm rate (FAR) of 38% and a median (minimum) warning time (WT) of ∼4.8 (0.4) hr. The NOAA space weather scale includes four additional categories: moderate (S2), strong (S3), severe (S4), and extreme (S5). As S1 events have only minor impacts on HF radio propagation in the polar regions, the effective threshold for significant space radiation effects appears to be the S2 level (100 pfu), above which both biological and space operation impacts are observed along with increased effects on HF propagation in the polar regions. We modified the ESPERTA model to predict ≥S2 events and obtained a POD of 75% (41/55) and an FAR of 24% (13/54) for the 1995–2014 interval with a median (minimum) WT of ∼1.7 (0.2) hr based on predictions made at the time of the S1 threshold crossing. The improved performance of ESPERTA for ≥S2 events is a reflection of the big flare syndrome, which postulates that the measures of the various manifestations of eruptive solar flares increase as one considers increasingly larger events.

  14. Pre-event trajectories of mental health and health-related disabilities, and post-event traumatic stress symptoms and health : A 7-wave population-based study

    NARCIS (Netherlands)

    van der Velden, Peter; Bosmans, Mark; van der Meulen, Erik; Vermunt, J.K.

    2016-01-01

    It is unknown to what extent classes of trajectories of pre-event mental health problems (MHP) and health-related disabilities (HRD), predict post-event traumatic stress symptoms (PTSS), MHP and HRD. Aim of the present 7-wave study was to assess the predictive values using a representative sample of

  15. Evaluation of the Health Protection Event-Based Surveillance for the London 2012 Olympic and Paralympic Games.

    Science.gov (United States)

    Severi, E; Kitching, A; Crook, P

    2014-06-19

    The Health Protection Agency (HPA) (currently Public Health England) implemented the Health Protection Event-Based Surveillance (EBS) to provide additional national epidemic intelligence for the 2012 London Olympic and Paralympic Games (the Games). We describe EBS and evaluate the system attributes. EBS aimed at identifying, assessing and reporting to the HPA Olympic Coordination Centre (OCC) possible national infectious disease threats that may significantly impact the Games. EBS reported events in England from 2 July to 12 September 2012. EBS sourced events from reports from local health protection units and from screening an electronic application 'HPZone Dashboard' (DB). During this period, 147 new events were reported to EBS, mostly food-borne and vaccine-preventable diseases: 79 from regional units, 144 from DB (76 from both). EBS reported 61 events to the OCC: 21 of these were reported onwards. EBS sensitivity was 95.2%; positive predictive value was 32.8%; reports were timely (median one day; 10th percentile: 0 days - same day; 90th percentile: 3.6 days); completeness was 99.7%; stability was 100%; EBS simplicity was assessed as good; the daily time per regional or national unit dedicated to EBS was approximately 4 hours (weekdays) and 3 hours (weekends). OCC directors judged EBS as efficient, fast and responsive. EBS provided reliable, reassuring, timely, simple and stable national epidemic intelligence for the Games.

  16. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenbing [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Wang, Zidong [Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH (United Kingdom); Liu, Yurong, E-mail: yrliu@yzu.edu.cn [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ding, Derui [Shanghai Key Lab of Modern Optical System, Department of Control Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093 (China); Alsaadi, Fuad E. [Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2017-01-05

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator. - Highlights: • An event-triggered estimator is designed for complex networks with time-varying delays. • A novel event generator function is proposed to reduce the communication burden. • The comparison principle is utilized to derive the sufficient conditions. • The designed triggering condition is shown to be free of the Zeno behavior.

  17. Development of knowledge-based operator support system for steam generator water leak events in FBR plants

    International Nuclear Information System (INIS)

    Arikawa, Hiroshi; Ida, Toshio; Matsumoto, Hiroyuki; Kishida, Masako

    1991-01-01

    A knowledge engineering approach to operation support system would be useful in maintaining safe and steady operation in nuclear plants. This paper describes a knowledge-based operation support system which assists the operators during steam generator water leak events in FBR plants. We have developed a real-time expert system. The expert system adopts hierarchical knowledge representation corresponding to the 'plant abnormality model'. A technique of signal validation which uses knowledge of symptom propagation are applied to diagnosis. In order to verify the knowledge base concerning steam generator water leak events in FBR plants, a simulator is linked to the expert system. It is revealed that diagnosis based on 'plant abnormality model' and signal validation using knowledge of symptom propagation could work successfully. Also, it is suggested that the expert system could be useful in supporting FBR plants operations. (author)

  18. Intensity changes in future extreme precipitation: A statistical event-based approach.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    Short-lived precipitation extremes are often responsible for hazards in urban and rural environments with economic and environmental consequences. The precipitation intensity is expected to increase about 7% per degree of warming, according to the Clausius-Clapeyron (CC) relation. However, the observations often show a much stronger increase in the sub-daily values. In particular, the behavior of the hourly summer precipitation from radar observations with the dew point temperature (the Pi-Td relation) for the Netherlands suggests that for moderate to warm days the intensification of the precipitation can be even higher than 21% per degree of warming, that is 3 times higher than the expected CC relation. The rate of change depends on the initial precipitation intensity, as low percentiles increase with a rate below CC, the medium percentiles with 2CC and the moderate-high and high percentiles with 3CC. This non-linear statistical Pi-Td relation is suggested to be used as a delta-transformation to project how a historic extreme precipitation event would intensify under future, warmer conditions. Here, the Pi-Td relation is applied over a selected historic extreme precipitation event to 'up-scale' its intensity to warmer conditions. Additionally, the selected historic event is simulated in the high-resolution, convective-permitting weather model Harmonie. The initial and boundary conditions are alternated to represent future conditions. The comparison between the statistical and the numerical method of projecting the historic event to future conditions showed comparable intensity changes, which depending on the initial percentile intensity, range from below CC to a 3CC rate of change per degree of warming. The model tends to overestimate the future intensities for the low- and the very high percentiles and the clouds are somewhat displaced, due to small wind and convection changes. The total spatial cloud coverage in the model remains, as also in the statistical

  19. Satellite Collision Modeling with Physics-Based Hydrocodes: Debris Generation Predictions of the Iridium-Cosmos Collision Event and Other Impact Events

    International Nuclear Information System (INIS)

    Springer, H.K.; Miller, W.O.; Levatin, J.L.; Pertica, A.J.; Olivier, S.S.

    2010-01-01

    disposition. Based on comparing our results to observations, it is unlikely that the Iridium 33-Cosmos 2251 collision event was a large mass-overlap collision. We also performed separate simulations studying the debris generated by the collision of 5 and 10 cm spherical projectiles on the Iridium 33 satellite at closing velocities of 5, 10, and 15 km/s. It is important to understand the vulnerability of satellites to small debris threats, given their pervasiveness in orbit. These studies can also be merged with probabilistic conjunction analysis to better understand the risk to space assets. In these computational studies, we found that momentum transfer, kinetic energy losses due to dissipative mechanisms (e.g., fracture), fragment number, and fragment velocity increases with increasing velocity for a fixed projectile size. For a fixed velocity, we found that the smaller projectile size more efficiently transfers momentum to the satellite. This latter point has an important implication: Eight (spaced) 5 cm debris objects can impart more momentum to the satellite, and likely cause more damage, than a single 10 cm debris object at the same velocity. Further studies are required to assess the satellite damage induced by 1-5 cm sized debris objects, as well as multiple debris objects, in this velocity range.

  20. THE ROLE AND IMPLICATIONS OF THE EVENT BASED COMMUNICATION IN THE ELECTORAL CAMPAIGN

    Directory of Open Access Journals (Sweden)

    Tatu Cristian Ionut

    2011-12-01

    Full Text Available The electoral campaigns are considered to be among the most delicate challenges for a marketer due to the limited time available, the sensible margin for error, the high impact of each statement and the condensation of a quite large amount of resources in a 30 day period. While the ultimate goal for the campaign staff is to bring the global electoral package closer to the electorate and earn their votes most, of the time various competitors use disappointingly similar tactics that create confusion among the electorate. The campaign related events turned out to be one of the tactics that allows for a pin-point targeting of the electorate and a better control on the receivers of the message. This paper focuses on the types of events used that can be used in an electoral campaign reinforced with their particularities and effects registered in previous campaigns.

  1. Carcinogenic ptaquiloside in stream water at base flow and during storm events

    DEFF Research Database (Denmark)

    Strobel, Bjarne W.; Clauson-Kaas, Frederik; Hansen, Hans Chr. Bruun

    2017-01-01

    identified, of which the compound ptaquiloside (PTA) is the most abundant. Ptaquiloside has been shown to be highly water soluble, leachable from bracken fronds and litter, and present in the soil below bracken stands. During storm events throughfall from the bracken canopy was collected as well. Stream...... water samples were taken as grab samples, while throughfall accumulated in glass jars set out below the canopy. Field blanks and fortified lab controls were included to ensure reliability of the analysis. Ptaquiloside concentrations were determined using LC-MS/MS after a clean-up using solid phase...... extraction. Results showed that PTA levels in the stream were highly dependent on precipitation, and was rising considerably during rain events, peaking at 2.28 μg/L, before quickly (conservation...

  2. A novel CUSUM-based approach for event detection in smart metering

    Science.gov (United States)

    Zhu, Zhicheng; Zhang, Shuai; Wei, Zhiqiang; Yin, Bo; Huang, Xianqing

    2018-03-01

    Non-intrusive load monitoring (NILM) plays such a significant role in raising consumer awareness on household electricity use to reduce overall energy consumption in the society. With regard to monitoring low power load, many researchers have introduced CUSUM into the NILM system, since the traditional event detection method is not as effective as expected. Due to the fact that the original CUSUM faces limitations given the small shift is below threshold, we therefore improve the test statistic which allows permissible deviation to gradually rise as the data size increases. This paper proposes a novel event detection and corresponding criterion that could be used in NILM systems to recognize transient states and to help the labelling task. Its performance has been tested in a real scenario where eight different appliances are connected to main line of electric power.

  3. Research on a Hierarchical Dynamic Automatic Voltage Control System Based on the Discrete Event-Driven Method

    Directory of Open Access Journals (Sweden)

    Yong Min

    2013-06-01

    Full Text Available In this paper, concepts and methods of hybrid control systems are adopted to establish a hierarchical dynamic automatic voltage control (HD-AVC system, realizing the dynamic voltage stability of power grids. An HD-AVC system model consisting of three layers is built based on the hybrid control method and discrete event-driven mechanism. In the Top Layer, discrete events are designed to drive the corresponding control block so as to avoid solving complex multiple objective functions, the power system’s characteristic matrix is formed and the minimum amplitude eigenvalue (MAE is calculated through linearized differential-algebraic equations. MAE is applied to judge the system’s voltage stability and security and construct discrete events. The Middle Layer is responsible for management and operation, which is also driven by discrete events. Control values of the control buses are calculated based on the characteristics of power systems and the sensitivity method. Then control values generate control strategies through the interface block. In the Bottom Layer, various control devices receive and implement the control commands from the Middle Layer. In this way, a closed-loop power system voltage control is achieved. Computer simulations verify the validity and accuracy of the HD-AVC system, and verify that the proposed HD-AVC system is more effective than normal voltage control methods.

  4. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation

    OpenAIRE

    Dongmei Huang; Chenyixuan Xu; Danfeng Zhao; Wei Song; Qi He

    2017-01-01

    Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertice...

  5. The Adverse Events and Hemodynamic Effects of Adenosine-Based Cardiac MRI

    International Nuclear Information System (INIS)

    Voigtlander, Thomas; Magedanz, Annett; Schmermund, Axel; Bramlage, Peter; Elsaesser, Amelie; Kauczor, Hans-Ulrich; Mohrs, Oliver K.

    2011-01-01

    We wanted to prospectively assess the adverse events and hemodynamic effects associated with an intravenous adenosine infusion in patients with suspected or known coronary artery disease and who were undergoing cardiac MRI. One hundred and sixty-eight patients (64 ± 9 years) received adenosine (140 μg/kg/min) during cardiac MRI. Before and during the administration, the heart rate, systemic blood pressure, and oxygen saturation were monitored using a MRI-compatible system. We documented any signs and symptoms of potential adverse events. In total, 47 out of 168 patients (28%) experienced adverse effects, which were mostly mild or moderate. In 13 patients (8%), the adenosine infusion was discontinued due to intolerable dyspnea or chest pain. No high grade atrioventricular block, bronchospasm or other life-threatening adverse events occurred. The hemodynamic measurements showed a significant increase in the heart rate during adenosine infusion (69.3 ± 11.7 versus 82.4 ± 13.0 beats/min, respectively; p < 0.001). A significant but clinically irrelevant increase in oxygen saturation occurred during adenosine infusion (96 ± 1.9% versus 97 ± 1.3%, respectively; p < 0.001). The blood pressure did not significantly change during adenosine infusion (systolic: 142.8 ± 24.0 versus 140.9 ± 25.7 mmHg; diastolic: 80.2 ± 12.5 mmHg versus 78.9 ± 15.6, respectively). This study confirms the safety of adenosine infusion during cardiac MRI. A considerable proportion of all patients will experience minor adverse effects and some patients will not tolerate adenosine infusion. However, all adverse events can be successfully managed by a radiologist. The increased heart rate during adenosine infusion highlights the need to individually adjust the settings according to the patient, e.g., the number of slices of myocardial perfusion imaging.

  6. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  7. StoRMon: an event log analyzer for Grid Storage Element based on StoRM

    International Nuclear Information System (INIS)

    Zappi, Riccardo; Dal Pra, Stefano; Dibenedetto, Michele; Ronchieri, Elisabetta

    2011-01-01

    Managing a collaborative production Grid infrastructure requires to identify and handle every issue, which might arise, in a timely manner. Currently, the most complex problem of the data Grid infrastructure relates to the data management because of its distributed nature. To ensure that problems are quickly addressed and solved, each site should contribute to the solution providing any useful information about services that run in its administrative domain. Often Grid sites' administrators to be effective must collect, organize and examine the scattered logs events that are produced from every service and component of the Storage Element. This paper focuses on the problem of gathering the events logs on a Grid Storage Element and describes the design of a new service, called StoRMon. StoRMon will be able to collect, archive, analyze and report on events logs produced by each service of Storage Element during the execution of its tasks. The data and the processed information will be available to the site administrators by using a single contact-point to easily identify security incidents, fraudulent activity, and the operational issues mainly. The new service is applied to a Grid Storage Element characterized by StoRM, GridFTP and YAMSS, and collects the usage data of StoRM, transferring and hierarchical storage services.

  8. Vision-based Event Detection of the Sit-to-Stand Transition

    Directory of Open Access Journals (Sweden)

    Victor Shia

    2015-12-01

    Full Text Available Sit-to-stand (STS motions are one of the most important activities of daily living as they serve as a precursor to mobility and walking. However, there exist no standard method of segmenting STS motions. This is partially due to the variety of different sensors and modalities used to study the STS motion such as force plate, vision, and accelerometers, each providing different types of data, and the variability of the STS motion in video data. In this work, we present a method using motion capture to detect events in the STS motion by estimating ground reaction forces, thereby eliminating the variability in joint angles from visual data. We illustrate the accuracy of this method with 10 subjects with an average difference of 16.5ms in event times obtained via motion capture vs force plate. This method serves as a proof of concept for detecting events in the STS motion via video which are comparable to those obtained via force plate.

  9. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    OpenAIRE

    Wooyeon Sunwoo; Minha Choi

    2017-01-01

    Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC) prior to a rainfall even...

  10. Response of Muddy Sediments and Benthic Diatom-based Biofilms to Repeated Erosion Events

    Science.gov (United States)

    Valentine, K.; Mariotti, G.; Fagherazzi, S.

    2016-02-01

    Benthic biofilms, microbes aggregated within a matrix of Extracellular Polymeric Substances (EPS), are commonly found in shallow coastal areas and intertidal environments. Biofilms have the potential to stabilize sediments, hence reducing erosion and possibly mitigating land loss. The purpose of this study is to determine how repeated flow events that rework the bed affect biofilm growth and its ability to stabilize cohesive sediments. Natural mud devoid of grazers was used to create placed beds in four annular flumes; biofilms were allowed to grow on the sediment surface. Each flume was eroded at different time intervals (1 or 12 days) to allow for varied levels of biofilm growth and adjustment following erosion. In addition, experiments with abiotic mud were performed by adding bleach to the tank. Each erosion test consisted of step-wise increases in flow that were used to measured erodibility. In the experiments where the bed was eroded every day both the abiotic and biotic flumes exhibited a decrease in erodibility with time, likely due to consolidation, but the decrease in erodibility was greater in the flume with a biofilm. Specifically the presence of biofilm reduced bed erosion at low shear stresses ( 0.1 Pa). We attribute this progressive decrease in erodibility to the accumulation of EPS over time: even though the biofilm was eroded during each erosion event, the EPS was retained within the flume, mixed with the eroded sediment and eventually settled. Less frequent erosion allowed the growth of a stronger biofilm that decreased bed erosion at higher shear stresses ( 0.4 Pa). We conclude that the time between destructive flow events influences the ability of biofilms to stabilize sediments. This influence will likely be affected by biofilm growth conditions such as light, temperature, nutrients, salinity, and the microbial community.

  11. Fast recognition of single molecules based on single-event photon statistics

    International Nuclear Information System (INIS)

    Dong Shuangli; Huang Tao; Liu Yuan; Wang Jun; Zhang Guofeng; Xiao Liantuan; Jia Suotang

    2007-01-01

    Mandel's Q parameter, which is determined from single-event photon statistics, provides an alternative way to recognize single molecules with fluorescence detection, other than the second-order correlation function. It is shown that the Q parameter of an assumed ideal double-molecule fluorescence with the same average photon number as that of the sample fluorescence can act as the criterion for single-molecule recognition. The influence of signal-to-background ratio and the error estimates for photon statistics are also presented. We have applied this method to ascertain single Cy5 dye molecules within hundreds of milliseconds

  12. Tuning and Test of Fragmentation Models Based on Identified Particles and Precision Event Shape Data

    CERN Document Server

    Abreu, P; Adye, T; Ajinenko, I; Alekseev, G D; Alemany, R; Allport, P P; Almehed, S; Amaldi, Ugo; Amato, S; Andreazza, A; Andrieux, M L; Antilogus, P; Apel, W D; Åsman, B; Augustin, J E; Augustinus, A; Baillon, Paul; Bambade, P; Barão, F; Barate, R; Barbi, M S; Bardin, Dimitri Yuri; Baroncelli, A; Bärring, O; Barrio, J A; Bartl, Walter; Bates, M J; Battaglia, Marco; Baubillier, M; Baudot, J; Becks, K H; Begalli, M; Beillière, P; Belokopytov, Yu A; Belous, K S; Benvenuti, Alberto C; Berggren, M; Bertini, D; Bertrand, D; Besançon, M; Bianchi, F; Bigi, M; Bilenky, S M; Billoir, P; Bloch, D; Blume, M; Bolognese, T; Bonesini, M; Bonivento, W; Booth, P S L; Bosio, C; Botner, O; Boudinov, E; Bouquet, B; Bourdarios, C; Bowcock, T J V; Bozzo, M; Branchini, P; Brand, K D; Brenke, T; Brenner, R A; Bricman, C; Brown, R C A; Brückman, P; Brunet, J M; Bugge, L; Buran, T; Burgsmüller, T; Buschmann, P; Buys, A; Cabrera, S; Caccia, M; Calvi, M; Camacho-Rozas, A J; Camporesi, T; Canale, V; Canepa, M; Cankocak, K; Cao, F; Carena, F; Carroll, L; Caso, Carlo; Castillo-Gimenez, M V; Cattai, A; Cavallo, F R; Chabaud, V; Charpentier, P; Chaussard, L; Checchia, P; Chelkov, G A; Chen, M; Chierici, R; Chliapnikov, P V; Chochula, P; Chorowicz, V; Chudoba, J; Cindro, V; Collins, P; Contreras, J L; Contri, R; Cortina, E; Cosme, G; Cossutti, F; Cowell, J H; Crawley, H B; Crennell, D J; Crosetti, G; Cuevas-Maestro, J; Czellar, S; Dahl-Jensen, Erik; Dahm, J; D'Almagne, B; Dam, M; Damgaard, G; Dauncey, P D; Davenport, Martyn; Da Silva, W; Defoix, C; Deghorain, A; Della Ricca, G; Delpierre, P A; Demaria, N; De Angelis, A; de Boer, Wim; De Brabandere, S; De Clercq, C; La Vaissière, C de; De Lotto, B; De Min, A; De Paula, L S; De Saint-Jean, C; Dijkstra, H; Di Ciaccio, Lucia; Di Diodato, A; Djama, F; Dolbeau, J; Dönszelmann, M; Doroba, K; Dracos, M; Drees, J; Drees, K A; Dris, M; Durand, J D; Edsall, D M; Ehret, R; Eigen, G; Ekelöf, T J C; Ekspong, Gösta; Elsing, M; Engel, J P; Erzen, B; Espirito-Santo, M C; Falk, E; Fassouliotis, D; Feindt, Michael; Ferrer, A; Fichet, S; Filippas-Tassos, A; Firestone, A; Fischer, P A; Föth, H; Fokitis, E; Fontanelli, F; Formenti, F; Franek, B J; Frenkiel, P; Fries, D E C; Frodesen, A G; Frühwirth, R; Fulda-Quenzer, F; Fuster, J A; Galloni, A; Gamba, D; Gandelman, M; García, C; García, J; Gaspar, C; Gasparini, U; Gavillet, P; Gazis, E N; Gelé, D; Gerber, J P; Gokieli, R; Golob, B; Gopal, Gian P; Gorn, L; Górski, M; Guz, Yu; Gracco, Valerio; Graziani, E; Green, C; Grefrath, A; Gris, P; Grosdidier, G; Grzelak, K; Gumenyuk, S A; Gunnarsson, P; Günther, M; Guy, J; Hahn, F; Hahn, S; Hajduk, Z; Hallgren, A; Hamacher, K; Harris, F J; Hedberg, V; Henriques, R P; Hernández, J J; Herquet, P; Herr, H; Hessing, T L; Higón, E; Hilke, Hans Jürgen; Hill, T S; Holmgren, S O; Holt, P J; Holthuizen, D J; Hoorelbeke, S; Houlden, M A; Hrubec, Josef; Huet, K; Hultqvist, K; Jackson, J N; Jacobsson, R; Jalocha, P; Janik, R; Jarlskog, C; Jarlskog, G; Jarry, P; Jean-Marie, B; Johansson, E K; Jönsson, L B; Jönsson, P E; Joram, Christian; Juillot, P; Kaiser, M; Kapusta, F; Karafasoulis, K; Karlsson, M; Karvelas, E; Katsanevas, S; Katsoufis, E C; Keränen, R; Khokhlov, Yu A; Khomenko, B A; Khovanskii, N N; King, B J; Kjaer, N J; Klapp, O; Klein, H; Klovning, A; Kluit, P M; Köne, B; Kokkinias, P; Koratzinos, M; Korcyl, K; Kostyukhin, V; Kourkoumelis, C; Kuznetsov, O; Kreuter, C; Kronkvist, I J; Krumshtein, Z; Krupinski, W; Kubinec, P; Kucewicz, W; Kurvinen, K L; Lacasta, C; Laktineh, I; Lamsa, J; Lanceri, L; Lane, D W; Langefeld, P; Lapin, V; Laugier, J P; Lauhakangas, R; Leder, Gerhard; Ledroit, F; Lefébure, V; Legan, C K; Leitner, R; Lemonne, J; Lenzen, Georg; Lepeltier, V; Lesiak, T; Libby, J; Liko, D; Lindner, R; Lipniacka, A; Lippi, I; Lörstad, B; Loken, J G; López, J M; Loukas, D; Lutz, P; Lyons, L; Naughton, J M; Maehlum, G; Mahon, J R; Maio, A; Malmgren, T G M; Malychev, V; Mandl, F; Marco, J; Marco, R P; Maréchal, B; Margoni, M; Marin, J C; Mariotti, C; Markou, A; Martínez-Rivero, C; Martínez-Vidal, F; Martí i García, S; Masik, J; Matorras, F; Matteuzzi, C; Matthiae, Giorgio; Mazzucato, M; McCubbin, M L; McKay, R; McNulty, R; Medbo, J; Merk, M; Meroni, C; Meyer, S; Meyer, W T; Myagkov, A; Michelotto, M; Migliore, E; Mirabito, L; Mitaroff, Winfried A; Mjörnmark, U; Moa, T; Møller, R; Mönig, K; Monge, M R; Morettini, P; Müller, H; Mulders, M; Mundim, L M; Murray, W J; Muryn, B; Myatt, Gerald; Naraghi, F; Navarria, Francesco Luigi; Navas, S; Nawrocki, K; Negri, P; Neumann, W; Neumeister, N; Nicolaidou, R; Nielsen, B S; Nieuwenhuizen, M; Nikolaenko, V; Niss, P; Nomerotski, A; Normand, Ainsley; Oberschulte-Beckmann, W; Obraztsov, V F; Olshevskii, A G; Onofre, A; Orava, Risto; Österberg, K; Ouraou, A; Paganini, P; Paganoni, M; Pagès, P; Pain, R; Palka, H; Papadopoulou, T D; Papageorgiou, K; Pape, L; Parkes, C; Parodi, F; Passeri, A; Pegoraro, M; Peralta, L; Pernegger, H; Pernicka, Manfred; Perrotta, A; Petridou, C; Petrolini, A; Petrovykh, M; Phillips, H T; Piana, G; Pierre, F; Plaszczynski, S; Podobrin, O; Pol, M E; Polok, G; Poropat, P; Pozdnyakov, V; Privitera, P; Pukhaeva, N; Pullia, Antonio; Radojicic, D; Ragazzi, S; Rahmani, H; Rames, J; Ratoff, P N; Read, A L; Reale, M; Rebecchi, P; Redaelli, N G; Regler, Meinhard; Reid, D; Renton, P B; Resvanis, L K; Richard, F; Richardson, J; Rídky, J; Rinaudo, G; Ripp, I; Romero, A; Roncagliolo, I; Ronchese, P; Roos, L; Rosenberg, E I; Rosso, E; Roudeau, Patrick; Rovelli, T; Rückstuhl, W; Ruhlmann-Kleider, V; Ruiz, A; Rybicki, K; Saarikko, H; Sacquin, Yu; Sadovskii, A; Sahr, O; Sajot, G; Salt, J; Sánchez, J; Sannino, M; Schimmelpfennig, M; Schneider, H; Schwickerath, U; Schyns, M A E; Sciolla, G; Scuri, F; Seager, P; Sedykh, Yu; Segar, A M; Seitz, A; Sekulin, R L; Serbelloni, L; Shellard, R C; Siegrist, P; Silvestre, R; Simonetti, S; Simonetto, F; Sissakian, A N; Sitár, B; Skaali, T B; Smadja, G; Smirnov, N; Smirnova, O G; Smith, G R; Sokolov, A; Sosnowski, R; Souza-Santos, D; Spassoff, Tz; Spiriti, E; Sponholz, P; Squarcia, S; Stanescu, C; Stapnes, Steinar; Stavitski, I; Stevenson, K; Stichelbaut, F; Stocchi, A; Strauss, J; Strub, R; Stugu, B; Szczekowski, M; Szeptycka, M; Tabarelli de Fatis, T; Tavernet, J P; Chikilev, O G; Thomas, J; Tilquin, A; Timmermans, J; Tkatchev, L G; Todorov, T; Todorova, S; Toet, D Z; Tomaradze, A G; Tomé, B; Tonazzo, A; Tortora, L; Tranströmer, G; Treille, D; Trischuk, W; Tristram, G; Trombini, A; Troncon, C; Tsirou, A L; Turluer, M L; Tyapkin, I A; Tyndel, M; Tzamarias, S; Überschär, B; Ullaland, O; Uvarov, V; Valenti, G; Vallazza, E; van Apeldoorn, G W; van Dam, P; Van Eldik, J; Vassilopoulos, N; Vegni, G; Ventura, L; Venus, W A; Verbeure, F; Verlato, M; Vertogradov, L S; Vilanova, D; Vincent, P; Vitale, L; Vlasov, E; Vodopyanov, A S; Vrba, V; Wahlen, H; Walck, C; Waldner, F; Weierstall, M; Weilhammer, Peter; Weiser, C; Wetherell, Alan M; Wicke, D; Wickens, J H; Wielers, M; Wilkinson, G R; Williams, W S C; Winter, M; Witek, M; Woschnagg, K; Yip, K; Yushchenko, O P; Zach, F; Zaitsev, A; Zalewska-Bak, A; Zalewski, Piotr; Zavrtanik, D; Zevgolatakos, E; Zimin, N I; Zito, M; Zontar, D; Zucchelli, G C; Zumerle, G

    1996-01-01

    Event shape and charged particle inclusive distributions are measured using 750000 decays of the $Z$ to hadrons from the DELPHI detector at LEP. These precise data allow a decisive confrontation with models of the hadronization process. Improved tunings of the JETSET ARIADNE and HERWIG parton shower models and the JETSET matrix element model are obtained by fitting the models to these DELPHI data as well as to identified particle distributions from all LEP experiments. The description of the data distributions by the models is critically reviewed with special importance attributed to identified particles.

  13. On-line event reconstruction using a parallel in-memory data base

    OpenAIRE

    Argante, E; Van der Stok, P D V; Willers, Ian Malcolm

    1995-01-01

    PORS is a system designed for on-line event reconstruction in high energy physics (HEP) experiments. It uses the CPREAD reconstruction program. Central to the system is a parallel in-memory database which is used as communication medium between parallel workers. A farming control structure is implemented with PORS in a natural way. The database provides structured storage of data with a short life time. PORS serves as a case study for the construction of a methodology on how to apply parallel...

  14. Breaking The Millisecond Barrier On SpiNNaker: Implementing Asynchronous Event-Based Plastic Models With Microsecond Resolution

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-06-01

    Full Text Available Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms towards those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 microseconds. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.

  15. Severity Classification of a Seismic Event based on the Magnitude-Distance Ratio Using Only One Seismological Station

    Directory of Open Access Journals (Sweden)

    Luis Hernán Ochoa Gutiérrez

    2014-07-01

    Full Text Available Seismic event characterization is often accomplished using algorithms based only on information received at seismological stations located closest to the particular event, while ignoring historical data received at those stations. These historical data are stored and unseen at this stage. This characterization process can delay the emergency response, costing valuable time in the mitigation of the adverse effects on the affected population. Seismological stations have recorded data during many events that have been characterized by classical methods, and these data can be used as previous "knowledge" to train such stations to recognize patterns. This knowledge can be used to make faster characterizations using only one three-component broadband station by applying bio-inspired algorithms or recently developed stochastic methods, such as kernel methods. We trained a Support Vector Machine (SVM algorithm with seismograph data recorded by INGEOMINAS's National Seismological Network at a three-component station located near Bogota, Colombia. As input model descriptors, we used the following: (1 the integral of the Fourier transform/power spectrum for each component, divided into 7 windows of 2 seconds and beginning at the P onset time, and (2 the ratio between the calculated logarithm of magnitude (Mb and epicentral distance. We used 986 events with magnitudes greater than 3 recorded from late 2003 to 2008. The algorithm classifies events with magnitude-distance ratios (a measure of the severity of possible damage caused by an earthquake greater than a background value. This value can be used to estimate the magnitude based on a known epicentral distance, which is calculated from the difference between P and S onset times. This rapid (< 20 seconds magnitude estimate can be used for rapid response strategies. The results obtained in this work confirm that many hypocentral parameters and a rapid location of a seismic event can be obtained using a few

  16. Pattern recognition based on time-frequency analysis and convolutional neural networks for vibrational events in φ-OTDR

    Science.gov (United States)

    Xu, Chengjin; Guan, Junjun; Bao, Ming; Lu, Jiangang; Ye, Wei

    2018-01-01

    Based on vibration signals detected by a phase-sensitive optical time-domain reflectometer distributed optical fiber sensing system, this paper presents an implement of time-frequency analysis and convolutional neural network (CNN), used to classify different types of vibrational events. First, spectral subtraction and the short-time Fourier transform are used to enhance time-frequency features of vibration signals and transform different types of vibration signals into spectrograms, which are input to the CNN for automatic feature extraction and classification. Finally, by replacing the soft-max layer in the CNN with a multiclass support vector machine, the performance of the classifier is enhanced. Experiments show that after using this method to process 4000 vibration signal samples generated by four different vibration events, namely, digging, walking, vehicles passing, and damaging, the recognition rates of vibration events are over 90%. The experimental results prove that this method can automatically make an effective feature selection and greatly improve the classification accuracy of vibrational events in distributed optical fiber sensing systems.

  17. Assessing Nature-Based Coastal Protection against Disasters Derived from Extreme Hydrometeorological Events in Mexico

    Directory of Open Access Journals (Sweden)

    Octavio Pérez-Maqueo

    2018-04-01

    Full Text Available Natural ecosystems are expected to reduce the damaging effects of extreme hydrometeorological effects. We tested this prediction for Mexico by performing regression models, with two dependent variables: the occurrence of deaths and economic damages, at a state and municipality levels. For each location, the explanatory variables were the Mexican social vulnerability index (which includes socioeconomic aspects, local capacity to prevent and respond to an emergency, and the perception of risk and land use cover considering different vegetation types. We used the hydrometeorological events that have affected Mexico from 1970 to 2011. Our findings reveal that: (a hydrometeorological events affect both coastal and inland states, although damages are greater on the coast; (b the protective role of natural ecosystems only was clear at a municipality level: the presence of mangroves, tropical dry forest and tropical rainforest was related to a significant reduction in the occurrence of casualties. Social vulnerability was positively correlated with the occurrence of deaths. Natural ecosystems, both typically coastal (mangroves and terrestrial (tropical forests, which are located on the mountain ranges close to the coast function for storm protection. Thus, their conservation and restoration are effective and sustainable strategies that will help protect and develop the increasingly urbanized coasts.

  18. Safety management as a foundation for evidence-based aeromedical standards and reporting of medical events.

    Science.gov (United States)

    Evans, Anthony D; Watson, Dougal B; Evans, Sally A; Hastings, John; Singh, Jarnail; Thibeault, Claude

    2009-06-01

    The different interpretations by States (countries) of the aeromedical standards established by the International Civil Aviation Organization has resulted in a variety of approaches to the development of national aeromedical policy, and consequently a relative lack of harmonization. However, in many areas of aviation, safety management systems have been recently introduced and may represent a way forward. A safety management system can be defined as "A systematic approach to managing safety, including the necessary organizational structures, accountabilities, policies, and procedures" (1). There are four main areas where, by applying safety management principles, it may be possible to better use aeromedical data to enhance flight safety. These are: 1) adjustment of the periodicity and content of routine medical examinations to more accurately reflect aeromedical risk; 2) improvement in reporting and analysis of routine medical examination data; 3) improvement in reporting and analysis of in-flight medical events; and 4) support for improved reporting of relevant aeromedical events through the promotion of an appropriate culture by companies and regulatory authorities. This paper explores how the principles of safety management may be applied to aeromedical systems to improve their contribution to safety.

  19. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2013-12-01

    Full Text Available The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM, combined with its sparsified version (sparse online LS-OC-SVM. LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.

  20. An Event-driven, Value-based, Pull Systems Engineering Scheduling Approach

    Science.gov (United States)

    2012-03-01

    combining a services approach to systems engineering with a kanban -based scheduling system. It provides the basis for validating the approach with...agent-based simulations. Keywords-systems engineering; systems engineering process; lean; kanban ; process simulation I. INTRODUCTION AND BACKGROUND...approaches [8], [9], we are investigating the use of flow-based pull scheduling techniques ( kanban systems) in a rapid response development

  1. Qualitative Characteristics of Memories for Real, Imagined, and Media-Based Events

    Science.gov (United States)

    Gordon, Ruthanna; Gerrig, Richard J.; Franklin, Nancy

    2009-01-01

    People's memories must be able to represent experiences with multiple types of origins--including the real world and our own imaginations, but also printed texts (prose-based media), movies, and television (screen-based media). This study was intended to identify cues that distinguish prose- and screen-based media memories from each other, as well…

  2. Pre-event trajectories of mental health and health-related disabilities, and post-event traumatic stress symptoms and health: A 7-wave population-based study.

    Science.gov (United States)

    van der Velden, Peter G; Bosmans, Mark W G; van der Meulen, Erik; Vermunt, Jeroen K

    2016-12-30

    It is unknown to what extent classes of trajectories of pre-event mental health problems (MHP) and health-related disabilities (HRD), predict post-event traumatic stress symptoms (PTSS), MHP and HRD. Aim of the present 7-wave study was to assess the predictive values using a representative sample of adult Dutch (N=4052) participating in three health-surveys in November-December 2009 (T1), 2010 (T2), 2011 (T3). In total, 2988 out of 4052 also participated in trauma-surveys in April(T4), August(T5) and December(T6) 2012 and a fourth health-survey in November-December 2012 (T7). About 10% (N=314) was confronted with potentially traumatic events (PTE) in the 4 months before T4 or T5. Latent class analyses among 4052 respondents identified four classes of pre-event MHP and HRD. Series of multivariate logistic regression analyses with class membership, peri-traumatic stress, type of event, gender, age and education as predictors, showed that classes with high levels of MHP or HRD, were more at risk for high levels of PTSS at baseline and follow-ups at 4 and 8 months, than classes with low levels of MHP or HRD. These classes were very strong predictors for high levels of post-event MHP and HRD: no differences were found between non-affected and affected respondents with different levels of peri-traumatic stress. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Detection of water-quality contamination events based on multi-sensor fusion using an extented Dempster–Shafer method

    International Nuclear Information System (INIS)

    Hou, Dibo; He, Huimei; Huang, Pingjie; Zhang, Guangxin; Loaiciga, Hugo

    2013-01-01

    This study presents a method for detecting contamination events of sources of drinking water based on the Dempster–Shafer (D-S) evidence theory. The detection method has the purpose of protecting water supply systems against accidental and intentional contamination events. This purpose is achieved by first predicting future water-quality parameters using an autoregressive (AR) model. The AR model predicts future water-quality parameters using recent measurements of these parameters made with automated (on-line) water-quality sensors. Next, a probabilistic method assigns probabilities to the time series of residuals formed by comparing predicted water-quality parameters with threshold values. Finally, the D-S fusion method searches for anomalous probabilities of the residuals and uses the result of that search to determine whether the current water quality is normal (that is, free of pollution) or contaminated. The D-S fusion method is extended and improved in this paper by weighted averaging of water-contamination evidence and by the analysis of the persistence of anomalous probabilities of water-quality parameters. The extended D-S fusion method makes determinations that have a high probability of being correct concerning whether or not a source of drinking water has been contaminated. This paper's method for detecting water-contamination events was tested with water-quality time series from automated (on-line) water quality sensors. In addition, a small-scale, experimental, water-pipe network was tested to detect water-contamination events. The two tests demonstrated that the extended D-S fusion method achieves a low false alarm rate and high probabilities of detecting water contamination events. (paper)

  4. Improving patient safety via automated laboratory-based adverse event grading.

    Science.gov (United States)

    Niland, Joyce C; Stiller, Tracey; Neat, Jennifer; Londrc, Adina; Johnson, Dina; Pannoni, Susan

    2012-01-01

    The identification and grading of adverse events (AEs) during the conduct of clinical trials is a labor-intensive and error-prone process. This paper describes and evaluates a software tool developed by City of Hope to automate complex algorithms to assess laboratory results and identify and grade AEs. We compared AEs identified by the automated system with those previously assessed manually, to evaluate missed/misgraded AEs. We also conducted a prospective paired time assessment of automated versus manual AE assessment. We found a substantial improvement in accuracy/completeness with the automated grading tool, which identified an additional 17% of severe grade 3-4 AEs that had been missed/misgraded manually. The automated system also provided an average time saving of 5.5 min per treatment course. With 400 ongoing treatment trials at City of Hope and an average of 1800 laboratory results requiring assessment per study, the implications of these findings for patient safety are enormous.

  5. A data base for on-line event analysis on a distributed memory machine

    International Nuclear Information System (INIS)

    Argante, E.; Meesters, M.R.J.; Willers, I.; Stok, P. van der

    1996-01-01

    Parallel in-memory databases can enhance the structuring and parallelization of programs used in High Energy Physics (HEP). Efficient database access routines are used as communication primitives which hide the communication topology in contrast to the more explicit communications like PVM or MPI. A parallel in-memory database, called SPIDER, has been implemented on a 32 node Meiko CS-2 distributed memory machine. The SPIDER primitives generate a lower overhead than the one generated by PVM or MPI. The even reconstruction program, CPREAD, of the CLEAR experiment, has been used as test case. Performance measurements showed that CPREAD interfaced to SPIDER can easily cope with the event rate generated by CPLEAR. (author)

  6. A power filter for the detection of burst events based on time-frequency spectrum estimation

    International Nuclear Information System (INIS)

    Guidi, G M; Cuoco, E; Vicere, A

    2004-01-01

    We propose as a statistic for the detection of bursts in a gravitational wave interferometer the 'energy' of the events estimated with a time-dependent calculation of the spectrum. This statistic has an asymptotic Gaussian distribution with known statistical moments, which makes it possible to perform a uniformly most powerful test (McDonough R N and Whalen A D 1995 Detection of Signals in Noise (New York: Academic)) on the energy mean. We estimate the receiver operating characteristic (ROC, from the same book) of this statistic for different levels of the signal-to-noise ratio in the specific case of a simulated noise having the spectral density expected for Virgo, using test signals taken from a library of possible waveforms emitted during the collapse of the core of type II supernovae

  7. Analysis of factors associated with hiccups based on the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Hosoya, Ryuichiro; Uesawa, Yoshihiro; Ishii-Nozawa, Reiko; Kagaya, Hajime

    2017-01-01

    Hiccups are occasionally experienced by most individuals. Although hiccups are not life-threatening, they may lead to a decline in quality of life. Previous studies showed that hiccups may occur as an adverse effect of certain medicines during chemotherapy. Furthermore, a male dominance in hiccups has been reported. However, due to the limited number of studies conducted on this phenomenon, debate still surrounds the few factors influencing hiccups. The present study aimed to investigate the influence of medicines and patient characteristics on hiccups using a large-sized adverse drug event report database and, specifically, the Japanese Adverse Drug Event Report (JADER) database. Cases of adverse effects associated with medications were extracted from JADER, and Fisher's exact test was performed to assess the presence or absence of hiccups for each medication. In a multivariate analysis, we conducted a multiple logistic regression analysis using medication and patient characteristic variables exhibiting significance. We also examined the role of dexamethasone in inducing hiccups during chemotherapy. Medicines associated with hiccups included dexamethasone, levofolinate, fluorouracil, oxaliplatin, carboplatin, and irinotecan. Patient characteristics associated with hiccups included a male gender and greater height. The combination of anti-cancer agent and dexamethasone use was noted in more than 95% of patients in the dexamethasone-use group. Hiccups also occurred in patients in the anti-cancer agent-use group who did not use dexamethasone. Most of the medications that induce hiccups are used in chemotherapy. The results of the present study suggest that it is possible to predict a high risk of hiccups using patient characteristics. We confirmed that dexamethasone was the drug that has the strongest influence on the induction of hiccups. However, the influence of anti-cancer agents on the induction of hiccups cannot be denied. We consider the results of the present

  8. Trail-Based Search for Efficient Event Report to Mobile Actors in Wireless Sensor and Actor Networks.

    Science.gov (United States)

    Xu, Zhezhuang; Liu, Guanglun; Yan, Haotian; Cheng, Bin; Lin, Feilong

    2017-10-27

    In wireless sensor and actor networks, when an event is detected, the sensor node needs to transmit an event report to inform the actor. Since the actor moves in the network to execute missions, its location is always unavailable to the sensor nodes. A popular solution is the search strategy that can forward the data to a node without its location information. However, most existing works have not considered the mobility of the node, and thus generate significant energy consumption or transmission delay. In this paper, we propose the trail-based search (TS) strategy that takes advantage of actor's mobility to improve the search efficiency. The main idea of TS is that, when the actor moves in the network, it can leave its trail composed of continuous footprints. The search packet with the event report is transmitted in the network to search the actor or its footprints. Once an effective footprint is discovered, the packet will be forwarded along the trail until it is received by the actor. Moreover, we derive the condition to guarantee the trail connectivity, and propose the redundancy reduction scheme based on TS (TS-R) to reduce nontrivial transmission redundancy that is generated by the trail. The theoretical and numerical analysis is provided to prove the efficiency of TS. Compared with the well-known expanding ring search (ERS), TS significantly reduces the energy consumption and search delay.

  9. Trail-Based Search for Efficient Event Report to Mobile Actors in Wireless Sensor and Actor Networks †

    Science.gov (United States)

    Xu, Zhezhuang; Liu, Guanglun; Yan, Haotian; Cheng, Bin; Lin, Feilong

    2017-01-01

    In wireless sensor and actor networks, when an event is detected, the sensor node needs to transmit an event report to inform the actor. Since the actor moves in the network to execute missions, its location is always unavailable to the sensor nodes. A popular solution is the search strategy that can forward the data to a node without its location information. However, most existing works have not considered the mobility of the node, and thus generate significant energy consumption or transmission delay. In this paper, we propose the trail-based search (TS) strategy that takes advantage of actor’s mobility to improve the search efficiency. The main idea of TS is that, when the actor moves in the network, it can leave its trail composed of continuous footprints. The search packet with the event report is transmitted in the network to search the actor or its footprints. Once an effective footprint is discovered, the packet will be forwarded along the trail until it is received by the actor. Moreover, we derive the condition to guarantee the trail connectivity, and propose the redundancy reduction scheme based on TS (TS-R) to reduce nontrivial transmission redundancy that is generated by the trail. The theoretical and numerical analysis is provided to prove the efficiency of TS. Compared with the well-known expanding ring search (ERS), TS significantly reduces the energy consumption and search delay. PMID:29077017

  10. Reactor protection system software test-case selection based on input-profile considering concurrent events and uncertainties

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Lee, Seung Jun; Cho, Jaehyun; Jung, Wondea

    2016-01-01

    Recently, the input-profile-based testing for safety critical software has been proposed for determining the number of test cases and quantifying the failure probability of the software. Input-profile of a reactor protection system (RPS) software is the input which causes activation of the system for emergency shutdown of a reactor. This paper presents a method to determine the input-profile of a RPS software which considers concurrent events/transients. A deviation of a process parameter value begins through an event and increases owing to the concurrent multi-events depending on the correlation of process parameters and severity of incidents. A case of reactor trip caused by feedwater loss and main steam line break is simulated and analyzed to determine the RPS software input-profile and estimate the number of test cases. The different sizes of the main steam line breaks (e.g., small, medium, large break) with total loss of feedwater supply are considered in constructing the input-profile. The uncertainties of the simulation related to the input-profile-based software testing are also included. Our study is expected to provide an option to determine test cases and quantification of RPS software failure probability. (author)

  11. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    Science.gov (United States)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    Operational quantitative precipitation forecasts (QPF) are provided routinely by weather services or hydrological authorities, particularly those responsible for densely populated regions of small catchments, such as those typically found in Mediterranean areas prone to flash-floods. Specific rainfall values are used as thresholds for issuing warning levels considering different time frameworks (mid-range, short-range, 24h, 1h, etc.), for example 100 mm in 24h or 60 mm in 1h. There is a clear need to determine how feasible is a specific rainfall value for a given lead-time, in particular for very short range forecasts or nowcasts typically obtained from weather radar observations (Pierce et al 2012). In this study we assess which specific nowcast lead-times can be provided for a number of heavy precipitation events (HPE) that affected Catalonia (NE Spain). The nowcasting system we employed generates QPFs through the extrapolation of rainfall fields observed with weather radar following a Lagrangian approach developed and tested successfully in previous studies (Berenguer et al. 2005, 2011).Then QPFs up to 3h are compared with two quality controlled observational data sets: weather radar quantitative precipitation estimates (QPE) and raingauge data. Several high-impact weather HPE were selected including the 7 September 2005 Llobregat Delta river tornado outbreak (Bech et al. 2007) or the 2 November 2008 supercell tornadic thunderstorms (Bech et al. 2011) both producing, among other effects, local flash floods. In these two events there were torrential rainfall rates (30' amounts exceeding 38.2 and 12.3 mm respectively) and 24h accumulation values above 100 mm. A number of verification scores are used to characterize the evolution of precipitation forecast quality with time, which typically presents a decreasing trend but showing an strong dependence on the selected rainfall threshold and integration period. For example considering correlation factors, 30

  12. Developing Clinical Competency in Crisis Event Management: An Integrated Simulation Problem-Based Learning Activity

    Science.gov (United States)

    Liaw, S. Y.; Chen, F. G.; Klainin, P.; Brammer, J.; O'Brien, A.; Samarasekera, D. D.

    2010-01-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session…

  13. Multiagent System-Based Wide-Area Protection and Control Scheme against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun

    2015-01-01

    In this paper, a multi agent system (MAS) based wide area protection and control scheme is proposed to deal with the long term voltage instability induced cascading trips. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal emergency control...... strategy is defined to adjust the emergency states timely and prevent the unexpected relay trips. In order to supervise the control process and further minimize the load loss, an agent based process control is adopted to monitor the states of distributed controllers and adjust the emergency control...... strategy. A hybrid simulation platform based on LabVIEW and real time digital simulator (RTDS) is set up to simulate a blackout case in the power system of Eastern Denmark and to demonstrate the effectiveness of the proposed MAS based protection strategy....

  14. Identification of Tropical-Extratropical Interactions and Extreme Precipitation Events in the Middle East based on Potential Vorticity and Moisture Transport

    KAUST Repository

    de Vries, A. J.; Ouwersloot, H. G.; Feldstein, S. B.; Riemer, M.; El Kenawy, A. M.; McCabe, Matthew; Lelieveld, J.

    2017-01-01

    ) intrusion reaches deep into the subtropics and forces an incursion of high poleward vertically integrated water vapor transport (IVT) into the Middle East. This study presents an object-based identification method for extreme precipitation events based

  15. Remote Sensing of Clouds And Precipitation: Event-Based Characterization, Life Cycle Evolution, and Aerosol Influences

    Science.gov (United States)

    Esmaili, Rebekah Bradley

    contiguous United States. There was agreement on seasonal totals, but closer examination shows that the average intensity and duration of events is too high, and too infrequent compared to events detected on the ground. Awareness of the strengths and limitations, particularly in context of high-resolution cloud development, can enhance SPPs and can complement climate model simulations.

  16. Flood modelling with a distributed event-based parsimonious rainfall-runoff model: case of the karstic Lez river catchment

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2012-04-01

    Full Text Available Rainfall-runoff models are crucial tools for the statistical prediction of flash floods and real-time forecasting. This paper focuses on a karstic basin in the South of France and proposes a distributed parsimonious event-based rainfall-runoff model, coherent with the poor knowledge of both evaporative and underground fluxes. The model combines a SCS runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The efficiency of the model is discussed not only to satisfactorily simulate floods but also to get powerful relationships between the initial condition of the model and various predictors of the initial wetness state of the basin, such as the base flow, the Hu2 index from the Meteo-France SIM model and the piezometric levels of the aquifer. The advantage of using meteorological radar rainfall in flood modelling is also assessed. Model calibration proved to be satisfactory by using an hourly time step with Nash criterion values, ranging between 0.66 and 0.94 for eighteen of the twenty-one selected events. The radar rainfall inputs significantly improved the simulations or the assessment of the initial condition of the model for 5 events at the beginning of autumn, mostly in September–October (mean improvement of Nash is 0.09; correction in the initial condition ranges from −205 to 124 mm, but were less efficient for the events at the end of autumn. In this period, the weak vertical extension of the precipitation system and the low altitude of the 0 °C isotherm could affect the efficiency of radar measurements due to the distance between the basin and the radar (~60 km. The model initial condition S is correlated with the three tested predictors (R2 > 0.6. The interpretation of the model suggests that groundwater does not affect the first peaks of the flood, but can strongly impact subsequent peaks in the case of a multi-storm event. Because this kind of model is based on a limited

  17. Potentially traumatic events have negative and positive effects on loneliness, depending on PTSD-symptom levels: evidence from a population-based prospective comparative study.

    Science.gov (United States)

    van der Velden, Peter G; Pijnappel, Bas; van der Meulen, Erik

    2018-02-01

    Examine to what extent adults affected by recent potentially traumatic events (PTE) with different PTSD-symptom levels are more at risk for post-event loneliness than non-affected adults are in the same study period. We extracted data from the Dutch longitudinal LISS panel to measure pre-event loneliness (2011) and post-event loneliness (2013 and 2014), pre-event mental health problems (2011), PTE and PTSD symptoms (2012). This panel is based on a traditional random sample drawn from the population register by Statistics Netherlands. Results of the multinomial logistic regression analyses showed that affected adults with high levels of PTSD symptoms were more at risk for high levels of post-event loneliness than affected adults with very low PTSD-symptom levels and non-affected adults, while controlling for pre-event loneliness, pre-event mental health problems and demographics. However, affected adults with very low levels of PTSD symptoms compared to non-affected adults were less at risk for medium and high levels of post-event loneliness while controlling for the same variables. Yet, pre-event loneliness appeared to be the strongest independent predictor of loneliness at later stages: more than 80% with high pre-event levels had high post-event levels at both follow-ups. Remarkably, potentially traumatic events have depending on PTSD-symptom levels both negative and positive effects on post-event loneliness in favor of affected adults with very low PTSD symptoms levels. However, post-event levels at later stages are predominantly determined by pre-event loneliness levels.

  18. Event based neutron activation spectroscopy and analysis algorithm using MLE and meta-heuristics

    International Nuclear Information System (INIS)

    Wallace, B.

    2014-01-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes involved was used to create a statistical model. Maximum likelihood estimation was combined with meta-heuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research. (author)

  19. A PC-based discrete event simulation model of the Civilian Radioactive Waste Management System

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1991-01-01

    A System Simulation Model has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. This model can be used to quantify the impacts of different operating schedules, operational rules, system configurations, and equipment reliability and availability considerations on the performance of processes comprising the CRWMS and how these factors combine to determine overall system performance for the purpose of making system design decisions. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  20. Single-Trial Event-Related Potential Based Rapid Image Triage System

    Directory of Open Access Journals (Sweden)

    Ke Yu

    2011-06-01

    Full Text Available Searching for points of interest (POI in large-volume imagery is a challenging problem with few good solutions. In this work, a neural engineering approach called rapid image triage (RIT which could offer about a ten-fold speed up in POI searching is developed. It is essentially a cortically-coupled computer vision technique, whereby the user is presented bursts of images at a speed of 6–15 images per second and then neural signals called event-related potential (ERP is used as the ‘cue’ for user seeing images of high relevance likelihood. Compared to past efforts, the implemented system has several unique features: (1 it applies overlapping frames in image chip preparation, to ensure rapid image triage performance; (2 a novel common spatial-temporal pattern (CSTP algorithm that makes use of both spatial and temporal patterns of ERP topography is proposed for high-accuracy single-trial ERP detection; (3 a weighted version of probabilistic support-vector-machine (SVM is used to address the inherent unbalanced nature of single-trial ERP detection for RIT. High accuracy, fast learning, and real-time capability of the developed system shown on 20 subjects demonstrate the feasibility of a brainmachine integrated rapid image triage system for fast detection of POI from large-volume imagery.

  1. VA Suicide Prevention Applications Network: A National Health Care System-Based Suicide Event Tracking System.

    Science.gov (United States)

    Hoffmire, Claire; Stephens, Brady; Morley, Sybil; Thompson, Caitlin; Kemp, Janet; Bossarte, Robert M

    2016-11-01

    The US Department of Veterans Affairs' Suicide Prevention Applications Network (SPAN) is a national system for suicide event tracking and case management. The objective of this study was to assess data on suicide attempts among people using Veterans Health Administration (VHA) services. We assessed the degree of data overlap on suicide attempters reported in SPAN and the VHA's medical records from October 1, 2010, to September 30, 2014-overall, by year, and by region. Data on suicide attempters in the VHA's medical records consisted of diagnoses documented with E95 codes from the International Classification of Diseases, Ninth Revision . Of 50 518 VHA patients who attempted suicide during the 4-year study period, data on fewer than half (41%) were reported in both SPAN and the medical records; nearly 65% of patients whose suicide attempt was recorded in SPAN had no data on attempted suicide in the VHA's medical records. Evaluation of administrative data suggests that use of SPAN substantially increases the collection of data on suicide attempters as compared with the use of medical records alone, but neither SPAN nor the VHA's medical records identify all suicide attempters. Further research is needed to better understand the strengths and limitations of both systems and how to best combine information across systems.

  2. Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach

    Directory of Open Access Journals (Sweden)

    Joeri Hofmans

    2017-11-01

    Full Text Available A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories—in the form of the dynamic model of the psychological contract—and research methods—in the form of daily diary research and experience sampling research—are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models—the Zero-Inflated model and the Hurdle model—that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.

  3. Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach.

    Science.gov (United States)

    Hofmans, Joeri

    2017-01-01

    A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories-in the form of the dynamic model of the psychological contract-and research methods-in the form of daily diary research and experience sampling research-are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models-the Zero-Inflated model and the Hurdle model-that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.

  4. Topology and signatures of a model for flux transfer events based on vortex-induced reconnection

    International Nuclear Information System (INIS)

    Liu, Z.X.; Zhu, Z.W.; Li, F.; Pu, Z.Y.

    1992-01-01

    A model of the disturbed magnetic field and disturbed velocity of flux transfer events (FTEs) is deduced on the basis of the vortex-induced reconnection theory. The topology and signatures of FTEs are calculated and discussed. The authors propose that the observed forms of FTE signatures depend on the motional direction of the FTE tube, the positions of the spacecraft relative to the passing FTE tube, and which part of the FTE tube (the magnetosphere part, the magnetopause part, or the magnetosheath part) the spacecraft is passing through. It is found that when a FTE tube moves from south to north along a straight line in the northern hemisphere, positive FTEs appear for most passages; however, reverse FTEs are also observed occasionally while the signatures of B Z (B L ) appear as a single peak, and the irregular FTEs always correspond to oblique line motions of the FTE tube. The velocity signatures are similar to those of the magnetic field, but in the northern hemisphere their directions are all just opposite to the magnetic field. The calculated results for the magnetic field are compared with 61 observed FTEs. The observed signatures (B N and B L ) of 52 FTEs are consistent with the calculations. The results indicate that a majority of observed FTEs correspond to passages of spacecraft through the edges of FTE tubes

  5. Neurophysiological Effects of Meditation Based on Evoked and Event Related Potential Recordings.

    Science.gov (United States)

    Singh, Nilkamal; Telles, Shirley

    2015-01-01

    Evoked potentials (EPs) are a relatively noninvasive method to assess the integrity of sensory pathways. As the neural generators for most of the components are relatively well worked out, EPs have been used to understand the changes occurring during meditation. Event-related potentials (ERPs) yield useful information about the response to tasks, usually assessing attention. A brief review of the literature yielded eleven studies on EPs and seventeen on ERPs from 1978 to 2014. The EP studies covered short, mid, and long latency EPs, using both auditory and visual modalities. ERP studies reported the effects of meditation on tasks such as the auditory oddball paradigm, the attentional blink task, mismatched negativity, and affective picture viewing among others. Both EP and ERPs were recorded in several meditations detailed in the review. Maximum changes occurred in mid latency (auditory) EPs suggesting that maximum changes occur in the corresponding neural generators in the thalamus, thalamic radiations, and primary auditory cortical areas. ERP studies showed meditation can increase attention and enhance efficiency of brain resource allocation with greater emotional control.

  6. A PC-based discrete event simulation model of the civilian radioactive waste management system

    International Nuclear Information System (INIS)

    Airth, G.L.; Joy, D.S.; Nehls, J.W.

    1992-01-01

    This paper discusses a System Simulation Model which has been developed for the Department of Energy to simulate the movement of individual waste packages (spent fuel assemblies and fuel containers) through the Civilian Radioactive Waste Management System (CRWMS). A discrete event simulation language, GPSS/PC, which runs on an IBM/PC and operates under DOS 5.0, mathematically represents the movement and processing of radioactive waste packages through the CRWMS and the interaction of these packages with the equipment in the various facilities. The major features of the System Simulation Model are: the ability to reference characteristics of the different types of radioactive waste (age, burnup, etc.) in order to make operational and/or system design decisions, the ability to place stochastic variations on operational parameters such as processing time and equipment outages, and the ability to include a rigorous simulation of the transportation system. Output from the model includes the numbers, types, and characteristics of waste packages at selected points in the CRWMS and the extent to which various resources will be utilized in order to transport, process, and emplace the waste

  7. Neurophysiological Effects of Meditation Based on Evoked and Event Related Potential Recordings

    Science.gov (United States)

    Singh, Nilkamal; Telles, Shirley

    2015-01-01

    Evoked potentials (EPs) are a relatively noninvasive method to assess the integrity of sensory pathways. As the neural generators for most of the components are relatively well worked out, EPs have been used to understand the changes occurring during meditation. Event-related potentials (ERPs) yield useful information about the response to tasks, usually assessing attention. A brief review of the literature yielded eleven studies on EPs and seventeen on ERPs from 1978 to 2014. The EP studies covered short, mid, and long latency EPs, using both auditory and visual modalities. ERP studies reported the effects of meditation on tasks such as the auditory oddball paradigm, the attentional blink task, mismatched negativity, and affective picture viewing among others. Both EP and ERPs were recorded in several meditations detailed in the review. Maximum changes occurred in mid latency (auditory) EPs suggesting that maximum changes occur in the corresponding neural generators in the thalamus, thalamic radiations, and primary auditory cortical areas. ERP studies showed meditation can increase attention and enhance efficiency of brain resource allocation with greater emotional control. PMID:26137479

  8. Optimized Data Transfers Based on the OpenCL Event Management Mechanism

    Directory of Open Access Journals (Sweden)

    Hiroyuki Takizawa

    2015-01-01

    Full Text Available In standard OpenCL programming, hosts are supposed to control their compute devices. Since compute devices are dedicated to kernel computation, only hosts can execute several kinds of data transfers such as internode communication and file access. These data transfers require one host to simultaneously play two or more roles due to the need for collaboration between the host and devices. The codes for such data transfers are likely to be system-specific, resulting in low portability. This paper proposes an OpenCL extension that incorporates such data transfers into the OpenCL event management mechanism. Unlike the current OpenCL standard, the main thread running on the host is not blocked to serialize dependent operations. Hence, an application can easily use the opportunities to overlap parallel activities of hosts and compute devices. In addition, the implementation details of data transfers are hidden behind the extension, and application programmers can use the optimized data transfers without any tricky programming techniques. The evaluation results show that the proposed extension can use the optimized data transfer implementation and thereby increase the sustained data transfer performance by about 18% for a real application accessing a big data file.

  9. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  10. Discrete Event Simulation-Based Resource Modelling in Health Technology Assessment.

    Science.gov (United States)

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Dixon, Simon

    2017-10-01

    The objective of this article was to conduct a systematic review of published research on the use of discrete event simulation (DES) for resource modelling (RM) in health technology assessment (HTA). RM is broadly defined as incorporating and measuring effects of constraints on physical resources (e.g. beds, doctors, nurses) in HTA models. Systematic literature searches were conducted in academic databases (JSTOR, SAGE, SPRINGER, SCOPUS, IEEE, Science Direct, PubMed, EMBASE) and grey literature (Google Scholar, NHS journal library), enhanced by manual searchers (i.e. reference list checking, citation searching and hand-searching techniques). The search strategy yielded 4117 potentially relevant citations. Following the screening and manual searches, ten articles were included. Reviewing these articles provided insights into the applications of RM: firstly, different types of economic analyses, model settings, RM and cost-effectiveness analysis (CEA) outcomes were identified. Secondly, variation in the characteristics of the constraints such as types and nature of constraints and sources of data for the constraints were identified. Thirdly, it was found that including the effects of constraints caused the CEA results to change in these articles. The review found that DES proved to be an effective technique for RM but there were only a small number of studies applied in HTA. However, these studies showed the important consequences of modelling physical constraints and point to the need for a framework to be developed to guide future applications of this approach.

  11. Spironolactone and risk of upper gastrointestinal events: population based case-control study

    NARCIS (Netherlands)

    K.M.C. Verhamme (Katia); G. Mosis (Georgio); B.H.Ch. Stricker (Bruno); M.C.J.M. Sturkenboom (Miriam); J.P. Dieleman (Jeanne)

    2006-01-01

    textabstractOBJECTIVE: To confirm and quantify any association between spironolactone and upper gastrointestinal bleeding and ulcers. DESIGN: Population based case-control study. SETTING: A primary care information database in the Netherlands. PARTICIPANTS: All people on the

  12. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  13. QoS Assurance for Service-Based Applications Using Discrete-Event Simulation

    OpenAIRE

    Jamoussi , Yassine; Driss , Maha; Jézéquel , Jean-Marc; Hajjami Ben Ghézala , Henda

    2010-01-01

    International audience; The new paradigm for distributed computing over the Internet is that of Web services. The goal of Web services is to achieve universal interoperability between applications by using standardized protocols and languages. One of the key ideas of the Web service paradigm is the ability of building complex and value-added service-based applications by composing preexisting services. For a service-based application, in addition to its functional requirements, Quality of ser...

  14. Architecture design of the multi-functional wavelet-based ECG microprocessor for realtime detection of abnormal cardiac events.

    Science.gov (United States)

    Cheng, Li-Fang; Chen, Tung-Chien; Chen, Liang-Gee

    2012-01-01

    Most of the abnormal cardiac events such as myocardial ischemia, acute myocardial infarction (AMI) and fatal arrhythmia can be diagnosed through continuous electrocardiogram (ECG) analysis. According to recent clinical research, early detection and alarming of such cardiac events can reduce the time delay to the hospital, and the clinical outcomes of these individuals can be greatly improved. Therefore, it would be helpful if there is a long-term ECG monitoring system with the ability to identify abnormal cardiac events and provide realtime warning for the users. The combination of the wireless body area sensor network (BASN) and the on-sensor ECG processor is a possible solution for this application. In this paper, we aim to design and implement a digital signal processor that is suitable for continuous ECG monitoring and alarming based on the continuous wavelet transform (CWT) through the proposed architectures--using both programmable RISC processor and application specific integrated circuits (ASIC) for performance optimization. According to the implementation results, the power consumption of the proposed processor integrated with an ASIC for CWT computation is only 79.4 mW. Compared with the single-RISC processor, about 91.6% of the power reduction is achieved.

  15. Infectious diseases prioritisation for event-based surveillance at the European Union level for the 2012 Olympic and Paralympic Games.

    Science.gov (United States)

    Economopoulou, A; Kinross, P; Domanovic, D; Coulombier, D

    2014-04-17

    In 2012, London hosted the Olympic and Paralympic Games (the Games), with events occurring throughout the United Kingdom (UK) between 27 July and 9 September 2012. Public health surveillance was performed by the Health Protection Agency (HPA). Collaboration between the HPA and the European Centre for Disease Prevention and Control (ECDC) was established for the detection and assessment of significant infectious disease events (SIDEs) occurring outside the UK during the time of the Games. Additionally, ECDC undertook an internal prioritisation exercise to facilitate ECDC’s decisions on which SIDEs should have preferentially enhanced monitoring through epidemic intelligence activities for detection and reporting in daily surveillance in the European Union (EU). A team of ECDC experts evaluated potential public health risks to the Games, selecting and prioritising SIDEs for event-based surveillance with regard to their potential for importation to the Games, occurrence during the Games or export to the EU/European Economic Area from the Games. The team opted for a multilevel approach including comprehensive disease selection, development and use of a qualitative matrix scoring system and a Delphi method for disease prioritisation. The experts selected 71 infectious diseases to enter the prioritisation exercise of which 27 were considered as priority for epidemic intelligence activities by ECDC for the EU for the Games.

  16. Positive predictive value of a register-based algorithm using the Danish National Registries to identify suicidal events

    DEFF Research Database (Denmark)

    Gasse, Christiane; Danielsen, Andreas Aalkjaer; Pedersen, Marianne Giørtz

    2018-01-01

    events overall, by gender, age groups, and calendar time. RESULTS: We retrieved medical records for 357 (75%) people. The PPV of the DK-algorithm to identify suicidal events was 51.5% (95% CI: 46.4-56.7) overall, 42.7% (95% CI: 35.2-50.5) in males, and 58.5% (95% CI: 51.6-65.1) in females. The PPV varied...... further across age groups and calendar time. After excluding cases identified via the DK-algorithm by unspecific codes of intoxications and injury, the PPV improved slightly (56.8% [95% CI: 50.0-63.4]). CONCLUSIONS: The DK-algorithm can reliably identify self-harm with suicidal intention in 52......PURPOSE: It is not possible to fully assess intention of self-harm and suicidal events using information from administrative databases. We conducted a validation study of intention of suicide attempts/self-harm contacts identified by a commonly applied Danish register-based algorithm (DK...

  17. Stress reaction process-based hierarchical recognition algorithm for continuous intrusion events in optical fiber prewarning system

    Science.gov (United States)

    Qu, Hongquan; Yuan, Shijiao; Wang, Yanping; Yang, Dan

    2018-04-01

    To improve the recognition performance of optical fiber prewarning system (OFPS), this study proposed a hierarchical recognition algorithm (HRA). Compared with traditional methods, which employ only a complex algorithm that includes multiple extracted features and complex classifiers to increase the recognition rate with a considerable decrease in recognition speed, HRA takes advantage of the continuity of intrusion events, thereby creating a staged recognition flow inspired by stress reaction. HRA is expected to achieve high-level recognition accuracy with less time consumption. First, this work analyzed the continuity of intrusion events and then presented the algorithm based on the mechanism of stress reaction. Finally, it verified the time consumption through theoretical analysis and experiments, and the recognition accuracy was obtained through experiments. Experiment results show that the processing speed of HRA is 3.3 times faster than that of a traditional complicated algorithm and has a similar recognition rate of 98%. The study is of great significance to fast intrusion event recognition in OFPS.

  18. Lessons learned from events declared to the ASN related to interventional radiology and having occurred during radiation-based acts

    International Nuclear Information System (INIS)

    Lachaume, Jean-Luc

    2014-01-01

    Based on an analysis of events declared to the ASN and inspection observations performed in the field of interventional radiology, this report outlines that the majority of these events could have been avoided and that they result from a lack of culture in radiation protection, notably an unawareness of doses delivered to patients or received by practitioners, and of risks related to exposure to ionizing radiations. The report notably outlines that events are related to a lack of staff and means in the field of patient and personnel radiation protection, an underdeveloped risk management and radiation protection implementation, lacks in the management of delivered or received doses and absence of approaches of professional practice assessment, operator insufficient education, and weaknesses in the management of subcontracted operations. Recommendations are made related to needs in medical radio-physics, identification of acts and patients at risk and definition of patient follow-up modalities, the implementation of an approach of professional practice assessment, the storage of dosimetric data, the improvement of operator technical education, the control of subcontracted operations, and the anticipation of technical and organisational changes

  19. Simple procedure for evaluating earthquake response spectra of large-event motions based on site amplification factors derived from smaller-event records

    International Nuclear Information System (INIS)

    Dan, Kazuo; Miyakoshi, Jun-ichi; Yashiro, Kazuhiko.

    1996-01-01

    A primitive procedure was proposed for evaluating earthquake response spectra of large-event motions to make use of records from smaller events. The result of the regression analysis of the response spectra was utilized to obtain the site amplification factors in the proposed procedure, and the formulation of the seismic-source term in the regression analysis was examined. A linear form of the moment magnitude, Mw, is good for scaling the source term of moderate earthquakes with Mw of 5.5 to 7.0, while a quadratic form of Mw and the ω-square source-spectrum model is appropriate for scaling the source term of smaller and greater earthquakes, respectively. (author). 52 refs

  20. Hydrodynamic Based Decision Making Framework for Impact Assessment of Extreme Storm Events on Coastal Communities

    Science.gov (United States)

    Nazari, R.; Miller, K.; Hurler, C.

    2015-12-01

    Coastal and inland flooding has been a problematic occurrence, specifically over the past century. Global warming has caused an 8 inch sea level rise since 1990, which made the coastal flood zone wider, deeper and more damaging. Additionally, riverine flooding is extremely damaging to the country's substructure and economy as well which causes river banks to overflow, inundating low-lying areas. New Jersey and New York are two areas at severe risk for flood hazard, sea level rise, land depletion and economic loss which are the main study area of this work. A decision making framework is being built to help mitigate the impacts of the environmental and economical dangers of storm surges, sea level rise, flashfloods and inland flooding. With vigorous research and the use of innovative hydrologic modeling software, this tool can be built and utilized to form resiliency for coastal communities. This will allow the individuals living in a coastal community to understand the details of climatic hazards in their area and risks associated to their communities. This tool will also suggest the best solution for the problem each community faces. Atlantic City and New York City has been modeled and compared using potential storm events and the outcomes have been analyzed. The tool offers all the possible solutions for the type of flooding that occurs. Green infrastructure such as rain gardens, detention basins and green roofs can be used as small scale solutions. Greater scale solutions such as removable flood barriers, concrete walls and height adjustable walls will also be displayed if that poses as the best solution. The results and benefits from the simulation and modeling techniques, will allow coastal communities to choose the most appropriate method for building a long lasting and sustainable resilience plan in the future.

  1. Event-Based Analysis of Rainfall-Runoff Response to Assess Wetland-Stream Interaction in the Prairie Pothole Region

    Science.gov (United States)

    Haque, M. A.; Ross, C.; Schmall, A.; Bansah, S.; Ali, G.

    2016-12-01

    Process-based understanding of wetland response to precipitation is needed to quantify the extent to which non-floodplain wetlands - such as Prairie potholes - generate flow and transmit that flow to nearby streams. While measuring wetland-stream (W-S) interaction is difficult, it is possible to infer it by examining hysteresis characteristics between wetland and stream stage during individual precipitation events. Hence, to evaluate W-S interaction, 10 intact and 10 altered/lost potholes were selected for study; they are located in Broughton's Creek Watershed (Manitoba, Canada) on both sides of a 5 km creek reach. Stilling wells (i.e., above ground wells) were deployed in the intact and altered wetlands to monitor surface water level fluctuations while water table wells were drilled below drainage ditches to a depth of 1 m to monitor shallow groundwater fluctuations. All stilling wells and water table wells were equipped with capacitance water level loggers to monitor fluctuations in surface water and shallow groundwater every 15 minutes. In 2013 (normal year) and 2014 (wet year), 15+ precipitation events were identified and scatter plots of wetland (x-axis) versus stream (y-axis) stage were built to identify W-S hysteretic dynamics. Initial data analysis reveals that in dry antecedent conditions, intact and altered wetlands show clockwise W-S relations, while drained wetlands show anticlockwise W-S hysteresis. However, in wetter antecedent conditions, all wetland types show anticlockwise hysteresis. Future analysis will target the identification of thresholds in antecedent moisture conditions that determine significant changes in event wetland response characteristics (e.g., the delay between the start of rainfall and stream stage, the maximum water level rise in each wetland during each event, the delay between the start of rainfall and peak wetland stage) as well as hysteresis properties (e.g., gradient and area of the hysteresis loop).

  2. Risk of affective disorders following prenatal exposure to severe life events: a Danish population-based cohort study.

    LENUS (Irish Health Repository)

    Khashan, Ali S

    2012-01-31

    OBJECTIVE: To examine the effect of prenatal exposure to severe life events on risk of affective disorders in the offspring. METHODS: In a cohort of 1.1 million Danish births from May 1978 until December 1997, mothers were considered exposed if one (or more) of their close relatives died or was diagnosed with serious illness up to 6 months before conception or during pregnancy. Offspring were followed up from their 10th birthday until their death, migration, onset of affective disorder or 31 December 2007; hospital admissions were identified by linkage to the Central Psychiatric Register. Log-linear Poisson regression was used for data analysis. RESULTS: The risk of affective disorders was increased in male offspring whose mothers were exposed to severe life events during the second trimester (adjusted RR 1.55 [95% CI 1.05-2.28]). There was an increased risk of male offspring affective disorders in relation to maternal exposure to death of a relative in the second trimester (adjusted RR 1.74 [95% CI 1.06-2.84]) or serious illness in a relative before pregnancy (adjusted RR 1.44 [95% CI 1.02-2.05]). There was no evidence for an association between prenatal exposure to severe life events and risk of female offspring affective disorders. CONCLUSIONS: Our population-based study suggests that prenatal maternal exposure to severe life events may increase the risk of affective disorders in male offspring. These findings are consistent with studies of populations exposed to famine and earthquake disasters which indicate that prenatal environment may influence the neurodevelopment of the unborn child.

  3. The Response of Different Audiences to Place-based Communication about the Role of Climate Change in Extreme Weather Events

    Science.gov (United States)

    Halperin, A.; Walton, P.

    2015-12-01

    As the science of extreme event attribution grows, there is an increasing need to understand how the public responds to this type of climate change communication. Extreme event attribution has the unprecedented potential to locate the effects of climate change in the here and now, but there is little information about how different facets of the public might respond to these local framings of climate change. Drawing on theories of place attachment and psychological distance, this paper explores how people with different beliefs and values shift their willingness to mitigate and adapt to climate change in response to local or global communication of climate change impacts. Results will be presented from a recent survey of over 600 Californians who were each presented with one of three experimental conditions: 1) a local framing of the role of climate change in the California drought 2) a global framing of climate change and droughts worldwide, or 3) a control condition of no text. Participants were categorized into groups based on their prior beliefs about climate change according to the Six Americas classification scheme (Leiserowitz et al., 2011). The results from the survey in conjunction with qualitative results from follow-up interviews shed insight into the importance of place in communicating climate change for people in each of the Six Americas. Additional results examine the role of gender and political affiliation in mediating responses to climate change communication. Despite research that advocates unequivocally for local framing of climate change, this study offers a more nuanced perspective of under which circumstances extreme event attribution might be an effective tool for changing behaviors. These results could be useful for scientists who wish to gain a better understanding of how their event attribution research is perceived or for educators who want to target their message to audiences where it could have the most impact.

  4. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    Science.gov (United States)

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.

  5. VLSI implementation of a 2.8 Gevent/s packet based AER interface with routing and event sorting functionality

    Directory of Open Access Journals (Sweden)

    Stefan eScholze

    2011-10-01

    Full Text Available State-of-the-art large scale neuromorphic systems require sophisticated spike event communication between units of the neural network. We present a high-speed communication infrastructure for a waferscale neuromorphic system, based on application-specific neuromorphic communication ICs in an FPGA-maintained environment. The ICs implement configurable axonal delays, as required for certain types of dynamic processing or for emulating spike based learning among distant cortical areas. Measurements are presented which show the efficacy of these delays in influencing behaviour of neuromorphic benchmarks. The specialized, dedicated AER communication in most current systems requires separate, low-bandwidth configuration channels. In contrast, the configuration of the waferscale neuromorphic system is also handled by the digital packet-based pulse channel, which transmits configuration data at the full bandwidth otherwise used for pulse transmission. The overall so-called pulse communication subgroup (ICs and FPGA delivers a factor 25-50 more event transmission rate than other current neuromorphic communication infrastructures.

  6. Multi-Agent System based Event-Triggered Hybrid Controls for High-Security Hybrid Energy Generation Systems

    DEFF Research Database (Denmark)

    Dou, Chun-Xia; Yue, Dong; Guerrero, Josep M.

    2017-01-01

    This paper proposes multi-agent system based event- triggered hybrid controls for guaranteeing energy supply of a hybrid energy generation system with high security. First, a mul-ti-agent system is constituted by an upper-level central coordi-nated control agent combined with several lower......-level unit agents. Each lower-level unit agent is responsible for dealing with internal switching control and distributed dynamic regula-tion for its unit system. The upper-level agent implements coor-dinated switching control to guarantee the power supply of over-all system with high security. The internal...

  7. Low-Cost National Media-Based Surveillance System for Public Health Events, Bangladesh

    Science.gov (United States)

    Ao, Trong T.; Rahman, Mahmudur; Haque, Farhana; Chakraborty, Apurba; Hossain, M. Jahangir; Haider, Sabbir; Alamgir, A.S.M.; Sobel, Jeremy; Luby, Stephen P.

    2016-01-01

    We assessed a media-based public health surveillance system in Bangladesh during 2010–2011. The system is a highly effective, low-cost, locally appropriate, and sustainable outbreak detection tool that could be used in other low-income, resource-poor settings to meet the capacity for surveillance outlined in the International Health Regulations 2005. PMID:26981877

  8. Collaborative DDoS Defense using Flow-based Security Event Information

    NARCIS (Netherlands)

    Steinberger, Jessica; Kuhnert, Benjamin; Sperotto, Anna; Baier, Harald; Pras, Aiko

    Over recent years, network-based attacks evolved to the top concerns responsible for network infrastructure and service outages. To counteract such attacks, an approach is to move mitigation from the target network to the networks of Internet Service Providers (ISP). In addition, exchanging threat

  9. Multi Agent System Based Process Control in Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun

    2013-01-01

    emergent states, but also those unusual control process variations when unexpected situation is experienced. A hybrid simulation platform based on MATLAB/LabVIEW and real time digital simulator (RTDS) is set up to simulate a voltage collapse case in the power system of Eastern Denmark and demonstrate...

  10. Complex event processing for content-based text, image, and video retrieval

    NARCIS (Netherlands)

    Bowman, E.K.; Broome, B.D.; Holland, V.M.; Summers-Stay, D.; Rao, R.M.; Duselis, J.; Howe, J.; Madahar, B.K.; Boury-Brisset, A.C.; Forrester, B.; Kwantes, P.; Burghouts, G.; Huis, J. van; Mulayim, A.Y.

    2016-01-01

    This report summarizes the findings of an exploratory team of the North Atlantic Treaty Organization (NATO) Information Systems Technology panel into Content-Based Analytics (CBA). The team carried out a technical review into the current status of theoretical and practical developments of methods,

  11. Wide Area Protection Scheme Preventing Cascading Events based on Improved Impedance relay

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun

    2013-01-01

    Load flow transferring after an initial contingency is regarded as one of the main reasons of causing unexpected cascading trips. A multi agent system (MAS) based wide area protection strategy is proposed in this paper to predict the load flow transferring from the point of view of impedance relays...

  12. GIS-based soil liquefaction susceptibility map of Mumbai city for earthquake events

    Science.gov (United States)

    Mhaske, Sumedh Yamaji; Choudhury, Deepankar

    2010-03-01

    The problem of liquefaction of soil during seismic event is one of the important topics in the field of Geotechnical Earthquake Engineering. Liquefaction of soil is generally occurs in loose cohesionless saturated soil when pore water pressure increases suddenly due to induced ground motion and shear strength of soil decreases to zero and leading the structure situated above to undergo a large settlement, or failure. The failures took place due to liquefaction induced soil movement spread over few square km area continuously. Hence this is a problem where spatial variation involves and to represent this spatial variation Geographic Information System (GIS) is very useful in decision making about the area subjected to liquefaction. In this paper, GIS software GRAM++ is used to prepare soil liquefaction susceptibility map for entire Mumbai city in India by marking three zones viz. critically liquefiable soil, moderately liquefiable soil and non liquefiable soil. Extensive field borehole test data for groundwater depth, standard penetration test (SPT) blow counts, dry density, wet density and specific gravity, etc. have been collected from different parts of Mumbai. Simplified procedure of Youd et al. (2001) is used for calculation of factor of safety against soil liquefaction potential. Mumbai city and suburban area are formed by reclaiming lands around seven islands since 1865 till current date and still it is progressing in the area such as Navi Mumbai and beyond Borivali to Mira road suburban area. The factors of safety against soil liquefaction were determined for earthquake moment magnitude ranging from Mw = 5.0 to 7.5. It is found that the areas like Borivali, Malad, Dahisar, Bhandup may prone to liquefaction for earthquake moment magnitude ranging from Mw = 5.0 to 7.5. The liquefaction susceptibility maps were created by using GRAM++ by showing the areas where the factor of safety against the soil liquefaction is less than one. Proposed liquefaction

  13. Evidence Base Update for Psychosocial Treatments for Children and Adolescents Exposed to Traumatic Events

    Science.gov (United States)

    Dorsey, Shannon; McLaughlin, Katie A.; Kerns, Suzanne E. U.; Harrison, Julie P.; Lambert, Hilary K.; Briggs, Ernestine C.; Cox, Julia Revillion; Amaya-Jackson, Lisa

    2016-01-01

    Child and adolescent trauma exposure is prevalent, with trauma exposure-related symptoms, including posttraumatic stress, depressive, and anxiety symptoms often causing substantial impairment. This article updates the evidence base on psychosocial treatments for child and adolescent trauma exposure completed for this journal by Silverman et al. (2008). For this review, we focus on 37 studies conducted during the seven years since the last review. Treatments are grouped by overall treatment family (e.g., cognitive behavioral therapy), treatment modality (e.g., individual vs. group), and treatment participants (e.g., child only vs. child and parent). All studies were evaluated for methodological rigor according to Journal of Clinical Child & Adolescent Psychology evidence-based treatment evaluation criteria (Southam-Gerow & Prinstein, 2014), with cumulative designations for level of support for each treatment family. Individual CBT with parent involvement, individual CBT, and group CBT were deemed well-established; group CBT with parent involvement and eye movement desensitization and reprocessing (EMDR) were deemed probably efficacious; individual integrated therapy for complex trauma and group mind–body skills were deemed possibly efficacious; individual client-centered play therapy, individual mind–body skills, and individual psychoanalysis were deemed experimental; and group creative expressive + CBT was deemed questionable efficacy. Advances in the evidence base, with comparisons to the state of the science at the time of the Silverman et al. (2008) review, are discussed. Finally, we present dissemination and implementation challenges and areas for future research. PMID:27759442

  14. Acute cardiovascular events and all-cause mortality in patients with hyperthyroidism: a population-based cohort study.

    Science.gov (United States)

    Dekkers, Olaf M; Horváth-Puhó, Erzsébet; Cannegieter, Suzanne C; Vandenbroucke, Jan P; Sørensen, Henrik Toft; Jørgensen, Jens Otto L

    2017-01-01

    Several studies have shown an increased risk for cardiovascular disease (CVD) in hyperthyroidism, but most studies have been too small to address the effect of hyperthyroidism on individual cardiovascular endpoints. Our main aim was to assess the association among hyperthyroidism, acute cardiovascular events and mortality. It is a nationwide population-based cohort study. Data were obtained from the Danish Civil Registration System and the Danish National Patient Registry, which covers all Danish hospitals. We compared the rate of all-cause mortality as well as venous thromboembolism (VTE), acute myocardial infarction (AMI), ischemic and non-ischemic stroke, arterial embolism, atrial fibrillation (AF) and percutaneous coronary intervention (PCI) in the two cohorts. Hazard ratios (HR) with 95% confidence intervals (95% CI) were estimated. The study included 85 856 hyperthyroid patients and 847 057 matched population-based controls. Mean follow-up time was 9.2 years. The HR for mortality was highest in the first 3 months after diagnosis of hyperthyroidism: 4.62, 95% CI: 4.40-4.85, and remained elevated during long-term follow-up (>3 years) (HR: 1.35, 95% CI: 1.33-1.37). The risk for all examined cardiovascular events was increased, with the highest risk in the first 3 months after hyperthyroidism diagnosis. The 3-month post-diagnosis risk was highest for atrial fibrillation (HR: 7.32, 95% CI: 6.58-8.14) and arterial embolism (HR: 6.08, 95% CI: 4.30-8.61), but the risks of VTE, AMI, ischemic and non-ischemic stroke and PCI were increased also 2- to 3-fold. We found an increased risk for all-cause mortality and acute cardiovascular events in patients with hyperthyroidism. © 2017 European Society of Endocrinology.

  15. Extreme events in total ozone over the Northern mid-latitudes: an analysis based on long-term data sets from five European ground-based stations

    Energy Technology Data Exchange (ETDEWEB)

    Rieder, Harald E. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland)), e-mail: hr2302@columbia.edu; Jancso, Leonhardt M. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Inst. for Meteorology and Geophysics, Univ. of Innsbruck, Innsbruck (Austria)); Di Rocco, Stefania (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Dept. of Geography, Univ. of Zurich, Zurich (Switzerland)) (and others)

    2011-11-15

    We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear 'fingerprints' of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector

  16. Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time

    Directory of Open Access Journals (Sweden)

    Yousuke Kawachi

    2011-06-01

    Full Text Available We examined how stream/bounce event perception is affected by motion correspondence based on the surface features of moving objects passing behind an occlusion. In the stream/bounce display two identical objects moving across each other in a two-dimensional display can be perceived as either streaming through or bouncing off each other at coincidence. Here, surface features such as colour (Experiments 1 and 2 or luminance (Experiment 3 were switched between the two objects at coincidence. The moment of coincidence was invisible to observers due to an occluder. Additionally, the presentation of the moving objects was manipulated in duration after the feature switch at coincidence. The results revealed that a postcoincidence duration of approximately 200 ms was required for the visual system to stabilize judgments of stream/bounce events by determining motion correspondence between the objects across the occlusion on the basis of the surface feature. The critical duration was similar across motion speeds of objects and types of surface features. Moreover, controls (Experiments 4a–4c showed that cognitive bias based on feature (colour/luminance congruency across the occlusion could not fully account for the effects of surface features on the stream/bounce judgments. We discuss the roles of motion correspondence, visual feature processing, and attentive tracking in the stream/bounce judgments.

  17. A High-Efficiency Uneven Cluster Deployment Algorithm Based on Network Layered for Event Coverage in UWSNs

    Directory of Open Access Journals (Sweden)

    Shanen Yu

    2016-12-01

    Full Text Available Most existing deployment algorithms for event coverage in underwater wireless sensor networks (UWSNs usually do not consider that network communication has non-uniform characteristics on three-dimensional underwater environments. Such deployment algorithms ignore that the nodes are distributed at different depths and have different probabilities for data acquisition, thereby leading to imbalances in the overall network energy consumption, decreasing the network performance, and resulting in poor and unreliable late network operation. Therefore, in this study, we proposed an uneven cluster deployment algorithm based network layered for event coverage. First, according to the energy consumption requirement of the communication load at different depths of the underwater network, we obtained the expected value of deployment nodes and the distribution density of each layer network after theoretical analysis and deduction. Afterward, the network is divided into multilayers based on uneven clusters, and the heterogeneous communication radius of nodes can improve the network connectivity rate. The recovery strategy is used to balance the energy consumption of nodes in the cluster and can efficiently reconstruct the network topology, which ensures that the network has a high network coverage and connectivity rate in a long period of data acquisition. Simulation results show that the proposed algorithm improves network reliability and prolongs network lifetime by significantly reducing the blind movement of overall network nodes while maintaining a high network coverage and connectivity rate.

  18. Distributed Event-Based Set-Membership Filtering for a Class of Nonlinear Systems With Sensor Saturations Over Sensor Networks.

    Science.gov (United States)

    Ma, Lifeng; Wang, Zidong; Lam, Hak-Keung; Kyriakoulis, Nikos

    2017-11-01

    In this paper, the distributed set-membership filtering problem is investigated for a class of discrete time-varying system with an event-based communication mechanism over sensor networks. The system under consideration is subject to sector-bounded nonlinearity, unknown but bounded noises and sensor saturations. Each intelligent sensing node transmits the data to its neighbors only when certain triggering condition is violated. By means of a set of recursive matrix inequalities, sufficient conditions are derived for the existence of the desired distributed event-based filter which is capable of confining the system state in certain ellipsoidal regions centered at the estimates. Within the established theoretical framework, two additional optimization problems are formulated: one is to seek the minimal ellipsoids (in the sense of matrix trace) for the best filtering performance, and the other is to maximize the triggering threshold so as to reduce the triggering frequency with satisfactory filtering performance. A numerically attractive chaos algorithm is employed to solve the optimization problems. Finally, an illustrative example is presented to demonstrate the effectiveness and applicability of the proposed algorithm.

  19. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    Science.gov (United States)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  20. Mapping of crop calendar events by object-based analysis of MODIS and ASTER images

    Directory of Open Access Journals (Sweden)

    A.I. De Castro

    2014-06-01

    Full Text Available A method to generate crop calendar and phenology-related maps at a parcel level of four major irrigated crops (rice, maize, sunflower and tomato is shown. The method combines images from the ASTER and MODIS sensors in an object-based image analysis framework, as well as testing of three different fitting curves by using the TIMESAT software. Averaged estimation of calendar dates were 85%, from 92% in the estimation of emergence and harvest dates in rice to 69% in the case of harvest date in tomato.

  1. A microprocessor-based single board computer for high energy physics event pattern recognition

    International Nuclear Information System (INIS)

    Bernstein, H.; Gould, J.J.; Imossi, R.; Kopp, J.K.; Love, W.A.; Ozaki, S.; Platner, E.D.; Kramer, M.A.

    1981-01-01

    A single board MC 68000 based computer has been assembled and bench marked against the CDC 7600 running portions of the pattern recognition code used at the MPS. This computer has a floating coprocessor to achieve throughputs equivalent to several percent that of the 7600. A major part of this work was the construction of a FORTRAN compiler including assembler, linker and library. The intention of this work is to assemble a large number of these single board computers in a parallel FASTBUS environment to act as an on-line and off-line filter for the raw data from MPS II and ISABELLE experiments. (orig.)

  2. A Discrete-Events Simulation Approach for Evaluation of Service-Based Applications

    OpenAIRE

    Driss , Maha; Jamoussi , Yassine; Jézéquel , Jean-Marc; Ben Ghézala , Henda Hajjami

    2008-01-01

    International audience; One of the promises of the service-oriented architecture(SOA) is to build complex added-value services in order to enhance and extend existing ones. service-based applications(SBAs) are asked not only to perform required functionalities,but also to deliver expected level of Quality of Service (QoS). Dealing with QoS management of such distributed applications, which are executed in dynamic environments,raises the need to consider context characteristics.This paper prop...

  3. Microprocessor-based single board computer for high energy physics event pattern recognition

    International Nuclear Information System (INIS)

    Bernstein, H.; Gould, J.J.; Imossi, R.; Kopp, J.K.; Love, W.A.; Ozaki, S.; Platner, E.D.; Kramer, M.A.

    1981-01-01

    A single board MC 68000 based computer has been assembled and bench marked against the CDC 7600 running portions of the pattern recognition code used at the MPS. This computer has a floating coprocessor to achieve throughputs equivalent to several percent that of the 7600. A major part of this work was the construction of a FORTRAN compiler including assembler, linker and library. The intention of this work is to assemble a large number of these single board computers in a parallel FASTBUS environment to act as an on-line and off-line filter for the raw data from MPS II and ISABELLE experiments

  4. Time response of protection in event of vacuum failure based on Nude ionization gauge controller

    Science.gov (United States)

    Gao, Hui; Wang, Qiuping; Wang, Weibin; Wu, Qinglin; Chen, Wentong; Sheng, Liusi; Zhang, Yunwu

    2001-10-01

    This article describes the design and application of fast-response vacuum protection sensor module, based on the Nude ionization gauge and a homemade controller named GH07X. A simulative test indicated that the controller's response time was less than 200 μs when 1 atm air rushed into the vacuum system through a pulsed valve with 0.8 mm orifice nozzle and the emitting current of the Nude gauge was 4 mA. The experiment result showed that the response time mainly depended on the gas density as well as the electron emitting current of the Nude gauge filament. Compared with the vacuum protection sensors based on sputter ion pump and cold-cathode gauge, GH07X is faster and reliable besides, GH07X can be used as an ultrahigh-vacuum slow valve interlock controller with response time of 100 ms, which is faster than other gauge controllers. The widely used field-bus interface CAN and common serial interface RS232/RS485 are embedded in GH07X controller system.

  5. Event-based nonpoint source pollution prediction in a scarce data catchment

    Science.gov (United States)

    Chen, Lei; Sun, Cheng; Wang, Guobo; Xie, Hui; Shen, Zhenyao

    2017-09-01

    Quantifying the rainfall-runoff-pollutant (R-R-P) process is key to regulating non-point source (NPS) pollution; however, the impacts of scarce measured data on R-R-P simulations have not yet been reported. In this study, we conducted a comprehensive study of scarce data that addressed both rainfall-runoff and runoff-pollutant processes, whereby the impacts of data scarcity on two commonly used methods, including Unit Hydrograph (UH) and Loads Estimator (LOADEST), were quantified. A case study was performed in a typical small catchment of the Three Gorges Reservoir Region (TGRR) of China. Based on our results, the classification of rainfall patterns should be carried out first when analyzing modeling results. Compared to data based on a missing rate and a missing location, key information generates more impacts on the simulated flow and NPS loads. When the scarcity rate exceeds a certain threshold (20% in this study), measured data scarcity level has clear impacts on the model's accuracy. As the model of total nitrogen (TN) always performs better under different data scarcity conditions, researchers are encouraged to pay more attention to continuous the monitoring of total phosphorus (TP) for better NPS-TP predictions. The results of this study serve as baseline information for hydrologic forecasting and for the further control of NPS pollutants.

  6. Evaluating TCMS Train-to-Ground communication performances based on the LTE technology and discreet event simulations

    DEFF Research Database (Denmark)

    Bouaziz, Maha; Yan, Ying; Kassab, Mohamed

    2018-01-01

    is shared between the train and different passengers. The simulation is based on the discrete-events network simulator Riverbed Modeler. Next, second step focusses on a co-simulation testbed, to evaluate performances with real traffic based on Hardware-In-The-Loop and OpenAirInterface modules. Preliminary...... (Long Term Evolution) network as an alternative communication technology, instead of GSM-R (Global System for Mobile communications-Railway) because of some capacity and capability limits. First step, a pure simulation is used to evaluate the network load for a high-speed scenario, when the LTE network...... simulation and co-simulation results show that LTE provides good performance for the TCMS traffic exchange in terms of packet delay and data integrity...

  7. A calculation of baryon diffusion constant in hot and dense hadronic matter based on an event generator URASiMA

    International Nuclear Information System (INIS)

    Sasaki, N.; Miyamura, O.; Nonaka, C.; Muroya, S.

    2000-01-01

    We evaluate thermodynamical quantities and transport coefficient of a dense and hot hadronic matter based on an event generator URASiMA (Ultra-Relativistic AA collision Simulator based on Multiple Scattering Algorithm). The statistical ensembles in equilibrium with fixed temperature and chemical potential are generated by imposing periodic boundary condition to the simulation of URASiMA, where energy density and baryon number density is conserved. Achievement of the thermal equilibrium and the chemical equilibrium are confirmed by the common value of slope parameter in the energy distributions and the saturation of the numbers of contained particles, respectively. By using the generated ensembles, we investigate the temperature dependence and the chemical potential dependence of the baryon diffusion constant of a dense and hot hadronic matter. (author)

  8. RFID-based information visibility for hospital operations: exploring its positive effects using discrete event simulation.

    Science.gov (United States)

    Asamoah, Daniel A; Sharda, Ramesh; Rude, Howard N; Doran, Derek

    2016-10-12

    Long queues and wait times often occur at hospitals and affect smooth delivery of health services. To improve hospital operations, prior studies have developed scheduling techniques to minimize patient wait times. However, these studies lack in demonstrating how such techniques respond to real-time information needs of hospitals and efficiently manage wait times. This article presents a multi-method study on the positive impact of providing real-time scheduling information to patients using the RFID technology. Using a simulation methodology, we present a generic scenario, which can be mapped to real-life situations, where patients can select the order of laboratory services. The study shows that information visibility offered by RFID technology results in decreased wait times and improves resource utilization. We also discuss the applicability of the results based on field interviews granted by hospital clinicians and administrators on the perceived barriers and benefits of an RFID system.

  9. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE)

    Science.gov (United States)

    Ajayakumar, J.; Shook, E.; Turner, V. K.

    2017-10-01

    With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful

  10. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data.

    Science.gov (United States)

    Dexter, Franklin; Wachtel, Ruth E; Epstein, Richard H

    2011-01-07

    No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Our technical advance is the

  11. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    Directory of Open Access Journals (Sweden)

    Epstein Richard H

    2011-01-01

    Full Text Available Abstract Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1. Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS data for most scenarios (43 of 45. Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of

  12. Molecular aspects in inflammatory events of temporomandibular joint: Microarray-based identification of mediators

    Directory of Open Access Journals (Sweden)

    Naomi Ogura

    2015-02-01

    Full Text Available Synovial inflammation (synovitis frequently accompanies intracapsular pathologic conditions of the temporomandibular joint (TMJ such as internal derangement (ID and/or osteoarthritis (OA, and is suggested to be associated with symptom severity. To identify the putative factors associated with synovitis, we investigated interleukin (IL-1β- and/or tumor necrosis factor (TNF-α-responsive genes of fibroblast-like synoviocytes (FLS from patients with ID and/or OA of TMJ using microarray analysis. In this review, we first summarize the FLS of TMJ and the signaling pathways of IL-1β and TNF-α. Next, we show the up-regulated genes in FLS after stimulation with IL-1β or TNF-α, and summarize the gene functions based on recent studies. Among the top 10 up-regulated factors, molecules such as IL-6 and cycrooxygense-2 have been well characterized and investigated in the inflammatory responses and tissue destruction associated with joint diseases such as RA and OA, but the roles of some molecules remain unclear. The FLS reaction can lead to the synthesis and release of a wide variety of inflammatory mediators. Some of these mediators are detected in joint tissues and synovial fluids under intracapsular pathologic conditions, and may represent potential targets for therapeutic interventions in ID and/or OA of TMJ.

  13. Characterization of aerosol pollution events in France using ground-based and POLDER-2 satellite data

    Directory of Open Access Journals (Sweden)

    M. Kacenelenbogen

    2006-01-01

    Full Text Available We analyze the relationship between daily fine particle mass concentration (PM2.5 and columnar aerosol optical thickness derived from the Polarization and Directionality of Earth's Reflectances (POLDER satellite sensor. The study is focused over France during the POLDER-2 lifetime between April and October 2003. We have first compared the POLDER derived aerosol optical thickness (AOT with integrated volume size distribution derived from ground-based Sun Photometer observations. The good correlation (R=0.72 with sub-micron volume fraction indicates that POLDER derived AOT is sensitive to the fine aerosol mass concentration. Considering 1974 match-up data points over 28 fine particle monitoring sites, the POLDER-2 derived AOT is fairly well correlated with collocated PM2.5 measurements, with a correlation coefficient of 0.55. The correlation coefficient reaches a maximum of 0.80 for particular sites. We have analyzed the probability to find an appropriate air quality category (AQC as defined by U.S. Environmental Protection Agency (EPA from POLDER-2 AOT measurements. The probability can be up to 88.8% (±3.7% for the "Good" AQC and 89.1% (±3.6% for the "Moderate" AQC.

  14. Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform

    Directory of Open Access Journals (Sweden)

    Massimiliano eGiulioni

    2016-02-01

    Full Text Available We demonstrate robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission and follows the basic theoretical principles presented in (Benosman et al. 2014: the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. The same basic principle is embedded in the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina. Mimicking those cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. We built a 3x3 test grid of independent detectors, each observing a different portion of the scene, so that our final output is a spike train encoding a 3x3 optical flow vector field. In this work we focus on the architectural aspects, and we demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene.

  15. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Paul-Eric DOSSOU

    2013-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  16. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Pawel PAWLEWSKI

    2012-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  17. Video event classification and image segmentation based on noncausal multidimensional hidden Markov models.

    Science.gov (United States)

    Ma, Xiang; Schonfeld, Dan; Khokhar, Ashfaq A

    2009-06-01

    In this paper, we propose a novel solution to an arbitrary noncausal, multidimensional hidden Markov model (HMM) for image and video classification. First, we show that the noncausal model can be solved by splitting it into multiple causal HMMs and simultaneously solving each causal HMM using a fully synchronous distributed computing framework, therefore referred to as distributed HMMs. Next we present an approximate solution to the multiple causal HMMs that is based on an alternating updating scheme and assumes a realistic sequential computing framework. The parameters of the distributed causal HMMs are estimated by extending the classical 1-D training and classification algorithms to multiple dimensions. The proposed extension to arbitrary causal, multidimensional HMMs allows state transitions that are dependent on all causal neighbors. We, thus, extend three fundamental algorithms to multidimensional causal systems, i.e., 1) expectation-maximization (EM), 2) general forward-backward (GFB), and 3) Viterbi algorithms. In the simulations, we choose to limit ourselves to a noncausal 2-D model whose noncausality is along a single dimension, in order to significantly reduce the computational complexity. Simulation results demonstrate the superior performance, higher accuracy rate, and applicability of the proposed noncausal HMM framework to image and video classification.

  18. A browser-based event display for the CMS Experiment at the LHC using WebGL

    Science.gov (United States)

    McCauley, T.

    2017-10-01

    Modern web browsers are powerful and sophisticated applications that support an ever-wider range of uses. One such use is rendering high-quality, GPU-accelerated, interactive 2D and 3D graphics in an HTML canvas. This can be done via WebGL, a JavaScript API based on OpenGL ES. Applications delivered via the browser have several distinct benefits for the developer and user. For example, they can be implemented using well-known and well-developed technologies, while distribution and use via a browser allows for rapid prototyping and deployment and ease of installation. In addition, delivery of applications via the browser allows for easy use on mobile, touch-enabled devices such as phones and tablets. iSpy WebGL is an application for visualization of events detected and reconstructed by the CMS Experiment at the Large Hadron Collider at CERN. The first event display developed for an LHC experiment to use WebGL, iSpy WebGL is a client-side application written in JavaScript, HTML, and CSS and uses the WebGL API three.js. iSpy WebGL is used for monitoring of CMS detector performance, for production of images and animations of CMS collisions events for the public, as a virtual reality application using Google Cardboard, and asa tool available for public education and outreach such as in the CERN Open Data Portal and the CMS masterclasses. We describe here its design, development, and usage as well as future plans.

  19. Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform.

    Science.gov (United States)

    Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B

    2016-01-01

    Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks.

  20. Comparison of two recent storm surge events based on results of field surveys

    Science.gov (United States)

    Nakamura, Ryota; Shibayama, Tomoya; Mikami, Takahito; Esteban, Miguel; Takagi, Hiroshi; Maell, Martin; Iwamoto, Takumu

    2017-10-01

    This paper compares two different types of storm surge disaster based on field surveys. Two cases: a severe storm surge flood with its height of over 5 m due to Typhoon Haiyan (2013) in Philippine, and inundation of storm surge around Nemuro city in Hokkaido of Japan with its maximum surge height of 2.8 m caused by extra-tropical cyclone are taken as examples. For the case of the Typhoon Haiyan, buildings located in coastal region were severely affected due to a rapidly increase in ocean surface. The non-engineering buildings were partially or completely destroyed due to their debris transported to an inner bay region. In fact, several previous reports indicated two unique features, bore-like wave and remarkably high speed currents. These characteristics of the storm surge may contribute to a wide-spread corruption for the buildings around the affected region. Furthermore, in the region where the surge height was nearly 3 m, the wooden houses were completely or partially destroyed. On the other hand, in Nemuro city, a degree of suffering in human and facility caused by the storm surge is minor. There was almost no partially destroyed residential houses even though the height of storm surge reached nearly 2.8 m. An observation in the tide station in Nemuro indicated that this was a usual type of storm surge, which showed a gradual increase of sea level height in several hours without possessing the unique characteristics like Typhoon Haiyan. As a result, not only the height of storm surge but also the robustness of the buildings and characteristics of storm surge, such as bore like wave and strong currents, determined the existent of devastation in coastal regions.

  1. Experiences of citizen-based reporting of rainfall events using lab-generated videos

    Science.gov (United States)

    Alfonso, Leonardo; Chacon, Juan

    2016-04-01

    Hydrologic studies rely on the availability of good-quality precipitation estimates. However, in remote areas of the world and particularly in developing countries, ground-based measurement networks are either sparse or nonexistent. This creates difficulties in the estimation of precipitation, which limits the development of hydrologic forecasting and early warning systems for these regions. The EC-FP7 WeSenseIt project aims at exploring the involvement of citizens in the observation of the water cycle with innovative sensor technologies, including mobile telephony. In particular, the project explores the use of a smartphone applications to facilitate the reporting water-related situations. Apart from the challenge of using such information for scientific purposes, the citizen engagement is one of the most important issues to address. To this end effortless methods for reporting need to be developed in order to involve as many people as possible in these experiments. A potential solution to overcome these drawbacks, consisting on lab-controlled rainfall videos have been produced to help mapping the extent and distribution of rainfall fields with minimum effort [1]. In addition, the quality of the collected rainfall information has also been studied [2] by means of different experiments with students. The present research shows the latest results of the application of this method and evaluates the experiences in some cases. [1] Alfonso, L., J. Chacón, and G. Peña-Castellanos (2015), Allowing Citizens to Effortlessly Become Rainfall Sensors, in 36th IAHR World Congress edited, The Hague, the Netherlands [2] Cortes-Arevalo, J., J. Chacón, L. Alfonso, and T. Bogaard (2015), Evaluating data quality collected by using a video rating scale to estimate and report rainfall intensity, in 36th IAHR World Congress edited, The Hague, the Netherlands

  2. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor

    Directory of Open Access Journals (Sweden)

    Bodo eRückauer

    2016-04-01

    Full Text Available In this study we compare nine optical flow algorithms that locally measure the flow normal to edges according to accuracy and computation cost. In contrast to conventional, frame-based motion flow algorithms, our open-source implementations compute optical flow based on address-events from a neuromorphic Dynamic Vision Sensor (DVS. For this benchmarking we created a dataset of two synthesized and three real samples recorded from a 240x180 pixel Dynamic and Active-pixel Vision Sensor (DAVIS. This dataset contains events from the DVS as well as conventional frames to support testing state-of-the-art frame-based methods. We introduce a new source for the ground truth: In the special case that the perceived motion stems solely from a rotation of the vision sensor around its three camera axes, the true optical flow can be estimated using gyro data from the inertial measurement unit integrated with the DAVIS camera. This provides a ground-truth to which we can compare algorithms that measure optical flow by means of motion cues. An analysis of error sources led to the use of a refractory period, more accurate numerical derivatives and a Savitzky-Golay filter to achieve significant improvements in accuracy. Our pure Java implementations of two recently published algorithms reduce computational cost by up to 29% compared to the original implementations. Two of the algorithms introduced in this paper further speed up processing by a factor of 10 compared with the original implementations, at equal or better accuracy. On a desktop PC, they run in real-time on dense natural input recorded by a DAVIS camera.

  3. The costs associated with adverse event procedures for an international HIV clinical trial determined by activity-based costing.

    Science.gov (United States)

    Chou, Victoria B; Omer, Saad B; Hussain, Hamidah; Mugasha, Christine; Musisi, Maria; Mmiro, Francis; Musoke, Philippa; Jackson, J Brooks; Guay, Laura A

    2007-12-01

    To determine costs for adverse event (AE) procedures for a large HIV perinatal trial by analyzing actual resource consumption using activity-based costing (ABC) in an international research setting. The AE system for an ongoing clinical trial in Uganda was evaluated using ABC techniques to determine costs from the perspective of the study. Resources were organized into cost categories (eg, personnel, patient care expenses, laboratory testing, equipment). Cost drivers were quantified, and unit cost per AE was calculated. A subset of time and motion studies was performed prospectively to observe clinic personnel time required for AE identification. In 18 months, there were 9028 AEs, with 970 (11%) reported as serious adverse events. Unit cost per AE was $101.97. Overall, AE-related costs represented 32% ($920,581 of $2,834,692) of all study expenses. Personnel ($79.30) and patient care ($11.96) contributed the greatest proportion of component costs. Reported AEs were predominantly nonserious (mild or moderate severity) and unrelated to study drug(s) delivery. Intensive identification and management of AEs to conduct clinical trials ethically and protect human subjects require expenditure of substantial human and financial resources. Better understanding of these resource requirements should improve planning and funding of international HIV-related clinical trials.

  4. Monetary Incentive Effects on Event-Based Prospective Memory Three Months after Traumatic Brain Injury in Children

    Science.gov (United States)

    Pedroza, Claudia; Chapman, Sandra B.; Cook, Lori G.; Vásquez, Ana C.; Levin, Harvey S.

    2011-01-01

    Information regarding the remediation of event-based prospective memory (EB-PM) impairments following pediatric traumatic brain injury (TBI) is scarce. Addressing this, two levels of monetary incentives were used to improve EB-PM in children ages 7 to 16 years with orthopedic injuries (OI, n = 51), or moderate (n = 25), and severe (n = 39) TBI at approximately three months postinjury. The EB-PM task consisted of the child giving a specific verbal response to a verbal cue from the examiner while performing a battery of neuropsychological measures (ongoing task). Significant effects were found for Age-at-Test, Motivation Condition, Period, and Group. Within-group analyses indicated OI and moderate TBI groups performed significantly better under the high-versus low-incentive condition, but the severe TBI group demonstrated no significant improvement. These results indicate EB-PM can be significantly improved at three months postinjury in children with moderate, but not severe, TBI. PMID:21347945

  5. Fostering Organizational Innovation based on modeling the Marketing Research Process through Event-driven Process Chain (EPC

    Directory of Open Access Journals (Sweden)

    Elena Fleacă

    2016-11-01

    Full Text Available Enterprises competing in an actual business framework are required to win and maintain their competitiveness by flexibility, fast reaction and conformation to the changing customers' needs based on innovation of work related to products, services, and internal processes. The paper addresses these challenges which gain more complex bonds in a case of high pressure for innovation. The methodology commences with a literature review of the current knowledge on innovation through business processes management. Secondly, it has been applied the Event-driven Process Chain tool from the scientific literature to model the variables of marketing research process. The findings highlight benefits of marketing research workflow that enhances the value of market information while reducing costs of obtaining it, in a coherent way.

  6. On-board event processing algorithms for a CCD-based space borne X-ray spectrometer

    International Nuclear Information System (INIS)

    Chun, H.J.; Bowles, J.A.; Branduardi-Raymont, G.; Gowen, R.A.

    1996-01-01

    This paper describes two alternative algorithms which are applied to reduce the telemetry requirements for a Charge Coupled Device (CCD) based, space-borne, X-ray spectrometer by on-board reconstruction of the X-ray events split over two or more adjacent pixels. The algorithms have been developed for the Reflection Grating Spectrometer (RGS) on the X-ray multi-mirror (XMM) mission, the second cornerstone project in the European Space Agency's Horizon 2000 programme. The overall instrument and some criteria which provide the background of the development of the algorithms, implemented in Tartan ADA on an MA31750 microprocessor, are described. The on-board processing constraints and requirements are discussed, and the performances of the algorithms are compared. Test results are presented which show that the recursive implementation is faster and has a smaller executable file although it uses more memory because of its stack requirements. (orig.)

  7. A Step towards a Sharable Community Knowledge Base for WRF Settings -Developing a WRF Setting Methodology based on a case study in a Torrential Rainfall Event

    Science.gov (United States)

    CHU, Q.; Xu, Z.; Zhuo, L.; Han, D.

    2016-12-01

    Increased requirements for interactions between different disciplines and readily access to the numerical weather forecasting system featured with portability and extensibility have made useful contribution to the increases of downstream model users in WRF over recent years. For these users, a knowledge base classified by the representative events would be much helpful. This is because the determination of model settings is regarded as the most important steps in WRF. However, such a process is generally time-consuming, even if with a high computational platform. As such, we propose a sharable proper lookup table on WRF domain settings and corresponding procedures based on a representative torrential rainfall event in Beijing, China. It has been found that WRF's simulations' drift away from the input lateral boundary conditions can be significantly reduced with the adjustment of the domain settings. Among all the impact factors, the placement of nested domain can not only affect the moving speed and angle of the storm-center, but also the location and amount of heavy-rain-belt which can only be detected with adjusted spatial resolutions. Spin-up time is also considered in the model settings, which is demonstrated to have the most obvious influence on the accuracy of the simulations. This conclusion is made based on the large diversity of spatial distributions of precipitation, in terms of the amount of heavy rain varied from -30% to 58% among each experiment. After following all the procedures, the variations of domain settings have minimal effect on the modeling and show the best correlation (larger than 0.65) with fusion observations. So the model settings, including domain size covering the greater Beijing area, 1:5:5 downscaling ratio, 57 vertical levels with top of 50hpa and 60h spin-up time, are found suitable for predicting the similar convective torrential rainfall event in Beijing area. We hope that the procedure for building the community WRF knowledge

  8. Allowing Brief Delays in Responding Improves Event-Based Prospective Memory for Young Adults Living with HIV Disease

    Science.gov (United States)

    Loft, Shayne; Doyle, Katie L.; Naar-King, Sylvie; Outlaw, Angulique Y.; Nichols, Sharon L.; Weber, Erica; Blackstone, Kaitlin; Woods, Steven Paul

    2014-01-01

    Event-based prospective memory (PM) tasks require individuals to remember to perform an action when they encounter a specific cue in the environment, and have clear relevance for daily functioning for individuals with HIV. In many everyday tasks, the individual must not only maintain the intent to perform the PM task, but the PM task response also competes with the alternative and more habitual task response. The current study examined whether event-based PM can be improved by slowing down the pace of the task environment. Fifty-seven young adults living with HIV performed an ongoing lexical decision task while simultaneously performing a PM task of monitoring for a specific word (which was focal to the ongoing task of making lexical decisions) or syllable contained in a word (which was nonfocal). Participants were instructed to refrain from making task responses until after a tone was presented, which occurred at varying onsets (0–1600ms) after each stimulus appeared. Improvements in focal and non-focal PM accuracy were observed with response delays of 600ms. Furthermore, the difference in PM accuracy between the low demand focal PM task and the resource demanding non-focal PM task was reduced by half across increasingly longer delays, falling from 31% at 0ms delay to only 14% at 1600ms delay. The degree of ongoing task response slowing for the PM conditions, relative to a control condition that did not have a PM task and made lexical decisions only, also decreased with increased delay. Overall, the evidence indicates that delaying the task responses of younger HIV-infected adults increased the probability that the PM relevant features of task stimuli were adequately assessed prior to the ongoing task response, and by implication that younger HIV infected adults can more adequately achieve PM goals when the pace of the task environment is slowed down. PMID:25116075

  9. A joint Cluster and ground-based instruments study of two magnetospheric substorm events on 1 September 2002

    Directory of Open Access Journals (Sweden)

    N. C. Draper

    2004-12-01

    Full Text Available We present a coordinated ground- and space-based multi-instrument study of two magnetospheric substorm events that occurred on 1 September 2002, during the interval from 18:00 UT to 24:00 UT. Data from the Cluster and Polar spacecraft are considered in combination with ground-based magnetometer and HF radar data. During the first substorm event the Cluster spacecraft, which were in the Northern Hemisphere lobe, are to the west of the main region affected by the expansion phase. Nevertheless, substorm signatures are seen by Cluster at 18:25 UT (just after the expansion phase onset as seen on the ground at 18:23 UT, despite the ~5 RE} distance of the spacecraft from the plasma sheet. The Cluster spacecraft then encounter an earthward-moving diamagnetic cavity at 19:10 UT, having just entered the plasma sheet boundary layer. The second substorm expansion phase is preceded by pseudobreakups at 22:40 and 22:56 UT, at which time thinning of the near-Earth, L=6.6, plasma sheet occurs. The expansion phase onset at 23:05 UT is seen simultaneously in the ground magnetic field, in the magnetotail and at Polar's near-Earth position. The response in the ionospheric flows occurs one minute later. The second substorm better fits the near-Earth neutral line model for substorm onset than the cross-field current instability model. Key words. Magnetospheric physics (Magnetosphereionosphere interactions; Magnetic reconnection; Auroral phenomenon

  10. Event Investigation

    International Nuclear Information System (INIS)

    Korosec, D.

    2000-01-01

    The events in the nuclear industry are investigated from the license point of view and from the regulatory side too. It is well known the importance of the event investigation. One of the main goals of such investigation is to prevent the circumstances leading to the event and the consequences of the event. The protection of the nuclear workers against nuclear hazard, and the protection of general public against dangerous effects of an event could be achieved by systematic approach to the event investigation. Both, the nuclear safety regulatory body and the licensee shall ensure that operational significant events are investigated in a systematic and technically sound manner to gather information pertaining to the probable causes of the event. One of the results should be appropriate feedback regarding the lessons of the experience to the regulatory body, nuclear industry and general public. In the present paper a general description of systematic approach to the event investigation is presented. The systematic approach to the event investigation works best where cooperation is present among the different divisions of the nuclear facility or regulatory body. By involving management and supervisors the safety office can usually improve their efforts in the whole process. The end result shall be a program which serves to prevent events and reduce the time and efforts solving the root cause which initiated each event. Selection of the proper method for the investigation and an adequate review of the findings and conclusions lead to the higher level of the overall nuclear safety. (author)

  11. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  12. An efficient routing algorithm for event based monitoring in a plant using virtual sink nodes in a wireless sensor network

    International Nuclear Information System (INIS)

    Jain, Sanjay Kumar; Vietla, Srinivas; Roy, D.A.; Biswas, B.B.; Pithawa, C.K.

    2010-01-01

    A Wireless Sensor Network is a collection of wireless sensor nodes arranged in a self-forming network without aid of any infrastructure or administration. The individual nodes have limited resources and hence efficient communication mechanisms between the nodes have to be devised for continued operation of the network in a plant environment. In wireless sensor networks a sink node or base station at one end acts as the recipient of information gathered by all other sensor nodes in the network and the information arrives at the sink through multiple hops across the nodes of the network. A routing algorithm has been developed in which a virtual sink node is generated whenever hop count of an ordinary node crosses a certain specified value. The virtual sink node acts as a recipient node for data of all neighboring nodes. This virtual sink helps in reducing routing overhead, especially when the sensor network is scaled to a larger network. The advantages with this scheme are less energy consumption, reduced congestion in the network and longevity of the network. The above algorithm is suitable for event based or interval based monitoring systems in nuclear plants. This paper describes the working of the proposed algorithm and provides its implementation details. (author)

  13. Event generators for address event representation transmitters

    Science.gov (United States)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  14. Irrigated Agriculture in Morocco: An Agent-Based Model of Adaptation and Decision Making Amid Increasingly Frequent Drought Events

    Science.gov (United States)

    Norton, M.

    2015-12-01

    In the past 100 years, Morocco has undertaken a heavy investment in developing water infrastructure that has led to a dramatic expansion of irrigated agriculture. Irrigated agriculture is the primary user of water in many arid countries, often accounting for 80-90% of total water usage. Irrigation is adopted by farmers not only because it leads to increased production, but also because it improves resilience to an uncertain climate. However, the Mediterranean region as a whole has also seen an increase in the frequency and severity of drought events. These droughts have had a dramatic impact on farmer livelihoods and have led to a number of coping strategies, including the adoption or disadoption of irrigation. In this study, we use a record of the annual extent of irrigated agriculture in Morocco to model the effect of drought on the extent of irrigated agriculture. Using an agent-based socioeconomic model, we seek to answer the following questions: 1) Do farmers expand irrigated agriculture in response to droughts? 2) Do drought events entail the removal of perennial crops like orchards? 3) Can we detect the retreat of irrigated agriculture in the more fragile watersheds of Morocco? Understanding the determinants of irrigated crop expansion and contractions will help us understand how agro-ecological systems transition from 20th century paradigms of expansion of water supply to a 21st century paradigm of water use efficiency. The answers will become important as countries learn how to manage water in new climate regimes characterized by less reliable and available precipitation.

  15. The Agency of Event

    DEFF Research Database (Denmark)

    Nicholas, Paul; Tamke, Martin; Riiber, Jacob

    2014-01-01

    This paper explores the notion of agency within event-based models. We present an event-based modeling approach that links interdependent generative, analytic and decision making sub-models within a system of exchange. Two case study projects demonstrate the underlying modeling concepts and metho...

  16. A comparative evaluation of emerging methods for errors of commission based on applications to the Davis-Besse (1985) event

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B.; Dang, V.N.; Hirschberg, S. [Paul Scherrer Inst., Nuclear Energy and Safety Research Dept., CH-5232 Villigen PSI (Switzerland); Straeter, O. [Gesellschaft fur Anlagen- und Reaktorsicherheit (Germany)

    1999-12-01

    In considering the human role in accidents, the classical PSA methodology applied today focuses primarily on the omissions of actions required of the operators at specific points in the scenario models. A practical, proven methodology is not available for systematically identifying and analyzing the scenario contexts in which the operators might perform inappropriate actions that aggravate the scenario. As a result, typical PSA's do not comprehensively treat these actions, referred to as errors of commission (EOCs). This report presents the results of a joint project of the Paul Scherrer Institut (PSI, Villigen, Switzerland) and the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, Garching, Germany) that examined some methods recently proposed for addressing the EOC issue. Five methods were investigated: 1 ) ATHEANA, 2) the Borssele screening methodology. 3) CREAM, 4) CAHR, and 5) CODA. In addition to a comparison of their scope, basic assumptions, and analytical approach, the methods were each applied in the analysis of PWR Loss of Feedwater scenarios based on the 1985 Davis-Besse event, in which the operator response included actions that can be categorized as EOCs. The aim was to compare how the methods consider a concrete scenario in which EOCs have in fact been observed. These case applications show how the methods are used in practical terms and constitute a common basis for comparing the methods and the insights that they provide. The identification of the potentially significant EOCs to be analysed in the PSA is currently the central problem for their treatment. The identification or search scheme has to consider an extensive set of potential actions that the operators may take. These actions may take place instead of required actions, for example, because the operators fail to assess the plant state correctly, or they may occur even when no action is required. As a result of this broad search space, most methodologies apply multiple schemes to

  17. A comparative evaluation of emerging methods for errors of commission based on applications to the Davis-Besse (1985) event

    International Nuclear Information System (INIS)

    Reer, B.; Dang, V.N.; Hirschberg, S.; Straeter, O.

    1999-12-01

    In considering the human role in accidents, the classical PSA methodology applied today focuses primarily on the omissions of actions required of the operators at specific points in the scenario models. A practical, proven methodology is not available for systematically identifying and analyzing the scenario contexts in which the operators might perform inappropriate actions that aggravate the scenario. As a result, typical PSA's do not comprehensively treat these actions, referred to as errors of commission (EOCs). This report presents the results of a joint project of the Paul Scherrer Institut (PSI, Villigen, Switzerland) and the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, Garching, Germany) that examined some methods recently proposed for addressing the EOC issue. Five methods were investigated: 1 ) ATHEANA, 2) the Borssele screening methodology. 3) CREAM, 4) CAHR, and 5) CODA. In addition to a comparison of their scope, basic assumptions, and analytical approach, the methods were each applied in the analysis of PWR Loss of Feedwater scenarios based on the 1985 Davis-Besse event, in which the operator response included actions that can be categorized as EOCs. The aim was to compare how the methods consider a concrete scenario in which EOCs have in fact been observed. These case applications show how the methods are used in practical terms and constitute a common basis for comparing the methods and the insights that they provide. The identification of the potentially significant EOCs to be analysed in the PSA is currently the central problem for their treatment. The identification or search scheme has to consider an extensive set of potential actions that the operators may take. These actions may take place instead of required actions, for example, because the operators fail to assess the plant state correctly, or they may occur even when no action is required. As a result of this broad search space, most methodologies apply multiple schemes to

  18. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department

    Directory of Open Access Journals (Sweden)

    Cholada Kittipittayakorn

    2016-01-01

    Full Text Available Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries’ healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES and agent-based simulation (ABS to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.

  19. The Cognitive Aging of Episodic Memory: A View Based On The Event-Related Brain Potential (ERP

    Directory of Open Access Journals (Sweden)

    David eFriedman

    2013-08-01

    Full Text Available A cardinal feature of older-adult cognition is a decline, relative to the young, in the encoding and retrieval of personally-relevant events, i.e. episodic memory (EM. A consensus holds that familiarity, a relatively automatic feeling of knowing that can support recognition-memory judgments, is preserved with aging. By contrast, recollection, which requires the effortful, strategic recovery of contextual detail, declines as we age. Over the last decade, ERPs have become increasingly important tools in the study of the aging of EM, because a few, well-researched EM effects have been associated with the cognitive processes thought to underlie successful EM performance. EM effects are operationalized by subtracting the ERPs elicited by correctly-rejected, new items from those to correctly recognized, old items. Although highly controversial, the mid-frontal effect (a positive component between ~300 and 500 ms, maximal at fronto-central scalp sites is thought to reflect familiarity-based recognition. A positivity between ~500 and 800 ms, maximal at left-parietal scalp, has been labeled the left-parietal EM effect. A wealth of evidence suggests that this brain activity reflects recollection-based retrieval. Here, I review the ERP evidence in support of the hypothesis that familiarity is maintained while recollection is compromised in older relative to young adults. I consider the possibility that the inconsistency in findings may be due to individual differences in performance, executive function and quality of life indices, such as socio-economic status.

  20. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    Science.gov (United States)

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.

  1. SENTINEL EVENTS

    Directory of Open Access Journals (Sweden)

    Andrej Robida

    2004-09-01

    Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of

  2. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE

    Directory of Open Access Journals (Sweden)

    J. Ajayakumar

    2017-10-01

    Full Text Available With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE. SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the “North American storm complex” in December 2015, the “snowstorm Jonas” in January 2016, the “West Virginia floods” in June 2016, and the “Hurricane Matthew” in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with

  3. Event Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2001-01-01

    The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...

  4. A Novel Idea for Optimizing Condition-Based Maintenance Using Genetic Algorithms and Continuous Event Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-01-01

    Full Text Available Effective maintenance strategies are of utmost significance for system engineering due to their direct linkage with financial aspects and safety of the plants’ operation. At a point where the state of a system, for instance, level of its deterioration, can be constantly observed, a strategy based on condition-based maintenance (CBM may be affected; wherein upkeep of the system is done progressively on the premise of monitored state of the system. In this article, a multicomponent framework is considered that is continuously kept under observation. In order to decide an optimal deterioration stage for the said system, Genetic Algorithm (GA technique has been utilized that figures out when its preventive maintenance should be carried out. The system is configured into a multiobjective problem that is aimed at optimizing the two desired objectives, namely, profitability and accessibility. For the sake of reality, a prognostic model portraying the advancements of deteriorating system has been employed that will be based on utilization of continuous event simulation techniques. In this regard, Monte Carlo (MC simulation has been shortlisted as it can take into account a wide range of probable options that can help in reducing uncertainty. The inherent benefits proffered by the said simulation technique are fully utilized to display various elements of a deteriorating system working under stressed environment. The proposed synergic model (GA and MC is considered to be more effective due to the employment of “drop-by-drop approach” that permits successful drive of the related search process with regard to the best optimal solutions.

  5. 基于情景的突发事件链构建方法*%A Scenario-based Construction Method of Emergency Event Chain

    Institute of Scientific and Technical Information of China (English)

    马骁霏; 仲秋雁; 曲毅; 王宁; 王延章

    2013-01-01

    突发事件前兆缺失,危害巨大,其次生、衍生事件构成的事件链给人类造成严重的影响。针对突发事件演变路径不明,形成的链式结构复杂多变的特点,借鉴共性知识模型,利用知识元刻画情景与事件,通过知识元间的关联,分析情景与事件相互作用关系,事件间的演变规律;以数据和规则为驱动,提出了一种基于情景的突发事件链构建方法,该方法可为“情景-应对”决策提供了支持,最后给出案例分析。%Lacking of precursor, emergency events do harm to human society greatly. Emergency event chain, constituted by secondary e-vents, has severe negative effects on human race. The chain structure of emergency event is complex and varies frequently, and it is diffi-cult to judge the evolution path for emergency events. Drawing on the common knowledge model, using knowledge element to express scenario and events, the paper analyzes the interaction between scenario and events and the evolution of events based on the relationship of knowledge element. Using data and rule as drive, a scenario-based construction method of emergency event chain is proposed. The meth-od supports“scenario-response” policy decision-making. At last, a case study is presented.

  6. Web-Based versus High-Fidelity Simulation Training for Certified Registered Nurse Anesthetists in the Management of High Risk/Low Occurrence Anesthesia Events

    Science.gov (United States)

    Kimemia, Judy

    2017-01-01

    Purpose: The purpose of this project was to compare web-based to high-fidelity simulation training in the management of high risk/low occurrence anesthesia related events, to enhance knowledge acquisition for Certified Registered Nurse Anesthetists (CRNAs). This project was designed to answer the question: Is web-based training as effective as…

  7. Multi-day activity scheduling reactions to planned activities and future events in a dynamic agent-based model of activity-travel behavior

    NARCIS (Netherlands)

    Nijland, E.W.L.; Arentze, T.A.; Timmermans, H.J.P.

    2009-01-01

    Modeling multi-day planning has received scarce attention today in activity-based transport demand modeling. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that

  8. Creating personalized memories from social events: community-based support for multi-camera recordings of school concerts

    NARCIS (Netherlands)

    R.L. Guimarães (Rodrigo); P.S. Cesar Garcia (Pablo Santiago); D.C.A. Bulterman (Dick); V. Zsombori; I. Kegel

    2011-01-01

    htmlabstractThe wide availability of relatively high-quality cameras makes it easy for many users to capture video fragments of social events such as concerts, sports events or community gatherings. The wide availability of simple sharing tools makes it nearly as easy to upload individual fragments

  9. Aerosol events in the broader Mediterranean basin based on 7-year (2000–2007 MODIS C005 data

    Directory of Open Access Journals (Sweden)

    A. Gkikas

    2009-09-01

    Full Text Available Aerosol events (their frequency and intensity in the broader Mediterranean basin were studied using 7-year (2000–2007 aerosol data of optical depth (AOD at 550 nm from the MODerate Resolution Imaging Spectroradiometer (MODIS Terra. The complete spatial coverage of data revealed a significant spatial variability of aerosol events which is also dependent on their intensity. Strong events occur more often in the western and central Mediterranean basin (up to 14 events/year whereas extreme events (AOD up to 5.0 are systematically observed in the eastern Mediterranean basin throughout the year. There is also a significant seasonal variability with strong aerosol events occurring most frequently in the western part of the basin in summer and extreme episodes in the eastern part during spring. The events were also analyzed separately over land and sea revealing differences that are due to the different natural and anthropogenic processes, like dust transport (producing maximum frequencies of extreme episodes in spring over both land and sea or forest fires (producing maximum frequencies in strong episodes in summer over land. The inter-annual variability shows a gradual decrease in the frequency of all aerosol episodes over land and sea areas of the Mediterranean during the period 2000–2007, associated with an increase in their intensity (increased AOD values. The strong spatiotemporal variability of aerosol events indicates the need for monitoring them at the highest spatial and temporal coverage and resolution.

  10. Relevancies of multiple-interaction events and signal-to-noise ratio for Anger-logic based PET detector designs

    Science.gov (United States)

    Peng, Hao

    2015-10-01

    A fundamental challenge for PET block detector designs is to deploy finer crystal elements while limiting the number of readout channels. The standard Anger-logic scheme including light sharing (an 8 by 8 crystal array coupled to a 2×2 photodetector array with an optical diffuser, multiplexing ratio: 16:1) has been widely used to address such a challenge. Our work proposes a generalized model to study the impacts of two critical parameters on spatial resolution performance of a PET block detector: multiple interaction events and signal-to-noise ratio (SNR). The study consists of the following three parts: (1) studying light output profile and multiple interactions of 511 keV photons within crystal arrays of different crystal widths (from 4 mm down to 1 mm, constant height: 20 mm); (2) applying the Anger-logic positioning algorithm to investigate positioning/decoding uncertainties (i.e., "block effect") in terms of peak-to-valley ratio (PVR), with light sharing, multiple interactions and photodetector SNR taken into account; and (3) studying the dependency of spatial resolution on SNR in the context of modulation transfer function (MTF). The proposed model can be used to guide the development and evaluation of a standard Anger-logic based PET block detector including: (1) selecting/optimizing the configuration of crystal elements for a given photodetector SNR; and (2) predicting to what extent additional electronic multiplexing may be implemented to further reduce the number of readout channels.

  11. Feasibility of a neutron detector-dosemeter based on single-event upsets in dynamic random-access memories

    International Nuclear Information System (INIS)

    Phillips, G.W.; August, R.A.; Campbell, A.B.; Nelson, M.E.; Guardala, N.A.; Price, J.L.; Moscovitch, M.

    2002-01-01

    The feasibility was investigated of a solid-state neutron detector/dosemeter based on single-event upset (SEU) effects in dynamic random-access memories (DRAMs), commonly used in computer memories. Such a device, which uses a neutron converter material to produce a charged particle capable of causing an upset, would be light-weight, low-power, and could be read simply by polling the memory for bit flips. It would have significant advantages over standard solid-state neutron dosemeters which require off-line processing for track etching and analysis. Previous efforts at developing an SEU neutron detector/dosemeter have suffered from poor response, which can be greatly enhanced by selecting a modern high-density DRAM chip for SEU sensitivity and by using a thin 10 B film as a converter. Past attempts to use 10 B were not successful because the average alpha particle energy was insufficient to penetrate to the sensitive region of the memory. This can be overcome by removing the surface passivation layer before depositing the 10 B film or by implanting 10B directly into the chip. Previous experimental data show a 10 3 increase in neutron sensitivity by chips containing borosilicate glass, which could be used in an SEU detector. The results are presented of simulations showing that the absolute efficiency of an SEU neutron dosemeter can be increased by at least a factor of 1000 over earlier designs. (author)

  12. Community event-based outreach screening for syphilis and other sexually transmissible infections among gay men in Sydney, Australia.

    Science.gov (United States)

    Read, Phillip J; Knight, Vickie; Bourne, Christopher; Guy, Rebecca; Donovan, Basil; Allan, Warwick; McNulty, Anna M

    2013-08-01

    Objectives Increased testing frequency is a key strategy in syphilis control, but achieving regular testing is difficult. The objective of this study is to describe a sexually transmissible infection (STI) testing outreach program (the Testing Tent) at a gay community event. Gay men attending the testing tent in 2010-11 completed a computer-assisted self-interview and were screened for STIs. Clinical, demographic, behavioural and diagnostic data were compared with gay men attending a clinic-based service during 2009. The Testing Tent was marketed on social media sites and data were extracted on the number of times the advertisements were viewed. Staffing, laboratory, marketing and venue hire expenses were calculated to estimate the cost of delivering the service. Ninety-eight men attended the Testing Tent. They were older (median age: 42 years v. 30 years; Pfacilities are an acceptable option and are accessed by gay men requiring regular testing, and may be an important addition to traditional testing environments.

  13. Deficits in cue detection underlie event-based prospective memory impairment in major depression: an eye tracking study.

    Science.gov (United States)

    Chen, Siyi; Zhou, Renlai; Cui, Hong; Chen, Xinyin

    2013-10-30

    This study examined the cue detection in the non-focal event-based prospective memory (PM) of individuals with and without a major depressive disorder using behavioural and eye tracking assessments. The participants were instructed to search on each trial for a different target stimulus that could be present or absent and to make prospective responses to the cue object. PM tasks included cue only and target plus cue, whereas ongoing tasks included target only and distracter only. The results showed that a) participants with depression performed more poorly than those without depression in PM; b) participants with depression showed more fixations and longer total and average fixation durations in both ongoing and PM conditions; c) participants with depression had lower scores on accuracy in target-plus-cue trials than in cue-only trials and had a higher gaze rate of targets on hits and misses in target-plus-cue trials than did those without depression. The results indicate that the state of depression may impair top-down cognitive control function, which in turn results in particular deficits in the engagement of monitoring for PM cues. Copyright © 2013. Published by Elsevier Ireland Ltd.

  14. A Simple and Sensitive Plant-Based Western Corn Rootworm Bioassay Method for Resistance Determination and Event Selection.

    Science.gov (United States)

    Wen, Zhimou; Chen, Jeng Shong

    2018-05-26

    We report here a simple and sensitive plant-based western corn rootworm, Diabrotica virgifera virgifera LeConte (Coleoptera: Chrysomelidae), bioassay method that allows for examination of multiple parameters for both plants and insects in a single experimental setup within a short duration. For plants, injury to roots can be visually examined, fresh root weight can be measured, and expression of trait protein in plant roots can be analyzed. For insects, in addition to survival, larval growth and development can be evaluated in several aspects including body weight gain, body length, and head capsule width. We demonstrated using the method that eCry3.1Ab-expressing 5307 corn was very effective against western corn rootworm by eliciting high mortality and significantly inhibiting larval growth and development. We also validated that the method allowed determination of resistance in an eCry3.1Ab-resistant western corn rootworm strain. While data presented in this paper demonstrate the usefulness of the method for selection of events of protein traits and for determination of resistance in laboratory populations, we envision that the method can be applied in much broader applications.

  15. Three Dimensional Numerical Simulation of Rocket-based Combined-cycle Engine Response During Mode Transition Events

    Science.gov (United States)

    Edwards, Jack R.; McRae, D. Scott; Bond, Ryan B.; Steffan, Christopher (Technical Monitor)

    2003-01-01

    The GTX program at NASA Glenn Research Center is designed to develop a launch vehicle concept based on rocket-based combined-cycle (RBCC) propulsion. Experimental testing, cycle analysis, and computational fluid dynamics modeling have all demonstrated the viability of the GTX concept, yet significant technical issues and challenges still remain. Our research effort develops a unique capability for dynamic CFD simulation of complete high-speed propulsion devices and focuses this technology toward analysis of the GTX response during critical mode transition events. Our principal attention is focused on Mode 1/Mode 2 operation, in which initial rocket propulsion is transitioned into thermal-throat ramjet propulsion. A critical element of the GTX concept is the use of an Independent Ramjet Stream (IRS) cycle to provide propulsion at Mach numbers less than 3. In the IRS cycle, rocket thrust is initially used for primary power, and the hot rocket plume is used as a flame-holding mechanism for hydrogen fuel injected into the secondary air stream. A critical aspect is the establishment of a