WorldWideScience

Sample records for initiation events based

  1. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  2. Initiating events frequency determination

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2004-01-01

    The paper describes work performed for the Nuclear Power Station (NPS). Work is related to the periodic initiating events frequency update for the Probabilistic Safety Assessment (PSA). Data for all relevant NPS initiating events (IE) were reviewed. The main focus was on events occurring during most recent operating history (i.e., last four years). The final IE frequencies were estimated by incorporating both NPS experience and nuclear industry experience. Each event was categorized according to NPS individual plant examination (IPE) initiating events grouping approach. For the majority of the IE groups, few, or no events have occurred at the NPS. For those IE groups with few or no NPS events, the final estimate was made by means of a Bayesian update with general nuclear industry values. Exceptions are rare loss-of-coolant-accidents (LOCA) events, where evaluation of engineering aspects is used in order to determine frequency.(author)

  3. RAS Initiative - Events

    Science.gov (United States)

    The NCI RAS Initiative has organized multiple events with outside experts to discuss how the latest scientific and technological breakthroughs can be applied to discover vulnerabilities in RAS-driven cancers.

  4. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    OpenAIRE

    Tramblay, Yves; Bouvier, Christophe; Martin, C.; Didon-Lescot, J. F.; Todorovik, D.; Domergue, J. M.

    2010-01-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture,...

  5. Integrated Initiating Event Performance Indicators

    International Nuclear Information System (INIS)

    S. A. Eide; Dale M. Rasmuson; Corwin L. Atwood

    2005-01-01

    The U.S. Nuclear Regulatory Commission Industry Trends Program (ITP) collects and analyses industry-wide data, assesses the safety significance of results, and communicates results to Congress and other stakeholders. This paper outlines potential enhancements in the ITP to comprehensively cover the Initiating Events Cornerstone of Safety. Future work will address other cornerstones of safety. The proposed Tier 1 activity involves collecting data on ten categories of risk-significant initiating events, trending the results, and comparing early performance with prediction limits (allowable numbers of events, above which NRC action may occur). Tier 1 results would be used to monitor industry performance at the level of individual categories of initiating events. The proposed Tier 2 activity involves integrating the information for individual categories of initiating events into a single risk-based indicator, termed the Baseline Risk Index for Initiating Events or BRIIE. The BRIIE would be evaluated yearly and compared against a threshold. BRIIE results would be reported to Congress on a yearly basis

  6. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    OpenAIRE

    Wooyeon Sunwoo; Minha Choi

    2017-01-01

    Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC) prior to a rainfall even...

  7. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  8. Identification of Initiating Events for PGSFR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jintae; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The Sodium-cooled Fast Reactor (SFR) is by far the most advanced reactor of the six Generation IV reactors. The SFR uses liquid sodium as the reactor coolant, which has superior heat transport characteristics. It also allows high power density with low coolant volume fraction and operation at low pressure. In Korea, KAERI has been developing Prototype Generation-IV Sodium-cooled Fast Reactor (PGSFR) that employs passive safety systems and inherent reactivity feedback effects. In order to prepare for the licensing, it is necessary to assess the safety of the reactor. Thus, the objective of this study is to conduct accident sequence analysis that can contribute to risk assessment. The analysis embraces identification of initiating events and accident sequences development. PGSFR is to test and demonstrate the performance of transuranic (TRU)-containing metal fuel required for a commercial SFR, and to demonstrate the TRU transmutation capability of a burner reactor as a part of an advanced fuel cycle system. Initiating events that can happen in PGSFR were identified through the MLD method. This method presents a model of a plant in terms of individual events and their combinations in a systematic and logical way. The 11 identified initiating events in this study include the events considered in the past analysis that was conducted for PRISM-150.

  9. Identification of Initiating Events for PGSFR

    International Nuclear Information System (INIS)

    Kim, Jintae; Jae, Moosung

    2016-01-01

    The Sodium-cooled Fast Reactor (SFR) is by far the most advanced reactor of the six Generation IV reactors. The SFR uses liquid sodium as the reactor coolant, which has superior heat transport characteristics. It also allows high power density with low coolant volume fraction and operation at low pressure. In Korea, KAERI has been developing Prototype Generation-IV Sodium-cooled Fast Reactor (PGSFR) that employs passive safety systems and inherent reactivity feedback effects. In order to prepare for the licensing, it is necessary to assess the safety of the reactor. Thus, the objective of this study is to conduct accident sequence analysis that can contribute to risk assessment. The analysis embraces identification of initiating events and accident sequences development. PGSFR is to test and demonstrate the performance of transuranic (TRU)-containing metal fuel required for a commercial SFR, and to demonstrate the TRU transmutation capability of a burner reactor as a part of an advanced fuel cycle system. Initiating events that can happen in PGSFR were identified through the MLD method. This method presents a model of a plant in terms of individual events and their combinations in a systematic and logical way. The 11 identified initiating events in this study include the events considered in the past analysis that was conducted for PRISM-150

  10. Early Glycemic Control and Magnitude of HbA1c Reduction Predict Cardiovascular Events and Mortality: Population-Based Cohort Study of 24,752 Metformin Initiators.

    Science.gov (United States)

    Svensson, Elisabeth; Baggesen, Lisbeth M; Johnsen, Søren P; Pedersen, Lars; Nørrelund, Helene; Buhl, Esben S; Haase, Christiane L; Thomsen, Reimar W

    2017-06-01

    We investigated the association of early achieved HbA 1c level and magnitude of HbA 1c reduction with subsequent risk of cardiovascular events or death in patients with type 2 diabetes who initiate metformin. This was a population-based cohort study including all metformin initiators with HbA 1c tests in Northern Denmark, 2000-2012. Six months after metformin initiation, we classified patients by HbA 1c achieved (<6.5% or higher) and by magnitude of HbA 1c change from the pretreatment baseline. We used Cox regression to examine subsequent rates of acute myocardial infarction, stroke, or death, controlling for baseline HbA 1c and other confounding factors. We included 24,752 metformin initiators (median age 62.5 years, 55% males) with a median follow-up of 2.6 years. The risk of a combined outcome event gradually increased with rising levels of HbA 1c achieved compared with a target HbA 1c of <6.5%: adjusted hazard ratio (HR) 1.18 (95% CI 1.07-1.30) for 6.5-6.99%, HR 1.23 (1.09-1.40) for 7.0-7.49%, HR 1.34 (1.14-1.57) for 7.5-7.99%, and HR 1.59 (1.37-1.84) for ≥8%. Results were consistent for individual outcome events and robust by age-group and other patient characteristics. A large absolute HbA 1c reduction from baseline also predicted outcome: adjusted HR 0.80 (0.65-0.97) for Δ = -4, HR 0.98 (0.80-1.20) for Δ = -3, HR 0.92 (0.78-1.08) for Δ = -2, and HR 0.99 (0.89-1.10) for Δ = -1 compared with no HbA 1c change (Δ = 0). A large initial HbA 1c reduction and achievement of low HbA 1c levels within 6 months after metformin initiation are associated with a lower risk of cardiovascular events and death in patients with type 2 diabetes. © 2017 by the American Diabetes Association.

  11. Selection of initial events of accelerator driven subcritical system

    International Nuclear Information System (INIS)

    Wang Qianglong; Hu Liqin; Wang Jiaqun; Li Yazhou; Yang Zhiyi

    2013-01-01

    The Probabilistic Safety Assessment (PSA) is an important tool in reactor safety analysis and a significant reference to the design and operation of reactor. It is the origin and foundation of the PSA for a reactor to select the initial events. Accelerator Driven Subcritical System (ADS) has advanced design characteristics, complicated subsystems and little engineering and operating experience, which makes it much more difficult to identify the initial events of ADS. Based on the current design project of ADS, the system's safety characteristics and special issues were analyzed in this article. After a series of deductions with Master Logic Diagram (MLD) and considering the relating experience of other advanced research reactors, a preliminary initial events was listed finally, which provided the foundation for the next safety assessment. (authors)

  12. Forecasting of integral parameters of solar cosmic ray events according to initial characteristics of an event

    International Nuclear Information System (INIS)

    Belovskij, M.N.; Ochelkov, Yu.P.

    1981-01-01

    The forecasting method for an integral proton flux of solar cosmic rays (SCR) based on the initial characteristics of the phe-- nomenon is proposed. The efficiency of the method is grounded. The accuracy of forecasting is estimated and the retrospective forecasting of real events is carried out. The parameters of the universal function describing the time progress of the SCR events are pre-- sented. The proposed method is suitable for forecasting practically all the SCR events. The timeliness of the given forecasting is not worse than that of the forecasting based on utilization of the SCR propagation models [ru

  13. Event-By-Event Initial Conditions for Heavy Ion Collisions

    Science.gov (United States)

    Rose, S.; Fries, R. J.

    2017-04-01

    The early time dynamics of heavy ion collisions can be described by classical fields in an approximation of Quantum ChromoDynamics (QCD) called Color Glass Condensate (CGC). Monte-Carlo sampling of the color charge for the incoming nuclei are used to calculate their classical gluon fields. Following the recent work by Chen et al. we calculate the energy momentum tensor of those fields at early times in the collision event-by-event. This can then be used for subsequent hydrodynamic evolution of the single events.

  14. Event-By-Event Initial Conditions for Heavy Ion Collisions

    International Nuclear Information System (INIS)

    Rose, S; Fries, R J

    2017-01-01

    The early time dynamics of heavy ion collisions can be described by classical fields in an approximation of Quantum ChromoDynamics (QCD) called Color Glass Condensate (CGC). Monte-Carlo sampling of the color charge for the incoming nuclei are used to calculate their classical gluon fields. Following the recent work by Chen et al. we calculate the energy momentum tensor of those fields at early times in the collision event-by-event. This can then be used for subsequent hydrodynamic evolution of the single events. (paper)

  15. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  16. A review for identification of initiating events in event tree development process on nuclear power plants

    International Nuclear Information System (INIS)

    Riyadi, Eko H.

    2014-01-01

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events

  17. A review for identification of initiating events in event tree development process on nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id [Center for Regulatory Assessment of Nuclear Installation and Materials, Nuclear Energy Regulatory Agency (BAPETEN), Jl. Gajah Mada 8 Jakarta 10120 (Indonesia)

    2014-09-30

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.

  18. Initiating Events Modeling for On-Line Risk Monitoring Application

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.

    1998-01-01

    In order to make on-line risk monitoring application of Probabilistic Risk Assessment more complete and realistic, a special attention need to be dedicated to initiating events modeling. Two different issues are of special importance: one is how to model initiating events frequency according to current plant configuration (equipment alignment and out of service status) and operating condition (weather and various activities), and the second is how to preserve dependencies between initiating events model and rest of PRA model. First, the paper will discuss how initiating events can be treated in on-line risk monitoring application. Second, practical example of initiating events modeling in EPRI's Equipment Out of Service on-line monitoring tool will be presented. Gains from application and possible improvements will be discussed in conclusion. (author)

  19. Seeking for toroidal event horizons from initially stationary BH configurations

    International Nuclear Information System (INIS)

    Ponce, Marcelo; Lousto, Carlos; Zlochower, Yosef

    2011-01-01

    We construct and evolve non-rotating vacuum initial data with a ring singularity, based on a simple extension of the standard Brill-Lindquist multiple BH initial data, and search for event horizons with spatial slices that are toroidal when the ring radius is sufficiently large. While evolutions of the ring singularity are not numerically feasible for large radii, we find some evidence, based on configurations of multiple BHs arranged in a ring, that this configuration leads to singular limit where the horizon width has zero size, possibly indicating the presence of a naked singularity, when the radius of the ring is sufficiently large. This is in agreement with previous studies that have found that there is no apparent horizon surrounding the ring singularity when the ring's radius is larger than about twice its mass.

  20. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  1. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  2. Viability of Event Management Business in Batangas City, Philippine: Basis for Business Operation Initiatives

    OpenAIRE

    Jeninah Christia D. Borbon

    2016-01-01

    The research study on Viability of Event Management Business in Batangas City: Basis for Business Operation Initiatives aimed to assess the viability of this type of business using Thompson’s (2005) Dimension of Business Viability as its tool in order to create business operation initiatives. It provided a good framework for defining success factors in entrepreneurial operation initiatives in a specific business type – event management. This study utilized event organizers based i...

  3. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Science.gov (United States)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  4. Estimation of initiating event frequency for external flood events by extreme value theorem

    International Nuclear Information System (INIS)

    Chowdhury, Sourajyoti; Ganguly, Rimpi; Hari, Vibha

    2017-01-01

    External flood is an important common cause initiating event in nuclear power plants (NPPs). It may potentially lead to severe core damage (SCD) by first causing the failure of the systems required for maintaining the heat sinks and then by contributing to failures of engineered systems designed to mitigate such failures. The sample NPP taken here is twin 220 MWe Indian standard pressurized heavy water reactor (PHWR) situated inland. A comprehensive in-house Level-1 internal event PSA for full power had already been performed. External flood assessment was further conducted in area of external hazard risk assessment in response to post-Fukushima measures taken in nuclear industries. The present paper describes the methodology to calculate initiating event (IE) frequency for external flood events for the sample inland Indian NPP. General extreme value (GEV) theory based on maximum likelihood method (MLM) and order statistics approach (OSA) is used to analyse the rainfall data for the site. Thousand-year return level and necessary return periods for extreme rainfall are evaluated. These results along with plant-specific topographical calculations quantitatively establish that external flooding resulting from upstream dam break, river flooding and heavy rainfall (flash flood) would be unlikely for the sample NPP in consideration.

  5. Initial-state parton shower kinematics for NLO event generators

    International Nuclear Information System (INIS)

    Odaka, Shigeru; Kurihara, Yoshimasa

    2007-01-01

    We are developing a consistent method to combine tree-level event generators for hadron collision interactions with those including one additional QCD radiation from the initial-state partons, based on the limited leading-log (LLL) subtraction method, aiming at an application to NLO event generators. In this method, a boundary between non-radiative and radiative processes necessarily appears at the factorization scale (μ F ). The radiation effects are simulated using a parton shower (PS) in non-radiative processes. It is therefore crucial in our method to apply a PS which well reproduces the radiation activities evaluated from the matrix-element (ME) calculations for radiative processes. The PS activity depends on the applied kinematics model. In this paper we introduce two models for our simple initial-state leading-log PS: a model similar to the 'old' PYTHIA-PS and a p T -prefixed model motivated by ME calculations. PS simulations employing these models are tested using W-boson production at LHC as an example. Both simulations show a smooth matching to the LLL subtracted W+1 jet simulation in the p T distribution of W bosons, and the summed p T spectra are stable against a variation of μ F , despite that the p T -prefixed PS results in an apparently harder p T spectrum. (orig.)

  6. Sensitivity studies on the approaches for addressing multiple initiating events in fire events PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Lim, Ho Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    A single fire event within a fire compartment or a fire scenario can cause multiple initiating events (IEs). As an example, a fire in a turbine building fire area can cause a loss of the main feed-water (LOMF) and loss of off-site power (LOOP) IEs. Previous domestic fire events PSA had considered only the most severe initiating event among multiple initiating events. NUREG/CR-6850 and ANS/ASME PRA Standard require that multiple IEs are to be addressed in fire events PSA. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA for Hanul Unit 3 were performed and their results were presented. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA are performed and their results were presented. From the sensitivity analysis results, we can find that the incorporations of multiple IEs into fire events PSA model result in the core damage frequency (CDF) increase and may lead to the generation of the duplicate cutsets. Multiple IEs also can occur at internal flooding event or other external events such as seismic event. They should be considered in the constructions of PSA models in order to realistically estimate risk due to flooding or seismic events.

  7. Gastrointestinal events and association with initiation of treatment for osteoporosis

    Directory of Open Access Journals (Sweden)

    Modi A

    2015-11-01

    Full Text Available Ankita Modi,1 Ethel S Siris,2 Jackson Tang,3 Shiva Sajjan,1 Shuvayu S Sen1 1Center for Observational and Real-World Evidence, Merck & Co., Inc, Kenilworth, NJ, 2Toni Stabile Osteoporosis Center, Columbia University Medical Center, NY Presbyterian Hospital, New York, NY, 3Asclepius Analytics Ltd, Brooklyn, NY, USA Background: Preexisting gastrointestinal (GI events may deter the use of pharmacologic treatment in patients diagnosed with osteoporosis (OP. The objective of this study was to examine the association between preexisting GI events and OP pharmacotherapy initiation among women diagnosed with OP. Methods: The study utilized claims data from a large US managed care database to identify women aged ≥55 years with a diagnosis code for OP (index date during 2002–2009. Patients with a claim for pharmacologic OP treatment in the 12-month pre-index period (baseline were excluded. OP treatment initiation in the post-index period was defined as a claim for bisphosphonates (alendronate, ibandronate, risedronate, zoledronic acid, calcitonin, raloxifene, or teriparatide. During the post-index period (up to 12 months, GI events were identified before treatment initiation. A time-dependent Cox regression model was used to investigate the likelihood of initiating any OP treatment. Among patients initiating OP treatment, a discrete choice model was utilized to assess the relationship between post-index GI events and likelihood of initiating with a bisphosphonate versus a non-bisphosphonate. Results: In total, 65,344 patients (mean age 66 years were included; 23.7% had a GI event post diagnosis and before treatment initiation. Post-index GI events were associated with a 75% lower likelihood of any treatment initiation (hazard ratio 0.25; 95% confidence interval 0.24–0.26. Among treated patients (n=23,311, those with post-index GI events were 39% less likely to receive a bisphosphonate versus a non-bisphosphonate (odds ratio 0.61; 95% confidence

  8. The initiating events in the Loviisa nuclear power plant history

    International Nuclear Information System (INIS)

    Sjoblom, K.

    1987-01-01

    During the 16 reactor years of Loviisa nuclear power plant operation no serious incident has endangered the high level of safety. The initiating events of plant incidents have been analyzed in order to get a view of plant operational safety experience. The initiating events have been placed in categories similar to those that EPRI uses. However, because of the very small number of scrams the study was extended to also cover transients with a relatively low safety importance in order to get more comprehensive statistics. Human errors, which contributed to 15% of the transients, were a special subject in this study. The conditions under which human failures occurred, and the nature and root causes of the human failures that caused the initiating events were analyzed. For future analyses it was noticed that it would be beneficial to analyze incidents immediately, to consult with the persons directly involved and to develop an international standard format for incident analyses

  9. Analysis of the Steam Generator Tubes Rupture Initiating Event

    International Nuclear Information System (INIS)

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  10. Defining initiating events for purposes of probabilistic safety assessment

    International Nuclear Information System (INIS)

    1993-09-01

    This document is primarily directed towards technical staff involved in the performance or review of plant specific Probabilistic Safety Assessment (PSA). It highlights different approaches and provides typical examples useful for defining the Initiating Events (IE). The document also includes the generic initiating event database, containing about 300 records taken from about 30 plant specific PSAs. In addition to its usefulness during the actual performance of a PSA, the generic IE database is of the utmost importance for peer reviews of PSAs, such as the IAEA's International Peer Review Service (IPERS) where reference to studies on similar NPPs is needed. 60 refs, figs and tabs

  11. Adverse cardiac events in out-patients initiating clozapine treatment

    DEFF Research Database (Denmark)

    Rohde, C; Polcwiartek, C; Kragholm, K

    2018-01-01

    OBJECTIVE: Using national Danish registers, we estimated rates of clozapine-associated cardiac adverse events. Rates of undiagnosed myocarditis were estimated by exploring causes of death after clozapine initiation. METHOD: Through nationwide health registers, we identified all out-patients initi......OBJECTIVE: Using national Danish registers, we estimated rates of clozapine-associated cardiac adverse events. Rates of undiagnosed myocarditis were estimated by exploring causes of death after clozapine initiation. METHOD: Through nationwide health registers, we identified all out...... the maximum rate of clozapine-associated fatal myocarditis to 0.28%. CONCLUSION: Cardiac adverse effects in Danish out-patients initiating clozapine treatment are extremely rare and these rates appear to be comparable to those observed for other antipsychotic drugs....

  12. Initiating Event Rates at U.S. Nuclear Power Plants. 1988 - 2013

    International Nuclear Information System (INIS)

    Schroeder, John A.; Bower, Gordon R.

    2014-01-01

    Analyzing initiating event rates is important because it indicates performance among plants and also provides inputs to several U.S. Nuclear Regulatory Commission (NRC) risk-informed regulatory activities. This report presents an analysis of initiating event frequencies at U.S. commercial nuclear power plants since each plant's low-power license date. The evaluation is based on the operating experience from fiscal year 1988 through 2013 as reported in licensee event reports. Engineers with nuclear power plant experience staff reviewed each event report since the last update to this report for the presence of valid scrams or reactor trips at power. To be included in the study, an event had to meet all of the following criteria: includes an unplanned reactor trip (not a scheduled reactor trip on the daily operations schedule), sequence of events starts when reactor is critical and at or above the point of adding heat, occurs at a U.S. commercial nuclear power plant (excluding Fort St. Vrain and LaCrosse), and is reported by a licensee event report. This report displays occurrence rates (baseline frequencies) for the categories of initiating events that contribute to the NRC's Industry Trends Program. Sixteen initiating event groupings are trended and displayed. Initiators are plotted separately for initiating events with different occurrence rates for boiling water reactors and pressurized water reactors. p-values are given for the possible presence of a trend over the most recent 10 years.

  13. Multi-Unit Initiating Event Analysis for a Single-Unit Internal Events Level 1 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong San; Park, Jin Hee; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The Fukushima nuclear accident in 2011 highlighted the importance of considering the risks from multi-unit accidents at a site. The ASME/ANS probabilistic risk assessment (PRA) standard also includes some requirements related to multi-unit aspects, one of which (IE-B5) is as follows: 'For multi-unit sites with shared systems, DO NOT SUBSUME multi-unit initiating events if they impact mitigation capability [1].' However, the existing single-unit PSA models do not explicitly consider multi-unit initiating events and hence systems shared by multiple units (e.g., alternate AC diesel generator) are fully credited for the single unit and ignores the need for the shared systems by other units at the same site [2]. This paper describes the results of the multi-unit initiating event (IE) analysis performed as a part of the at-power internal events Level 1 probabilistic safety assessment (PSA) for an OPR1000 single unit ('reference unit'). In this study, a multi-unit initiating event analysis for a single-unit PSA was performed, and using the results, dual-unit LOOP initiating event was added to the existing PSA model for the reference unit (OPR1000 type). Event trees were developed for dual-unit LOOP and dual-unit SBO which can be transferred from dual- unit LOOP. Moreover, CCF basic events for 5 diesel generators were modelled. In case of simultaneous SBO occurrences in both units, this study compared two different assumptions on the availability of the AAC D/G. As a result, when dual-unit LOOP initiating event was added to the existing single-unit PSA model, the total CDF increased by 1∼ 2% depending on the probability that the AAC D/G is available to a specific unit in case of simultaneous SBO in both units.

  14. Initiating Events for Multi-Reactor Plant Sites

    Energy Technology Data Exchange (ETDEWEB)

    Muhlheim, Michael David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Flanagan, George F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Poore, III, Willis P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    Inherent in the design of modular reactors is the increased likelihood of events that initiate at a single reactor affecting another reactor. Because of the increased level of interactions between reactors, it is apparent that the Probabilistic Risk Assessments (PRAs) for modular reactor designs need to specifically address the increased interactions and dependencies.

  15. Feature extraction and sensor selection for NPP initiating event identification

    International Nuclear Information System (INIS)

    Lin, Ting-Han; Wu, Shun-Chi; Chen, Kuang-You; Chou, Hwai-Pwu

    2017-01-01

    Highlights: • A two-stage feature extraction scheme for NPP initiating event identification. • With stBP, interrelations among the sensors can be retained for identification. • With dSFS, sensors that are crucial for identification can be efficiently selected. • Efficacy of the scheme is illustrated with data from the Maanshan NPP simulator. - Abstract: Initiating event identification is essential in managing nuclear power plant (NPP) severe accidents. In this paper, a novel two-stage feature extraction scheme that incorporates the proposed sensor type-wise block projection (stBP) and deflatable sequential forward selection (dSFS) is used to elicit the discriminant information in the data obtained from various NPP sensors to facilitate event identification. With the stBP, the primal features can be extracted without eliminating the interrelations among the sensors of the same type. The extracted features are then subjected to a further dimensionality reduction by selecting the sensors that are most relevant to the events under consideration. This selection is not easy, and a combinatorial optimization technique is normally required. With the dSFS, an optimal sensor set can be found with less computational load. Moreover, its sensor deflation stage allows sensors in the preselected set to be iteratively refined to avoid being trapped into a local optimum. Results from detailed experiments containing data of 12 event categories and a total of 112 events generated with a Taiwan’s Maanshan NPP simulator are presented to illustrate the efficacy of the proposed scheme.

  16. Sensitivity of a Simulated Derecho Event to Model Initial Conditions

    Science.gov (United States)

    Wang, Wei

    2014-05-01

    Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.

  17. Identification and selection of initiating events for experimental fusion facilities

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1989-01-01

    This paper describes the current approaches used in probabilistic risk assessment (PRA) to identify and select accident initiating events for study in either probabilistic safety analysis or PRA. Current methods directly apply to fusion facilities as well as other types of industries, such as chemical processing and nuclear fission. These identification and selection methods include the Master Logic Diagram, historical document review, system level Failure Modes and Effects Analysis, and others. A combination of the historical document review, such as Safety Analysis Reports and fusion safety studies, and the Master Logic Diagram with appropriate quality assurance reviews, is suggested for standardizing US fusion PRA effects. A preliminary set of generalized initiating events applicable to fusion facilities derived from safety document review is presented as a framework to start from for the historical document review and Master Logic Diagram approach. Fusion designers should find this list useful for their design reviews. 29 refs., 2 tabs

  18. Identification and selection of initiating events for experimental fusion facilities

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1989-01-01

    This paper describes the current approaches used in probabilistic risk assessment (PRA) to identify and select accident initiating events for study in either probabilistic safety analysis or PRA. Current methods directly apply to fusion facilities as well as other types of industries, such as chemical processing and nuclear fission. These identification and selection methods include the Master Logic Diagram, historical document review, system level Failure Modes and Effects Analysis, and others. A combination of the historical document review, such as Safety Analysis Reports and fusion safety studies, and the Master Logic Diagram with appropriate quality assurance reviews, is suggested for standardizing U.S. fusion PRA efforts. A preliminary set of generalized initiating events applicable to fusion facilities derived from safety document review is presented as a framework to start from for the historical document review and Master Logic Diagram approach. Fusion designers should find this list useful for their design reviews. 29 refs., 1 tab

  19. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  20. Multivariate algorithms for initiating event detection and identification in nuclear power plants

    International Nuclear Information System (INIS)

    Wu, Shun-Chi; Chen, Kuang-You; Lin, Ting-Han; Chou, Hwai-Pwu

    2018-01-01

    Highlights: •Multivariate algorithms for NPP initiating event detection and identification. •Recordings from multiple sensors are simultaneously considered for detection. •Both spatial and temporal information is used for event identification. •Untrained event isolation avoids falsely relating an untrained event. •Efficacy of the algorithms is verified with data from the Maanshan NPP simulator. -- Abstract: To prevent escalation of an initiating event into a severe accident, promptly detecting its occurrence and precisely identifying its type are essential. In this study, several multivariate algorithms for initiating event detection and identification are proposed to help maintain safe operations of nuclear power plants (NPPs). By monitoring changes in the NPP sensing variables, an event is detected when the preset thresholds are exceeded. Unlike existing approaches, recordings from sensors of the same type are simultaneously considered for detection, and no subjective reasoning is involved in setting these thresholds. To facilitate efficient event identification, a spatiotemporal feature extractor is proposed. The extracted features consist of the temporal traits used by existing techniques and the spatial signature of an event. Through an F-score-based feature ranking, only those that are most discriminant in classifying the events under consideration will be retained for identification. Moreover, an untrained event isolation scheme is introduced to avoid relating an untrained event to those in the event dataset so that improper recovery actions can be prevented. Results from experiments containing data of 12 event classes and a total of 125 events generated using a Taiwan’s Maanshan NPP simulator are provided to illustrate the efficacy of the proposed algorithms.

  1. Viability of Event Management Business in Batangas City, Philippine: Basis for Business Operation Initiatives

    Directory of Open Access Journals (Sweden)

    Jeninah Christia D. Borbon

    2016-11-01

    Full Text Available The research study on Viability of Event Management Business in Batangas City: Basis for Business Operation Initiatives aimed to assess the viability of this type of business using Thompson’s (2005 Dimension of Business Viability as its tool in order to create business operation initiatives. It provided a good framework for defining success factors in entrepreneurial operation initiatives in a specific business type – event management. This study utilized event organizers based in Batangas, a southern popular province, which also is a great popular destination for many types of events. Findings showed that the event management business in Batangas City is generally a personal event type of business whose year of operation ranges from one to three years, mostly link to church or reception venues and usually offers on the day coordination. In the assessment of its perceived viability, it was found out that this type of business is moderately viable in terms of market, technical, business model, management model, economic and financial, and exit strategy. Among all the dimensions tested, only market, management model, economic and financial, and exit strategy showed significant relationship with the profile variables of the event management business. From the enumerated problems encountered, those that got the highest rate were demanding clients, overbooking of reservation/exceeding number of guests and failure to meet spectators and/or competitors expectations. And, the recommended business operation initiatives were based on the weaknesses discovered using Thompson’s Dimension of Business Viability Model.

  2. Development of transient initiating event frequencies for use in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors

  3. Development of transient initiating event frequencies for use in probabilistic risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.

  4. Analysis of early initiating event(s) in radiation-induced thymic lymphomagenesis

    International Nuclear Information System (INIS)

    Muto, Masahiro; Ying Chen; Kubo, Eiko; Mita, Kazuei

    1996-01-01

    Since the T cell receptor rearrangement is a sequential process and unique to the progeny of each clone, we investigated the early initiating events in radiation-induced thymic lymphomagenesis by comparing the oncogenic alterations with the pattern of γ T cell receptor (TCR) rearrangements. We reported previously that after leukemogenic irradiation, preneoplastic cells developed, albeit infrequently, from thymic leukemia antigen-2 + (TL-2 + ) thymocytes. Limited numbers of TL-2 + cells from individual irradiated B10.Thy-1.1 mice were injected into B10.Thy-1.2 mice intrathymically, and the common genetic changes among the donor-type T cell lymphomas were investigated with regard to p53 gene and chromosome aberrations. The results indicated that some mutations in the p53 gene had taken place in these lymphomas, but there was no common mutation among the donor-type lymphomas from individual irradiated mice, suggesting that these mutations were late-occurring events in the process of oncogenesis. On the other hand, there were common chromosome aberrations or translocations such as trisomy 15, t(7F; 10C), t(1A; 13D) or t(6A; XB) among the donor-type lymphomas derived from half of the individual irradiated mice. This indicated that the aberrations/translocations, which occurred in single progenitor cells at the early T cell differentiation either just before or after γ T cell receptor rearrangements, might be important candidates for initiating events. In the donor-type lymphomas from the other half of the individual irradiated mice, microgenetic changes were suggested to be initial events and also might take place in single progenitor cells just before or right after γ TCR rearrangements. (author)

  5. Signaling events during initiation of arbuscular mycorrhizal symbiosis.

    Science.gov (United States)

    Schmitz, Alexa M; Harrison, Maria J

    2014-03-01

    Under nutrient-limiting conditions, plants will enter into symbiosis with arbuscular mycorrhizal (AM) fungi for the enhancement of mineral nutrient acquisition from the surrounding soil. AM fungi live in close, intracellular association with plant roots where they transfer phosphate and nitrogen to the plant in exchange for carbon. They are obligate fungi, relying on their host as their only carbon source. Much has been discovered in the last decade concerning the signaling events during initiation of the AM symbiosis, including the identification of signaling molecules generated by both partners. This signaling occurs through symbiosis-specific gene products in the host plant, which are indispensable for normal AM development. At the same time, plants have adapted complex mechanisms for avoiding infection by pathogenic fungi, including an innate immune response to general microbial molecules, such as chitin present in fungal cell walls. How it is that AM fungal colonization is maintained without eliciting a defensive response from the host is still uncertain. In this review, we present a summary of the molecular signals and their elicited responses during initiation of the AM symbiosis, including plant immune responses and their suppression. © 2014 Institute of Botany, Chinese Academy of Sciences.

  6. Factors controlling the initiation of Snowball Earth events

    Science.gov (United States)

    Voigt, A.

    2012-12-01

    During the Neoproterozoic glaciations tropical continents were covered by active glaciers that extended down to sea level. To explain these glaciers, the Snowball Earth hypothesis assumes that oceans were completely sea-ice covered during these glaciation, but there is an ongoing debate whether or not some regions of the tropical oceans remained open. In this talk, I will describe past and ongoing climate modelling activities with the comprehensive coupled climate model ECHAM5/MPI-OM that identify and compare factors that control the initiation of Snowball Earth events. I first show that shifting the continents from their present-day location to their Marinoan (635 My BP) low-latitude location increases the planetary albedo, cools the climate, and thereby allows Snowball Earth initiation at higher levels of total solar irradiance and atmospheric CO2. I then present simulations with successively lowered bare sea-ice albedo, disabled sea-ice dynamics, and switched-off ocean heat transport. These simulations show that both lowering the bare sea-ice albedo and disabling sea-ice dynamics increase the critical sea-ice cover in ECHAM5/MPI-OM, but sea-ice dynamics due to strong equatorward sea-ice transport have a much larger influence on the critical CO2. Disabling sea-ice transport allows a state with sea-ice margin at 10 deg latitude by virtue of the Jormungand mechanism. The accumulation of snow on land, in combination with tropical land temperatures below or close to freezing, suggests that tropical land glaciers could easily form in such a state. However, in contrast to aquaplanet simulations without ocean heat transport, there is no sign of a Jormungand hysteresis in the coupled simulations. Ocean heat transport is not responsible for the lack of a Jormungand hysteresis in the coupled simulations. By relating the above findings to previous studies, I will outline promising future avenues of research on the initiation of Snowball Earth events. In particular, an

  7. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  8. Developing Public Health Initiatives through Understanding Motivations of the Audience at Mass-Gathering Events.

    Science.gov (United States)

    Hutton, Alison; Ranse, Jamie; Munn, Matthew Brendan

    2018-04-01

    This report identifies what is known about audience motivations at three different mass-gathering events: outdoor music festivals, religious events, and sporting events. In light of these motivations, the paper discusses how these can be harnessed by the event organizer and Emergency Medical Services. Lastly, motivations tell what kinds of interventions can be used to achieve an understanding of audience characteristics and the opportunity to develop tailor-made programs to maximize safety and make long-lasting public health interventions to a particular "cohort" or event population. A lot of these will depend on what the risks/hazards are with the particular populations in order to "target" them with public health interventions. Audience motivations tell the event organizer and Emergency Medical Services about the types of behaviors they should expect from the audience and how this may affect their health while at the event. Through these understandings, health promotion and event safety messages can be developed for a particular type of mass-gathering event based on the likely composition of the audience in attendance. Health promotion and providing public information should be at the core of any mass-gathering event to minimize public health risk and to provide opportunities for the promotion of healthy behaviors in the local population. Audience motivations are a key element to identify and agree on what public health information is needed for the event audience. A more developed understanding of audience behavior provides critical information for event planners, event risk managers, and Emergency Medical Services personnel to better predict and plan to minimize risk and reduce patient presentations at events. Mass-gathering event organizers and designers intend their events to be positive experiences and to have meaning for those who attend. Therefore, continual vigilance to improve public health effectiveness and efficiency can become best practice at events

  9. Identification of Common Cause Initiating Events Using the NEA IRS Database. Rev 0

    International Nuclear Information System (INIS)

    Kulig, Maciej; Tomic, Bojan; Nyman, R alph

    2007-02-01

    The study presented in this report is a continuation of work conducted for SKI in 1998 on the identification of Common Cause Initiators (CCIs) based on operational events documented in the NEA Incident Reporting System (IRS). Based on the new operational experience accumulated in IRS in the period 1995-2006, the project focused on the identification of new CCI events. An attempt was also made to compare the observations made in the earlier study with the results of the current work. The earlier study and the current project cover the events reported in the IRS database with the incident date in the period from 01.01.1980 to 15.11.2006. The review of the NEA IRS database conducted within this project generated a sample of events that provides insights regarding the Common Cause Initiators (CCIs). This list includes certain number of 'real' CCIs but also potential CCIs and other events that provide insights on potential dependency mechanisms. Relevant characteristics of the events were analysed in the context of CCIs. This evaluation was intended to investigate the importance of the CCI issue and also to provide technical insights that could help in the modelling the CCIs in PSAs. The analysis of operational events provided useful engineering insights regarding the potential dependencies that may originate CCIs. Some indications were also obtained on the plant SSCs/areas that are susceptible to common cause failures. Direct interrelations between the accident mitigation systems through common support systems, which can originate a CCI, represent a dominant dependency mechanism involved in the CCI events. The most important contributors of this type are electrical power supply systems and I-and-C systems. Area-related events (fire, flood, water spray), external hazards (lightning, high wind or cold weather) and transients (water hammer, electrical transients both internal and external) have also been found to be important sources of dependency that may originate CCIs

  10. Identification of Common Cause Initiating Events Using the NEA IRS Database. Rev 0

    Energy Technology Data Exchange (ETDEWEB)

    Kulig, Maciej; Tomic, Bojan (Enconet Consulting, Vienna (Austria)); Nyman, Ralph (Swedish Nuclear Power Inspectorate, Stockholm (Sweden))

    2007-02-15

    The study presented in this report is a continuation of work conducted for SKI in 1998 on the identification of Common Cause Initiators (CCIs) based on operational events documented in the NEA Incident Reporting System (IRS). Based on the new operational experience accumulated in IRS in the period 1995-2006, the project focused on the identification of new CCI events. An attempt was also made to compare the observations made in the earlier study with the results of the current work. The earlier study and the current project cover the events reported in the IRS database with the incident date in the period from 01.01.1980 to 15.11.2006. The review of the NEA IRS database conducted within this project generated a sample of events that provides insights regarding the Common Cause Initiators (CCIs). This list includes certain number of 'real' CCIs but also potential CCIs and other events that provide insights on potential dependency mechanisms. Relevant characteristics of the events were analysed in the context of CCIs. This evaluation was intended to investigate the importance of the CCI issue and also to provide technical insights that could help in the modelling the CCIs in PSAs. The analysis of operational events provided useful engineering insights regarding the potential dependencies that may originate CCIs. Some indications were also obtained on the plant SSCs/areas that are susceptible to common cause failures. Direct interrelations between the accident mitigation systems through common support systems, which can originate a CCI, represent a dominant dependency mechanism involved in the CCI events. The most important contributors of this type are electrical power supply systems and I-and-C systems. Area-related events (fire, flood, water spray), external hazards (lightning, high wind or cold weather) and transients (water hammer, electrical transients both internal and external) have also been found to be important sources of dependency that may

  11. Study of Updating Initiating Event Frequency using Prognostics

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung

    2014-01-01

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA

  12. Study of Updating Initiating Event Frequency using Prognostics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of)

    2014-10-15

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA.

  13. Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs

    Science.gov (United States)

    Purba, J. H.

    2018-02-01

    Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.

  14. Initiating events in the safety probabilistic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Stasiulevicius, R.

    1989-01-01

    The importance of the initiating event in the probabilistic safety analysis of nuclear power plants are discussed and the basic procedures necessary for preparing reports, quantification and grouping of the events are described. The examples of initiating events with its occurence medium frequency, included those calculated for OCONEE reactor and Angra-1 reactor are presented. (E.G.)

  15. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  16. A study on the determination of threshold values for the initiating event performance indicators of domestic nuclear power plants

    International Nuclear Information System (INIS)

    Kang, D. I.; Park, J. H.; Kim, K. Y.; Whang, M. J.; Yang, J. E.; Sung, G. Y.

    2003-01-01

    In this paper, we determine the threshold values of unplanned reactor scram, domestic initiating event performance indicator, using data of domestic unplanned reactor scram and probabilistic safety assessment model of Korea Standard Nuclear Power Plant(KSNP). We also perform a pilot study of initiating event Risk Based Performance Indicator(RBPI) for KSNP. Study results for unplanned reactor scram show that the threshold value of between green and blue color is 3, that of between blue and yellow color is 6, and that of between yellow and orange color is 30. Pilot study results of initiating event RBPI show that loss of feedwater, transient, and loss of component cooling water events are selected as initiating event RBPI for KSNP

  17. Initiating events and accidental sequences taken into account in the CAREM reactor design

    International Nuclear Information System (INIS)

    Kay, J.M.; Felizia, E.R.; Navarro, N.R.; Caruso, G.J.

    1990-01-01

    The advance made in the nuclear security evaluation of the CAREM reactor is presented. It was carried out using the Security Probabilistic Analysis (SPA). The latter takes into account the different phases of identification and solution of initiating events and the qualitative development of event trees. The method of identification of initiating events is the Master Logical Diagram (MLD), whose deductive basis makes it appropriate for a new design like the one described. The qualitative development of the event trees associated to the identified initiating events, allows identification of those accidental sequences which are to have the security systems in the reactor. (Author) [es

  18. Methodology for Selecting Initiating Events and Hazards for Consideration in an Extended PSA

    International Nuclear Information System (INIS)

    Wielenberg, A.; Hage, M.; Loeffler, H.; Alzbutas, R.; Apostol, M.; Bareith, A.; Siklossy, T.; Brac, P.; Burgazzi, L.; Cazzoli, E.; Vitazkova, J.; Cizelj, L.; Prosek, A.; Volkanovski, A.; Hashimoto, K.; Godefroy, F.; Gonzalez, M.; Groudev, P.; Kolar, L.; Kumar, M.; Nitoi, M.; Raimond, E.

    2016-01-01

    An extended PSA applies to a site of one or several Nuclear Power Plant unit(s) and its environment. It intends to calculate the risk induced by the main sources of radioactivity (reactor core and spent fuel storages) on the site, taking into account all operating states for each main source and all possible relevant accident initiating events (both internal and external) affecting one unit or the whole site. The combination between hazards or initiating events and their impact on a unit or the whole site is a crucial issue for an extended PSA. The report tries to discuss relevant methodologies for this purpose. The report proposes a methodology to select initiating events and hazards for the development of an extended PSA. The proposed methodology for initiating events identification, screening and bounding analysis for an extended PSA consists of four major steps: 1. A comprehensive identification of events and hazards and their respective combinations applicable to the plant and site. Qualitative screening criteria will be applied, 2. The calculation of initial (possibly conservative) frequency claims for events and hazards and their respective combinations applicable to the plant and the site. Quantitative screening criteria will be applied, 3. An impact analysis and bounding assessment for all applicable events and scenarios. Events are either screened out from further more detailed analysis, or are assigned to a bounding event (group), or are retained for detailed analysis, 4. The probabilistic analysis of all retained (bounding) events at the appropriate level of detail. (authors)

  19. Development and verification of an efficient spatial neutron kinetics method for reactivity-initiated event analyses

    International Nuclear Information System (INIS)

    Ikeda, Hideaki; Takeda, Toshikazu

    2001-01-01

    A space/time nodal diffusion code based on the nodal expansion method (NEM), EPISODE, was developed in order to evaluate transient neutron behavior in light water reactor cores. The present code employs the improved quasistatic (IQS) method for spatial neutron kinetics, and neutron flux distribution is numerically obtained by solving the neutron diffusion equation with the nonlinear iteration scheme to achieve fast computation. A predictor-corrector (PC) method developed in the present study enabled to apply a coarse time mesh to the transient spatial neutron calculation than that applicable in the conventional IQS model, which improved computational efficiency further. Its computational advantage was demonstrated by applying to the numerical benchmark problems that simulate reactivity-initiated events, showing reduction of computational times up to a factor of three than the conventional IQS. The thermohydraulics model was also incorporated in EPISODE, and the capability of realistic reactivity event analyses was verified using the SPERT-III/E-Core experimental data. (author)

  20. Blast experiments for the derivation of initial cloud dimensions after a ''Dirty Bomb'' event

    International Nuclear Information System (INIS)

    Thielen, H.; Schroedl, E.

    2004-01-01

    Basis for the assessment of potential consequences of a ''dirty bomb'' event is the calculation of the atmospheric dispersion of airborne particles. The empirical derivation of parameters for the estimation of the initial pollutant cloud dimensions was the principal purpose for blast experiments performed in the training area Munster in summer 2003 with the participation of several highly engaged German organisations and institutions. The experiments were performed under variation of parameters like mass and kind of explosive, subsurface characteristics or meteorological conditions and were documented by digital video recording. The blasting experiments supplied significant results under reproducible conditions. The initial cloud dimension was primarily influenced by the explosive mass. The influence of other parameters was relatively small and within the range of the experimental uncertainties. Based on these experimental results a new correlation was determined for the empirical estimation of the initial cloud dimensions as a function of explosive mass. The observed initial cloud volumes were more than an order of magnitude smaller than those calculated with other widely-used formulas (e.g. HOTSPOT). As a smaller volume of the initial cloud leads to higher near-ground concentration maxima, our results support an appropriate adjustment of currently employed calculation methods. (orig.)

  1. A Study on the Frequency of Initiating Event of OPR-1000 during Outage Periods

    Energy Technology Data Exchange (ETDEWEB)

    Hong Jae Beol; Jae, Moo Sung [Hanyang Univ., Seoul (Korea, Republic of)

    2013-10-15

    These sources of data did not reflect the latest event data which have occurred during the PWR outage to the frequencies of initiating event Electric Power Research Institute(EPRI) in USA collected the data of loss of decay heat removal during outage from 1989 to 2009 and published technical report. Domestic operating experiences for LOOP is gathered in Operational Performance Information System for Nuclear Power Plant(OPIS). To reduce conservatism and obtain completeness for LPSD PSA, those data should be collected and used to update the frequencies. The frequencies of LOSDC and LOOP are reevaluated using the data of EPRI and OPIS in this paper. Quantification is conducted to recalculate core damage frequency(CDF), since the rate is changed. The results are discussed below. To make an accurate estimate of the initiating events of LPSD PSA, the event data were collected and the frequencies of initiating events were updated using Bayesian approach. CDF was evaluated through quantification. Δ CDF is -40% and the dominant contributor is pressurizer PSV stuck open event. The most of the event data in EPRI TR were collected from US nuclear power plant industry. Those data are not enough to evaluate outage risk precisely. Therefore, to reduce conservatism and obtain completeness for LPSD PSA, the licensee event report and domestic data should be collected and reflected to the frequencies of the initiating events during outage.

  2. Containment performance evaluation for the GESSAR-II plant for seismic initiating events

    International Nuclear Information System (INIS)

    Shiu, K.K.; Chu, T.; Ludewig, H.; Pratt, W.T.

    1986-01-01

    As a part of the overall effort undertaken by Brookhaven National Laboratory (BNL) to review the GESSAR-II probabilistic risk assessment, an independent containment performance evaluation was performed using the containment event tree approach. This evaluation focused principally on those accident sequences which are initiated by seismic events. This paper reports the findings of this study. 1 ref

  3. Defining molecular initiating events in the adverse outcome pathway framework for risk assessment.

    Science.gov (United States)

    Allen, Timothy E H; Goodman, Jonathan M; Gutsell, Steve; Russell, Paul J

    2014-12-15

    Consumer and environmental safety decisions are based on exposure and hazard data, interpreted using risk assessment approaches. The adverse outcome pathway (AOP) conceptual framework has been presented as a logical sequence of events or processes within biological systems which can be used to understand adverse effects and refine current risk assessment practices in ecotoxicology. This framework can also be applied to human toxicology and is explored on the basis of investigating the molecular initiating events (MIEs) of compounds. The precise definition of the MIE has yet to reach general acceptance. In this work we present a unified MIE definition: an MIE is the initial interaction between a molecule and a biomolecule or biosystem that can be causally linked to an outcome via a pathway. Case studies are presented, and issues with current definitions are addressed. With the development of a unified MIE definition, the field can look toward defining, classifying, and characterizing more MIEs and using knowledge of the chemistry of these processes to aid AOP research and toxicity risk assessment. We also present the role of MIE research in the development of in vitro and in silico toxicology and suggest how, by using a combination of biological and chemical approaches, MIEs can be identified and characterized despite a lack of detailed reports, even for some of the most studied molecules in toxicology.

  4. Establishing precursor events for stress corrosion cracking initiation in type 304L stainless steel

    International Nuclear Information System (INIS)

    Khan, M.U.F.; Raja, V.S.; Roychowdhury, S.; Kain, V.

    2015-01-01

    The present study attempts to establish slip band emergence, due to localized deformation, as a precursor event for SCC initiation in type 304L SS. The unidirectional tensile loading was used for straining flat tensile specimen, less than 10% strain, in air, 0.5 M NaCl + 0.5 M H 2 SO 4 and boiling water reactor (BWR) simulated environment (288 C. degrees, 10 MPa). The surface features were characterized using optical microscopy, scanning electron microscopy (including electron backscattered diffraction-EBSD) and atomic force microscopy. The study shows that with increase in strain level, during unidirectional slow strain rate test (SSRT), average slip band height increases in air and the attack on slip lines occurs in acidified chloride environment. In BWR simulated environment, preferential oxidation on slip lines and initiation of a few cracks on some of the slip lines are observed. Based on the observation, the study suggests slip bands, formed due to localized deformation, to act as a precursor for SCC initiation. (authors)

  5. Calculation of noninformative prior of reliability parameter and initiating event frequency with Jeffreys method

    International Nuclear Information System (INIS)

    He Jie; Zhang Binbin

    2013-01-01

    In the probabilistic safety assessment (PSA) of nuclear power plants, there are few historical records on some initiating event frequencies or component failures in industry. In order to determine the noninformative priors of such reliability parameters and initiating event frequencies, the Jeffreys method in Bayesian statistics was employed. The mathematical mechanism of the Jeffreys prior and the simplified constrained noninformative distribution (SCNID) were elaborated in this paper. The Jeffreys noninformative formulas and the credible intervals of the Gamma-Poisson and Beta-Binomial models were introduced. As an example, the small break loss-of-coolant accident (SLOCA) was employed to show the application of the Jeffreys prior in determining an initiating event frequency. The result shows that the Jeffreys method is an effective method for noninformative prior calculation. (authors)

  6. Overview of results and perspectives from the Shoreham major common-cause initiating events study

    International Nuclear Information System (INIS)

    Joksimovich, V.; Orvis, D.D.; Paccione, R.J.

    1986-01-01

    This study represents the continuation of a large effort by LILCO to fully understand the potential hazards posed by future operation of the Shoreham Nuclear Power Stations (SNPS). The Shoreham Probabilistic Risk Assessment, a level 3 PRA without external events, provided a characterization of the accident sequences that could leave the core in a condition in which it would be vulnerable to severe damage if further mitigating actions were not taken. It estimated the frequency and magnitude of the potential radioactivity releases associated with such sequences. The study was limited to accident sequences initiated by so called internal events to the plant including a loss of offsite power. It also characterized the public risk associated with those accident sequences. The ''Major Common-Cause Initiating Events Study'' (MCCI) for the Shoreham plant was performed to obtain insights into the plant's susceptibility to, and inherent defenses against, certain MCCIs. Major common-cause initiating events are occurrences which have the potential to initiate a plant transient or LOCA and, also, damage one or more plant systems needed to mitigate the effects of a transient or LOCA. The scope of the MCCI study included detailed analyses of seismic events and fires through the severe core damage and bounding analyses of aircraft crashes, windstorms, turbine missiles and release of hazardous materials near the plant

  7. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  8. The development on the methodology of the initiating event frequencies for liquid metal reactor KALIMER

    International Nuclear Information System (INIS)

    Jeong, K. S.; Yang, Z. A.; Ah, Y. B.; Jang, W. P.; Jeong, H. Y.; Ha, K. S.; Han, D. H.

    2002-01-01

    In this paper, the PSA methodology of PRISM,Light Water Reactor, Pressurized Heavy Water Reactor are analyzed and the methodology of Initiating Events for KALIMER are suggested. Also,the reliability assessment of assumptions for Pipes Corrosion Frequency is set up. The reliability assessment of Passive Safety System, one of Main Safety System of KALIMER, are discussed and analyzed

  9. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  10. Initiating events study of the first extraction cycle process in a model reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Renze; Zhang, Jian Gang; Zhuang, Dajie; Feng, Zong Yang [China Institute for Radiation Protection, Taiyuan (China)

    2016-06-15

    Definition and grouping of initiating events (IEs) are important basics for probabilistic safety assessment (PSA). An IE in a spent fuel reprocessing plant (SFRP) is an event that probably leads to the release of dangerous material to jeopardize workers, public and environment. The main difference between SFRPs and nuclear power plants (NPPs) is that hazard materials spread diffusely in a SFRP and radioactive material is just one kind of hazard material. Since the research on IEs for NPPs is in-depth around the world, there are several general methods to identify IEs: reference of lists in existence, review of experience feedback, qualitative analysis method, and deductive analysis method. While failure mode and effect analysis (FMEA) is an important qualitative analysis method, master logic diagram (MLD) method is the deductive analysis method. IE identification in SFRPs should be consulted with the experience of NPPs, however the differences between SFRPs and NPPs should be considered seriously. The plutonium uranium reduction extraction (Purex) process is adopted by the technics in a model reprocessing plant. The first extraction cycle (FEC) is the pivotal process in the Purex process. Whether the FEC can function safely and steadily would directly influence the production process of the whole plant-production quality. Important facilities of the FEC are installed in the equipment cells (ECs). In this work, IEs in the FEC process were identified and categorized by FMEA and MLD two methods, based on the fact that ECs are containments in the plant. The results show that only two ECs in the FEC do not need to be concerned particularly with safety problems, and criticality, fire and red oil explosion are IEs which should be emphatically analyzed. The results are accordant with the references.

  11. Estimation of initiating event distribution at nuclear power plants by Bayesian procedure

    International Nuclear Information System (INIS)

    Chen Guangming

    1995-01-01

    Initiating events at nuclear power plants such as human errors or components failures may lead to a nuclear accident. The study of the frequency of these events or the distribution of the failure rate is necessary in probabilistic risk assessment for nuclear power plants. This paper presents Bayesian modelling methods for the analysis of the distribution of the failure rate. The method can also be utilized in other related fields especially where the data is sparse. An application of the Bayesian modelling in the analysis of distribution of the time to recover Loss of Off-Site Power ( LOSP) is discussed in the paper

  12. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  13. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Heo, Gyunyoung

    2015-01-01

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making

  14. An Initiating-Event Analysis for PSA of Hanul Units 3 and 4: Results and Insights

    International Nuclear Information System (INIS)

    Kim, Dong-San; Park, Jin Hee

    2015-01-01

    As a part of the PSA, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard related to IE analysis. This paper describes the methods of, results and some insights from the IE analysis for the PSA of the Hanul units 3 and 4. In this study, as a part of the PSA for the Hanul units 3 and 4, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard. In comparison with the previous IE analysis, this study performed a more systematic and detailed analysis to identify potential initiating events, and calculated the IE frequencies by using the state-of-the-art methods and the latest data. As a result, not a few IE frequencies are quite different from the previous frequencies, which can change the major accident sequences obtained from the quantification of the PSA model

  15. Physical mechanism of initial breakdown pulses and narrow bipolar events in lightning discharges

    Science.gov (United States)

    Silva, Caitano L.; Pasko, Victor P.

    2015-05-01

    To date the true nature of initial breakdown pulses (IBPs) and narrow bipolar events (NBEs) in lightning discharges remains a mystery. Recent experimental evidence has correlated IBPs to the initial development of lightning leaders inside the thundercloud. NBE wideband waveforms resemble classic IBPs in both amplitude and duration. Most NBEs are quite peculiar in the sense that very frequently they occur in isolation from other lightning processes. The remaining fraction, 16% of positive polarity NBEs, according to Wu et al. (2014), happens as the first event in an otherwise regular intracloud lightning discharge. These authors point out that the initiator type of NBEs has no difference with other NBEs that did not start lightning, except for the fact that they occur deeper inside the thunderstorm (i.e., at lower altitudes). In this paper, we propose a new physical mechanism to explain the source of both IBPs and NBEs. We propose that IBPs and NBEs are the electromagnetic transients associated with the sudden (i.e., stepwise) elongation of the initial negative leader extremity in the thunderstorm electric field. To demonstrate our hypothesis a novel computational/numerical model of the bidirectional lightning leader tree is developed, consisting of a generalization of electrostatic and transmission line approximations found in the literature. Finally, we show how the IBP and NBE waveform characteristics directly reflect the properties of the bidirectional lightning leader (such as step length, for example) and amplitude of the thunderstorm electric field.

  16. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  17. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  18. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  19. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  20. The "Big Bang" in obese fat: Events initiating obesity-induced adipose tissue inflammation.

    Science.gov (United States)

    Wensveen, Felix M; Valentić, Sonja; Šestan, Marko; Turk Wensveen, Tamara; Polić, Bojan

    2015-09-01

    Obesity is associated with the accumulation of pro-inflammatory cells in visceral adipose tissue (VAT), which is an important underlying cause of insulin resistance and progression to diabetes mellitus type 2 (DM2). Although the role of pro-inflammatory cytokines in disease development is established, the initiating events leading to immune cell activation remain elusive. Lean adipose tissue is predominantly populated with regulatory cells, such as eosinophils and type 2 innate lymphocytes. These cells maintain tissue homeostasis through the excretion of type 2 cytokines, such as IL-4, IL-5, and IL-13, which keep adipose tissue macrophages (ATMs) in an anti-inflammatory, M2-like state. Diet-induced obesity is associated with the loss of tissue homeostasis and development of type 1 inflammatory responses in VAT, characterized by IFN-γ. A key event is a shift of ATMs toward an M1 phenotype. Recent studies show that obesity-induced adipocyte hypertrophy results in upregulated surface expression of stress markers. Adipose stress is detected by local sentinels, such as NK cells and CD8(+) T cells, which produce IFN-γ, driving M1 ATM polarization. A rapid accumulation of pro-inflammatory cells in VAT follows, leading to inflammation. In this review, we provide an overview of events leading to adipose tissue inflammation, with a special focus on adipose homeostasis and the obesity-induced loss of homeostasis which marks the initiation of VAT inflammation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Study of MHD events initiated by pellet injection into T-10 plasmas

    International Nuclear Information System (INIS)

    Kuteev, B.; Khimchenko, L.; Krylov, S.; Pavlov, Y.; Pustovitov, V.; Sarychev, D.; Sergeev, V.; Skokov, V.; Timokhin, V.

    2005-01-01

    There are several events which might be responsible for ultra fast transport of heat and particles during pellet ablation stage in a tokamak. Those are jumps of transport coefficients, plasma drifts in the pellet vicinity and MHD events with time scale significantly shorter than the pellet ablation time. The role of the latter is still not very well understood due to a lack of studies. This paper is devoted to detailed study of the effects during the pellet ablation phase (∼ one millisecond) with main objective to determine the relation between pellet (material Li, C., KCl, size and velocity) and plasma parameters ( q-value a the pellet position, plasma density and temperature) which initiate microsecond MHD events in plasma. The pellets were injected into both into Ohmic and ECE heated plasmas (up to 3 MW) in the T-10 tokamak at various stages of the plasma discharge, in a wide range from the very beginning up to the post-disruption stage. It is observed that at some conditions a pellet ablates in the plasma without accompanying MHD events. This occurs at the highest plasma densities even if a pellet penetrates through q=1 magnetic surface. The ablation rate corresponds to NGSM in this case. Small scale events may occur near rational magnetic surfaces and the ablation rate fluctuations may be explained by reconnection. Both increase of the longitudinal heat flow due to plasma conventional from higher temperature region and growth of the electric field generation supra-thermal electrons may be responsible for the enhanced ablation. Large scale MHD events envelop a region inside q<3. It is observed that the MHD-cooled area is not poloidally symmetric. Mechanisms of the phenomena observed and their consequences on tokamak operation are discussed. (Author)

  2. Risk factors for adverse events after vaccinations performed during the initial hospitalization of infants born prematurely.

    Science.gov (United States)

    Wilińska, Maria; Warakomska, Małgorzata; Głuszczak-Idziakowska, Ewa; Jackowska, Teresa

    There are significant delays in implementing vaccination among preterm infants. Description of the frequency and kinds of adverse events following immunization in preterms. Establishment of the group of preterms who will distinctively be susceptible to adverse events. Demographical, clinical data and the occurrence of adverse events after DTaP, HIB and pneumococcal vaccination among preterms during their initial hospitalization were prospectively collected with the use of an electronic data form between 1st June 2011 and 31st May 2015. The analysis was conducted on 138 patients. The groups were divided according to maturity (I: ≤ GA 28w n=73 and GA 29-36 w n=65). There were no statistically significant differences between the groups in the occurrence of adverse events. Out of the total group, following vaccination apnoea developed in 6 newborns (4%) and activity dysfunctions were observed in 13 newborns (10%). The occurrence of apnoea after vaccination positively correlated with the time of non-invasive ventilation and the occurrence of late infection. There were no statistically significant demographical or clinical risk factors for the development of activity dysfunctions following vaccination. Term vaccination in clinically stable preterm infants is a safe medical procedure. However, long-term non-invasive respiratory support and late infections are risk factors for apnea following vaccinations. In these patients vaccinations should be considered during hospitalization.

  3. Initial concepts on energetics and mass releases during nonnuclear explosive events in fuel cycle facilities

    International Nuclear Information System (INIS)

    Halverson, M.A.; Mishima, J.

    1986-09-01

    Non-nuclear explosions are one of the initiating events (accidents) considered in the US Nuclear Regulatory Commission study of formal methods for estimating the airborne release of radionuclides from fuel cycle facilities. Methods currently available to estimate the energetics and mass airborne release from the four types of non-nuclear explosive events (fast and slow physical explosions and fast and slow chemical explosions) are reviewed. The likelihood that fast physical explosions will occur in fuel cycle facilities appears to be remote and this type of explosion is not considered. Methods to estimate the consequences of slow physical and fast chemical explosions are available. Methods to estimate the consequences of slow chemical explosions are less well defined

  4. Event Highlight: Nigeria Evidence-based Health System Initiative

    International Development Research Centre (IDRC) Digital Library (Canada)

    2012-06-01

    Jun 1, 2012 ... skills about relevant statistical and epidemiological methods. In this third module, they analyzed the data from the social audit surveys in their two states. Working ... Distance Learning Master of Science in Epidemiology that is.

  5. Initiating events identification of the IS process using the master logic diagram

    International Nuclear Information System (INIS)

    Cho, Nam Chul; Jae, Moo Sung; Yang, Joon Eon

    2005-01-01

    Hydrogen is very attractive as a future secondary energy carrier considering environmental problems. It is important to produce hydrogen from water by use of carbon free primary energy source. The thermochemical water decomposition cycle is one of the methods for the hydrogen production process from water. Japan Atomic Energy Research Institute (JAERI) has been carrying out an R and D on the IS (iodine.sulfur) process that was first proposed by GA (General Atomic Co.) focusing on demonstration the 'closed-cycle' continuous hydrogen production on developing a feasible and efficient scheme for the HI processing, and on screening and/or developing materials of construction to be used in the corrosive process environment. The successful continuous operation of the IS-process was demonstrated and this process is one of the thermochemical processes, which is the closest to being industrialized. Currently, Korea has also started a research about the IS process and the construction of the IS process system is planned. In this study, for risk analysis of the IS process, initiating events of the IS process are identified by using the Master Logic Diagram (MLD) that is method for initiating event identification

  6. Selection of important initiating events for Level 1 probabilistic safety assessment study at Puspati TRIGA Reactor

    International Nuclear Information System (INIS)

    Maskin, M.; Charlie, F.; Hassan, A.; Prak Tom, P.; Ramli, Z.; Mohamed, F.

    2016-01-01

    Highlights: • Identifying possible important initiating events (IEs) for Level 1 probabilistic safety assessment performed on research nuclear reactor. • Methods in screening and grouping IEs are addressed. • Focusing only on internal IEs due to random failures of components. - Abstract: This paper attempts to present the results in identifying possible important initiating events (IEs) as comprehensive as possible to be applied in the development of Level-1 probabilistic safety assessment (PSA) study. This involves the approaches in listing and the methods in screening and grouping IEs, by focusing only on the internal IEs due to random failures of components and human errors with full power operational conditions and reactor core as the radioactivity source. Five approaches were applied in listing the IEs and each step of the methodology was described and commented. The criteria in screening and grouping the IEs were also presented. The results provided the information on how the Malaysian PSA team applied the approaches in selecting the most probable IEs as complete as possible in order to ensure the set of IEs was identified systematically and as representative as possible, hence providing confidence to the completeness of the PSA study. This study is perhaps one of the first to address classic comprehensive steps in identifying important IEs to be used in a Level-1 PSA study.

  7. A Method to Quantify Plant Availability and Initiating Event Frequency Using a Large Event Tree, Small Fault Tree Model

    International Nuclear Information System (INIS)

    Kee, Ernest J.; Sun, Alice; Rodgers, Shawn; Popova, ElmiraV; Nelson, Paul; Moiseytseva, Vera; Wang, Eric

    2006-01-01

    South Texas Project uses a large fault tree to produce scenarios (minimal cut sets) used in quantification of plant availability and event frequency predictions. On the other hand, the South Texas Project probabilistic risk assessment model uses a large event tree, small fault tree for quantifying core damage and radioactive release frequency predictions. The South Texas Project is converting its availability and event frequency model to use a large event tree, small fault in an effort to streamline application support and to provide additional detail in results. The availability and event frequency model as well as the applications it supports (maintenance and operational risk management, system engineering health assessment, preventive maintenance optimization, and RIAM) are briefly described. A methodology to perform availability modeling in a large event tree, small fault tree framework is described in detail. How the methodology can be used to support South Texas Project maintenance and operations risk management is described in detail. Differences with other fault tree methods and other recently proposed methods are discussed in detail. While the methods described are novel to the South Texas Project Risk Management program and to large event tree, small fault tree models, concepts in the area of application support and availability modeling have wider applicability to the industry. (authors)

  8. Initialization Errors in Quantum Data Base Recall

    OpenAIRE

    Natu, Kalyani

    2016-01-01

    This paper analyzes the relationship between initialization error and recall of a specific memory in the Grover algorithm for quantum database search. It is shown that the correct memory is obtained with high probability even when the initial state is far removed from the correct one. The analysis is done by relating the variance of error in the initial state to the recovery of the correct memory and the surprising result is obtained that the relationship between the two is essentially linear.

  9. Treatment of the loss of ultimate heat sink initiating events in the IRSN level 1 PSA

    International Nuclear Information System (INIS)

    Dupuy, Patricia; Georgescu, Gabriel; Corenwinder, Francois

    2014-01-01

    The total loss of the ultimate heat sink is an initiating event which, even it is mainly of external origin, has been considered in the frame of internal events Level 1 PSA by IRSN. The on-going actions on the development of external hazards PSA and the recent incident of loss of the heat sink induced by the ingress of vegetable matter that occurred in France in 2009 have pointed out the need to improve the modeling of the loss of the heat sink initiating event and sequences to better take into account the fact that this loss may be induced by external hazards and thus affect all the site units. The paper presents the historical steps of the modeling of the total loss of the heat sink, the safety stakes of this modeling, the main assumptions used by IRSN in the associated PSA for the 900 MWe reactors and the results obtained. The total loss of the heat sink was not initially addressed in the safety demonstration of French NPPs. On the basis of the insights of the first probabilistic assessments performed in the 80's, the risks associated to this 'multiple failure situation' turned out to be very significant and design and organisational improvements were implemented on the plants. Reviews of the characterization of external hazards and of their consequences on the installations and French operating feedback have revealed that extreme hazards may induce a total loss of the heat sink. Moreover, the accident that occurred at Fukushima in 2011 has pointed out the risk of such a loss of long duration at all site units in case of extreme hazards. In this context, it seems relevant to further improve the modelling of the total loss of the heat sink by considering the external hazards that may cause this loss. In a first step, IRSN has improved the assumptions and data used in the loss of the heat sink PSA model, in particular by considering that such a loss may affect all the site units. The next challenge will be the deeper analysis of the impact of external hazards on

  10. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  11. When the Sky Falls: Performing Initial Assessments of Bright Atmospheric Events

    Science.gov (United States)

    Cooke, William J.; Brown, Peter; Blaauw, Rhiannon; Kingery, Aaron; Moser, Danielle

    2015-01-01

    The 2013 Chelyabinsk super bolide was the first "significant" impact event to occur in the age of social media and 24 hour news. Scientists, used to taking many days or weeks to analyze fireball events, were hard pressed to meet the immediate demands (within hours) for answers from the media, general public, and government officials. Fulfilling these requests forced many researchers to exploit information available from various Internet sources - videos were downloaded from sites like Youtube, geolocated via Google Street View, and quickly analyzed with improvised software; Twitter and Facebook were scoured for eyewitness accounts of the fireball and reports of meteorites. These data, combined with infrasound analyses, enabled a fairly accurate description of the Chelyabinsk event to be formed within a few hours; in particular, any relationship to 2012 DA14 (which passed near Earth later that same day) was eliminated. Results of these analyses were quickly disseminated to members of the NEO community for press conferences and media interviews. Despite a few minor glitches, the rapid initial assessment of Chelyabinsk was a triumph, permitting the timely conveyance of accurate information to the public and the incorporation of social media into fireball analyses. Beginning in 2008, the NASA Meteoroid Environments Office, working in cooperation with Western's Meteor Physics Group, developed processes and software that permit quick characterization - mass, trajectory, and orbital properties - of fireball events. These tools include automated monitoring of Twitter to establish the time of events (the first tweet is usually no more than a few seconds after the fireball), mining of Youtube and all sky camera web archives to locate videos suitable for analyses, use of Google Earth and Street View to geolocate the video locations, and software to determine the fireball trajectory and object orbital parameters, including generation of animations suitable for popular media

  12. Initiating events of accidents in the practice of oil well logging in Cuba

    International Nuclear Information System (INIS)

    Alles Leal, A.; Perez Reyes, Y.; Dumenigo Gonzalez, C.

    2013-01-01

    The oil well logging is an extremely important activity within the oil industry, but in turn, brings risks that occasionally result in damage to health, the environment and economic losses. In this context, risk analysis has become an important tool to control them through their prediction and the study of the factors that determine them, enabling substantiated decisions to, first, foresee accidents and, secondly, to minimize their consequences. This paper proposes the elaboration of a list of initiating events of accidents in the practice of oil well logging which is one of the most important aspects for further evaluation of radiation safety of this practice. For its determination the technique employed to identify risks was 'Failure Modes and Effects Analysis (FMEA)' by applying it to the different stages and processes of practice. (Author)

  13. Probabilistic safety analysis on an SBWR 72 hours after the initiating event

    International Nuclear Information System (INIS)

    Dominguez Bautista, M.T.; Peinador Veira, M.

    1996-01-01

    Passive plants, including SBWRs, are designed to carry out safety functions with passive systems during the first 72 hours after the initiation event with no need for manual actions or external support. After this period, some recovery actions are required to enable the passive systems to continue performing their safety functions. The study was carried out by the INITEC-Empresarios Agrupados Joint Venture within the framework of the international group collaborating with GE on this project. Its purpose has been to assess, by means of probabilistic criteria, the importance to safety of each of these support actions, in order to define possible requirements to be considered in the design in respect of said recovery actions. In brief, the methodology developed for this objective consists of (1) quantifying success event trees from the PSA up to 72 hours, (2) determining the actions required in each sequence to maintain Steady State after 72 hours, (3) identifying available alternative core cooling methods in each sequence, (4) establishing the approximate (order of magnitude) realizability of each alternative method, (5) calculating the frequency of core damage as a function of the failure probability of post-72-hour actions and (6) analysing the importance of post-72-hour actions. The results of this analysis permit the establishment, right from the conceptual design phase, of the requirements that will arise to ensure these actions in the long term, enhancing their reliability and preventing the accident from continuing beyond this period. (Author)

  14. Fragile X founder chromosomes in Italy: A few initial events and possible explanation for their heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Chiurazzi, P.; Genuardi, M.; Kozak, L.; Neri, G. [Universita Cattolica and Centro Ricerche per la Disabilita Mentale e Motoria, Roma (Italy)] [and others

    1996-07-12

    A total of 137 fragile X and 235 control chromosomes from various regions of Italy were haplotyped by analyzing two neighbouring marker microsatellites, FRAXAC1 and DXS548. The number of CGG repeats at the 5{prime} end of the FMR1 gene was also assessed in 141 control chromosomes and correlated with their haplotypes. Significant linkage disequilibrium between some {open_quotes}major{close_quotes} haplotypes and fragile X was observed, while other {open_quotes}minor{close_quotes} haplotypes may have originated by subsequent mutation at the marker microsatellite loci and/or recombination between them. Recent evidence suggests that the initial mechanism leading to CGG instability might consist of rare (10{sup -6/-7}) CGG repeat slippage events and/or loss of a stabilizing AGG via A-to-C transversion. Also, the apparently high variety of fragile X chromosomes may be partly due to the relatively high mutation rate (10{sup -4/-5}) of the microsatellite markers used in haplotyping. Our fragile X sample also showed a higher than expected heterozygosity when compared to the control sample and we suggest that this might be explained by the chance occurrence of the few founding events on different chromosomes, irrespective of their actual frequency in the population. Alternatively, a local mechanism could enhance the microsatellite mutation rate only on fragile X chromosomes, or fragile X mutations might occur more frequently on certain background haplotypes. 59 refs., 4 figs.

  15. Charged particle multiplicities in heavy and light quark initiated events above the $Z^0$ peak

    CERN Document Server

    Abbiendi, G.; Akesson, P.F.; Alexander, G.; Allison, John; Amaral, P.; Anagnostou, G.; Anderson, K.J.; Arcelli, S.; Asai, S.; Axen, D.; Azuelos, G.; Bailey, I.; Barberio, E.; Barlow, R.J.; Batley, R.J.; Bechtle, P.; Behnke, T.; Bell, Kenneth Watson; Bell, P.J.; Bella, G.; Bellerive, A.; Benelli, G.; Bethke, S.; Biebel, O.; Bloodworth, I.J.; Boeriu, O.; Bock, P.; Bonacorsi, D.; Boutemeur, M.; Braibant, S.; Brigliadori, L.; Brown, Robert M.; Buesser, K.; Burckhart, H.J.; Campana, S.; Carnegie, R.K.; Caron, B.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, David G.; Csilling, A.; Cuffiani, M.; Dado, S.; Dallison, S.; De Roeck, A.; De Wolf, E.A.; Desch, K.; Dienes, B.; Donkers, M.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Elfgren, E.; Etzion, E.; Fabbri, F.; Feld, L.; Ferrari, P.; Fiedler, F.; Fleck, I.; Ford, M.; Frey, A.; Furtjes, A.; Gagnon, P.; Gary, John William; Gaycken, G.; Geich-Gimbel, C.; Giacomelli, G.; Giacomelli, P.; Giunta, Marina; Goldberg, J.; Gross, E.; Grunhaus, J.; Gruwe, M.; Gunther, P.O.; Gupta, A.; Hajdu, C.; Hamann, M.; Hanson, G.G.; Harder, K.; Harel, A.; Harin-Dirac, M.; Hauschild, M.; Hauschildt, J.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Hensel, C.; Herten, G.; Heuer, R.D.; Hill, J.C.; Hoffman, Kara Dion; Homer, R.J.; Horvath, D.; Howard, R.; Igo-Kemenes, P.; Ishii, K.; Jeremie, H.; Jovanovic, P.; Junk, T.R.; Kanaya, N.; Kanzaki, J.; Karapetian, G.; Karlen, D.; Kartvelishvili, V.; Kawagoe, K.; Kawamoto, T.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kim, D.H.; Klein, K.; Klier, A.; Kluth, S.; Kobayashi, T.; Kobel, M.; Komamiya, S.; Kormos, Laura L.; Kramer, T.; Kress, T.; Krieger, P.; von Krogh, J.; Krop, D.; Kruger, K.; Kuhl, T.; Kupper, M.; Lafferty, G.D.; Landsman, H.; Lanske, D.; Layter, J.G.; Leins, A.; Lellouch, D.; Lettso, J.; Levinson, L.; Lillich, J.; Lloyd, S.L.; Loebinger, F.K.; Lu, J.; Ludwig, J.; Macpherson, A.; Mader, W.; Marcellini, S.; Marchant, T.E.; Martin, A.J.; Martin, J.P.; Masetti, G.; Mashimo, T.; Mattig, Peter; McDonald, W.J.; McKenna, J.; McMahon, T.J.; McPherson, R.A.; Meijers, F.; Mendez-Lorenzo, P.; Menges, W.; Merritt, F.S.; Mes, H.; Michelini, A.; Mihara, S.; Mikenberg, G.; Miller, D.J.; Moed, S.; Mohr, W.; Mori, T.; Mutter, A.; Nagai, K.; Nakamura, I.; Neal, H.A.; Nisius, R.; O'Neale, S.W.; Oh, A.; Okpara, A.; Oreglia, M.J.; Orito, S.; Pahl, C.; Pasztor, G.; Pater, J.R.; Patrick, G.N.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Poli, B.; Polok, J.; Pooth, O.; Przybycien, M.; Quadt, A.; Rabbertz, K.; Rembser, C.; Renkel, P.; Rick, H.; Roney, J.M.; Rosati, S.; Rozen, Y.; Runge, K.; Sachs, K.; Saeki, T.; Sahr, O.; Sarkisyan, E.K.G.; Schaile, A.D.; Schaile, O.; Scharff-Hansen, P.; Schieck, J.; Schoerner-Sadenius, Thomas; Schroder, Matthias; Schumacher, M.; Schwick, C.; Scott, W.G.; Seuster, R.; Shears, T.G.; Shen, B.C.; Sherwood, P.; Siroli, G.; Skuja, A.; Smith, A.M.; Sobie, R.; Soldner-Rembold, S.; Spano, F.; Stahl, A.; Stephens, K.; Strom, David M.; Strohmer, R.; Tarem, S.; Tasevsky, M.; Taylor, R.J.; Teuscher, R.; Thomson, M.A.; Torrence, E.; Toya, D.; Tran, P.; Trefzger, T.; Tricoli, A.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turner-Watson, M.F.; Ueda, I.; Ujvari, B.; Vachon, B.; Vollmer, C.F.; Vannerem, P.; Verzocchi, M.; Voss, H.; Vossebeld, J.; Waller, D.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wengler, T.; Wermes, N.; Wetterling, D.; Wilson, G.W.; Wilson, J.A.; Wolf, G.; Wyatt, T.R.; Yamashita, S.; Zer-Zion, D.; Zivkovic, Lidija

    2002-01-01

    We have measured the mean charged particle multiplicities separately for bbbar, ccbar and light quark (uubar, ddbar, ssbar) initiated events produced in e+e- annihilations at LEP. The data were recorded with the OPAL detector at eleven different energies above Z0 peak, corresponding to the full statistics collected at LPE1.5 and LEP2. The difference in mean charged and particle multiplicities for bbbar and light quark events, delta_bl, measured over this energy range is consistent with an energy independent behaviour, as predicted by QCD, but is inconsistent with the prediction of a more phenomenological approach which assumes that the multiplicity accompanying the decay of a heavy quark is independent of the quark mass itself. Our results, which can be combined into the single measurement delta_bl = 3.44+-0.40(stat)+-0.89(syst) at a luminosity weighted average centre-of mass energy of 195 GeV, are also consistent with an energy independent behaviour as extrapolated from lower energy data.

  16. Ultrafast hydrogen exchange reveals specific structural events during the initial stages of folding of cytochrome c.

    Science.gov (United States)

    Fazelinia, Hossein; Xu, Ming; Cheng, Hong; Roder, Heinrich

    2014-01-15

    Many proteins undergo a sharp decrease in chain dimensions during early stages of folding, prior to the rate-limiting step in folding. However, it remains unclear whether compact states are the result of specific folding events or a general hydrophobic collapse of the poly peptide chain driven by the change in solvent conditions. To address this fundamental question, we extended the temporal resolution of NMR-detected H/D exchange labeling experiments into the microsecond regime by adopting a microfluidics approach. By observing the competition between H/D exchange and folding as a function of labeling pH, coupled with direct measurement of exchange rates in the unfolded state, we were able to monitor hydrogen-bond formation for over 50 individual backbone NH groups within the initial 140 microseconds of folding of horse cytochrome c. Clusters of solvent-shielded amide protons were observed in two α-helical segments in the C-terminal half of the protein, while the N-terminal helix remained largely unstructured, suggesting that proximity in the primary structure is a major factor in promoting helix formation and association at early stages of folding, while the entropically more costly long-range contacts between the N- and C-terminal helices are established only during later stages. Our findings clearly indicate that the initial chain condensation in cytochrome c is driven by specific interactions among a subset of α-helical segments rather than a general hydrophobic collapse.

  17. Analysis of the initiating events in HIV-1 particle assembly and genome packaging.

    Directory of Open Access Journals (Sweden)

    Sebla B Kutluay

    2010-11-01

    Full Text Available HIV-1 Gag drives a number of events during the genesis of virions and is the only viral protein required for the assembly of virus-like particles in vitro and in cells. Although a reasonable understanding of the processes that accompany the later stages of HIV-1 assembly has accrued, events that occur at the initiation of assembly are less well defined. In this regard, important uncertainties include where in the cell Gag first multimerizes and interacts with the viral RNA, and whether Gag-RNA interaction requires or induces Gag multimerization in a living cell. To address these questions, we developed assays in which protein crosslinking and RNA/protein co-immunoprecipitation were coupled with membrane flotation analyses in transfected or infected cells. We found that interaction between Gag and viral RNA occurred in the cytoplasm and was independent of the ability of Gag to localize to the plasma membrane. However, Gag:RNA binding was stabilized by the C-terminal domain (CTD of capsid (CA, which participates in Gag-Gag interactions. We also found that Gag was present as monomers and low-order multimers (e.g. dimers but did not form higher-order multimers in the cytoplasm. Rather, high-order multimers formed only at the plasma membrane and required the presence of a membrane-binding signal, but not a Gag domain (the CA-CTD that is essential for complete particle assembly. Finally, sequential RNA-immunoprecipitation assays indicated that at least a fraction of Gag molecules can form multimers on viral genomes in the cytoplasm. Taken together, our results suggest that HIV-1 particle assembly is initiated by the interaction between Gag and viral RNA in the cytoplasm and that this initial Gag-RNA encounter involves Gag monomers or low order multimers. These interactions per se do not induce or require high-order Gag multimerization in the cytoplasm. Instead, membrane interactions are necessary for higher order Gag multimerization and subsequent

  18. Adequate engineering for lowering the frequency of initiating events at Siemens/KWU

    International Nuclear Information System (INIS)

    Gremm, O.

    1988-01-01

    The analysis of TMI and Chernobyl events shows weak points and deficits in the field of preventive safety features. This should not be forgotten during the ongoing discussion on severe accidents. Therefore the paper explains special preventive safety features which were the results of the development of Siemens/KWU reactor technology. With respect to the present discussion on new reactor concepts special attention is given to the inherent and passive safety features and the engineering which results in low core melt frequency. Such an analysis leads to knowledge modules which are based on experience during licensing procedures and plant operation and should be the starting points for reactor technology of the future

  19. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  20. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  1. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  2. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  3. A hypothesis generation model of initiating events for nuclear power plant operators

    International Nuclear Information System (INIS)

    Sawhney, R.S.; Dodds, H.L.; Schryver, J.C.; Knee, H.E.

    1989-01-01

    The goal of existing alarm-filtering models is to provide the operator with the most accurate assessment of patterns of annunciated alarms. Some models are based on event-tree analysis, such as DuPont's Diagnosis of Multiple Alarms. Other models focus on improving hypothesis generation by deemphasizing alarms not relevant to the current plant scenario. Many such models utilize the alarm filtering system as a basis of dynamic prioritization. The Lisp-based alarm analysis model presented in this paper was developed for the Advanced Controls Program at Oak Ridge National Laboratory to dynamically prioritize hypotheses via an AFS by incorporating an unannunciated alarm analysis with other plant-based concepts. The objective of this effort is to develop an alarm analysis model that would allow greater flexibility and more accurate hypothesis generation than the prototype fault diagnosis model utilized in the Integrated Reactor Operator/System (INTEROPS) model. INTEROPS is a time-based predictive model of the nuclear power plant operator, which utilizes alarm information in a manner similar to the human operator. This is achieved by recoding the knowledge base from the personal computer-based expert system shell to a common Lisp structure, providing the ability to easily modify both the manner in which the knowledge is structured as well as the logic by which the program performs fault diagnosis

  4. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  5. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  6. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  7. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  8. Identification of initiating events using a master logic diagram in low-power and shutdown PSA for nuclear power plant

    International Nuclear Information System (INIS)

    Han, S. J.; Park, J. H.; Kim, T. W.; Ha, J. J.

    2003-01-01

    It is necessary to apply a formal technique instead of an empirical technique in the identification of initiating events for Low Power and ShutDown (LPSD) Probabilistic Safety Assessment (PSA) of Nuclear Power Plant (NPP). The present study focuses on the examination of Master Logic Diagram (MLD) technique as a formal technique in the identification of initiating events. The MLD technique is a deductive tool using top-down approach for the formal and logical indentification of initiating events. The present study modified the MLD used in the full power PSA considering the characteristics of LPSD operation. The modified MLD introduced a systematic formation in decomposition process of which the MLD for full power PSA lacked. The modified MLD was able to identify initiating events systematic and logical. However, the formal techniques including the MLD have a limitation for precisely identifying all of the initiating events. In order to overcome this limitation, it is necessary to combine it with an empirical technique. We expect that the modified MLD can be used in an upgrade of the current LPSD PSAs

  9. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    Science.gov (United States)

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  10. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  11. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  12. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  13. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  14. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  15. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  16. The initial impact of EU ETS verification events on stock prices

    International Nuclear Information System (INIS)

    Brouwers, Roel; Schoubben, Frederiek; Van Hulle, Cynthia; Van Uytbergen, Steve

    2016-01-01

    This paper studies the impact of verified emissions publications in the European Emissions Trading Scheme (EU ETS) on the market value of participating companies. Using event study methodology on a unique sample of 368 listed companies, we show that verified emissions only resulted in statistically significant market responses when the carbon price was high and allowance scarcity was anticipated. The cross-section analysis of abnormal returns surrounding the publication of verified emissions shows that share prices decrease when actual emissions relative to allocated emissions increase. This negative relationship between allocation shortfalls and firm value is only significant for firms that are either carbon-intensive, compared to sector peers, or are less likely to pass through carbon-related costs in their product prices. The results suggest that although the EU ETS has been deemed unsuccessful so far due to over-allocation and low carbon price, shareholders initially perceived allowance holdings as value relevant. Our results highlight that a significant carbon market price and addressing pass-through costing are essential for successful future reforms of the EU ETS and other analogous carbon cap-and-trade systems implemented or planned worldwide. - Highlights: •We study the impact of EU ETS verified emissions disclosure on firms' market value. •Disclosure is relevant if carbon price is high and permits scarcity is anticipated. •We find a negative relationship between allocation shortfalls and firm value. •Stronger relationship for carbon – intensive and no cost pass-through firms. •High carbon price and addressing cost pass-through are crucial for EU ETS reforms.

  17. The dynamic relationship between current and previous severe hypoglycemic events: a lagged dependent variable analysis among patients with type 2 diabetes who have initiated basal insulin.

    Science.gov (United States)

    Ganz, Michael L; Li, Qian; Wintfeld, Neil S; Lee, Yuan-Chi; Sorli, Christopher; Huang, Joanna C

    2015-01-01

    Past studies have found episodes of severe hypoglycemia (SH) to be serially dependent. Those studies, however, only considered the impact of a single (index) event on future risk; few have analyzed SH risk as it evolves over time in the presence (or absence) of continuing events. The objective of this study was to determine the dynamic risks of SH events conditional on preceding SH events among patients with type 2 diabetes (T2D) who have initiated basal insulin. We used an electronic health records database from the United States that included encounter and laboratory data and clinical notes on T2D patients who initiated basal insulin therapy between 2008 and 2011 and to identify SH events. We used a repeated-measures lagged dependent variable logistic regression model to estimate the impact of SH in one quarter on the risk of SH in the next quarter. We identified 7235 patients with T2D who initiated basal insulin. Patients who experienced ≥1 SH event during any quarter were more likely to have ≥1 SH event during the subsequent quarter than those who did not (predicted probabilities of 7.4% and 1.0%, respectively; p history of SH before starting basal insulin (predicted probabilities of 1.0% and 3.2%, respectively; p history of SH during the titration period (predicted probabilities of 1.1% and 2.8%, respectively; p history of SH events and therefore the value of preventing one SH event may be substantial. These results can inform patient care by providing clinicians with dynamic data on a patient's risk of SH, which in turn can facilitate appropriate adjustment of the risk-benefit ratio for individualized patient care. These results should, however, be interpreted in light of the key limitations of our study: not all SH events may have been captured or coded in the database, data on filled prescriptions were not available, we were unable to adjust for basal insulin dose, and the post-titration follow-up period could have divided into time units other

  18. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  19. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    ), have been developed with a specific aim to standardize AE categorization. However, these controlled terminologies have many drawbacks, such as lack of textual definitions, poorly defined hierarchies, and lack of semantic axioms that provide logical relations among terms. A biomedical ontology is a set of consensus-based and computer and human interpretable terms and relations that represent entities in a specific biomedical domain and how they relate each other. To represent and analyze vaccine adverse events (VAEs), our research group has initiated and led the development of a community-based ontology: the Ontology of Adverse Events (OAE) (He et al., J Biomed Semant 5:29, 2014). The OAE has been found to have advantages to overcome the drawbacks of those controlled terminologies (He et al., Curr Pharmacol Rep :1-16. doi:10.1007/s40495-016-0055-0, 2014). By expanding the OAE and the community-based Vaccine Ontology (VO) (He et al., VO: vaccine ontology. In The 1st International Conference on Biomedical Ontology (ICBO-2009). Nature Precedings, Buffalo. http://precedings.nature.com/documents/3552/version/1 ; J Biomed Semant 2(Suppl 2):S8; J Biomed Semant 3(1):17, 2009; Ozgur et al., J Biomed Semant 2(2):S8, 2011; Lin Y, He Y, J Biomed Semant 3(1):17, 2012), we have also developed the Ontology of Vaccine Adverse Events (OVAE) to represent known VAEs associated with licensed vaccines (Marcos E, Zhao B, He Y, J Biomed Semant 4:40, 2013).In this book chapter, we will first introduce the basic information of VAEs, VAE safety surveillance systems, and how to specifically query and analyze VAEs using the US VAE database VAERS (Chen et al., Vaccine 12(10):960-960, 1994). In the second half of the chapter, we will introduce the development and applications of the OAE and OVAE. Throughout this chapter, we will use the influenza vaccine Flublok as the vaccine example to launch the corresponding elaboration (Huber VC, McCullers JA, Curr Opin Mol Ther 10(1):75-85, 2008). Flublok is a

  20. A Novel Flood Forecasting Method Based on Initial State Variable Correction

    Directory of Open Access Journals (Sweden)

    Kuang Li

    2017-12-01

    Full Text Available The influence of initial state variables on flood forecasting accuracy by using conceptual hydrological models is analyzed in this paper and a novel flood forecasting method based on correction of initial state variables is proposed. The new method is abbreviated as ISVC (Initial State Variable Correction. The ISVC takes the residual between the measured and forecasted flows during the initial period of the flood event as the objective function, and it uses a particle swarm optimization algorithm to correct the initial state variables, which are then used to drive the flood forecasting model. The historical flood events of 11 watersheds in south China are forecasted and verified, and important issues concerning the ISVC application are then discussed. The study results show that the ISVC is effective and applicable in flood forecasting tasks. It can significantly improve the flood forecasting accuracy in most cases.

  1. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  2. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  3. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  4. 7 CFR 1467.20 - Market-based conservation initiatives.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Market-based conservation initiatives. 1467.20....20 Market-based conservation initiatives. (a) Acceptance and use of contributions. Section 1241(e) of... for Conservation Improvements. (1) USDA recognizes that environmental benefits will be achieved by...

  5. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  6. Summary of significant solar-initiated events during STIP interval XII

    International Nuclear Information System (INIS)

    Gergely, T.E.

    1982-01-01

    A summary of the significant solar-terrestrial events of STIP Interval XII (April 10-July 1, 1981) is presented. It is shown that the first half of the interval was extremely active, with several of the largest X-ray flares, particle events, and shocks of this solar cycle taking place during April and the first half of May. However, the second half of the interval was characterized by relatively quiet conditions. A detailed examination is presented of several large events which occurred on 10, 24, and 27 April and on 8 and 16 May. It is suggested that the comparison and statistical analysis of the numerous events for which excellent observations are available could provide information on what causes a type II burst to propagate in the interplanetary medium

  7. Rocchio-based relevance feedback in video event retrieval

    NARCIS (Netherlands)

    Pingen, G.L.J.; de Boer, M.H.T.; Aly, Robin; Amsaleg, Laurent; Guðmundsson, Gylfi Þór; Gurrin, Cathal; Jónsson, Björn Þór; Satoh, Shin’ichi

    This paper investigates methods for user and pseudo relevance feedback in video event retrieval. Existing feedback methods achieve strong performance but adjust the ranking based on few individual examples. We propose a relevance feedback algorithm (ARF) derived from the Rocchio method, which is a

  8. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  9. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  10. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  11. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  12. Evaluating MJO Event Initiation and Decay in the Skeleton Model using an RMM-like Index

    Science.gov (United States)

    2015-11-25

    univariate zonal wind EOF analysis, the mean number of continuing events exceeds 437 observations, though the observed number falls within the 95...year simulation period using the truncated, 464 observed SSTs. Approximately two-thirds of the observed events fall within 20-100 days with a 465...Advances in simulating atmospheric variability with the ECMWF 745 model: From synoptic to decadal time-scales, Q. J. Roy. Meteor . Soc.. 134, 1337

  13. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  14. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  15. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  16. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  17. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    Energy Technology Data Exchange (ETDEWEB)

    Khristova, R; Kaltchev, B; Dimitrov, B [Energoproekt, Sofia (Bulgaria); Nedyalkova, D; Sonev, A [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1996-12-31

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs.

  18. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    International Nuclear Information System (INIS)

    Khristova, R.; Kaltchev, B.; Dimitrov, B.; Nedyalkova, D.; Sonev, A.

    1995-01-01

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs

  19. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  1. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  2. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  3. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  4. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  5. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  6. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  7. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  8. Development and Initial Validation of a Patient-Reported Adverse Drug Event Questionnaire

    NARCIS (Netherlands)

    de Vries, Sieta T.; Mol, Peter G. M.; de Zeeuw, Dick; Haaijer-Ruskamp, Flora M.; Denig, Petra

    2013-01-01

    Background Direct patient reporting of adverse drug events (ADEs) is relevant for the evaluation of drug safety. To collect such data in clinical trials and postmarketing studies, a valid questionnaire is needed that can measure all possible ADEs experienced by patients. Objective Our aim was to

  9. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  10. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  11. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  12. Hypertension control after an initial cardiac event among Medicare patients with diabetes mellitus: A multidisciplinary group practice observational study.

    Science.gov (United States)

    Chaddha, Ashish; Smith, Maureen A; Palta, Mari; Johnson, Heather M

    2018-04-23

    Patients with diabetes mellitus and cardiovascular disease have a high risk of mortality and/or recurrent cardiovascular events. Hypertension control is critical for secondary prevention of cardiovascular events. The objective was to determine rates and predictors of achieving hypertension control among Medicare patients with diabetes and uncontrolled hypertension after hospital discharge for an initial cardiac event. A retrospective analysis of linked electronic health record and Medicare data was performed. The primary outcome was hypertension control within 1 year after hospital discharge for an initial cardiac event. Cox proportional hazard models assessed sociodemographics, medications, utilization, and comorbidities as predictors of control. Medicare patients with diabetes were more likely to achieve hypertension control when prescribed beta-blockers at discharge or with a history of more specialty visits. Adults ≥ 80 were more likely to achieve control with diuretics. These findings demonstrate the importance of implementing guideline-directed multidisciplinary care in this complex and high-risk population. ©2018 Wiley Periodicals, Inc.

  13. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  14. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  15. Radiologically isolated syndrome: 5-year risk for an initial clinical event.

    Directory of Open Access Journals (Sweden)

    Darin T Okuda

    Full Text Available OBJECTIVE: To report the 5-year risk and to identify risk factors for the development of a seminal acute or progressive clinical event in a multi-national cohort of asymptomatic subjects meeting 2009 RIS Criteria. METHODS: Retrospectively identified RIS subjects from 22 databases within 5 countries were evaluated. Time to the first clinical event related to demyelination (acute or 12-month progression of neurological deficits was compared across different groups by univariate and multivariate analyses utilizing a Cox regression model. RESULTS: Data were available in 451 RIS subjects (F: 354 (78.5%. The mean age at from the time of the first brain MRI revealing anomalies suggestive of MS was 37.2 years (y (median: 37.1 y, range: 11-74 y with mean clinical follow-up time of 4.4 y (median: 2.8 y, range: 0.01-21.1 y. Clinical events were identified in 34% (standard error=3% of individuals within a 5-year period from the first brain MRI study. Of those who developed symptoms, 9.6% fulfilled criteria for primary progressive MS. In the multivariate model, age [hazard ratio (HR: 0.98 (95% CI: 0.96-0.99; p=0.03], sex (male [HR: 1.93 (1.24-2.99; p=0.004], and lesions within the cervical or thoracic spinal cord [HR: 3.08 (2.06-4.62; p=<0.001] were identified as significant predictors for the development of a first clinical event. INTERPRETATION: These data provide supportive evidence that a meaningful number of RIS subjects evolve to a first clinical symptom. An age <37 y, male sex, and spinal cord involvement appear to be the most important independent predictors of symptom onset.

  16. Increased non-AIDS mortality among persons with AIDS-defining events after antiretroviral therapy initiation

    DEFF Research Database (Denmark)

    Pettit, April C; Giganti, Mark J; Ingle, Suzanne M

    2018-01-01

    ) initiation. METHODS: We included HIV treatment-naïve adults from the Antiretroviral Therapy Cohort Collaboration (ART-CC) who initiated ART from 1996 to 2014. Causes of death were assigned using the Coding Causes of Death in HIV (CoDe) protocol. The adjusted hazard ratio (aHR) for overall and cause......-specific non-AIDS mortality among those with an ADE (all ADEs, tuberculosis (TB), Pneumocystis jiroveci pneumonia (PJP), and non-Hodgkin's lymphoma (NHL)) compared to those without an ADE was estimated using a marginal structural model. RESULTS: The adjusted hazard of overall non-AIDS mortality was higher...

  17. Ultraviolet-resonance femtosecond stimulated Raman study of the initial events in photoreceptor chromophore

    Directory of Open Access Journals (Sweden)

    Tahara T.

    2013-03-01

    Full Text Available Newly-developed ultraviolet-resonance femtosecond stimulated-Raman spectroscopy was utilized to study the initial structural evolution of photoactive yellow protein chromophore in solution. The obtained spectra changed drastically within 1 ps, demonstrating rapid in-plane deformations of the chromophore.

  18. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  19. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  20. Short-Period Surface Wave Based Seismic Event Relocation

    Science.gov (United States)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  1. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  2. Citizens' initiative “# Noen3caínes". Discourse analysis of an event

    Directory of Open Access Journals (Sweden)

    Luis Eduardo Ospina Raigosa

    2016-07-01

    Full Text Available The article performs a Critical Study of the Discourse of a fragment of the video “Narconovelas- Movimiento ciudadano #noen3caines” (Garcia & Cartagena, 2013 which gives an account of the citizens’ initiative Noen3caínes. In this initiative citizens question the media, and what it constitutes as a social response. The general objective is to interpret the social response called Noen3Caínes, from the meanings that are proposed in the video analysis. Noen3Caínes had concrete effects on the advertisement of the TV series of RCN in Colombia “Tres Caínes”. At least 13 brands withdrew their advertising thanks to public pressure from the Internet because they considered that his image was not akin to that proposal on television, a matter which has no precedent in the history of the media in Colombia.

  3. Accident analyses in nuclear power plants following external initiating events and in the shutdown state. Final report

    International Nuclear Information System (INIS)

    Loeffler, Horst; Kowalik, Michael; Mildenberger, Oliver; Hage, Michael

    2016-06-01

    The work which is documented here provides the methodological basis for improvement of the state of knowledge for accident sequences after plant external initiating events and for accident sequences which begin in the shutdown state. The analyses have been done for a PWR and for a BWR reference plant. The work has been supported by the German federal ministry BMUB under the label 3612R01361. Top objectives of the work are: - Identify relevant event sequences in order to define characteristic initial and boundary conditions - Perform accident analysis of selected sequences - Evaluate the relevance of accident sequences in a qualitative way The accident analysis is performed with the code MELCOR 1.8.6. The applied input data set has been significantly improved compared to previous analyses. The event tree method which is established in PSA level 2 has been applied for creating a structure for a unified summarization and evaluation of the results from the accident analyses. The computer code EVNTRE has been applied for this purpose. In contrast to a PSA level 2, the branching probabilities of the event tree have not been determined with the usual accuracy, but they are given in an approximate way only. For the PWR, the analyses show a considerable protective effect of the containment also in the case of beyond design events. For the BWR, there is a rather high probability for containment failure under core melt impact, but nevertheless the release of radionuclides into the environment is very limited because of plant internal retention mechanisms. This report concludes with remarks about existing knowledge gaps and with regard to core melt sequences, and about possible improvements of the plant safety.

  4. Robust facial landmark detection based on initializing multiple poses

    Directory of Open Access Journals (Sweden)

    Xin Chai

    2016-10-01

    Full Text Available For robot systems, robust facial landmark detection is the first and critical step for face-based human identification and facial expression recognition. In recent years, the cascaded-regression-based method has achieved excellent performance in facial landmark detection. Nevertheless, it still has certain weakness, such as high sensitivity to the initialization. To address this problem, regression based on multiple initializations is established in a unified model; face shapes are then estimated independently according to these initializations. With a ranking strategy, the best estimate is selected as the final output. Moreover, a face shape model based on restricted Boltzmann machines is built as a constraint to improve the robustness of ranking. Experiments on three challenging datasets demonstrate the effectiveness of the proposed facial landmark detection method against state-of-the-art methods.

  5. Human performance in an operational event - how to improve it? An initiative in a French NPP

    International Nuclear Information System (INIS)

    Meslin, M.

    1998-01-01

    In the case of the Saint-Laurent-des-Eaux French nuclear power station, the author comments the elements and principles of human factor policy which have been implemented, the organizational implications of this implementation (building up of an internal human factors network), and briefly evokes studies and initiatives aimed at improving the quality of operation from a general point of view and through projects aiming at analyzing and at a valorisation of human reliability in activities dealing with reactor operation. He also comments the perception and appropriation of quality in the different departments

  6. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  7. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  8. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  9. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  10. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  11. Risk assessment of K basin twelve-inch drain valve failure from a postulated seismic initiating event

    International Nuclear Information System (INIS)

    MORGAN, R.G.

    1999-01-01

    The Spent Nuclear Fuel (SNF) Project will transfer metallic SNF from the Hanford 105 K-East and 105 K-West Basins to safe interim storage in the Canister Storage Building in the 200 Area. The initial basis for design, fabrication, installation, and operation of the fuel removal systems was that the basin leak rates which could result from a postulated accident condition would not be excessive relative to reasonable recovery operations. However, an additional potential K Basin water leak path is through the K Basin drain valves. Three twelve-inch drain valves are located in the main basin bays along the north wall. The sumps containing the valves are filled with concrete which covers the drain valve body. Visual observations suggest that only the valve's bonnet and stem are exposed above the basin concrete floor. It was recognized, however, that damage of the drain valve bonnet or stem during a seismic initiating event could provide a potential K Basin water leak path. The objectives of this activity are to: (1) evaluate the risk of damaging the three twelve-inch drain valves located along the north wall of the main basin from a seismic initiating event, and (2) determine the associated potential leak rate from a damaged valve

  12. Risk assessment of K basin twelve-inch drain valve failure from a postulated seismic initiating event

    Energy Technology Data Exchange (ETDEWEB)

    MORGAN, R.G.

    1999-04-06

    The Spent Nuclear Fuel (SNF) Project will transfer metallic SNF from the Hanford 105 K-East and 105 K-West Basins to safe interim storage in the Canister Storage Building in the 200 Area. The initial basis for design, fabrication, installation, and operation of the fuel removal systems was that the basin leak rates which could result from a postulated accident condition would not be excessive relative to reasonable recovery operations. However, an additional potential K Basin water leak path is through the K Basin drain valves. Three twelve-inch drain valves are located in the main basin bays along the north wall. The sumps containing the valves are filled with concrete which covers the drain valve body. Visual observations suggest that only the valve's bonnet and stem are exposed above the basin concrete floor. It was recognized, however, that damage of the drain valve bonnet or stem during a seismic initiating event could provide a potential K Basin water leak path. The objectives of this activity are to: (1) evaluate the risk of damaging the three twelve-inch drain valves located along the north wall of the main basin from a seismic initiating event, and (2) determine the associated potential leak rate from a damaged valve.

  13. Initial events in the cellular effects of ionizing radiations: clustered damage in DNA

    International Nuclear Information System (INIS)

    Goodhead, D.T.

    1994-01-01

    Ionizing radiations produce many hundreds of different simple chemical products in DNA and also multitudes of possible clustered combinations. The simple products, including single-strand breaks, tend to correlate poorly with biological effectiveness. Even for initial double-strand breaks, as a broad class, there is apparently little or no increase in yield with increasing ionization density, in contrast with the large rise in relative biological effectiveness for cellular effects. Track structure analysis has revealed that clustered DNA damage of severity greater than simple double-strand breaks is likely to occur at biologically relevant frequencies with all ionizing radiations. Studies are in progress to describe in more detail the chemical nature of these clustered lesions and to consider the implications for cellular repair. (author)

  14. Central Italy magnetotelluric investigation. Structures and relations to seismic events: analysis of initial data

    Directory of Open Access Journals (Sweden)

    J. Marianiuk

    1996-06-01

    Full Text Available A scientific collaboration between the Warsaw Academy of Science, (Poland and the National Institute of Geophysics (Italy, gave rise to the installation of few stations for the long term measurement of magnetotelluric fields in central Italy. The selection of investigation sites was determined by the individual seismic interest of each location. The project began in the summer of 1991, with the installation of 2 magnetotelluric stations in the province of Isernia, (Collemeluccio and Montedimezzo. In 1992, 2 more stations became operative, one in the province of Rieti, (Fassinoro, the other in the province of L'Aquila, (S. Vittoria. For the purpose of this project, the magnetic observatory in L'Aquila was also equipped with electric lines, for the measurement of the telluric field. The aim of the analysis here presented, is to show that is possible to follow the temporal evolution of magnetotelluric characteristic parameters. At Collemeluccio this evolution was compared with the seismic released energy for events recorded within the study area.

  15. Tendencies in human factor influence on initiating events occurrence in NPP Kozloduy

    International Nuclear Information System (INIS)

    Hristova, R.

    2001-01-01

    Overview of the methods and documents concerning human factor in nuclear safety and selection of the most appropriate methods and concept for human factor assessment in the reported events in Kozloduy NPP are presented. List of human error types and statistical data (the mean time between similar errors, the human rate λ, the number of occurrences ect.) is given. Some general results from the human error behavior investigation for all units of Kozloduy NPP related to the 4 personnel categories: Management personnel, Designers, Operating personnel, Maintenance personnel are also shown. At the end the following conclusion are made:18 % operating personnel errors (for comparison for the same category personnel in similar NPPs abroad this value is between 10 % and 30%); Human errors in Kozloduy NPP tend to increase after year 1990; only for the operating personnel a maximum near year 1997 was observed, after which the error values was decreased; at the beginning of year 2000 the reliability characteristics for all units have similar values; it is necessary to be taken into account the observed tendencies to take measurements for reducing of the most important error types for Kozloduy NPP personnel

  16. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  17. Aldosterone Does Not Predict Cardiovascular Events Following Acute Coronary Syndrome in Patients Initially Without Heart Failure.

    Science.gov (United States)

    Pitts, Reynaria; Gunzburger, Elise; Ballantyne, Christie M; Barter, Philip J; Kallend, David; Leiter, Lawrence A; Leitersdorf, Eran; Nicholls, Stephen J; Shah, Prediman K; Tardif, Jean-Claude; Olsson, Anders G; McMurray, John J V; Kittelson, John; Schwartz, Gregory G

    2017-01-10

    Aldosterone may have adverse effects in the myocardium and vasculature. Treatment with an aldosterone antagonist reduces cardiovascular risk in patients with acute myocardial infarction complicated by heart failure (HF) and left ventricular systolic dysfunction. However, most patients with acute coronary syndrome do not have advanced HF. Among such patients, it is unknown whether aldosterone predicts cardiovascular risk. To address this question, we examined data from the dal-OUTCOMES trial that compared the cholesteryl ester transfer protein inhibitor dalcetrapib with placebo, beginning 4 to 12 weeks after an index acute coronary syndrome. Patients with New York Heart Association class II (with LVEF coronary heart disease death, nonfatal myocardial infarction, stroke, hospitalization for unstable angina, or resuscitated cardiac arrest. Hospitalization for HF was a secondary endpoint. Over a median follow-up of 37 months, the primary outcome occurred in 366 patients (9.0%), and hospitalization for HF occurred in 72 patients (1.8%). There was no association between aldosterone and either the time to first occurrence of a primary outcome (hazard ratio for doubling of aldosterone 0.92, 95% confidence interval 0.78-1.09, P=0.34) or hospitalization for HF (hazard ratio 1.38, 95% CI 0.96-1.99, P=0.08) in Cox regression models adjusted for covariates. In patients with recent acute coronary syndrome but without advanced HF, aldosterone does not predict major cardiovascular events. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00658515. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  18. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  19. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  20. Characterization of initial events in bacterial surface colonization by two Pseudomonas species using image analysis.

    Science.gov (United States)

    Mueller, R F; Characklis, W G; Jones, W L; Sears, J T

    1992-05-01

    The processes leading to bacterial colonization on solid-water interfaces are adsorption, desorption, growth, and erosion. These processes have been measured individually in situ in a flowing system in real time using image analysis. Four different substrata (copper, silicon, 316 stainless-steel and glass) and 2 different bacterial species (Pseudomonas aeruginosa and Pseudomonas fluorescens) were used in the experiments. The flow was laminar (Re = 1.4) and the shear stress was kept constant during all experiments at 0.75 N m(-2). The surface roughness varied among the substrata from 0.002 microm (for silicon) to 0.015 microm (for copper). Surface free energies varied from 25.1 dynes cm(-1) for silicon to 31.2 dynes cm(-1) for copper. Cell curface hydrophobicity, reported as hydrocarbon partitioning values, ranged from 0.67 for Ps. fluorescens to 0.97 for Ps. aeruginosa.The adsorption rate coefficient varied by as much as a factor of 10 among the combinations of bacterial strain and substratum material, and was positively correlated with surface free energy, the surface roughness of the substratum, and the hydrophobicity of the cells. The probability of desorption decreased with increasing surface free energy and surface roughness of the substratum. Cell growth was inhibited on copper, but replication of cells overlying an initial cell layer was observed with increased exposure time to the cell-containing bulk water. A mathematical model describing cell accumulation on a substratum is presented.

  1. Impact of soil moisture initialization on boreal summer subseasonal forecasts: mid-latitude surface air temperature and heat wave events

    Science.gov (United States)

    Seo, Eunkyo; Lee, Myong-In; Jeong, Jee-Hoon; Koster, Randal D.; Schubert, Siegfried D.; Kim, Hye-Mi; Kim, Daehyun; Kang, Hyun-Suk; Kim, Hyun-Kyung; MacLachlan, Craig; Scaife, Adam A.

    2018-05-01

    This study uses a global land-atmosphere coupled model, the land-atmosphere component of the Global Seasonal Forecast System version 5, to quantify the degree to which soil moisture initialization could potentially enhance boreal summer surface air temperature forecast skill. Two sets of hindcast experiments are performed by prescribing the observed sea surface temperature as the boundary condition for a 15-year period (1996-2010). In one set of the hindcast experiments (noINIT), the initial soil moisture conditions are randomly taken from a long-term simulation. In the other set (INIT), the initial soil moisture conditions are taken from an observation-driven offline Land Surface Model (LSM) simulation. The soil moisture conditions from the offline LSM simulation are calibrated using the forecast model statistics to minimize the inconsistency between the LSM and the land-atmosphere coupled model in their mean and variability. Results show a higher boreal summer surface air temperature prediction skill in INIT than in noINIT, demonstrating the potential benefit from an accurate soil moisture initialization. The forecast skill enhancement appears especially in the areas in which the evaporative fraction—the ratio of surface latent heat flux to net surface incoming radiation—is sensitive to soil moisture amount. These areas lie in the transitional regime between humid and arid climates. Examination of the extreme 2003 European and 2010 Russian heat wave events reveal that the regionally anomalous soil moisture conditions during the events played an important role in maintaining the stationary circulation anomalies, especially those near the surface.

  2. Unique base-initiated depolymerization of limonene-derived polycarbonates

    NARCIS (Netherlands)

    Li, C.; Sablong, R.J.; van Benthem, R.A.T.M.; Koning, C.E.

    2017-01-01

    The depolymerization of poly(limonene carbonate) (PLC) initiated by 1,5,7-triazabicyclo[4.4.0]dec-5-ene (TBD) was investigated. The strong organic base TBD was capable of deprotonating the OH-terminated PLC, leading to fast degradation via backbiting reactions at high temperature. An interesting

  3. Software-Based Student Response Systems: An Interdisciplinary Initiative

    Science.gov (United States)

    Fischer, Carol M.; Hoffman, Michael S.; Casey, Nancy C.; Cox, Maureen P.

    2015-01-01

    Colleagues from information technology and three academic departments collaborated on an instructional technology initiative to employ student response systems in classes in mathematics, accounting and education. The instructors assessed the viability of using software-based systems to enable students to use their own devices (cell phones,…

  4. Biometrics based authentication scheme for session initiation protocol

    OpenAIRE

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when co...

  5. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  6. Coupled prediction of flood response and debris flow initiation during warm and cold season events in the Southern Appalachians, USA

    Science.gov (United States)

    Tao, J.; Barros, A. P.

    2013-07-01

    Debris flows associated with rainstorms are a frequent and devastating hazard in the Southern Appalachians in the United States. Whereas warm season events are clearly associated with heavy rainfall intensity, the same cannot be said for the cold season events. Instead, there is a relationship between large (cumulative) rainfall events independently of season, and thus hydrometeorological regime, and debris flows. This suggests that the dynamics of subsurface hydrologic processes play an important role as a trigger mechanism, specifically through soil moisture redistribution by interflow. The first objective of this study is to investigate this hypothesis. The second objective is to assess the physical basis for a regional coupled flood prediction and debris flow warning system. For this purpose, uncalibrated model simulations of well-documented debris flows in headwater catchments of the Southern Appalachians using a 3-D surface-groundwater hydrologic model coupled with slope stability models are examined in detail. Specifically, we focus on two vulnerable headwater catchments that experience frequent debris flows, the Big Creek and the Jonathan Creek in the Upper Pigeon River Basin, North Carolina, and three distinct weather systems: an extremely heavy summertime convective storm in 2011; a persistent winter storm lasting several days; and a severe winter storm in 2009. These events were selected due to the optimal availability of rainfall observations, availability of detailed field surveys of the landslides shortly after they occurred, which can be used to evaluate model predictions, and because they are representative of events that cause major economic losses in the region. The model results substantiate that interflow is a useful prognostic of conditions necessary for the initiation of slope instability, and should therefore be considered explicitly in landslide hazard assessments. Moreover, the relationships between slope stability and interflow are

  7. Identification of human-induced initiating events in the low power and shutdown operation using the commission error search and assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Chan; Kim, Jong Hyun [KEPCO International Nuclear Graduate School (KINGS), Ulsan (Korea, Republic of)

    2015-03-15

    Human-induced initiating events, also called Category B actions in human reliability analysis, are operator actions that may lead directly to initiating events. Most conventional probabilistic safety analyses typically assume that the frequency of initiating events also includes the probability of human-induced initiating events. However, some regulatory documents require Category B actions to be specifically analyzed and quantified in probabilistic safety analysis. An explicit modeling of Category B actions could also potentially lead to important insights into human performance in terms of safety. However, there is no standard procedure to identify Category B actions. This paper describes a systematic procedure to identify Category B actions for low power and shutdown conditions. The procedure includes several steps to determine operator actions that may lead to initiating events in the low power and shutdown stages. These steps are the selection of initiating events, the selection of systems or components, the screening of unlikely operating actions, and the quantification of initiating events. The procedure also provides the detailed instruction for each step, such as operator's action, information required, screening rules, and the outputs. Finally, the applicability of the suggested approach is also investigated by application to a plant example.

  8. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  9. 76 FR 25328 - New Mexico Green Initiatives, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2011-05-04

    ... Mexico Green Initiatives, LLC's application for market-based rate authority, with an accompanying rate... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-3431-000] New Mexico Green Initiatives, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

  10. VLSI-based video event triggering for image data compression

    Science.gov (United States)

    Williams, Glenn L.

    1994-02-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  11. Event-based proactive interference in rhesus monkeys.

    Science.gov (United States)

    Devkar, Deepna T; Wright, Anthony A

    2016-10-01

    Three rhesus monkeys (Macaca mulatta) were tested in a same/different memory task for proactive interference (PI) from prior trials. PI occurs when a previous sample stimulus appears as a test stimulus on a later trial, does not match the current sample stimulus, and the wrong response "same" is made. Trial-unique pictures (scenes, objects, animals, etc.) were used on most trials, except on trials where the test stimulus matched potentially interfering sample stimulus from a prior trial (1, 2, 4, 8, or 16 trials prior). Greater interference occurred when fewer trials separated interference and test. PI functions showed a continuum of interference. Delays between sample and test stimuli and intertrial intervals were manipulated to test how PI might vary as a function of elapsed time. Contrary to a similar study with pigeons, these time manipulations had no discernable effect on the monkey's PI, as shown by compete overlap of PI functions with no statistical differences or interactions. These results suggested that interference was strictly based upon the number of intervening events (trials with other pictures) without regard to elapsed time. The monkeys' apparent event-based interference was further supported by retesting with a novel set of 1,024 pictures. PI from novel pictures 1 or 2 trials prior was greater than from familiar pictures, a familiar set of 1,024 pictures. Moreover, when potentially interfering novel stimuli were 16 trials prior, performance accuracy was actually greater than accuracy on baseline trials (no interference), suggesting that remembering stimuli from 16 trials prior was a cue that this stimulus was not the sample stimulus on the current trial-a somewhat surprising conclusion particularly given monkeys.

  12. The effectiveness of the cardiovascular disease prevention programme 'KardioPro' initiated by a German sickness fund: a time-to-event analysis of routine data.

    Directory of Open Access Journals (Sweden)

    Sabine Witt

    Full Text Available Cardiovascular disease is the leading cause of morbidity and mortality in the developed world. To reduce this burden of disease, a German sickness fund ('Siemens-Betriebskrankenkasse', SBK initiated the prevention programme 'KardioPro' including primary (risk factor reduction and secondary (screening prevention and guideline-based treatment. The aim of this study was to assess the effectiveness of 'KardioPro' as it is implemented in the real world.The study is based on sickness fund routine data. The control group was selected from non-participants via propensity score matching. Study analysis was based on time-to-event analysis via Cox proportional hazards regression with the endpoint 'all-cause mortality, acute myocardial infarction (MI and ischemic stroke (1', 'all-cause mortality (2' and 'non-fatal acute MI and ischemic stroke (3'.A total of 26,202 insurants were included, 13,101 participants and 13,101 control subjects. 'KardioPro' enrollment was associated with risk reductions of 23.5% (95% confidence interval (CI 13.0-32.7% (1, 41.7% (95% CI 30.2-51.2% (2 and 3.5% (hazard ratio 0.965, 95% CI 0.811-1.148 (3. This corresponds to an absolute risk reduction of 0.29% (1, 0.31% (2 and 0.03% (3 per year.The prevention programme initiated by a German statutory sickness fund appears to be effective with regard to all-cause mortality. The non-significant reduction in non-fatal events might result from a shift from fatal to non-fatal events.

  13. Event Completion: Event Based Inferences Distort Memory in a Matter of Seconds

    Science.gov (United States)

    Strickland, Brent; Keil, Frank

    2011-01-01

    We present novel evidence that implicit causal inferences distort memory for events only seconds after viewing. Adults watched videos of someone launching (or throwing) an object. However, the videos omitted the moment of contact (or release). Subjects falsely reported seeing the moment of contact when it was implied by subsequent footage but did…

  14. A Network of AOPs for reduced thyroid hormone synthesis derived from inhibition of Thyroperoxidase - A common Molecular Initiating Event Leading to Species-Specific Indices of Adversity.

    Science.gov (United States)

    This collection of 3 AOPs describe varying outcomes of adversity dependent upon species in response to inhibition of thyroperoxidase (TPO) during development. Chemical inhibition of TPO, the molecular-initiating event (MIE), results in decreased thyroid hormone (TH) synthesis, a...

  15. Modeling time to recovery and initiating event frequency for loss of off-site power incidents at nuclear power plants

    International Nuclear Information System (INIS)

    Iman, R.L.; Hora, S.C.

    1988-01-01

    Industry data representing the time to recovery of loss of off-site power at nuclear power plants for 63 incidents caused by plant-centered losses, grid losses, or severe weather losses are fit with exponential, lognormal, gamma and Weibull probability models. A Bayesian analysis is used to compare the adequacy of each of these models and to provide uncertainty bounds on each of the fitted models. A composite model that combines the probability models fitted to each of the three sources of data is presented as a method for predicting the time to recovery of loss of off-site power. The composite model is very general and can be made site specific by making adjustments on the models used, such as might occur due to the type of switchyard configuration or type of grid, and by adjusting the weights on the individual models, such as might occur with weather conditions existing at a particular plant. Adjustments in the composite model are shown for different models used for switchyard configuration and for different weights due to weather. Bayesian approaches are also presented for modeling the frequency of initiating events leading to loss of off-site power. One Bayesian model assumes that all plants share a common incidence rate for loss of off-site power, while the other Bayesian approach models the incidence rate for each plant relative to the incidence rates of all other plants. Combining the Bayesian models for the frequency of the initiating events with the composite Bayesian model for recovery provides the necessary vehicle for a complete model that incorporates uncertainty into a probabilistic risk assessment

  16. Biometrics based authentication scheme for session initiation protocol.

    Science.gov (United States)

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  17. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event...

  18. SPEED : a semantics-based pipeline for economic event detection

    NARCIS (Netherlands)

    Hogenboom, F.P.; Hogenboom, A.C.; Frasincar, F.; Kaymak, U.; Meer, van der O.; Schouten, K.; Vandic, D.; Parsons, J.; Motoshi, S.; Shoval, P.; Woo, C.; Wand, Y.

    2010-01-01

    Nowadays, emerging news on economic events such as acquisitions has a substantial impact on the financial markets. Therefore, it is important to be able to automatically and accurately identify events in news items in a timely manner. For this, one has to be able to process a large amount of

  19. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  20. Logical Discrete Event Systems in a trace theory based setting

    NARCIS (Netherlands)

    Smedinga, R.

    1993-01-01

    Discrete event systems can be modelled using a triple consisting of some alphabet (representing the events that might occur), and two trace sets (sets of possible strings) denoting the possible behaviour and the completed tasks of the system. Using this definition we are able to formulate and solve

  1. Management initiatives in a community-based health insurance scheme.

    Science.gov (United States)

    Sinha, Tara; Ranson, M Kent; Chatterjee, Mirai; Mills, Anne

    2007-01-01

    Community-based health insurance (CBHI) schemes have developed in response to inadequacies of alternate systems for protecting the poor against health care expenditures. Some of these schemes have arisen within community-based organizations (CBOs), which have strong links with poor communities, and are therefore well situated to offer CBHI. However, the managerial capacities of many such CBOs are limited. This paper describes management initiatives undertaken in a CBHI scheme in India, in the course of an action-research project. The existing structures and systems at the CBHI had several strengths, but fell short on some counts, which became apparent in the course of planning for two interventions under the research project. Management initiatives were introduced that addressed four features of the CBHI, viz. human resources, organizational structure, implementation systems, and data management. Trained personnel were hired and given clear roles and responsibilities. Lines of reporting and accountability were spelt out, and supportive supervision was provided to team members. The data resources of the organization were strengthened for greater utilization of this information. While the changes that were introduced took some time to be accepted by team members, the commitment of the CBHI's leadership to these initiatives was critical to their success. Copyright (c) 2007 John Wiley & Sons, Ltd.

  2. A model-based approach to operational event groups ranking

    Energy Technology Data Exchange (ETDEWEB)

    Simic, Zdenko [European Commission Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport; Maqua, Michael [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2014-04-15

    The operational experience (OE) feedback provides improvements in all industrial activities. Identification of the most important and valuable groups of events within accumulated experience is important in order to focus on a detailed investigation of events. The paper describes the new ranking method and compares it with three others. Methods have been described and applied to OE events utilised by nuclear power plants in France and Germany for twenty years. The results show that different ranking methods only roughly agree on which of the event groups are the most important ones. In the new ranking method the analytical hierarchy process is applied in order to assure consistent and comprehensive weighting determination for ranking indexes. The proposed method allows a transparent and flexible event groups ranking and identification of the most important OE for further more detailed investigation in order to complete the feedback. (orig.)

  3. Prediction problem for target events based on the inter-event waiting time

    Science.gov (United States)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  4. Automatic Classification of volcano-seismic events based on Deep Neural Networks.

    Science.gov (United States)

    Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.

  5. Does short-term virologic failure translate to clinical events in antiretroviral-naïve patients initiating antiretroviral therapy in clinical practice?

    NARCIS (Netherlands)

    Mugavero, Michael J; May, Margaret; Harris, Ross; Saag, Michael S; Costagliola, Dominique; Egger, Matthias; Phillips, Andrew; Günthard, Huldrych F; Dabis, Francois; Hogg, Robert; de Wolf, Frank; Fatkenheuer, Gerd; Gill, M John; Justice, Amy; D'Arminio Monforte, Antonella; Lampe, Fiona; Miró, Jose M; Staszewski, Schlomo; Sterne, Jonathan A C; Niesters, Bert

    2008-01-01

    OBJECTIVE: To determine whether differences in short-term virologic failure among commonly used antiretroviral therapy (ART) regimens translate to differences in clinical events in antiretroviral-naïve patients initiating ART. DESIGN: Observational cohort study of patients initiating ART between

  6. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  7. Autocorrel I: A Neural Network Based Network Event Correlation Approach

    National Research Council Canada - National Science Library

    Japkowicz, Nathalie; Smith, Reuben

    2005-01-01

    .... We use the autoassociator to build prototype software to cluster network alerts generated by a Snort intrusion detection system, and discuss how the results are significant, and how they can be applied to other types of network events.

  8. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  9. Evidence and Perspectives on the 24-hour Management of Hypertension: Hemodynamic Biomarker-Initiated 'Anticipation Medicine' for Zero Cardiovascular Event.

    Science.gov (United States)

    Kario, Kazuomi

    There are notable differences between Asians and Westerners regarding hypertension (HTN) and the relationship between HTN and cardiovascular disease (CVD). Asians show greater morning surges in blood pressure (BP) and a steeper slope illustrating the link between higher BP and the risk of CVD events. It is thus particularly important for Asian hypertensives to achieve 24-h BP control, including morning and night-time control. There are three components of 'perfect 24-h BP control:' the 24-h BP level, nocturnal BP dipping, and BP variability (BPV), such as the morning BP surge that can be assessed by ambulatory BP monitoring. The morning BP-guided approach using home BP monitoring (HBPM) is the first step toward perfect 24-h BP control, followed by the control of nocturnal HTN. We have been developing new HBPM devices that can measure nocturnal BP. BPV includes different time-phase variability from the shortest beat-by-beat, positional, diurnal, day-by-day, visit-to-visit, seasonal, and yearly changes. The synergistic resonance of each type of BPV would produce a great dynamic BP surge (resonance hypothesis), which triggers a CVD event, especially in the high-risk patients with systemic hemodynamic atherothrombotic syndrome (SHATS). In the future, the innovative management of HTN based on the simultaneous assessment of the resonance of all of the BPV phenotypes using a beat by beat wearable 'surge' BP monitoring device (WSP) and an information and communication technology (ICT)-based data analysis system will produce a paradigm shift from 'dots' BP management to 'seamless' ultimate individualized 'anticipation medication' for reaching a zero CVD event rate. Copyright © 2016 The Author. Published by Elsevier Inc. All rights reserved.

  10. Integrated analyzing method for the progress event based on subjects and predicates in events

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Munesawa, Yoshiomi

    2014-01-01

    It is expected to make use of the knowledge that was extracted by analyzing the mistakes of the past to prevent recurrence of accidents. Currently main analytic style is an analytic style that experts decipher deeply the accident cases, but cross-analysis has come to an end with extracting the common factors in the accident cases. We propose an integrated analyzing method for progress events to analyze among accidents in this study. Our method realized the integration of many accident cases by the integration connecting the common keyword called as 'Subject' or 'Predicate' that are extracted from each progress event in accident cases or near-miss cases. Our method can analyze and visualize the partial risk identification and the frequency to cause accidents and the risk assessment from the data integrated accident cases. The result of applying our method to PEC-SAFER accident cases identified 8 hazardous factors which can be caused from tank again, and visualized the high frequent factors that the first factor was damage of tank 26% and the second factor was the corrosion 21%, and visualized the high risks that the first risk was the damage 3.3 x 10 -2 [risk rank / year] and the second risk was the destroy 2.5 x 10 -2 [risk rank / year]. (author)

  11. Does short-term virologic failure translate to clinical events in antiretroviral-naïve patients initiating antiretroviral therapy in clinical practice?

    DEFF Research Database (Denmark)

    NN, NN; Mugavero, Michael J; May, Margaret

    2008-01-01

    , nevirapine, lopinavir/ritonavir, nelfinavir, or abacavir as third drugs in combination with a zidovudine and lamivudine nucleoside reverse transcriptase inhibitor backbone. MAIN OUTCOME MEASURES: Short-term (24-week) virologic failure (>500 copies/ml) and clinical events within 2 years of ART initiation.......58-2.22), lopinavir/ritonavir (1.32, 95% CI = 1.12-1.57), nelfinavir (3.20, 95% CI = 2.74-3.74), and abacavir (2.13, 95% CI = 1.82-2.50). However, the rate of clinical events within 2 years of ART initiation appeared higher only with nevirapine (adjusted hazard ratio for composite outcome measure 1.27, 95% CI = 1......OBJECTIVE: To determine whether differences in short-term virologic failure among commonly used antiretroviral therapy (ART) regimens translate to differences in clinical events in antiretroviral-naïve patients initiating ART. DESIGN: Observational cohort study of patients initiating ART between...

  12. Area-based initiatives - Engines of planning and policy innovation?

    DEFF Research Database (Denmark)

    Agger, Annika; Norvig Larsen, Jacob

    studies of local planning culture change are discussed. Main findings are that during the past two decades a general change in planning culture has developed gradually, triggered by urban regeneration full scale experimentation with place-based approaches. Second, planners as well as public administrators...... and development in planning culture turns out to be a more substantial result than the reduction of social exclusion and economic deprivation. The paper analyses all available official evaluation studies of Danish place-based urban policy initiatives from mid-1990s through 2010. In addition to this, recent...... attitude towards the involvement of local citizens and stakeholders is significantly transformed. While earlier, public participation in planning was mostly restricted to what was lawfully mandatory, the new turn in planning culture demonstrates a practice that goes much further in involving citizens...

  13. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  14. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Interleukin-1 beta gene deregulation associated with chromosomal rearrangement: A candidate initiating event for murine radiation-myeloid leukemogenesis

    International Nuclear Information System (INIS)

    Silver, A.; Boultwood, J.; Breckon, G.; Masson, W.; Adam, J.; Shaw, A.R.; Cox, R.

    1989-01-01

    The incidence of acute myeloid leukemia (AML) in CBA/H mice following exposure to single acute doses of ionizing radiation has previously been determined. A high proportion of these AMLs are characterized by rearrangement of murine chromosome 2 in the C2 and/or E5-F regions, and there is evidence that these events are a direct consequence of radiation damage to multipotential hemopoietic cells. Using a combination of in situ chromosome hybridization and mRNA analyses, we show that the cytokine gene interleukin-1 beta (IL-1 beta) is encoded in the chromosome 2 F region and is translocated in a chromosome 2---2 rearrangement in an x-ray-induced AML (N36). Also, IL-1 beta is specifically deregulated in N36 and in two other chromosome 2-rearranged AMLs but not in a fourth, which has two cytogenetically normal chromosome 2 copies. We suggest that radiation-induced specific chromosome 2 rearrangement associated with IL-1 beta deregulation may initiate murine leukemogenesis through the uncoupling of normal proliferative control mechanisms in multipotential hemopoietic cells

  16. Guidelines for time-to-event end point definitions in breast cancer trials: results of the DATECAN initiative (Definition for the Assessment of Time-to-event Endpoints in CANcer trials)†.

    Science.gov (United States)

    Gourgou-Bourgade, S; Cameron, D; Poortmans, P; Asselain, B; Azria, D; Cardoso, F; A'Hern, R; Bliss, J; Bogaerts, J; Bonnefoi, H; Brain, E; Cardoso, M J; Chibaudel, B; Coleman, R; Cufer, T; Dal Lago, L; Dalenc, F; De Azambuja, E; Debled, M; Delaloge, S; Filleron, T; Gligorov, J; Gutowski, M; Jacot, W; Kirkove, C; MacGrogan, G; Michiels, S; Negreiros, I; Offersen, B V; Penault Llorca, F; Pruneri, G; Roche, H; Russell, N S; Schmitt, F; Servent, V; Thürlimann, B; Untch, M; van der Hage, J A; van Tienhoven, G; Wildiers, H; Yarnold, J; Bonnetain, F; Mathoulin-Pélissier, S; Bellera, C; Dabakuyo-Yonli, T S

    2015-05-01

    Using surrogate end points for overall survival, such as disease-free survival, is increasingly common in randomized controlled trials. However, the definitions of several of these time-to-event (TTE) end points are imprecisely which limits interpretation and cross-trial comparisons. The estimation of treatment effects may be directly affected by the definitions of end points. The DATECAN initiative (Definition for the Assessment of Time-to-event Endpoints in CANcer trials) aims to provide recommendations for definitions of TTE end points. We report guidelines for randomized cancer clinical trials (RCTs) in breast cancer. A literature review was carried out to identify TTE end points (primary or secondary) reported in publications of randomized trials or guidelines. An international multidisciplinary panel of experts proposed recommendations for the definitions of these end points based on a validated consensus method that formalize the degree of agreement among experts. Recommended guidelines for the definitions of TTE end points commonly used in RCTs for breast cancer are provided for non-metastatic and metastatic settings. The use of standardized definitions should facilitate comparisons of trial results and improve the quality of trial design and reporting. These guidelines could be of particular interest to those involved in the design, conducting, reporting, or assessment of RCT. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. Risk assessment of K Basin twelve-inch and four-inch drain valve failure from a postulated seismic initiating event

    Energy Technology Data Exchange (ETDEWEB)

    MORGAN, R.G.

    1999-06-23

    The Spent Nuclear Fuel (SNF) Project will transfer metallic SNF from the Hanford 105 K-East and 105 K-West Basins to safe interim storage in the Canister Storage Building in the 200 Area. The initial basis for design, fabrication, installation, and operation of the fuel removal systems was that the basin leak rate which could result from a postulated accident condition would not be excessive relative to reasonable recovery operations. However, an additional potential K Basin water leak path is through the K Basin drain valves. Three twelve-inch drain valves are located in the main basin bays along the north wall. Five four-inch drain valves are located in the north and south loadout pits (NLOP and SLOP), the weasel pit, the technical viewing pit, and the discharge chute pit. The sumps containing the valves are filled with concrete which covers the drain valve body. Visual observations indicate that only the valve's bonnet and stem are exposed above the basin concrete floor for the twelve-inch drain valve and that much less of the valve's bonnet and stem are exposed above the basin concrete floor for the five four-inch drain valves. It was recognized, however, that damage of the drain valve bonnet or stem during a seismic initiating event could provide a potential K Basin water leak path. The objectives of this analysis are to: (1) evaluate the likelihood of damaging the three twelve-inch drain valves located along the north wall of the main basin and the five four-inch drain valves located in the pits from a seismic initiating event, and (2) determine the likelihood of exceeding a specific consequence (initial leak rate) from a damaged valve. The analysis process is a risk-based uncertainty analysis where each variable is modeled using available information and engineering judgement. The uncertainty associated with each variable is represented by a probability distribution (probability density function). Uncertainty exists because of the inherent

  18. Individual Subjective Initiative Merge Model Based on Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Yin-Jie Xu

    2013-01-01

    Full Text Available The merge control models proposed for work zones are classified into two types (Hard Control Merge (HCM model and Soft Control Merge (SCM model according to their own control intensity and are compared with a new model, called Individual Subjective Initiative Merge (ISIM model, which is based on the linear lane-changing probability strategy in the merging area. The attention of this paper is paid to the positive impact of the individual subjective initiative for the whole traffic system. Three models (ISIM, HCM, and SCM are established and compared with each other by two order parameters, that is, system output and average vehicle travel time. Finally, numerical results show that both ISIM and SCM perform better than HCM. Compared with SCM, the output of ISIM is 20 vehicles per hour higher under the symmetric input condition and is more stable under the asymmetric input condition. Meanwhile, the average travel time of ISIM is 2000 time steps less under the oversaturated input condition.

  19. Studer Group® ' s evidence-based leadership initiatives.

    Science.gov (United States)

    Schuller, Kristin A; Kash, Bita A; Gamm, Larry D

    2015-01-01

    The purpose of this paper is to analyze the implementation of an organizational change initiative--Studer Group®'s Evidence-Based Leadership (EBL)--in two large, US health systems by comparing and contrasting the factors associated with successful implementation and sustainability of the EBL initiative. This comparative case study assesses the responses to two pairs of open-ended questions during in-depth qualitative interviews of leaders and managers at both health systems. Qualitative content analysis was employed to identify major themes. Three themes associated with success and sustainability of EBL emerged at both health systems: leadership; culture; and organizational processes. The theme most frequently identified for both success and sustainability of EBL was culture. In contrast, there was a significant decline in salience of the leadership theme as attention shifts from success in implementation of EBL to sustaining EBL long term. Within the culture theme, accountability, and buy-in were most often cited by interviewees as success factors, while sense of accountability, buy-in, and communication were the most reported factors for sustainability. Cultural factors, such as accountability, staff support, and communication are driving forces of success and sustainability of EBL across both health systems. Leadership, a critical factor in several stages of implementation, appears to be less salient as among factors identified as important to longer term sustainability of EBL.

  20. Characterising Event-Based DOM Inputs to an Urban Watershed

    Science.gov (United States)

    Croghan, D.; Bradley, C.; Hannah, D. M.; Van Loon, A.; Sadler, J. P.

    2017-12-01

    Dissolved Organic Matter (DOM) composition in urban streams is dominated by terrestrial inputs after rainfall events. Urban streams have particularly strong terrestrial-riverine connections due to direct input from terrestrial drainage systems. Event driven DOM inputs can have substantial adverse effects on water quality. Despite this, DOM from important catchment sources such as road drains and Combined Sewage Overflows (CSO's) remains poorly characterised within urban watersheds. We studied DOM sources within an urbanised, headwater watershed in Birmingham, UK. Samples from terrestrial sources (roads, roofs and a CSO), were collected manually after the onset of rainfall events of varying magnitude, and again within 24-hrs of the event ending. Terrestrial samples were analysed for fluorescence, absorbance and Dissolved Organic Carbon (DOC) concentration. Fluorescence and absorbance indices were calculated, and Parallel Factor Analysis (PARAFAC) was undertaken to aid sample characterization. Substantial differences in fluorescence, absorbance, and DOC were observed between source types. PARAFAC-derived components linked to organic pollutants were generally highest within road derived samples, whilst humic-like components tended to be highest within roof samples. Samples taken from the CSO generally contained low fluorescence, however this likely represents a dilution effect. Variation within source groups was particularly high, and local land use seemed to be the driving factor for road and roof drain DOM character and DOC quantity. Furthermore, high variation in fluorescence, absorbance and DOC was apparent between all sources depending on event type. Drier antecedent conditions in particular were linked to greater presence of terrestrially-derived components and higher DOC content. Our study indicates that high variations in DOM character occur between source types, and over small spatial scales. Road drains located on main roads appear to contain the poorest

  1. Faith-based initiatives and the challenges of governance.

    Science.gov (United States)

    Biebricher, Thomas

    2011-01-01

    The task of this paper is to offer an analysis of the Faith-Based and Community Initiative (FBCI) established by George W. Bush and continued under the Obama administration based on a critical and decentred approach to governance (networks). The paper starts out by placing FBCI in the context of the welfare reform of 1996 arguing that both share certain basic assumptions, for example, regarding the nature of poverty, and that FBCI can be interpreted as a response to the relative failure of some aspects of the reform of 1996. In what follows, FBCI is analysed as a typical case of (welfare) state restructuring from government to governance. Emphasis is given to the way discourses and traditions such as communitarianism and public choice have shaped the formation of this new governance arrangement in the field of social service delivery in order to strive for a ‘decentring’ of FBCI by drawing attention to actors' beliefs and worldviews. Finally, I argue that it is not least because of a divergence of such views between policy-makers and faith-based organizations that the effect of FBCI remains for the time being limited.

  2. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  3. (When and where) Do extreme climate events trigger extreme ecosystem responses? - Development and initial results of a holistic analysis framework

    Science.gov (United States)

    Hauber, Eva K.; Donner, Reik V.

    2015-04-01

    a seasonal cycle for each quantile of the distribution, which can be used for a fully data-adaptive definition of extremes as exceedances above this time-dependent quantile function. (2) Having thus identified the extreme events, their distribution is analyzed in both space and time. Following a procedure recently proposed by Lloyd-Hughes (2012) and further exploited by Zscheischler et al. (2013), extremes observed at neighboring points in space and time are considered to form connected sets. Investigating the size distribution of these sets provides novel insights into the development and dynamical characteristics of spatio-temporally extended climate and ecosystem extremes. (3) Finally, the timing of such spatio-temporally extended extremes in different climatic as well as ecological variables is tested pairwise to rule out that co-occurrences of extremes have emerged solely due to chance. For this purpose, the recently developed framework of coincidence analysis (Donges et al., 2011; Rammig et al. 2014) is applied. The corresponding analysis allows identifying potential causal linkages between climatic extremes and extreme ecosystem responses and, thus, to study their mechanisms and spatial as well as seasonal distribution in great detail. In this work, the described method is exemplified by using different climate data from the ERA-Interim reanalysis as well as remote sensing-based land surface temperature data. References: Donges et al., PNAS, 108, 20422, 2011 Lloyd-Hughes, Int. J. Climatol., 32, 406, 2012 Rammig et al., Biogeosc. Disc., 11, 2537, 2014 Zscheischler et al., Ecol. Inform., 15, 66, 2013

  4. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  5. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-11-13

    that may include users, applications, operating systems, servers, hosts, routers, switches, control planes , and instrumentation planes , many of...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology...configuration errors in environment designs for several cyber range events. The rest of the paper is organized as follows. Section 2 provides an overview of

  6. Advances in in vivo EPR Tooth Biodosimetry: Meeting the targets for initial triage following a large-scale radiation event

    International Nuclear Information System (INIS)

    Flood, Ann Barry; Schreiber, Wilson; Du, Gaixin; Wood, Victoria A.; Kmiec, Maciej M.; Petryakov, Sergey V.; Williams, Benjamin B.; Swartz, Harold M.; Demidenko, Eugene; Boyle, Holly K.; Dong, Ruhong; Geimer, Shireen; Jarvis, Lesley A.; Kobayashi, Kyo; Nicolalde; Roberto, J.; Crist, Jason; Gupta, Ankit; Raynolds, Timothy; Brugger, Spencer; Budzioh, Pawel; Carr, Brandon; Feldman, Matthew; Gimi, Barjor; Grinberg, Oleg; Krymov, Vladimir; Lesniewski, Piotr; Mariani, Michael; Meaney, Paul M.; Rychert, Kevin M.; Salikhov, Ildar; Tipikin, Dmitriy S.; Tseytlin, Mark; Edwards, Brian R.; Herring, Christopher D.; Lindsay, Catherine; Rosenbaum, Traci; Ali, Arif; Carlson, David; Froncisz, Wojciech; Hirata, Hiroshi; Sidabras, Jason; Swarts, Steven G.

    2016-01-01

    Several important recent advances in the development and evolution of in vivo Tooth Biodosimetry using Electron Paramagnetic Resonance (EPR) allow its performance to meet or exceed the U.S. targeted requirements for accuracy and ease of operation and throughput in a large-scale radiation event. Ergonomically based changes to the magnet, coupled with the development of rotation of the magnet and advanced software to automate collection of data, have made it easier and faster to make a measurement. From start to finish, measurements require a total elapsed time of 5 min, with data acquisition taking place in less than 3 min. At the same time, the accuracy of the data for triage of large populations has improved, as indicated using the metrics of sensitivity, specificity and area under the ROC curve. Applying these standards to the intended population, EPR in vivo Tooth Biodosimetry has approximately the same diagnostic accuracy as the purported 'gold standard' (dicentric chromosome assay). Other improvements include miniaturisation of the spectrometer, leading to the creation of a significantly lighter and more compact prototype that is suitable for transporting for Point of Care (POC) operation and that can be operated off a single standard power outlet. Additional advancements in the resonator, including use of a disposable sensing loop attached to the incisor tooth, have resulted in a biodosimetry method where measurements can be made quickly with a simple 5-step workflow and by people needing only a few minutes of training (which can be built into the instrument as a training video). In sum, recent advancements allow this prototype to meet or exceed the US Federal Government's recommended targets for POC biodosimetry in large-scale events. (authors)

  7. Sensor Fusion-based Event Detection in Wireless Sensor Networks

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, Nirvana; Havinga, Paul J.M.

    2009-01-01

    Recently, Wireless Sensor Networks (WSN) community has witnessed an application focus shift. Although, monitoring was the initial application of wireless sensor networks, in-network data processing and (near) real-time actuation capability have made wireless sensor networks suitable candidate for

  8. Establishment of nuclear knowledge and information infrastructure; establishment of web-based database system for nuclear events

    Energy Technology Data Exchange (ETDEWEB)

    Park, W. J.; Kim, K. J. [Korea Atomic Energy Research Institute , Taejeon (Korea); Lee, S. H. [Korea Institute of Nuclear Safety, Taejeon (Korea)

    2001-05-01

    Nuclear events data reported by nuclear power plants are useful to prevent nuclear accidents at the power plant by examine the cause of initiating events and removal of weak points in the aspects of operational safety, and to improve nuclear safety in design and operation stages by backfitting operational experiences and practices 'Nuclear Event Evaluation Database : NEED' system distributed by CD-ROM media are upgraded to the NEED-Web (Web-based Nuclear Event Evaluation Database) version to manage event data using database system on network basis and the event data and the statistics are provided to the authorized users in the Nuclear Portal Site and publics through Internet Web services. The efforts to establish the NEED-Web system will improve the integrity of events data occurred in Korean nuclear power plant and the usability of data services, and enhance the confidence building and the transparency to the public in nuclear safety. 11 refs., 27 figs. (Author)

  9. Fault trees based on past accidents. Factorial analysis of events

    International Nuclear Information System (INIS)

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  10. European evidence-based recommendations for diagnosis and treatment of paediatric antiphospholipid syndrome: the SHARE initiative.

    Science.gov (United States)

    Groot, Noortje; de Graeff, Nienke; Avcin, Tadej; Bader-Meunier, Brigitte; Dolezalova, Pavla; Feldman, Brian; Kenet, Gili; Koné-Paut, Isabelle; Lahdenne, Pekka; Marks, Stephen D; McCann, Liza; Pilkington, Clarissa A; Ravelli, Angelo; van Royen-Kerkhof, Annet; Uziel, Yosef; Vastert, Sebastiaan J; Wulffraat, Nico M; Ozen, Seza; Brogan, Paul; Kamphuis, Sylvia; Beresford, Michael W

    2017-10-01

    Antiphospholipid syndrome (APS) is rare in children, and evidence-based guidelines are sparse. Consequently, management is mostly based on observational studies and physician's experience, and treatment regimens differ widely. The Single Hub and Access point for paediatric Rheumatology in Europe (SHARE) initiative was launched to develop diagnostic and management regimens for children and young adults with rheumatic diseases. Here, we developed evidence-based recommendations for diagnosis and treatment of paediatric APS. Evidence-based recommendations were developed using the European League Against Rheumatism standard operating procedure. Following a detailed systematic review of the literature, a committee of paediatric rheumatologists and representation of paediatric haematology with expertise in paediatric APS developed recommendations. The literature review yielded 1473 articles, of which 15 were valid and relevant. In total, four recommendations for diagnosis and eight for treatment of paediatric APS (including paediatric Catastrophic Antiphospholipid Syndrome) were accepted. Additionally, two recommendations for children born to mothers with APS were accepted. It was agreed that new classification criteria for paediatric APS are necessary, and APS in association with childhood-onset systemic lupus erythematosus should be identified by performing antiphospholipid antibody screening. Treatment recommendations included prevention of thrombotic events, and treatment recommendations for venous and/or arterial thrombotic events. Notably, due to the paucity of studies on paediatric APS, level of evidence and strength of the recommendations is relatively low. The SHARE initiative provides international, evidence-based recommendations for diagnosis and treatment for paediatric APS, facilitating improvement and uniformity of care. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use

  11. SAMIRA - SAtellite based Monitoring Initiative for Regional Air quality

    Science.gov (United States)

    Schneider, Philipp; Stebel, Kerstin; Ajtai, Nicolae; Diamandi, Andrei; Horalek, Jan; Nicolae, Doina; Stachlewska, Iwona; Zehner, Claus

    2016-04-01

    Here, we present a new ESA-funded project entitled Satellite based Monitoring Initiative for Regional Air quality (SAMIRA), which aims at improving regional and local air quality monitoring through synergetic use of data from present and upcoming satellites, traditionally used in situ air quality monitoring networks and output from chemical transport models. Through collaborative efforts in four countries, namely Romania, Poland, the Czech Republic and Norway, all with existing air quality problems, SAMIRA intends to support the involved institutions and associated users in their national monitoring and reporting mandates as well as to generate novel research in this area. Despite considerable improvements in the past decades, Europe is still far from achieving levels of air quality that do not pose unacceptable hazards to humans and the environment. Main concerns in Europe are exceedances of particulate matter (PM), ground-level ozone, benzo(a)pyrene (BaP) and nitrogen dioxide (NO2). While overall sulfur dioxide (SO2) emissions have decreased in recent years, regional concentrations can still be high in some areas. The objectives of SAMIRA are to improve algorithms for the retrieval of hourly aerosol optical depth (AOD) maps from SEVIRI, and to develop robust methods for deriving column- and near-surface PM maps for the study area by combining satellite AOD with information from regional models. The benefit to existing monitoring networks (in situ, models, satellite) by combining these datasets using data fusion methods will be tested for satellite-based NO2, SO2, and PM/AOD. Furthermore, SAMIRA will test and apply techniques for downscaling air quality-related EO products to a spatial resolution that is more in line with what is generally required for studying urban and regional scale air quality. This will be demonstrated for a set of study sites that include the capitals of the four countries and the highly polluted areas along the border of Poland and the

  12. Network based on statistical multiplexing for event selection and event builder systems in high energy physics experiments

    International Nuclear Information System (INIS)

    Calvet, D.

    2000-03-01

    Systems for on-line event selection in future high energy physics experiments will use advanced distributed computing techniques and will need high speed networks. After a brief description of projects at the Large Hadron Collider, the architectures initially proposed for the Trigger and Data AcQuisition (TD/DAQ) systems of ATLAS and CMS experiments are presented and analyzed. A new architecture for the ATLAS T/DAQ is introduced. Candidate network technologies for this system are described. This thesis focuses on ATM. A variety of network structures and topologies suited to partial and full event building are investigated. The need for efficient networking is shown. Optimization techniques for high speed messaging and their implementation on ATM components are described. Small scale demonstrator systems consisting of up to 48 computers (∼1:20 of the final level 2 trigger) connected via ATM are described. Performance results are presented. Extrapolation of measurements and evaluation of needs lead to a proposal of implementation for the main network of the ATLAS T/DAQ system. (author)

  13. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  14. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  15. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  16. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  17. Initiation of MMA polymerization by iniferters based on dithiocarbamates

    Directory of Open Access Journals (Sweden)

    Jovanović Slobodan M.

    2005-01-01

    Full Text Available Twelve modified dithiocarbamates and a thiuramdisulfide used for the initiation of methyl methacrylate (MMA polymerization were synthesized in this study. The polymerization of MMA was followed by determine the yield and molar mass of the obtained PMMA as a function of polymerization time. Four of the synthesized dithiocarbamates S-benzyl-N,N-dibenzyldithiocarbamate, S-allyl-N,N-dibenzyldithiocarbamate S-benzyl-N,N-diisobutyldithiocarbamate and S-benzoyl-N,N-diisobutyldithiocarbamate, as well as N,N,N',N'-tetrabenzylthiuramdisulfide acted as iniferters. They were active as the initiators of the photo and/or thermally initiated radical polymerization of MMA in bulk and inert solvents (benzene and toluene. S Benzyl - N,N - dibenzyldithiocarbamate can be successfully used for the initiation of MMA polymerization in a polar solvent such as dimethylacetamide.

  18. A prospective study of low fasting glucose with cardiovascular disease events and all-cause mortality: The Women's Health Initiative.

    Science.gov (United States)

    Mongraw-Chaffin, Morgana; LaCroix, Andrea Z; Sears, Dorothy D; Garcia, Lorena; Phillips, Lawrence S; Salmoirago-Blotcher, Elena; Zaslavsky, Oleg; Anderson, Cheryl A M

    2017-05-01

    While there is increasing recognition of the risks associated with hypoglycemia in patients with diabetes, few studies have investigated incident cause-specific cardiovascular outcomes with regard to low fasting glucose in the general population. We hypothesized that low fasting glucose would be associated with cardiovascular disease risk and all-cause mortality in postmenopausal women. To test our hypothesis, we used both continuous incidence rates and Cox proportional hazards models in 17,287 participants from the Women's Health Initiative with fasting glucose measured at baseline. Participants were separated into groups based on fasting glucose level: low (fasting glucose distribution exhibited evidence of a weak J-shaped association with heart failure and mortality that was predominantly due to participants with treated diabetes. Impaired and diabetic fasting glucose were positively associated with all outcomes. Associations for low fasting glucose differed, with coronary heart disease (HR=0.64 (0.42, 0.98)) significantly inverse; stroke (0.73 (0.48, 1.13)), combined cardiovascular disease (0.91 (0.73, 1.14)), and all-cause mortality (0.97 (0.79, 1.20)) null or inverse and not significant; and heart failure (1.27 (0.80, 2.02)) positive and not significant. Fasting glucose at the upper range, but not the lower range, was significantly associated with incident cardiovascular disease and all-cause mortality. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Distribution and Variability of Satellite-Derived Signals of Isolated Convection Initiation Events Over Central Eastern China

    Science.gov (United States)

    Huang, Yipeng; Meng, Zhiyong; Li, Jing; Li, Wanbiao; Bai, Lanqiang; Zhang, Murong; Wang, Xi

    2017-11-01

    This study combined measurements from the Chinese operational geostationary satellite Fengyun-2E (FY-2E) and ground-based weather radars to conduct a statistical survey of isolated convection initiation (CI) over central eastern China (CEC). The convective environment in CEC is modulated by the complex topography and monsoon climate. From May to August 2010, a total of 1,630 isolated CI signals were derived from FY-2E using a semiautomated method. The formation of these satellite-derived CI signals peaks in the early afternoon and occurs with high frequency in areas with remarkable terrain inhomogeneity (e.g., mountain, water, and mountain-water areas). The high signal frequency areas shift from northwest CEC (dry, high altitude) in early summer to southeast CEC (humid, low altitude) in midsummer along with an increasing monthly mean frequency. The satellite-derived CI signals tend to have longer lead times (the time difference between satellite-derived signal formation and radar-based CI) in the late morning and afternoon than in the early morning and night. During the early morning and night, the distinction between cloud top signatures and background terrestrial radiation becomes less apparent, resulting in delayed identification of the signals and thus short and even negative lead times. A decline in the lead time is observed from May to August, likely due to the increasing cloud growth rate and warm-rain processes. Results show increasing lead times with increasing landscape elevation, likely due to more warm-rain processes over the coastal sea and plain, along with a decreasing cloud growth rate from hill and mountain to the plateau.

  20. Knowledge based query expansion in complex multimedia event detection

    NARCIS (Netherlands)

    Boer, M. de; Schutte, K.; Kraaij, W.

    2016-01-01

    A common approach in content based video information retrieval is to perform automatic shot annotation with semantic labels using pre-trained classifiers. The visual vocabulary of state-of-the-art automatic annotation systems is limited to a few thousand concepts, which creates a semantic gap

  1. Knowledge based query expansion in complex multimedia event detection

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2015-01-01

    A common approach in content based video information retrieval is to perform automatic shot annotation with semantic labels using pre-trained classifiers. The visual vocabulary of state-of-the-art automatic annotation systems is limited to a few thousand concepts, which creates a semantic gap

  2. Transcription-based model for the induction of chromosomal exchange events by ionising radiation

    International Nuclear Information System (INIS)

    Radford, I.A.

    2003-01-01

    The mechanistic basis for chromosomal aberration formation, following exposure of mammalian cells to ionising radiation, has long been debated. Although chromosomal aberrations are probably initiated by DNA double-strand breaks (DSB), little is understood about the mechanisms that generate and modulate DNA rearrangement. Based on results from our laboratory and data from the literature, a novel model of chromosomal aberration formation has been suggested (Radford 2002). The basic postulates of this model are that: (1) DSB, primarily those involving multiple individual damage sites (i.e. complex DSB), are the critical initiating lesion; (2) only those DSB occurring in transcription units that are associated with transcription 'factories' (complexes containing multiple transcription units) induce chromosomal exchange events; (3) such DSB are brought into contact with a DNA topoisomerase I molecule through RNA polymerase II catalysed transcription and give rise to trapped DNA-topo I cleavage complexes; and (4) trapped complexes interact with another topo I molecule on a temporarily inactive transcription unit at the same transcription factory leading to DNA cleavage and subsequent strand exchange between the cleavage complexes. We have developed a method using inverse PCR that allows the detection and sequencing of putative ionising radiation-induced DNA rearrangements involving different regions of the human genome (Forrester and Radford 1998). The sequences detected by inverse PCR can provide a test of the prediction of the transcription-based model that ionising radiation-induced DNA rearrangements occur between sequences in active transcription units. Accordingly, reverse transcriptase PCR was used to determine if sequences involved in rearrangements were transcribed in the test cells. Consistent with the transcription-based model, nearly all of the sequences examined gave a positive result to reverse transcriptase PCR (Forrester and Radford unpublished)

  3. Tag and Neighbor based Recommender systems for Medical events

    DEFF Research Database (Denmark)

    Bayyapu, Karunakar Reddy; Dolog, Peter

    2010-01-01

    This paper presents an extension of a multifactor recommendation approach based on user tagging with term neighbours. Neighbours of words in tag vectors and documents provide for hitting larger set of documents and not only those matching with direct tag vectors or content of the documents. Tag...... in the situations where the quality of tags is lower. We discuss the approach on the examples from the existing Medworm system to indicate the usefulness of the approach....

  4. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  5. Flood modelling with a distributed event-based parsimonious rainfall-runoff model: case of the karstic Lez river catchment

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2012-04-01

    Full Text Available Rainfall-runoff models are crucial tools for the statistical prediction of flash floods and real-time forecasting. This paper focuses on a karstic basin in the South of France and proposes a distributed parsimonious event-based rainfall-runoff model, coherent with the poor knowledge of both evaporative and underground fluxes. The model combines a SCS runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The efficiency of the model is discussed not only to satisfactorily simulate floods but also to get powerful relationships between the initial condition of the model and various predictors of the initial wetness state of the basin, such as the base flow, the Hu2 index from the Meteo-France SIM model and the piezometric levels of the aquifer. The advantage of using meteorological radar rainfall in flood modelling is also assessed. Model calibration proved to be satisfactory by using an hourly time step with Nash criterion values, ranging between 0.66 and 0.94 for eighteen of the twenty-one selected events. The radar rainfall inputs significantly improved the simulations or the assessment of the initial condition of the model for 5 events at the beginning of autumn, mostly in September–October (mean improvement of Nash is 0.09; correction in the initial condition ranges from −205 to 124 mm, but were less efficient for the events at the end of autumn. In this period, the weak vertical extension of the precipitation system and the low altitude of the 0 °C isotherm could affect the efficiency of radar measurements due to the distance between the basin and the radar (~60 km. The model initial condition S is correlated with the three tested predictors (R2 > 0.6. The interpretation of the model suggests that groundwater does not affect the first peaks of the flood, but can strongly impact subsequent peaks in the case of a multi-storm event. Because this kind of model is based on a limited

  6. Microseismic Event Grouping Based on PageRank Linkage at the Newberry Volcano Geothermal Site

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2016-12-01

    The Newberry Volcano DOE FORGE site in Central Oregon has been stimulated two times using high-pressure fluid injection to study the Enhanced Geothermal Systems (EGS) technology. Several hundred microseismic events were generated during the first stimulation in the fall of 2012. Initial locations of this microseismicity do not show well defined subsurface structure in part because event location uncertainties are large (Foulger and Julian, 2013). We focus on this stimulation to explore the spatial and temporal development of microseismicity, which is key to understanding how subsurface stimulation modifies stress, fractures rock, and increases permeability. We use PageRank, Google's initial search algorithm, to determine connectivity within the events (Aguiar and Beroza, 2014) and assess signal-correlation topology for the micro-earthquakes. We then use this information to create signal families and compare these to the spatial and temporal proximity of associated earthquakes. We relocate events within families (identified by PageRank linkage) using the Bayesloc approach (Myers et al., 2007). Preliminary relocations show tight spatial clustering of event families as well as evidence of events relocating to a different cluster than originally reported. We also find that signal similarity (linkage) at several stations, not just one or two, is needed in order to determine that events are in close proximity to one another. We show that indirect linkage of signals using PageRank is a reliable way to increase the number of events that are confidently determined to be similar to one another, which may lead to efficient and effective grouping of earthquakes with similar physical characteristics, such as focal mechanisms and stress drop. Our ultimate goal is to determine whether changes in the state of stress and/or changes in the generation of subsurface fracture networks can be detected using PageRank topology as well as aid in the event relocation to obtain more accurate

  7. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  8. Support for At-Risk Girls: A School-Based Mental Health Nursing Initiative.

    Science.gov (United States)

    Adamshick, Pamela

    2015-09-01

    Mental health problems often go undiagnosed or unaddressed until a crisis or extreme event brings the problem to the forefront. Youth are particularly at risk for lack of identification and treatment in regard to mental health issues. This article describes an advanced nursing practice mental health initiative for at-risk teenage girls based on Hildegard Peplau's nursing theory, group process, and healing through holistic health approaches. A support group, RICHES, was developed with focus on core components of relationships, identity, communication, health, esteem, and support. The acronym RICHES was chosen as the name of the support group. Selected themes and issues addressed in this school-based support group are illustrated in case vignettes. Through a collaborative approach with the community and school, this practice initiative presents a unique healing process that extends knowledge in the realm of intervention with at-risk teenage girls. Further research is needed on the efficacy of support groups to modify risk factors and to address goals for primary prevention in at-risk teenage girls. © The Author(s) 2014.

  9. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  10. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    Science.gov (United States)

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  11. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  12. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  13. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  14. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  15. Young adults' recreational social environment as a predictor of ecstasy use initiation: findings of a population-based prospective study.

    Science.gov (United States)

    Smirnov, Andrew; Najman, Jake M; Hayatbakhsh, Reza; Wells, Helene; Legosz, Margot; Kemp, Robert

    2013-10-01

    To examine prospectively the contribution of the recreational social environment to ecstasy initiation. Population-based retrospective/prospective cohort study. Data from screening an Australian young adult population to obtain samples of users and non-users of ecstasy. A sample of 204 ecstasy-naive participants aged 19-23 years was obtained, and a 6-month follow-up identified those who initiated ecstasy use. We assessed a range of predictors of ecstasy initiation, including elements of participants' social environment, such as ecstasy-using social contacts and involvement in recreational settings. More than 40% of ecstasy-naive young adults reported ever receiving ecstasy offers. Ecstasy initiation after 6 months was predicted independently by having, at recruitment, many ecstasy-using social contacts [adjusted relative risk (ARR) 3.15, 95% confidence interval (CI): 1.57, 6.34], attending electronic/dance music events (ARR 6.97, 95% CI: 1.99, 24.37), receiving an ecstasy offer (ARR 4.02, 95% CI: 1.23, 13.10), early cannabis use (ARR 4.04, 95% CI: 1.78, 9.17) and psychological distress (ARR 5.34, 95% CI: 2.31, 12.33). Adjusted population-attributable fractions were highest for ecstasy-using social contacts (17.7%) and event attendance (15.1%). In Australia, ecstasy initiation in early adulthood is associated predominantly with social environmental factors, including ecstasy-using social contacts and attendance at dance music events, and is associated less commonly with psychological distress and early cannabis use, respectively. A combination of universal and targeted education programmes may be appropriate for reducing rates of ecstasy initiation and associated harms. © 2013 Society for the Study of Addiction.

  16. Estimative of core damage frequency in IPEN'S IEA-R1 research reactor due to the initiating event of loss of coolant caused by large rupture in the pipe of the primary circuit

    International Nuclear Information System (INIS)

    Hirata, Daniel Massami; Sabundjian, Gaiane; Cabral, Eduardo Lobo Lustosa

    2009-01-01

    The National Commission of Nuclear Energy (CNEN), which is the Brazilian nuclear regulatory commission, imposes safety and licensing standards in order to ensure that the nuclear power plants operate in a safe way. For licensing a nuclear reactor one of the demands of CNEN is the simulation of some accidents and thermalhydraulic transients considered as design base to verify the integrity of the plant when submitted to adverse conditions. The accidents that must be simulated are those that present large probability to occur or those that can cause more serious consequences. According to the FSAR (Final Safety Analysis Report) the initiating event that can cause the largest damage in the core, of the IEA-R1 research reactor at IPEN-CNEN/SP, is the LOCA (Loss of Coolant Accident). The objective of this paper is estimate the frequency of the IEA-R1 core damage, caused by this initiating event. In this paper we analyze the accident evolution and performance of the systems which should mitigate this event: the Emergency Coolant Core System (ECCS) and the isolated pool system. They will be analyzed by means of the event tree. In this work the reliability of these systems are also quantified using the fault tree. (author)

  17. Predicting honey bee sensitivity based on the conservation of the pesticide molecular initiating event

    Science.gov (United States)

    Concern surrounding the potential adverse impacts of pesticides to honey bee colonies has led to the need for rapid/cost efficient methods for aiding decision making relative to the protection of this important pollinator species. Neonicotinoids represent a class of pesticides th...

  18. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  19. Central FPGA-based destination and load control in the LHCb MHz event readout

    Science.gov (United States)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  20. Central FPGA-based destination and load control in the LHCb MHz event readout

    International Nuclear Information System (INIS)

    Jacobsson, R.

    2012-01-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  1. The Effectiveness of Web-Based Instruction: An Initial Inquiry

    Directory of Open Access Journals (Sweden)

    Tatana M. Olson

    2002-10-01

    Full Text Available As the use of Web-based instruction increases in the educational and training domains, many people have recognized the importance of evaluating its effects on student outcomes such as learning, performance, and satisfaction. Often, these results are compared to those of conventional classroom instruction in order to determine which method is “better.” However, major differences in technology and presentation rather than instructional content can obscure the true relationship between Web-based instruction and these outcomes. Computer-based instruction (CBI, with more features similar to Web-based instruction, may be a more appropriate benchmark than conventional classroom instruction. Furthermore, there is little consensus as to what variables should be examined or what measures of learning are the most appropriate, making comparisons between studies difficult and inconclusive. In this article, we review the historical findings of CBI as an appropriate benchmark to Web-based instruction. In addition, we review 47 reports of evaluations of Web-based courses in higher education published between 1996 and 2002. A tabulation of the documented findings into eight characteristics is offered, along with our assessments of the experimental designs, effect sizes, and the degree to which the evaluations incorporated features unique to Web-based instruction.

  2. Effect of initiation-inhibition and handedness on the patterns of the P50 event-related potential component: a low resolution electromagnetic tomography study

    Directory of Open Access Journals (Sweden)

    Capsalis Christos N

    2009-12-01

    Full Text Available Abstract Background Recent research recognizes the association between handedness, linguistic processes and cerebral networks subserving executive functioning, but the nature of this association remains unclear. Since the P50 event related potential (ERP is considered to reflect thalamocortical processes in association with working memory (WM operation the present study focuses on P50 patterns elicited during the performance of a linguistic related executive functioning test in right- and left-handers. Methods In 64 young adults with a high educational level (33 left-handed the P50 event-related potential was recorded while performing the initiation and inhibition condition of a modified version of the Hayling Sentence Completion test adjusted to induce WM. The manual preference of the participants was evaluated with the use of the Edinburgh Handedness Inventory (EHI. Results P50 showed greater amplitudes in left- than in right-handers, mainly in frontal leads, in the initiation condition. Reduced amplitudes in inhibition compared to initiation condition were observed in left-handers. Low Resolution Electromagnetic Tomography (LORETA analysis showed lower frontal lobe activation in the inhibition than in the initiation condition in both right- and left-handers. Also, LORETA yielded that right-handers exhibited greater activation in the inhibition condition than left-handers. Additionally, LORETA showed assymetrical hemispheric activation patterns in right-handers, in contrast to symmetrical patterns observed in left-handers. Higher P50 amplitudes were recorded in right-hemisphere of right-handers in the initiation condition. Conclusion Brain activation, especially the one closely related to thalamocortical function, elicited during WM operation involving initiation and inhibition processes appears to be related to handedness.

  3. Accident analyses in nuclear power plants following external initiating events and in the shutdown state. Final report; Unfallanalysen in Kernkraftwerken nach anlagenexternen ausloesenden Ereignissen und im Nichtleistungsbetrieb. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Loeffler, Horst; Kowalik, Michael; Mildenberger, Oliver; Hage, Michael

    2016-06-15

    The work which is documented here provides the methodological basis for improvement of the state of knowledge for accident sequences after plant external initiating events and for accident sequences which begin in the shutdown state. The analyses have been done for a PWR and for a BWR reference plant. The work has been supported by the German federal ministry BMUB under the label 3612R01361. Top objectives of the work are: - Identify relevant event sequences in order to define characteristic initial and boundary conditions - Perform accident analysis of selected sequences - Evaluate the relevance of accident sequences in a qualitative way The accident analysis is performed with the code MELCOR 1.8.6. The applied input data set has been significantly improved compared to previous analyses. The event tree method which is established in PSA level 2 has been applied for creating a structure for a unified summarization and evaluation of the results from the accident analyses. The computer code EVNTRE has been applied for this purpose. In contrast to a PSA level 2, the branching probabilities of the event tree have not been determined with the usual accuracy, but they are given in an approximate way only. For the PWR, the analyses show a considerable protective effect of the containment also in the case of beyond design events. For the BWR, there is a rather high probability for containment failure under core melt impact, but nevertheless the release of radionuclides into the environment is very limited because of plant internal retention mechanisms. This report concludes with remarks about existing knowledge gaps and with regard to core melt sequences, and about possible improvements of the plant safety.

  4. Ensemble-based flash-flood modelling: Taking into account hydrodynamic parameters and initial soil moisture uncertainties

    Science.gov (United States)

    Edouard, Simon; Vincendon, Béatrice; Ducrocq, Véronique

    2018-05-01

    Intense precipitation events in the Mediterranean often lead to devastating flash floods (FF). FF modelling is affected by several kinds of uncertainties and Hydrological Ensemble Prediction Systems (HEPS) are designed to take those uncertainties into account. The major source of uncertainty comes from rainfall forcing and convective-scale meteorological ensemble prediction systems can manage it for forecasting purpose. But other sources are related to the hydrological modelling part of the HEPS. This study focuses on the uncertainties arising from the hydrological model parameters and initial soil moisture with aim to design an ensemble-based version of an hydrological model dedicated to Mediterranean fast responding rivers simulations, the ISBA-TOP coupled system. The first step consists in identifying the parameters that have the strongest influence on FF simulations by assuming perfect precipitation. A sensitivity study is carried out first using a synthetic framework and then for several real events and several catchments. Perturbation methods varying the most sensitive parameters as well as initial soil moisture allow designing an ensemble-based version of ISBA-TOP. The first results of this system on some real events are presented. The direct perspective of this work will be to drive this ensemble-based version with the members of a convective-scale meteorological ensemble prediction system to design a complete HEPS for FF forecasting.

  5. Telemedicine-Based Burn Research Initiative: Longitudinal Outcomes of Patients

    National Research Council Canada - National Science Library

    Montalvo, Alfredo

    2003-01-01

    .... All instruments were professionally printed. The consultant for the project was hired and telemedicine equipment was evaluated by the consultant based on clinical requirements defined by the research team...

  6. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  7. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Science.gov (United States)

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms. PMID:22163972

  8. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  9. Noether's Theorem and its Inverse of Birkhoffian System in Event Space Based on Herglotz Variational Problem

    Science.gov (United States)

    Tian, X.; Zhang, Y.

    2018-03-01

    Herglotz variational principle, in which the functional is defined by a differential equation, generalizes the classical ones defining the functional by an integral. The principle gives a variational principle description of nonconservative systems even when the Lagrangian is independent of time. This paper focuses on studying the Noether's theorem and its inverse of a Birkhoffian system in event space based on the Herglotz variational problem. Firstly, according to the Herglotz variational principle of a Birkhoffian system, the principle of a Birkhoffian system in event space is established. Secondly, its parametric equations and two basic formulae for the variation of Pfaff-Herglotz action of a Birkhoffian system in event space are obtained. Furthermore, the definition and criteria of Noether symmetry of the Birkhoffian system in event space based on the Herglotz variational problem are given. Then, according to the relationship between the Noether symmetry and conserved quantity, the Noether's theorem is derived. Under classical conditions, Noether's theorem of a Birkhoffian system in event space based on the Herglotz variational problem reduces to the classical ones. In addition, Noether's inverse theorem of the Birkhoffian system in event space based on the Herglotz variational problem is also obtained. In the end of the paper, an example is given to illustrate the application of the results.

  10. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  11. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  12. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  13. Evaluation of extreme temperature events in northern Spain based on process control charts

    Science.gov (United States)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  14. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  15. Fluence-based and microdosimetric event-based methods for radiation protection in space

    International Nuclear Information System (INIS)

    Curtis, S.B.

    2002-01-01

    The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report no.137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/linear energy transfer (LET) method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented. (author)

  16. Event-based motion correction for PET transmission measurements with a rotating point source

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Meikle, Steven R; Fulton, Roger

    2011-01-01

    Accurate attenuation correction is important for quantitative positron emission tomography (PET) studies. When performing transmission measurements using an external rotating radioactive source, object motion during the transmission scan can distort the attenuation correction factors computed as the ratio of the blank to transmission counts, and cause errors and artefacts in reconstructed PET images. In this paper we report a compensation method for rigid body motion during PET transmission measurements, in which list mode transmission data are motion corrected event-by-event, based on known motion, to ensure that all events which traverse the same path through the object are recorded on a common line of response (LOR). As a result, the motion-corrected transmission LOR may record a combination of events originally detected on different LORs. To ensure that the corresponding blank LOR records events from the same combination of contributing LORs, the list mode blank data are spatially transformed event-by-event based on the same motion information. The number of counts recorded on the resulting blank LOR is then equivalent to the number of counts that would have been recorded on the corresponding motion-corrected transmission LOR in the absence of any attenuating object. The proposed method has been verified in phantom studies with both stepwise movements and continuous motion. We found that attenuation maps derived from motion-corrected transmission and blank data agree well with those of the stationary phantom and are significantly better than uncorrected attenuation data.

  17. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  18. Lifelong Learning for All in Asian Communities: ICT Based Initiatives

    Science.gov (United States)

    Misra, Pradeep Kumar

    2011-01-01

    The necessity to adjust to the prerequisites of the knowledge based society and economy brought about the need for lifelong learning for all in Asian communities. The concept of lifelong learning stresses that learning and education are related to life as a whole - not just to work - and that learning throughout life is a continuum that should run…

  19. Three Initiatives for Community-Based Art Education Practices

    Science.gov (United States)

    Lim, Maria; Chang, EunJung; Song, Borim

    2013-01-01

    Art educators should be concerned with teaching their students to make critical connections between the classroom and the outside world. One effective way to make these critical connections is to provide students with the opportunity to engage in community-based art endeavors. In this article, three university art educators discuss engaging…

  20. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  1. Initial Investigation of Software-Based Bone-Suppressed Imaging

    International Nuclear Information System (INIS)

    Park, Eunpyeong; Youn, Hanbean; Kim, Ho Kyung

    2015-01-01

    Chest radiography is the most widely used imaging modality in medicine. However, the diagnostic performance of chest radiography is deteriorated by the anatomical background of the patient. So, dual energy imaging (DEI) has recently been emerged and demonstrated an improved. However, the typical DEI requires more than two projections, hence causing additional patient dose. The motion artifact is another concern in the DEI. In this study, we investigate DEI-like bone-suppressed imaging based on the post processing of a single radiograph. To obtain bone-only images, we use the artificial neural network (ANN) method with the error backpropagation-based machine learning approach. The computational load of learning process of the ANN is too heavy for a practical implementation because we use the gradient descent method for the error backpropagation. We will use a more advanced error propagation method for the learning process

  2. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  3. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  4. Vegetation response to the 2016-2017 extreme Sierra Nevada snowfall event using multitemporal terrestrial laser scanning: initial results

    Science.gov (United States)

    Greenberg, J. A.; Hou, Z.; Ramirez, C.; Hart, R.; Marchi, N.; Parra, A. S.; Gutierrez, B.; Tompkins, R.; Harpold, A.; Sullivan, B. W.; Weisberg, P.

    2017-12-01

    The Sierra Nevada Mountains experienced record-breaking snowfall during the 2016-2017 winter after a prolonged period of drought. We hypothesized that at lower elevations, the increased snowmelt would result in a significant increase in biomass across vegetation strata, but at higher elevations, the snowpack would result in a diminished growing season, and yield a suppression of growth rates particularly in the understory vegetation. To test these hypotheses, we sampled sites across the Plumas National Forest and Lake Tahoe Basin using a terrestrial laser scanner (TLS) in the early growing season, and then rescanned these sites in the late growing season. Herein, we present initial, early results from this analysis, focusing on the biomass and height changes in trees.

  5. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  6. Supervision in the PC based prototype for the ATLAS event filter

    CERN Document Server

    Bee, C P; Etienne, F; Fede, E; Meessen, C; Nacasch, R; Qian, Z; Touchard, F

    1999-01-01

    A prototype of the ATLAS event filter based on commodity PCs linked by a Fast Ethernet switch has been developed in Marseille. The present contribution focus on the supervision aspects of the prototype based on Java and Java mobile agents technology. (5 refs).

  7. Skeletal-related events among breast and prostate cancer patients: towards new treatment initiation in Malaysia's hospital setting.

    Science.gov (United States)

    Ezat, Sharifa Wan Puteh; Syed Junid, Syed Mohamed Aljunid; Noraziani, Khamis; Zafar, Ahmed; Saperi, Sulong; Nur, Amrizal Muhammad; Aizuddin, Azimatun Noor; Ismail, Fuad; Abdullah, Norlia; Zainuddin, Zulkifli Md; Mohd Kassim, Abdul Yazid; Haflah, Nor Hazla Mohamed

    2013-01-01

    The human skeleton is the most common organ to be affected by metastatic cancer and bone metastases are a major cause of cancer morbidity. The five most frequent cancers in Malaysia among males includes prostate whereas breast cancer is among those in females, both being associated with skeletal lesions. Bone metastases weaken bone structure, causing a range of symptoms and complications thus developing skeletal-related events (SRE). Patients with SRE may require palliative radiotherapy or surgery to bone for pain, having hypercalcaemia, pathologic fractures, and spinal cord compression. These complications contribute to a decline in patient health- related quality of life. The multidimensional assessment of health-related quality of life for those patients is important other than considering a beneficial treatment impact on patient survival, since the side effects of treatment and disease symptoms can significantly impact health-related quality of life. Cancer treatment could contribute to significant financial implications for the healthcare system. Therefore, it is essential to assess the health-related quality of life and treatment cost, among prostate and breast cancer patients in countries like Malaysia to rationalized cost-effective way for budget allocation or utilization of health care resources, hence helping in providing more personalized treatment for cancer patients.

  8. Nitrogen Trifluoride-Based Fluoride- Volatility Separations Process: Initial Studies

    Energy Technology Data Exchange (ETDEWEB)

    McNamara, Bruce K.; Scheele, Randall D.; Casella, Andrew M.; Kozelisky, Anne E.

    2011-09-28

    This document describes the results of our investigations on the potential use of nitrogen trifluoride as the fluorinating and oxidizing agent in fluoride volatility-based used nuclear fuel reprocessing. The conceptual process uses differences in reaction temperatures between nitrogen trifluoride and fuel constituents that produce volatile fluorides to achieve separations and recover valuable constituents. We provide results from our thermodynamic evaluations, thermo-analytical experiments, kinetic models, and provide a preliminary process flowsheet. The evaluations found that nitrogen trifluoride can effectively produce volatile fluorides at different temperatures dependent on the fuel constituent.

  9. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  10. Coupled prediction of flood response and debris flow initiation during warm- and cold-season events in the Southern Appalachians, USA

    Science.gov (United States)

    Tao, J.; Barros, A. P.

    2014-01-01

    Debris flows associated with rainstorms are a frequent and devastating hazard in the Southern Appalachians in the United States. Whereas warm-season events are clearly associated with heavy rainfall intensity, the same cannot be said for the cold-season events. Instead, there is a relationship between large (cumulative) rainfall events independently of season, and thus hydrometeorological regime, and debris flows. This suggests that the dynamics of subsurface hydrologic processes play an important role as a trigger mechanism, specifically through soil moisture redistribution by interflow. We further hypothesize that the transient mass fluxes associated with the temporal-spatial dynamics of interflow govern the timing of shallow landslide initiation, and subsequent debris flow mobilization. The first objective of this study is to investigate this relationship. The second objective is to assess the physical basis for a regional coupled flood prediction and debris flow warning system. For this purpose, uncalibrated model simulations of well-documented debris flows in headwater catchments of the Southern Appalachians using a 3-D surface-groundwater hydrologic model coupled with slope stability models are examined in detail. Specifically, we focus on two vulnerable headwater catchments that experience frequent debris flows, the Big Creek and the Jonathan Creek in the Upper Pigeon River Basin, North Carolina, and three distinct weather systems: an extremely heavy summertime convective storm in 2011; a persistent winter storm lasting several days; and a severe winter storm in 2009. These events were selected due to the optimal availability of rainfall observations; availability of detailed field surveys of the landslides shortly after they occurred, which can be used to evaluate model predictions; and because they are representative of events that cause major economic losses in the region. The model results substantiate that interflow is a useful prognostic of conditions

  11. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  12. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  13. Event-based scenario manager for multibody dynamics simulation of heavy load lifting operations in shipyards

    Directory of Open Access Journals (Sweden)

    Sol Ha

    2016-01-01

    Full Text Available This paper suggests an event-based scenario manager capable of creating and editing a scenario for shipbuilding process simulation based on multibody dynamics. To configure various situation in shipyards and easily connect with multibody dynamics, the proposed method has two main concepts: an Actor and an Action List. The Actor represents the anatomic unit of action in the multibody dynamics and can be connected to a specific component of the dynamics kernel such as the body and joint. The user can make a scenario up by combining the actors. The Action List contains information for arranging and executing the actors. Since the shipbuilding process is a kind of event-based sequence, all simulation models were configured using Discrete EVent System Specification (DEVS formalism. The proposed method was applied to simulations of various operations in shipyards such as lifting and erection of a block and heavy load lifting operation using multiple cranes.

  14. Limits on the efficiency of event-based algorithms for Monte Carlo neutron transport

    Directory of Open Access Journals (Sweden)

    Paul K. Romano

    2017-09-01

    Full Text Available The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, the vector speedup is also limited by differences in the execution time for events being carried out in a single event-iteration.

  15. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  16. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  17. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    Science.gov (United States)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  18. Sensitivity of the Reaction Mechanism of the Ozone Depletion Events during the Arctic Spring on the Initial Atmospheric Composition of the Troposphere

    Directory of Open Access Journals (Sweden)

    Le Cao

    2016-09-01

    Full Text Available Ozone depletion events (ODEs during the Arctic spring have been investigated since the 1980s. It was found that the depletion of ozone is highly associated with the release of halogens, especially bromine containing compounds. These compounds originate from various substrates such as the ice/snow-covered surfaces in Arctic. In the present study, the dependence of the mixing ratios of ozone and principal bromine species during ODEs on the initial composition of the Arctic atmospheric boundary layer was investigated by using a concentration sensitivity analysis. This analysis was performed by implementing a reaction mechanism representing the ozone depletion and halogen release in the box model KINAL (KInetic aNALysis of reaction mechanics. The ratios between the relative change of the mixing ratios of particular species such as ozone and the variation in the initial concentration of each atmospheric component were calculated, which indicate the relative importance of each initial species in the chemical kinetic system. The results of the computations show that the impact of various chemical species is different for ozone and bromine containing compounds during the depletion of ozone. It was found that CH3CHO critically controls the time scale of the complete removal of ozone. However, the rate of the ozone loss and the maximum values of bromine species are only slightly influenced by the initial value of CH3CHO. In addition, according to the concentration sensitivity analysis, the reduction of initial Br2 was found to cause a significant retardant of the ODE while the initial mixing ratio of HBr exerts minor influence on both ozone and bromine species. In addition, it is also interesting to note that the increase of C2H2 would significantly raise the amount of HOBr and Br in the atmosphere while the ozone depletion is hardly changed.

  19. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  20. Intensity changes in future extreme precipitation: A statistical event-based approach.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    Short-lived precipitation extremes are often responsible for hazards in urban and rural environments with economic and environmental consequences. The precipitation intensity is expected to increase about 7% per degree of warming, according to the Clausius-Clapeyron (CC) relation. However, the observations often show a much stronger increase in the sub-daily values. In particular, the behavior of the hourly summer precipitation from radar observations with the dew point temperature (the Pi-Td relation) for the Netherlands suggests that for moderate to warm days the intensification of the precipitation can be even higher than 21% per degree of warming, that is 3 times higher than the expected CC relation. The rate of change depends on the initial precipitation intensity, as low percentiles increase with a rate below CC, the medium percentiles with 2CC and the moderate-high and high percentiles with 3CC. This non-linear statistical Pi-Td relation is suggested to be used as a delta-transformation to project how a historic extreme precipitation event would intensify under future, warmer conditions. Here, the Pi-Td relation is applied over a selected historic extreme precipitation event to 'up-scale' its intensity to warmer conditions. Additionally, the selected historic event is simulated in the high-resolution, convective-permitting weather model Harmonie. The initial and boundary conditions are alternated to represent future conditions. The comparison between the statistical and the numerical method of projecting the historic event to future conditions showed comparable intensity changes, which depending on the initial percentile intensity, range from below CC to a 3CC rate of change per degree of warming. The model tends to overestimate the future intensities for the low- and the very high percentiles and the clouds are somewhat displaced, due to small wind and convection changes. The total spatial cloud coverage in the model remains, as also in the statistical

  1. Ground-based solar radio observations of the August 1972 events

    International Nuclear Information System (INIS)

    Bhonsle, R.V.; Degaonkar, S.S.; Alurkar, S.K.

    1976-01-01

    Ground-based observations of the variable solar radio emission ranging from few millimetres to decametres have been used here as a diagnostic tool to gain coherent phenomenological understanding of the great 2, 4 and 7 August, 1972 solar events in terms of dominant physical processes like generation and propagation of shock waves in the solar atmosphere, particle acceleration and trapping. Four major flares are selected for detailed analysis on the basis of their ability to produce energetic protons, shock waves, polar cap absorptions (PCA) and sudden commencement (SC) geomagnetic storms. A comparative study of their radio characteristics is made. Evidence is seen for the pulsations during microwave bursts by the mechanism similar to that proposed by McLean et al. (1971), to explain the pulsations in the metre wavelength continuum radiation. It is suggested that the multiple peaks observed in some microwave bursts may be attributable to individual flares occurring sequentially due to a single initiating flare. Attempts have been made to establish identification of Type II bursts with the interplanetary shock waves and SC geomagnetic storms. Furthermore, it is suggested that it is the mass behind the shock front which is the deciding factor for the detection of shock waves in the interplantary space. It appears that more work is necessary in order to identify which of the three moving Type IV bursts (Wild and Smerd, 1972), namely, advancing shock front, expanding magnetic arch and ejected plasma blob serves as the piston-driver behind the interplanetary shocks. The existing criteria for proton flare prediction have been summarized and two new criteria have been proposed. (Auth.)

  2. Moving State Marine SINS Initial Alignment Based on High Degree CKF

    Directory of Open Access Journals (Sweden)

    Yong-Gang Zhang

    2014-01-01

    Full Text Available A new moving state marine initial alignment method of strap-down inertial navigation system (SINS is proposed based on high-degree cubature Kalman filter (CKF, which can capture higher order Taylor expansion terms of nonlinear alignment model than the existing third-degree CKF, unscented Kalman filter and central difference Kalman filter, and improve the accuracy of initial alignment under large heading misalignment angle condition. Simulation results show the efficiency and advantage of the proposed initial alignment method as compared with existing initial alignment methods for the moving state SINS initial alignment with large heading misalignment angle.

  3. Early prediction of adverse events in enhanced recovery based upon the host systemic inflammatory response.

    Science.gov (United States)

    Lane, J C; Wright, S; Burch, J; Kennedy, R H; Jenkins, J T

    2013-02-01

    Early identification of patients experiencing postoperative complications is imperative for successful management. C-reactive protein (CRP) is a nonspecific marker of inflammation used in many specialties to monitor patient condition. The role of CRP measurement early in the elective postoperative colorectal patient is unclear, particularly in the context of enhanced recovery (ERAS). Five hundred and thirty-three consecutive patients who underwent elective colorectal surgery between October 2008 and October 2010 within an established ERAS programme were studied. Patients were separated into a development group of 265 patients and a validation group of 268 patients by chronological order. CRP and white cell count were added to a prospectively maintained ERAS database. The primary outcome of the study was all adverse events (including infective complications, postoperative organ dysfunction and prolonged length of stay) during the initial hospital admission. Significant predictors for adverse events on univariate analysis were submitted to multivariate regression analysis and the resulting model applied to the validation group. The validity and predictive accuracy of the regression model was assessed using receiver operating characteristic curve/area under the curve (AUC) analysis. CRP levels >150 mg/l on postoperative day 2 and a rising CRP on day 3 were independently associated with all adverse events during the hospital admission. A weighted model was applied to the validation group yielding an AUC of 0.65 (95% CI 0.58-0.73) indicating, at best, modest discrimination and predictive accuracy for adverse events. Measurement of CRP in patients after elective colorectal surgery in the first few days after surgery within ERAS can assist in identifying those at risk of adverse events and a prolonged hospital stay. A CRP value of >150 mg/l on day 2 and a rising CRP on day 3 should alert the surgeon to an increased likelihood of such events. © 2012 The Authors

  4. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  5. Life review based on remembering specific positive events in active aging.

    Science.gov (United States)

    Latorre, José M; Serrano, Juan P; Ricarte, Jorge; Bonete, Beatriz; Ros, Laura; Sitges, Esther

    2015-02-01

    The aim of this study is to evaluate the effectiveness of life review (LR) based on specific positive events in non-depressed older adults taking part in an active aging program. Fifty-five older adults were randomly assigned to an experimental group or an active control (AC) group. A six-session individual training of LR based on specific positive events was carried out with the experimental group. The AC group undertook a "media workshop" of six sessions focused on learning journalistic techniques. Pre-test and post-test measures included life satisfaction, depressive symptoms, experiencing the environment as rewarding, and autobiographical memory (AM) scales. LR intervention decreased depressive symptomatology, improved life satisfaction, and increased specific memories. The findings suggest that practice in AM for specific events is an effective component of LR that could be a useful tool in enhancing emotional well-being in active aging programs, thus reducing depressive symptoms. © The Author(s) 2014.

  6. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  7. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  8. Automated reasoning with dynamic event trees: a real-time, knowledge-based decision aide

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Subramanyan, N.

    1988-01-01

    The models and data contained in a probabilistic risk assessment (PRA) Event Sequence Analysis represent a wealth of information that can be used for dynamic calculation of event sequence likelihood. In this paper we report a new and unique computerization methodology which utilizes these data. This sub-system (referred to as PREDICTOR) has been developed and tested as part of a larger system. PREDICTOR performs a real-time (re)calculation of the estimated likelihood of core-melt as a function of plant status. This methodology uses object-oriented programming techniques from the artificial intelligence discipline that enable one to codify event tree and fault tree logic models and associated probabilities developed in a PRA study. Existence of off-normal conditions is reported to PREDICTOR, which then updates the relevant failure probabilities throughout the event tree and fault tree models by dynamically replacing the off-the-shelf (or prior) probabilities with new probabilities based on the current situation. The new event probabilities are immediately propagated through the models (using 'demons') and an updated core-melt probability is calculated. Along the way, the dominant non-success path of each event tree is determined and highlighted. (author)

  9. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  10. Seismology-based early identification of dam-formation landquake events.

    Science.gov (United States)

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  11. A browser-based event display for the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Hategan, M; McCauley, T; Nguyen, P

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  12. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  13. Location aware event driven multipath routing in Wireless Sensor Networks: Agent based approach

    Directory of Open Access Journals (Sweden)

    A.V. Sutagundar

    2013-03-01

    Full Text Available Wireless Sensor Networks (WSNs demand reliable and energy efficient paths for critical information delivery to sink node from an event occurrence node. Multipath routing facilitates reliable data delivery in case of critical information. This paper proposes an event triggered multipath routing in WSNs by employing a set of static and mobile agents. Every sensor node is assumed to know the location information of the sink node and itself. The proposed scheme works as follows: (1 Event node computes the arbitrary midpoint between an event node and the sink node by using location information. (2 Event node establishes a shortest path from itself to the sink node through the reference axis by using a mobile agent with the help of location information; the mobile agent collects the connectivity information and other parameters of all the nodes on the way and provides the information to the sink node. (3 Event node finds the arbitrary location of the special (middle intermediate nodes (above/below reference axis by using the midpoint location information given in step 1. (4 Mobile agent clones from the event node and the clones carry the event type and discover the path passing through special intermediate nodes; the path above/below reference axis looks like an arc. While migrating from one sensor node to another along the traversed path, each mobile agent gathers the node information (such as node id, location information, residual energy, available bandwidth, and neighbors connectivity and delivers to the sink node. (5 The sink node constructs a partial topology, connecting event and sink node by using the connectivity information delivered by the mobile agents. Using the partial topology information, sink node finds the multipath and path weight factor by using link efficiency, energy ratio, and hop distance. (6 The sink node selects the number of paths among the available paths based upon the criticalness of an event, and (7 if the event is non

  14. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  15. Lyapunov design of event-based controllers for the rendez-vous of coupled systems

    NARCIS (Netherlands)

    De Persis, Claudio; Postoyan, Romain

    2014-01-01

    The objective is to present a new type of triggering conditions together with new proof concepts for the event-based coordination of multi-agents. As a first step, we focus on the rendez-vous of two identical systems modeled as double integrators with additional damping in the velocity dynamics. The

  16. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...

  17. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  18. Mind the gap: modelling event-based and millennial-scale landscape dynamics

    NARCIS (Netherlands)

    Baartman, J.E.M.

    2012-01-01

    This research looks at landscape dynamics – erosion and deposition – from two different perspectives: long-term landscape evolution over millennial timescales on the one hand and short-term event-based erosion and deposition at the other hand. For the first, landscape evolution models (LEMs) are

  19. Component-Based Data-Driven Predictive Maintenance to Reduce Unscheduled Maintenance Events

    NARCIS (Netherlands)

    Verhagen, W.J.C.; Curran, R.; de Boer, L.W.M.; Chen, C.H.; Trappey, A.C.; Peruzzini, M.; Stjepandić, J.; Wognum, N.

    2017-01-01

    Costs associated with unscheduled and preventive maintenance can contribute significantly to an airline's expenditure. Reliability analysis can help to identify and plan for maintenance events. Reliability analysis in industry is often limited to statistically based

  20. Risk Stratification for the Development of Respiratory Adverse Events Following Vascular Surgery Using the Society of Vascular Surgery’s Vascular Quality Initiative

    Science.gov (United States)

    Genovese, Elizabeth A; Fish, Larry; Chaer, Rabih A; Makaroun, Michel S; Baril, Donald T

    2017-01-01

    Objective Post-operative respiratory adverse events (RAEs) are associated with high rates of morbidity and mortality in general surgery, however little is known about these complications in the vascular surgery population, a frail subset with multiple comorbidities. The objective of this study was to describe the contemporary incidence of RAEs in vascular surgery patients, the risk factors for this complication and the overall impact of RAEs on patient outcomes. Methods The Vascular Quality Initiative was queried (2003–2014) for patients who underwent endovascular abdominal aortic repair, open abdominal aortic aneurysm (AAA) repair, thoracic endovascular aortic repair (TEVAR), suprainguinal bypass or infrainguinal bypass. A mixed-effects logistic regression model determined the independent risk factors for RAEs. Using a random 85% of the cohort, a risk prediction score for RAEs was created and the score was validated using the remaining 15% of the cohort, comparing the predicted to the actual incidence of RAE and determining the area under the receiver operating characteristic curve. The independent risk of in-hospital mortality and discharge to a nursing facility associated with RAEs was determined using a mixed-effects logistic regression to control for baseline patient characteristics, operative variables and other post-operative adverse events. Results The cohort consisted of 52,562 patients, with a 5.4% incidence of RAEs. The highest rates of RAEs were seen in current smokers (6.1%), recent acute myocardial infarction (10.1%), symptomatic congestive heart failure (CHF) (9.9%), chronic obstructive pulmonary disease (COPD) requiring oxygen therapy (11.0%), urgent and emergent procedures (6.4% and 25.9%, respectively), open AAA repairs (17.6%), in-situ suprainguinal bypasses (9.68%) and TEVARs (9.6%). The variables included in the risk prediction score were age, body mass index, smoking status, CHF severity, COPD severity, degree of renal insufficiency

  1. Risk stratification for the development of respiratory adverse events following vascular surgery using the Society of Vascular Surgery's Vascular Quality Initiative.

    Science.gov (United States)

    Genovese, Elizabeth A; Fish, Larry; Chaer, Rabih A; Makaroun, Michel S; Baril, Donald T

    2017-02-01

    Postoperative respiratory adverse events (RAEs) are associated with high rates of morbidity and mortality in general surgery, however, little is known about these complications in the vascular surgery population, a frail subset with multiple comorbidities. The objective of this study was to describe the contemporary incidence of RAEs in vascular surgery patients, the risk factors for this complication, and the overall impact of RAEs on patient outcomes. The Vascular Quality Initiative was queried (2003-2014) for patients who underwent endovascular abdominal aortic repair, open abdominal aortic aneurysm repair, thoracic endovascular aortic repair, suprainguinal bypass, or infrainguinal bypass. A mixed-effects logistic regression model determined the independent risk factors for RAEs. Using a random 85% of the cohort, a risk prediction score for RAEs was created, and the score was validated using the remaining 15% of the cohort, comparing the predicted to the actual incidence of RAE and determining the area under the receiver operating characteristic curve. The independent risk of in-hospital mortality and discharge to a nursing facility associated with RAEs was determined using a mixed-effects logistic regression to control for baseline patient characteristics, operative variables, and other postoperative adverse events. The cohort consisted of 52,562 patients, with a 5.4% incidence of RAEs. The highest rates of RAEs were seen in current smokers (6.1%), recent acute myocardial infarction (10.1%), symptomatic congestive heart failure (9.9%), chronic obstructive pulmonary disease requiring oxygen therapy (11.0%), urgent and emergent procedures (6.4% and 25.9%, respectively), open abdominal aortic aneurysm repairs (17.6%), in situ suprainguinal bypasses (9.68%), and thoracic endovascular aortic repairs (9.6%). The variables included in the risk prediction score were age, body mass index, smoking status, congestive heart failure severity, chronic obstructive pulmonary

  2. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  3. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  4. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    OpenAIRE

    Katherine M. Boydell

    2011-01-01

    A symbolic interaction (Blumer, 1969; Mead, 1934; Prus, 1996; Prus & Grills, 2003) approach was taken to study the collective event (Prus, 1997) of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed the authors to study the process of the creation of the dance and its capacity to convert abstract research into concre...

  5. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  6. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  7. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.

  8. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  9. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  10. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.

  11. Event-based rainfall-runoff modelling of the Kelantan River Basin

    International Nuclear Information System (INIS)

    Basarudin, Z; Adnan, N A; Latif, A R A; Syafiqah, N; Tahir, W

    2014-01-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area

  12. Calculation of intercepted runoff depth based on stormwater quality and environmental capacity of receiving waters for initial stormwater pollution management.

    Science.gov (United States)

    Peng, Hai-Qin; Liu, Yan; Gao, Xue-Long; Wang, Hong-Wu; Chen, Yi; Cai, Hui-Yi

    2017-11-01

    While point source pollutions have gradually been controlled in recent years, the non-point source pollution problem has become increasingly prominent. The receiving waters are frequently polluted by the initial stormwater from the separate stormwater system and the wastewater from sewage pipes through stormwater pipes. Consequently, calculating the intercepted runoff depth has become a problem that must be resolved immediately for initial stormwater pollution management. The accurate calculation of intercepted runoff depth provides a solid foundation for selecting the appropriate size of intercepting facilities in drainage and interception projects. This study establishes a separate stormwater system for the Yishan Building watershed of Fuzhou City using the InfoWorks Integrated Catchment Management (InfoWorks ICM), which can predict the stormwater flow velocity and the flow of discharge outlet after each rainfall. The intercepted runoff depth is calculated from the stormwater quality and environmental capacity of the receiving waters. The average intercepted runoff depth from six rainfall events is calculated as 4.1 mm based on stormwater quality. The average intercepted runoff depth from six rainfall events is calculated as 4.4 mm based on the environmental capacity of the receiving waters. The intercepted runoff depth differs when calculated from various aspects. The selection of the intercepted runoff depth depends on the goal of water quality control, the self-purification capacity of the water bodies, and other factors of the region.

  13. Event Investigation

    International Nuclear Information System (INIS)

    Korosec, D.

    2000-01-01

    The events in the nuclear industry are investigated from the license point of view and from the regulatory side too. It is well known the importance of the event investigation. One of the main goals of such investigation is to prevent the circumstances leading to the event and the consequences of the event. The protection of the nuclear workers against nuclear hazard, and the protection of general public against dangerous effects of an event could be achieved by systematic approach to the event investigation. Both, the nuclear safety regulatory body and the licensee shall ensure that operational significant events are investigated in a systematic and technically sound manner to gather information pertaining to the probable causes of the event. One of the results should be appropriate feedback regarding the lessons of the experience to the regulatory body, nuclear industry and general public. In the present paper a general description of systematic approach to the event investigation is presented. The systematic approach to the event investigation works best where cooperation is present among the different divisions of the nuclear facility or regulatory body. By involving management and supervisors the safety office can usually improve their efforts in the whole process. The end result shall be a program which serves to prevent events and reduce the time and efforts solving the root cause which initiated each event. Selection of the proper method for the investigation and an adequate review of the findings and conclusions lead to the higher level of the overall nuclear safety. (author)

  14. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  15. Acoustic Emission based on sentry function to monitor the initiation of delamination in composite materials

    International Nuclear Information System (INIS)

    Bakhtiary Davijani, A.A.; Hajikhani, M.; Ahmadi, M.

    2011-01-01

    Research highlights: → Constant load does not confirm constant damage in composite materials. → Different damages have different AE events. → Sentry function is a useful tool to monitor the initiation of damage in delamination. → The less sentry function number is the more damage the material has endured. -- Abstract: Delamination is the most common failure mode in composite materials, since it will result in the reduction of stiffness and can grow throughout other layers. Delamination is consisted of two main stages including initiation and propagation. Understanding the behavior of the material in these zones is very important, hence it has been thoroughly studied by different methods such as numerical methods, Acoustic Emission (AE), and modeling. Between these two regions initiation is a more vital stage in the delamination of the material. Once initiation occurs, which normally requires greater amount of force, cracks can easily propagate through the structure with little force and cause the failure of the structure. A better knowledge of initiation can lead to better design and production of stronger materials. Additionally, more knowledge about crack initiation and its internal microevents would help improve other parameters and result in higher strength against crack initiation. AE is a suitable method for in situ monitoring of damage in composite materials. In this study, AE was applied to test different glass/epoxy specimens which were loaded under mode I delamination. A function that combines AE and mechanical information is employed to investigate the initiation of delamination. Scanning electron microscope (SEM) was used to verify the results of this function. It is shown that this method is an appropriate technique to monitor the behavior of the initiation of delamination.

  16. A student-facilitated community-based support group initiative for ...

    African Journals Online (AJOL)

    A student-facilitated community-based support group initiative for Mental Health ... was a collaborative partnership between a local University Psychology Department ... users, Rehabilitation, Primary Health Care, Social support, Stigmatisation ...

  17. Initial Remedial Action Plan for Expanded Bioventing System BX Service Station, Patrick Air Force Base, Florida

    National Research Council Canada - National Science Library

    1995-01-01

    This initial remedial action plan presents the scope for an expanded bioventing system for in situ treatment of fuel-contaminated soils at the BX Service Station at Patrick Air Force Base (AFB), Florida...

  18. Combination of graph heuristics in producing initial solution of curriculum based course timetabling problem

    Science.gov (United States)

    Wahid, Juliana; Hussin, Naimah Mohd

    2016-08-01

    The construction of population of initial solution is a crucial task in population-based metaheuristic approach for solving curriculum-based university course timetabling problem because it can affect the convergence speed and also the quality of the final solution. This paper presents an exploration on combination of graph heuristics in construction approach in curriculum based course timetabling problem to produce a population of initial solutions. The graph heuristics were set as single and combination of two heuristics. In addition, several ways of assigning courses into room and timeslot are implemented. All settings of heuristics are then tested on the same curriculum based course timetabling problem instances and are compared with each other in terms of number of population produced. The result shows that combination of saturation degree followed by largest degree heuristic produce the highest number of population of initial solutions. The results from this study can be used in the improvement phase of algorithm that uses population of initial solutions.

  19. Estimation of core-damage frequency to evolutionary ALWR [advanced light water reactor] due to seismic initiating events: Task 4.3.3

    International Nuclear Information System (INIS)

    Brooks, R.D.; Harrison, D.G.; Summitt, R.L.

    1990-04-01

    The Electric Power Research Institute (EPRI) is presently developing a requirements document for the design of advanced light water reactors (ALWRs). One of the basic goals of the EPRI ALWR Requirements Document is that the core-damage frequency for an ALWR shall be less than 1.0E-5. To aid in this effort, the Department of Energy's Advanced Reactor Severe Accident Program (ARSAP) initiated a functional probabilistic risk assessment (PRA) to determine how effectively the evolutionary plant requirements contained in the existing EPRI Requirements Document assure that this safety goal will be met. This report develops an approximation of the core-damage frequency due to seismic events for both evolutionary plant designs (pressurized-water reactor (PWR) and boiling-water reactor(BWR)) as modeled in the corresponding functional PRAs. Component fragility values were taken directly form information which has been submitted for inclusion in Appendix A to Volume 1 of the EPRI Requirements Document. The results show a seismic core-damage frequency of 5.2E-6 for PWRS and 5.0E-6 for BWRs. Combined with the internal initiators from the functional PRAs, the overall core-damage frequencies are 6.0E-6 for the pwr and BWR, both of which satisfy the 1.0E-5 EPRI goal. In addition, site-specific considerations, such as more rigid components and less conservative fragility data and seismic hazard curves, may further reduce these frequencies. The effect of seismic events on structures are not addressed in this generic evaluation and should be addressed separately on a design-specific basis. 7 refs., 6 figs., 3 tabs

  20. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  1. Whole-genome sequencing of multiple myeloma from diagnosis to plasma cell leukemia reveals genomic initiating events, evolution, and clonal tides.

    Science.gov (United States)

    Egan, Jan B; Shi, Chang-Xin; Tembe, Waibhav; Christoforides, Alexis; Kurdoglu, Ahmet; Sinari, Shripad; Middha, Sumit; Asmann, Yan; Schmidt, Jessica; Braggio, Esteban; Keats, Jonathan J; Fonseca, Rafael; Bergsagel, P Leif; Craig, David W; Carpten, John D; Stewart, A Keith

    2012-08-02

    The longitudinal evolution of a myeloma genome from diagnosis to plasma cell leukemia has not previously been reported. We used whole-genome sequencing (WGS) on 4 purified tumor samples and patient germline DNA drawn over a 5-year period in a t(4;14) multiple myeloma patient. Tumor samples were acquired at diagnosis, first relapse, second relapse, and end-stage secondary plasma cell leukemia (sPCL). In addition to the t(4;14), all tumor time points also shared 10 common single-nucleotide variants (SNVs) on WGS comprising shared initiating events. Interestingly, we observed genomic sequence variants that waxed and waned with time in progressive tumors, suggesting the presence of multiple independent, yet related, clones at diagnosis that rose and fell in dominance. Five newly acquired SNVs, including truncating mutations of RB1 and ZKSCAN3, were observed only in the final sPCL sample suggesting leukemic transformation events. This longitudinal WGS characterization of the natural history of a high-risk myeloma patient demonstrated tumor heterogeneity at diagnosis with shifting dominance of tumor clones over time and has also identified potential mutations contributing to myelomagenesis as well as transformation from myeloma to overt extramedullary disease such as sPCL.

  2. Dynamics of initial ionization events in biological molecules: Formation and fate of free radicals. Final technical report, May 1, 1994--December 31, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Castleman, A.W. Jr.

    1997-08-01

    Study of early time events following the absorption of electromagnetic radiation in biological systems has potentially significant impact on several areas of importance. In this context, the studies being conducted under this program provided insight into the conformational changes as well as the reactions leading to a variety of transformations that culminate from hydrogen atom and proton transfer events. These studies enabled an investigation of molecular details of structure-function relationships. In a second aspect of the program, investigations were conducted to provide basic underpinning research that contributed to a quantification of the behavior of radionuclides and pollutants associated with advanced energy activities after these materials emanate from their source and become transferred through the environment to the biota and human receptor. The approach to elucidating factors governing the difference between reactions in the gas and condensed phase was to study the initiating steps at progressively higher degrees of cluster aggregation. The author employed ultrafast laser techniques, in combination with selected molecules, carefully prepared in tailored compositions, to investigation the primary mechanisms involved in various molecular functional groups following the absorption of electromagnetic radiation. He also studied various molecules representing chromophores in such biologically important molecules as tyrosine and amines.

  3. Event-Based Analysis of Rainfall-Runoff Response to Assess Wetland-Stream Interaction in the Prairie Pothole Region

    Science.gov (United States)

    Haque, M. A.; Ross, C.; Schmall, A.; Bansah, S.; Ali, G.

    2016-12-01

    Process-based understanding of wetland response to precipitation is needed to quantify the extent to which non-floodplain wetlands - such as Prairie potholes - generate flow and transmit that flow to nearby streams. While measuring wetland-stream (W-S) interaction is difficult, it is possible to infer it by examining hysteresis characteristics between wetland and stream stage during individual precipitation events. Hence, to evaluate W-S interaction, 10 intact and 10 altered/lost potholes were selected for study; they are located in Broughton's Creek Watershed (Manitoba, Canada) on both sides of a 5 km creek reach. Stilling wells (i.e., above ground wells) were deployed in the intact and altered wetlands to monitor surface water level fluctuations while water table wells were drilled below drainage ditches to a depth of 1 m to monitor shallow groundwater fluctuations. All stilling wells and water table wells were equipped with capacitance water level loggers to monitor fluctuations in surface water and shallow groundwater every 15 minutes. In 2013 (normal year) and 2014 (wet year), 15+ precipitation events were identified and scatter plots of wetland (x-axis) versus stream (y-axis) stage were built to identify W-S hysteretic dynamics. Initial data analysis reveals that in dry antecedent conditions, intact and altered wetlands show clockwise W-S relations, while drained wetlands show anticlockwise W-S hysteresis. However, in wetter antecedent conditions, all wetland types show anticlockwise hysteresis. Future analysis will target the identification of thresholds in antecedent moisture conditions that determine significant changes in event wetland response characteristics (e.g., the delay between the start of rainfall and stream stage, the maximum water level rise in each wetland during each event, the delay between the start of rainfall and peak wetland stage) as well as hysteresis properties (e.g., gradient and area of the hysteresis loop).

  4. The role local initiatives in community based disaster risk management in Kemijen, Semarang City

    Science.gov (United States)

    Fauzie, W. Z.; Sariffudin, S.

    2017-06-01

    Community-based disaster risk reduction is one of the homegrown initiatives efforts and community empowerment oriented in disaster management. This approach is very important because no one can understand the conditions in a region better than the local communities. Therefore, the implementation of CBDRM always emphasize local initiatives in decision making. The existence of local initiative is necessary specially to anticipate the impact of climate change which is increasingly affecting towns in coastal areas, including settlements in Semarang. Kemijen Urban Village is one of the informal settlements in Semarang, which has the highest intensity of flood that is 12 times during 5 years (2011-2015). The research question is how the level of local initiatives in flood disaster management in Kemijen, Semarang? This study aims to assess the level of local initiatives in Kemijen as the community adaptive capacity of flood prevention in pre-disaster, emergency response, and post-disaster. Local initiatives assessed on water supply, sanitation, food, shelter, health, drainage maintenance and waste management. This study shows the level of local initiatives in pre-disaster and post-disaster is almost same and bigger than the response phase. Scoring results showed that pre-disaster is 35.002, 27.9577 for emergency response, and post-disaster is 34.9862 with each category that is independent, empowered, and independent. This study also shows that local initiatives in Kemijen largely formed by individual initiative and only a few were formed by a collective initiative.

  5. Consultancy on 'IAEA initiative to establish a fast reactor knowledge base'. Working material

    International Nuclear Information System (INIS)

    2005-01-01

    At the outset of the meeting, Member States interest in establishing Fast Reactor Knowledge Base was acknowledged by the participants. While the broader objective of the initiative was to develop a Knowledge Base into which the existing Knowledge Preservation Systems will fit, the specific objectives of the meeting were: Make recommendations on FRKP methodology and guidance, Review the proposed structure of the Agency's FRKP Initiative, Make recommendations on the role of the Agency and the Member States implementing the Agency's FRKP Initiative, Develop an approach for the implementation of the structure of the Agency's RFKP Initiative. The meeting concluded covering many aspects of the initiative namely systematic method of data capturing, structuring and functions of FRKP System etc. and placed a strong emphasis on the continues role of IAEA's support and coordination in the data retrieval and knowledge preservation efforts

  6. Event Shape Sorting: selecting events with similar evolution

    Directory of Open Access Journals (Sweden)

    Tomášik Boris

    2017-01-01

    Full Text Available We present novel method for the organisation of events. The method is based on comparing event-by-event histograms of a chosen quantity Q that is measured for each particle in every event. The events are organised in such a way that those with similar shape of the Q-histograms end-up placed close to each other. We apply the method on histograms of azimuthal angle of the produced hadrons in ultrarelativsitic nuclear collisions. By selecting events with similar azimuthal shape of their hadron distribution one chooses events which are likely that they underwent similar evolution from the initial state to the freeze-out. Such events can more easily be compared to theoretical simulations where all conditions can be controlled. We illustrate the method on data simulated by the AMPT model.

  7. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  8. LCP method for a planar passive dynamic walker based on an event-driven scheme

    Science.gov (United States)

    Zheng, Xu-Dong; Wang, Qi

    2018-06-01

    The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.

  9. Pull-Based Distributed Event-Triggered Consensus for Multiagent Systems With Directed Topologies.

    Science.gov (United States)

    Yi, Xinlei; Lu, Wenlian; Chen, Tianping

    2017-01-01

    This paper mainly investigates consensus problem with a pull-based event-triggered feedback control. For each agent, the diffusion coupling feedbacks are based on the states of its in-neighbors at its latest triggering time, and the next triggering time of this agent is determined by its in-neighbors' information. The general directed topologies, including irreducible and reducible cases, are investigated. The scenario of distributed continuous communication is considered first. It is proved that if the network topology has a spanning tree, then the event-triggered coupling algorithm can realize the consensus for the multiagent system. Then, the results are extended to discontinuous communication, i.e., self-triggered control, where each agent computes its next triggering time in advance without having to observe the system's states continuously. The effectiveness of the theoretical results is illustrated by a numerical example finally.

  10. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  11. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  12. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  13. Nest-crowdcontrol: Advanced video-based crowd monitoring for large public events

    OpenAIRE

    Monari, Eduardo; Fischer, Yvonne; Anneken, Mathias

    2015-01-01

    Current video surveillance systems still lack of intelligent video and data analysis modules for supporting situation awareness of decision makers. Especially in mass gatherings like large public events, the decision maker would benefit from different views of the area, especially from crowd density estimations. This article describes a multi-camera system called NEST and its application for crowd density analysis. First, the overall system design is presented. Based on this, the crowd densit...

  14. Spatio-Temporal Story Mapping Animation Based On Structured Causal Relationships Of Historical Events

    Science.gov (United States)

    Inoue, Y.; Tsuruoka, K.; Arikawa, M.

    2014-04-01

    In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.

  15. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    Science.gov (United States)

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4

  16. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  17. A Cluster-Based Fuzzy Fusion Algorithm for Event Detection in Heterogeneous Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    ZiQi Hao

    2015-01-01

    Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.

  18. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    Science.gov (United States)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  19. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  20. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  1. Key terms for the assessment of the safety of vaccines in pregnancy: Results of a global consultative process to initiate harmonization of adverse event definitions.

    Science.gov (United States)

    Munoz, Flor M; Eckert, Linda O; Katz, Mark A; Lambach, Philipp; Ortiz, Justin R; Bauwens, Jorgen; Bonhoeffer, Jan

    2015-11-25

    The variability of terms and definitions of Adverse Events Following Immunization (AEFI) represents a missed opportunity for optimal monitoring of safety of immunization in pregnancy. In 2014, the Brighton Collaboration Foundation and the World Health Organization (WHO) collaborated to address this gap. Two Brighton Collaboration interdisciplinary taskforces were formed. A landscape analysis included: (1) a systematic literature review of adverse event definitions used in vaccine studies during pregnancy; (2) a worldwide stakeholder survey of available terms and definitions; (3) and a series of taskforce meetings. Based on available evidence, taskforces proposed key terms and concept definitions to be refined, prioritized, and endorsed by a global expert consultation convened by WHO in Geneva, Switzerland in July 2014. Using pre-specified criteria, 45 maternal and 62 fetal/neonatal events were prioritized, and key terms and concept definitions were endorsed. In addition recommendations to further improve safety monitoring of immunization in pregnancy programs were specified. This includes elaboration of disease concepts into standardized case definitions with sufficient applicability and positive predictive value to be of use for monitoring the safety of immunization in pregnancy globally, as well as the development of guidance, tools, and datasets in support of a globally concerted approach. There is a need to improve the safety monitoring of immunization in pregnancy programs. A consensus list of terms and concept definitions of key events for monitoring immunization in pregnancy is available. Immediate actions to further strengthen monitoring of immunization in pregnancy programs are identified and recommended. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Guidelines for time-to-event end point definitions in sarcomas and gastrointestinal stromal tumors (GIST) trials: results of the DATECAN initiative (Definition for the Assessment of Time-to-event Endpoints in CANcer trials)†.

    Science.gov (United States)

    Bellera, C A; Penel, N; Ouali, M; Bonvalot, S; Casali, P G; Nielsen, O S; Delannes, M; Litière, S; Bonnetain, F; Dabakuyo, T S; Benjamin, R S; Blay, J-Y; Bui, B N; Collin, F; Delaney, T F; Duffaud, F; Filleron, T; Fiore, M; Gelderblom, H; George, S; Grimer, R; Grosclaude, P; Gronchi, A; Haas, R; Hohenberger, P; Issels, R; Italiano, A; Jooste, V; Krarup-Hansen, A; Le Péchoux, C; Mussi, C; Oberlin, O; Patel, S; Piperno-Neumann, S; Raut, C; Ray-Coquard, I; Rutkowski, P; Schuetze, S; Sleijfer, S; Stoeckle, E; Van Glabbeke, M; Woll, P; Gourgou-Bourgade, S; Mathoulin-Pélissier, S

    2015-05-01

    The use of potential surrogate end points for overall survival, such as disease-free survival (DFS) or time-to-treatment failure (TTF) is increasingly common in randomized controlled trials (RCTs) in cancer. However, the definition of time-to-event (TTE) end points is rarely precise and lacks uniformity across trials. End point definition can impact trial results by affecting estimation of treatment effect and statistical power. The DATECAN initiative (Definition for the Assessment of Time-to-event End points in CANcer trials) aims to provide recommendations for definitions of TTE end points. We report guidelines for RCT in sarcomas and gastrointestinal stromal tumors (GIST). We first carried out a literature review to identify TTE end points (primary or secondary) reported in publications of RCT. An international multidisciplinary panel of experts proposed recommendations for the definitions of these end points. Recommendations were developed through a validated consensus method formalizing the degree of agreement among experts. Recommended guidelines for the definition of TTE end points commonly used in RCT for sarcomas and GIST are provided for adjuvant and metastatic settings, including DFS, TTF, time to progression and others. Use of standardized definitions should facilitate comparison of trials' results, and improve the quality of trial design and reporting. These guidelines could be of particular interest to research scientists involved in the design, conduct, reporting or assessment of RCT such as investigators, statisticians, reviewers, editors or regulatory authorities. © The Author 2014. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  3. Wide Area Protection Scheme Preventing Cascading Events based on Improved Impedance relay

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun

    2013-01-01

    Load flow transferring after an initial contingency is regarded as one of the main reasons of causing unexpected cascading trips. A multi agent system (MAS) based wide area protection strategy is proposed in this paper to predict the load flow transferring from the point of view of impedance relays...

  4. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  5. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  6. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.

    Science.gov (United States)

    Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos

    2018-03-23

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  7. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies

    Directory of Open Access Journals (Sweden)

    Ali Balador

    2018-03-01

    Full Text Available Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  8. Mining web-based data to assess public response to environmental events

    International Nuclear Information System (INIS)

    Cha, YoonKyung; Stow, Craig A.

    2015-01-01

    We explore how the analysis of web-based data, such as Twitter and Google Trends, can be used to assess the social relevance of an environmental accident. The concept and methods are applied in the shutdown of drinking water supply at the city of Toledo, Ohio, USA. Toledo's notice, which persisted from August 1 to 4, 2014, is a high-profile event that directly influenced approximately half a million people and received wide recognition. The notice was given when excessive levels of microcystin, a byproduct of cyanobacteria blooms, were discovered at the drinking water treatment plant on Lake Erie. Twitter mining results illustrated an instant response to the Toledo incident, the associated collective knowledge, and public perception. The results from Google Trends, on the other hand, revealed how the Toledo event raised public attention on the associated environmental issue, harmful algal blooms, in a long-term context. Thus, when jointly applied, Twitter and Google Trend analysis results offer complementary perspectives. Web content aggregated through mining approaches provides a social standpoint, such as public perception and interest, and offers context for establishing and evaluating environmental management policies. - The joint application of Twitter and Google Trend analysis to an environmental event offered both short and long-term patterns of public perception and interest on the event

  9. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  10. The taxable events for the Value-Added Tax (VAT based on a Comparative Law approach

    Directory of Open Access Journals (Sweden)

    Walker Villanueva Gutiérrez

    2014-07-01

    Full Text Available This article analyzes the definitions of the main taxable events for the Value-Added Tax (VAT based on a comparative approach to thelegislation of different countries (Spain, Mexico, Chile, Colombia, Argentina and Peru. In this regard, it analyzes which legislations offer definitions according to the principles of generality, fiscal neutrality and legal certainty for VAT. Moreover, it points out that the VAT systems of those countries do not require as a condition for the configuration of the taxable events that the transactions involve a «value added» or a final consumption. In the specificcase of «supplies of goods», the VAT systems have a similar definition of the taxable event, although there are a few differences. However, in the case of«supplies of services», which is the most important taxable event for VAT, there are important differences at the time each country defines it. This is not a desirable effect for the international trade of services, since the lack of harmonization produces double taxation or double non taxation.

  11. An adverse events potential costs analysis based on Drug Programs in Poland. Dermatology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-09-01

    Full Text Available The aim of the project, carried out within the Polish Society for Pharmacoeconomics (PTFE, was to estimate the potential costs of treatment of the side effects which (theoretically may occur as a result of treatments for the selected diseases. This paper deals solely with dermatology related events. Herein, several Drug Programs financed by the National Health Fund in Poland, in 2012, were analyzed. The adverse events were selected based on the Summary of Product Characteristics of the chosen products. We focused the project on those potential adverse events which were defined in SPC as frequent and very frequent. The results are presented according to their therapeutic areas, and in this paper, the focus is upon that which is related to dermatology. The events described as ‘very common’ had an incidence of ≥ 1/10, and that which is ‘common’ - ≥ 1/100, <1 /10. In order to identify the resources used, we, with the engagement of clinical experts, performed a survey. In our work, we employed only the total direct costs incurred by the public payer, based on valid individual cost data in February 2014. Moreover, we calculated the total spending from the public payer’s perspective, as well as the patient’s perspective, and the percentage of each component of the total cost in detail. The paper, thus, informs the reader of the estimated costs of treatment of side effects related to the dermatologic symptoms and reactions. Based on our work, we can state that the treatment of skin adverse drug reactions generates a significant cost - one incurred by both the public payer and the patient.

  12. ADEpedia: a scalable and standardized knowledge base of Adverse Drug Events using semantic web technology.

    Science.gov (United States)

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-01-01

    A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.

  13. Results from a data acquisition system prototype project using a switch-based event builder

    International Nuclear Information System (INIS)

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D.; Bowden, M.; Booth, A.; Cancelo, G.

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs

  14. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    OpenAIRE

    Boydell, Katherine M.

    2011-01-01

    A symbolic interaction (BLUMER, 1969; MEAD, 1934; PRUS, 1996; PRUS & GRILLS, 2003) approach was taken to study the collective event (PRUS, 1997) of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed us to study the process of the creation of the dance and its capacity to convert abstract research into concrete form a...

  15. Results from a data acquisition system prototype project using a switch-based event builder

    Energy Technology Data Exchange (ETDEWEB)

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D. (Fermi National Accelerator Lab., Batavia, IL (United States)); Bowden, M.; Booth, A. (Superconducting Super Collider Lab., Dallas, TX (United States)); Cancelo, G. (La Plata Univ. Nacional (Argentina))

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs.

  16. Event-triggered hybrid control based on multi-Agent systems for Microgrids

    DEFF Research Database (Denmark)

    Dou, Chun-xia; Liu, Bin; Guerrero, Josep M.

    2014-01-01

    This paper is focused on a multi-agent system based event-triggered hybrid control for intelligently restructuring the operating mode of an microgrid (MG) to ensure the energy supply with high security, stability and cost effectiveness. Due to the microgrid is composed of different types...... of distributed energy resources, thus it is typical hybrid dynamic network. Considering the complex hybrid behaviors, a hierarchical decentralized coordinated control scheme is firstly constructed based on multi-agent sys-tem, then, the hybrid model of the microgrid is built by using differential hybrid Petri...

  17. Building a knowledge base of severe adverse drug events based on AERS reporting data using semantic web technologies.

    Science.gov (United States)

    Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G

    2013-01-01

    A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.

  18. Identifying Typhoon Tracks based on Event Synchronization derived Spatially Embedded Climate Networks

    Science.gov (United States)

    Ozturk, Ugur; Marwan, Norbert; Kurths, Jürgen

    2017-04-01

    Complex networks are commonly used for investigating spatiotemporal dynamics of complex systems, e.g. extreme rainfall. Especially directed networks are very effective tools in identifying climatic patterns on spatially embedded networks. They can capture the network flux, so as the principal dynamics of spreading significant phenomena. Network measures, such as network divergence, bare the source-receptor relation of the directed networks. However, it is still a challenge how to catch fast evolving atmospheric events, i.e. typhoons. In this study, we propose a new technique, namely Radial Ranks, to detect the general pattern of typhoons forward direction based on the strength parameter of the event synchronization over Japan. We suggest to subset a circular zone of high correlation around the selected grid based on the strength parameter. Radial sums of the strength parameter along vectors within this zone, radial ranks are measured for potential directions, which allows us to trace the network flux over long distances. We employed also the delay parameter of event synchronization to identify and separate the frontal storms' and typhoons' individual behaviors.

  19. GIS-based rare events logistic regression for mineral prospectivity mapping

    Science.gov (United States)

    Xiong, Yihui; Zuo, Renguang

    2018-02-01

    Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

  20. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  1. 77 FR 48148 - Energy Alternatives Wholesale, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2012-08-13

    ... Energy Alternatives Wholesale, LLC's application for market-based rate authority, with an accompanying... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-2413-000] Energy Alternatives Wholesale, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

  2. 77 FR 9226 - Physical Systems Integration, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2012-02-16

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-1013-000] Physical Systems Integration, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... Physical Systems Integration, LLC's application for market-based rate authority, with an accompanying rate...

  3. 78 FR 49508 - Tesoro Refining & Marketing Company LLC; Supplemental Notice That Initial Market-Based Rate...

    Science.gov (United States)

    2013-08-14

    ... Refining & Marketing Company LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes... proceeding of Tesoro Refining & Marketing Company LLC's application for market-based rate authority, with an... of protests and interventions in lieu of paper, using the FERC Online links at http://www.ferc.gov...

  4. 75 FR 35017 - Brookfield Energy Marketing LP; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2010-06-21

    ... Energy Marketing LP; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... proceeding of Brookfield Energy Marketing LP's application for market-based rate authority, with an... protests and interventions in lieu of paper, using the FERC Online links at http://www.ferc.gov . To...

  5. 78 FR 16262 - Tesoro Refining & Marketing Company LLC; Supplemental Notice That Initial Market-Based Rate...

    Science.gov (United States)

    2013-03-14

    ... Refining & Marketing Company LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes... proceeding, of Tesoro Refining & Marketing Company LLC's application for market- based rate authority, with... submission of protests and interventions in lieu of paper, using the FERC Online links at http://www.ferc.gov...

  6. 75 FR 74711 - Planet Energy (Pennsylvania) Corp.; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-2167-000] Planet Energy (Pennsylvania) Corp.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... proceeding, of Planet Energy (Pennsylvania) Corp.'s application for market-based rate authority, with an...

  7. 75 FR 74712 - Planet Energy (Maryland) Corp.; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-2168-000] Planet Energy (Maryland) Corp.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... proceeding, of Planet Energy (Maryland) Corp.'s application for market-based rate authority, with an...

  8. 76 FR 6128 - Energy Exchange International, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2011-02-03

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-2730-000] Energy Exchange International, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... proceeding Energy Exchange International, LLC's application for market-based rate authority, with an...

  9. 75 FR 41855 - Stream Energy Pennsylvania, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2010-07-19

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER10-1750-000] Stream Energy Pennsylvania, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... of Stream Energy Pennsylvania, LLC's application for market-based rate authority, with an...

  10. 78 FR 28214 - Gainesville Renewable Energy Center, LLC; Supplemental Notice That Initial Market-Based Rate...

    Science.gov (United States)

    2013-05-14

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER13-1348-000] Gainesville Renewable Energy Center, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... Gainesville Renewable Energy Center, LLC's application for market- based rate authority, with an accompanying...

  11. 78 FR 40473 - Plainfield Renewable Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2013-07-05

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER13-1734-000] Plainfield Renewable Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... Plainfield Renewable Energy, LLC's application for market-based rate authority, with an accompanying rate...

  12. 77 FR 64980 - Chesapeake Renewable Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2012-10-24

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER13-28-000] Chesapeake Renewable Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... proceeding of Chesapeake Renewable Energy LLC's application for market-based rate authority, with an...

  13. 77 FR 52016 - Brookfield Smoky Mountain Hydropower LLC; Supplemental Notice That Initial Market-Based Rate...

    Science.gov (United States)

    2012-08-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-2447-001] Brookfield Smoky Mountain Hydropower LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes... proceeding, of Brookfield Smoky Mountain Hydropower LLC's application for market- based rate authority, with...

  14. 75 FR 61747 - Union Leader Corporation; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Science.gov (United States)

    2010-10-06

    ... of Union Leader Corporation's application for market-based rate authority, with an accompanying rate... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER10-2780-000] Union Leader Corporation; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section...

  15. 76 FR 12726 - Tropicana Manufacturing Company, Inc.; Supplemental Notice That Initial Market-Based Rate Filing...

    Science.gov (United States)

    2011-03-08

    ... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be... Manufacturing Company, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... Tropicana Manufacturing Company, Inc.'s application for market-based rate authority, with an accompanying...

  16. 77 FR 42722 - Berry Petroleum Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Science.gov (United States)

    2012-07-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-2233-000] Berry Petroleum Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... Petroleum Company's application for market-based rate authority, with an accompanying rate schedule, noting...

  17. Towards Evidence-Based Initial Teacher Education in Singapore: A Review of Current Literature

    Science.gov (United States)

    Low, Ee-Ling; Hui, Chenri; Taylor, Peter G.; Ng, Pak Tee

    2012-01-01

    Initial teacher education (ITE) in Singapore is shifting towards evidence-based practice. Despite a clear policy orientation, ITE in Singapore has not yet produced the evidence base that it is anticipating. This paper presents an analytical review of previous research into ITE in Singapore and makes comparisons to the larger international context.…

  18. Assessing the impact of area-based initiatives in deprived neighborhoods

    DEFF Research Database (Denmark)

    Alves, Sonia

    2017-01-01

    assumptions, and social and spatial effects of these initiatives. Among other inter-related issues, the paper discusses the impact of conflicting ideologies upon processes of radical strategy shift and of social and territorial marginalization, and appeals to the need for more pluralistic approaches......Whilst there have been many area-based initiatives to regenerate rundown areas in numerous cities around the world, many of them involving the demolition of stigmatized housing estates, far fewer attempts have been made to assess the effects of these initiatives upon the fortunes of displaced...... households and those who remain in these areas. By presenting the results of an empirical in-depth case study on the effects of an area based initiative targeted at one of the most deprived neighborhoods in Porto, this paper raises several epistemological concerns related to the goals, ideological...

  19. From Family Based to Industrial Based Production: Local Economic Development Initiatives and the HELIX Model

    Directory of Open Access Journals (Sweden)

    Bartjan W Pennink

    2013-01-01

    Full Text Available To build a strong local economy, good practice tells us that each community should undertake a collaborative, strategically planned process to understand and then act upon its own strengths, weaknesses, opportunities and threats. From this perspective we start with the local communities but how is this related to the perspective from the Helix model in which three actors are explicitly introduced: the Government, the Industry and the Universities? The purpose of local economic development (LED is to build up the economic capacity of a local area to improve its economic future and the quality of life for all. To support  the Local Economic Development in remote areas,   a program  has been developed based on the LED frame work of the world bank. This approach and  the experiences over  the past years with this program are  described in the first part.  In the second part of the paper, We analyse work done with that program with the help of the social capital concept and the triple helix model.  In all cases it is important to pay attention to who is taken the initiative after the first move (and it is not always the governance as actor and for the triple helix we suggest  that the concepts of (national Government, Industry and University need a translation to Local Governance Agency, Cooperation or other ways of cooperation of local communities and Local Universities. Although a push from outside might help  a local region in development the endogenous factors are  also needed. Keywords: Triple Helix model, Local Economic Development, Local Actors, Double Triangle within the Helix Model

  20. A Hospital Nursing Adverse Events Reporting System Project: An Approach Based on the Systems Development Life Cycle.

    Science.gov (United States)

    Cao, Yingjuan; Ball, Marion

    2017-01-01

    Based on the System Development Life Cycle, a hospital based nursing adverse event reporting system was developed and implemented which integrated with the current Hospital Information System (HIS). Besides the potitive outcomes in terms of timeliness and efficiency, this approach has brought an enormous change in how the nurses report, analyze and respond to the adverse events.

  1. Discrete event dynamic system (DES)-based modeling for dynamic material flow in the pyroprocess

    International Nuclear Information System (INIS)

    Lee, Hyo Jik; Kim, Kiho; Kim, Ho Dong; Lee, Han Soo

    2011-01-01

    A modeling and simulation methodology was proposed in order to implement the dynamic material flow of the pyroprocess. Since the static mass balance provides the limited information on the material flow, it is hard to predict dynamic behavior according to event. Therefore, a discrete event system (DES)-based model named, PyroFlow, was developed at the Korea Atomic Energy Research Institute (KAERI). PyroFlow is able to calculate dynamic mass balance and also show various dynamic operational results in real time. By using PyroFlow, it is easy to rapidly predict unforeseeable results, such as throughput in unit process, accumulated product in buffer and operation status. As preliminary simulations, bottleneck analyses in the pyroprocess were carried out and consequently it was presented that operation strategy had influence on the productivity of the pyroprocess.

  2. Triggerless Readout with Time and Amplitude Reconstruction of Event Based on Deconvolution Algorithm

    International Nuclear Information System (INIS)

    Kulis, S.; Idzik, M.

    2011-01-01

    In future linear colliders like CLIC, where the period between the bunch crossings is in a sub-nanoseconds range ( 500 ps), an appropriate detection technique with triggerless signal processing is needed. In this work we discuss a technique, based on deconvolution algorithm, suitable for time and amplitude reconstruction of an event. In the implemented method the output of a relatively slow shaper (many bunch crossing periods) is sampled and digitalised in an ADC and then the deconvolution procedure is applied to digital data. The time of an event can be found with a precision of few percent of sampling time. The signal to noise ratio is only slightly decreased after passing through the deconvolution filter. The performed theoretical and Monte Carlo studies are confirmed by the results of preliminary measurements obtained with the dedicated system comprising of radiation source, silicon sensor, front-end electronics, ADC and further digital processing implemented on a PC computer. (author)

  3. A data-based model to locate mass movements triggered by seismic events in Sichuan, China.

    Science.gov (United States)

    de Souza, Fabio Teodoro

    2014-01-01

    Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.

  4. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  5. Three Dimensional Numerical Simulation of Rocket-based Combined-cycle Engine Response During Mode Transition Events

    Science.gov (United States)

    Edwards, Jack R.; McRae, D. Scott; Bond, Ryan B.; Steffan, Christopher (Technical Monitor)

    2003-01-01

    The GTX program at NASA Glenn Research Center is designed to develop a launch vehicle concept based on rocket-based combined-cycle (RBCC) propulsion. Experimental testing, cycle analysis, and computational fluid dynamics modeling have all demonstrated the viability of the GTX concept, yet significant technical issues and challenges still remain. Our research effort develops a unique capability for dynamic CFD simulation of complete high-speed propulsion devices and focuses this technology toward analysis of the GTX response during critical mode transition events. Our principal attention is focused on Mode 1/Mode 2 operation, in which initial rocket propulsion is transitioned into thermal-throat ramjet propulsion. A critical element of the GTX concept is the use of an Independent Ramjet Stream (IRS) cycle to provide propulsion at Mach numbers less than 3. In the IRS cycle, rocket thrust is initially used for primary power, and the hot rocket plume is used as a flame-holding mechanism for hydrogen fuel injected into the secondary air stream. A critical aspect is the establishment of a thermal throat in the secondary stream through the combination of area reduction effects and combustion-induced heat release. This is a necessity to enable the power-down of the rocket and the eventual shift to ramjet mode. Our focus in this first year of the grant has been in three areas, each progressing directly toward the key initial goal of simulating thermal throat formation during the IRS cycle: CFD algorithm development; simulation of Mode 1 experiments conducted at Glenn's Rig 1 facility; and IRS cycle simulations. The remainder of this report discusses each of these efforts in detail and presents a plan of work for the next year.

  6. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  7. Social importance enhances prospective memory: evidence from an event-based task.

    Science.gov (United States)

    Walter, Stefan; Meier, Beat

    2017-07-01

    Prospective memory performance can be enhanced by task importance, for example by promising a reward. Typically, this comes at costs in the ongoing task. However, previous research has suggested that social importance (e.g., providing a social motive) can enhance prospective memory performance without additional monitoring costs in activity-based and time-based tasks. The aim of the present study was to investigate the influence of social importance in an event-based task. We compared four conditions: social importance, promising a reward, both social importance and promising a reward, and standard prospective memory instructions (control condition). The results showed enhanced prospective memory performance for all importance conditions compared to the control condition. Although ongoing task performance was slowed in all conditions with a prospective memory task when compared to a baseline condition with no prospective memory task, additional costs occurred only when both the social importance and reward were present simultaneously. Alone, neither social importance nor promising a reward produced an additional slowing when compared to the cost in the standard (control) condition. Thus, social importance and reward can enhance event-based prospective memory at no additional cost.

  8. Ant colony optimization and event-based dynamic task scheduling and staffing for software projects

    Science.gov (United States)

    Ellappan, Vijayan; Ashwini, J.

    2017-11-01

    In programming change organizations from medium to inconceivable scale broadens, the issue of wander orchestrating is amazingly unusual and testing undertaking despite considering it a manual system. Programming wander-organizing requirements to deal with the issue of undertaking arranging and in addition the issue of human resource portion (also called staffing) in light of the way that most of the advantages in programming ventures are individuals. We propose a machine learning approach with finds respond in due order regarding booking by taking in the present arranging courses of action and an event based scheduler revives the endeavour arranging system moulded by the learning computation in perspective of the conformity in event like the begin with the Ander, the instant at what time possessions be free starting to ended errands, and the time when delegates stick together otherwise depart the wander inside the item change plan. The route toward invigorating the timetable structure by the even based scheduler makes the arranging method dynamic. It uses structure components to exhibit the interrelated surges of endeavours, slip-ups and singular all through different progression organizes and is adjusted to mechanical data. It increases past programming wander movement ask about by taking a gander at a survey based process with a one of a kind model, organizing it with the data based system for peril assessment and cost estimation, and using a choice showing stage.

  9. A Markovian event-based framework for stochastic spiking neural networks.

    Science.gov (United States)

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  10. I Will Write a Letter and Change the World The Knowledge Base Kick-Starting Norway’s Rainforest Initiative

    Directory of Open Access Journals (Sweden)

    Erlend Andre Tveiten Hermansen

    2015-12-01

    Full Text Available In September 2007 two Norwegian NGOs wrote a letter to leading Norwegian politicians urging them to establish a climate initiative for protecting rainforests. Two months later, at the United Nations climate summit in Bali, Norway committed to donate three billion NOK annually to prevent tropical deforestation, making Norway the leading global donor in what has become the REDD+ mechanism (Reducing Emissions from Deforestation and Forest Degradation. This article provides a detailed analysis of the making of the rainforest initiative, placing particular emphasis on the knowledge base of the initiative, most notably a decisive letter. Close contact with policy makers in the process ensured legitimacy and credibility for the proposal. Important for the initiative’s rapid progression was that it came in the middle of the run-up to the negotiations of a cross-political climate settlement in the Norwegian Parliament. The rainforest initiative became one of the hottest proposals in the climate policy ‘bidding war’ between the government and the opposition. All these events must be seen against the background of 2007 being a year when public concern and media coverage about climate issues peaked. Politicians were under pressure to act, and the rainforest proposal’s perfect fit with the Norwegian climate mitigation main approach of pursuing large-scale cost-effective emission cutbacks abroad made it pass swiftly through the governmental machinery. In conclusion, the article suggests the metaphor of the perfect storm to explain how the NGOs exploited a situation which made the rainforest initiative an indispensable part of Norway’s climate policy.

  11. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications.

    Science.gov (United States)

    Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just

    2018-04-03

    Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  12. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications

    Directory of Open Access Journals (Sweden)

    Daniel G. Costa

    2018-04-01

    Full Text Available Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  13. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2017-09-01

    Full Text Available Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  14. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation.

    Science.gov (United States)

    Huang, Dongmei; Xu, Chenyixuan; Zhao, Danfeng; Song, Wei; He, Qi

    2017-09-21

    Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  15. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  16. Leading indicators of community-based violent events among adults with mental illness.

    Science.gov (United States)

    Van Dorn, R A; Grimm, K J; Desmarais, S L; Tueller, S J; Johnson, K L; Swartz, M S

    2017-05-01

    The public health, public safety and clinical implications of violent events among adults with mental illness are significant; however, the causes and consequences of violence and victimization among adults with mental illness are complex and not well understood, which limits the effectiveness of clinical interventions and risk management strategies. This study examined interrelationships between violence, victimization, psychiatric symptoms, substance use, homelessness and in-patient treatment over time. Available data were integrated from four longitudinal studies of adults with mental illness. Assessments took place at baseline, and at 1, 3, 6, 9, 12, 15, 18, 24, 30 and 36 months, depending on the parent studies' protocol. Data were analysed with the autoregressive cross-lag model. Violence and victimization were leading indicators of each other and affective symptoms were a leading indicator of both. Drug and alcohol use were leading indicators of violence and victimization, respectively. All psychiatric symptom clusters - affective, positive, negative, disorganized cognitive processing - increased the likelihood of experiencing at least one subsequent symptom cluster. Sensitivity analyses identified few group-based differences in the magnitude of effects in this heterogeneous sample. Violent events demonstrated unique and shared indicators and consequences over time. Findings indicate mechanisms for reducing violent events, including trauma-informed therapy, targeting internalizing and externalizing affective symptoms with cognitive-behavioral and psychopharmacological interventions, and integrating substance use and psychiatric care. Finally, mental illness and violence and victimization research should move beyond demonstrating concomitant relationships and instead focus on lagged effects with improved spatio-temporal contiguity.

  17. Event recognition in personal photo collections via multiple instance learning-based classification of multiple images

    Science.gov (United States)

    Ahmad, Kashif; Conci, Nicola; Boato, Giulia; De Natale, Francesco G. B.

    2017-11-01

    Over the last few years, a rapid growth has been witnessed in the number of digital photos produced per year. This rapid process poses challenges in the organization and management of multimedia collections, and one viable solution consists of arranging the media on the basis of the underlying events. However, album-level annotation and the presence of irrelevant pictures in photo collections make event-based organization of personal photo albums a more challenging task. To tackle these challenges, in contrast to conventional approaches relying on supervised learning, we propose a pipeline for event recognition in personal photo collections relying on a multiple instance-learning (MIL) strategy. MIL is a modified form of supervised learning and fits well for such applications with weakly labeled data. The experimental evaluation of the proposed approach is carried out on two large-scale datasets including a self-collected and a benchmark dataset. On both, our approach significantly outperforms the existing state-of-the-art.

  18. Initiating Service Encounter-based Innovation by Word-of-Business

    DEFF Research Database (Denmark)

    Mattsson, Jan

    2015-01-01

    Purpose – This paper aims to set up a natural experiment as action research and to develop a framework of cognitive distance of informants to improve the initiation of service encounter-based innovation. Design/methodology/approach – Natural experiment as action research in one Scandinavian case...... transcriptions of interviews and transcriptions. Research limitations/implications – Only one Scandinavian company and a limited number of informants were activated. Also, the time period only included the initiation phase of service encounter-based innovation. Practical implications – Three different strategies...... in an emerging innovation field, open/user-driven innovation. Theory from business marketing, service encounter and innovation is also used....

  19. An energy estimation framework for event-based methods in Non-Intrusive Load Monitoring

    International Nuclear Information System (INIS)

    Giri, Suman; Bergés, Mario

    2015-01-01

    Highlights: • Energy estimation is NILM has not yet accounted for complexity of appliance models. • We present a data-driven framework for appliance modeling in supervised NILM. • We test the framework on 3 houses and report average accuracies of 5.9–22.4%. • Appliance models facilitate the estimation of energy consumed by the appliance. - Abstract: Non-Intrusive Load Monitoring (NILM) is a set of techniques used to estimate the electricity consumed by individual appliances in a building from measurements of the total electrical consumption. Most commonly, NILM works by first attributing any significant change in the total power consumption (also known as an event) to a specific load and subsequently using these attributions (i.e. the labels for the events) to estimate energy for each load. For this last step, most published work in the field makes simplifying assumptions to make the problem more tractable. In this paper, we present a framework for creating appliance models based on classification labels and aggregate power measurements that can help to relax many of these assumptions. Our framework automatically builds models for appliances to perform energy estimation. The model relies on feature extraction, clustering via affinity propagation, perturbation of extracted states to ensure that they mimic appliance behavior, creation of finite state models, correction of any errors in classification that might violate the model, and estimation of energy based on corrected labels. We evaluate our framework on 3 houses from standard datasets in the field and show that the framework can learn data-driven models based on event labels and use that to estimate energy with lower error margins (e.g., 1.1–42.3%) than when using the heuristic models used by others

  20. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  1. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  2. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    Directory of Open Access Journals (Sweden)

    Ninna Reitzel Jensen

    2015-06-01

    Full Text Available Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account the Markov model for the state of the policyholder and, hereby, facilitating event risk.

  3. Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition

    Science.gov (United States)

    Daigle, Matthew; Roychoudhury, Indranil

    2010-01-01

    We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach

  4. A data base approach for prediction of deforestation-induced mass wasting events

    Science.gov (United States)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  5. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS).

    Science.gov (United States)

    Rigi, Amin; Baghaei Naeini, Fariborz; Makris, Dimitrios; Zweiri, Yahya

    2018-01-24

    In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments.

  6. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  7. Non-Cooperative Regulation Coordination Based on Game Theory for Wind Farm Clusters during Ramping Events

    DEFF Research Database (Denmark)

    Qi, Yongzhi; Liu, Yutian; Wu, Qiuwei

    2017-01-01

    With increasing penetration of wind power in power systems, it is important to track scheduled wind power output as much as possible during ramping events to ensure security of the system. In this paper, a non‐cooperative coordination strategy based on the game theory is proposed for the regulation...... of the regulation revenue function according to the derived Nash equilibrium condition, the ER strategy is the Nash equilibrium of the regulation competition. Case studies were conducted with the power output data of wind farms from State Grid Jibei Electric Power Company Limited of China to demonstrate...

  8. Arachne-A web-based event viewer for MINER{nu}A

    Energy Technology Data Exchange (ETDEWEB)

    Tagg, N., E-mail: ntagg@otterbein.edu [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Brangham, J. [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Chvojka, J. [Rochester, NY 14610 (United States); Clairemont, M. [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Day, M. [Rochester, NY 14610 (United States); Eberly, B. [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Felix, J. [Lascurain de Retana No. 5, Col. Centro. Guanajuato, Guanajuato 36000 (Mexico); Fields, L. [Northwestern University, Evanston, IL 60208 (United States); Gago, A.M. [Seccion Fisica, Departamento de Ciencias, Pontificia Universidad Catolica del Peru, Apartado 1761, Lima (Peru); Gran, R. [Department of Physics, University of Minnesota - Duluth, Duluth, MN 55812 (United States); Harris, D.A. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Kordosky, M. [Department of Physics, College of William and Mary, Williamsburg, VA 23187 (United States); Lee, H. [Rochester, NY 14610 (United States); Maggi, G. [Departamento de Fisica, Universidad Tecnica Federico Santa Maria, Avda. Espana 1680 Casilla 110-V Valparaiso (Chile); Maher, E. [Massachusetts College of Liberal Arts, 375 Church Street, North Adams, MA 01247 (United States); Mann, W.A. [Physics Department, Tufts University, Medford, MA 02155 (United States); Marshall, C.M.; McFarland, K.S.; McGowan, A.M.; Mislivec, A. [Rochester, NY 14610 (United States); and others

    2012-06-01

    Neutrino interaction events in the MINER{nu}A detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINER{nu}A to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  9. Arachne - A web-based event viewer for MINERvA

    International Nuclear Information System (INIS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.

    2011-01-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  10. Ptaquiloside from bracken in stream water at base flow and during storm events

    DEFF Research Database (Denmark)

    Clauson-Kaas, Frederik; Ramwell, Carmel; Hansen, Hans Chr. Bruun

    2016-01-01

    not decrease over the course of the event. In the stream, the throughfall contribution to PTA cannot be separated from a possible below-ground input from litter, rhizomes and soil. Catchment-specific factors such as the soil pH, topography, hydrology, and bracken coverage will evidently affect the level of PTA...... rainfall and PTA concentration in the stream, with a reproducible time lag of approx. 1 h from onset of rain to elevated concentrations, and returning rather quickly (about 2 h) to base flow concentration levels. The concentration of PTA behaved similar to an inert tracer (Cl(-)) in the pulse experiment...

  11. Modeling crowd behavior based on the discrete-event multiagent approach

    OpenAIRE

    Лановой, Алексей Феликсович; Лановой, Артем Алексеевич

    2014-01-01

    The crowd is a temporary, relatively unorganized group of people, who are in close physical contact with each other. Individual behavior of human outside the crowd is determined by many factors, associated with his intellectual activities, but inside the crowd the man loses his identity and begins to obey more simple laws of behavior.One of approaches to the construction of multi-level model of the crowd using discrete-event multiagent approach was described in the paper.Based on this analysi...

  12. Arachne—A web-based event viewer for MINERνA

    International Nuclear Information System (INIS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.; Kordosky, M.; Lee, H.; Maggi, G.; Maher, E.; Mann, W.A.; Marshall, C.M.; McFarland, K.S.; McGowan, A.M.; Mislivec, A.

    2012-01-01

    Neutrino interaction events in the MINERνA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERνA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  13. Real-time identification of residential appliance events based on power monitoring

    Science.gov (United States)

    Yang, Zhao; Zhu, Zhicheng; Wei, Zhiqiang; Yin, Bo; Wang, Xiuwei

    2018-03-01

    Energy monitoring for specific home appliances has been regarded as the pre-requisite for reducing residential energy consumption. To enhance the accuracy of identifying operation status of household appliances and to keep pace with the development of smart power grid, this paper puts forward the integration of electric current and power data on the basis of existing algorithm. If average power difference of several adjacent cycles varies from the baseline and goes beyond the pre-assigned threshold value, the event will be flagged. Based on MATLAB platform and domestic appliances simulations, the results of tested data and verified algorithm indicate that the power method has accomplished desired results of appliance identification.

  14. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  15. Arachne - A web-based event viewer for MINERvA

    Energy Technology Data Exchange (ETDEWEB)

    Tagg, N.; /Otterbein Coll.; Brangham, J.; /Otterbein Coll.; Chvojka, J.; /Rochester U.; Clairemont, M.; /Otterbein Coll.; Day, M.; /Rochester U.; Eberly, B.; /Pittsburgh U.; Felix, J.; /Guanajuato U.; Fields, L.; /Northwestern U.; Gago, A.M.; /Lima, Pont. U. Catolica; Gran, R.; /Maryland U.; Harris, D.A.; /Fermilab /William-Mary Coll.

    2011-11-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  16. Analytical expression for initial magnetization curve of Fe-based soft magnetic composite material

    Energy Technology Data Exchange (ETDEWEB)

    Birčáková, Zuzana, E-mail: zuzana.bircakova@upjs.sk [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Kollár, Peter; Füzer, Ján [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Bureš, Radovan; Fáberová, Mária [Institute of Materials Research, Slovak Academy of Sciences, Watsonova 47, 04001 Košice (Slovakia)

    2017-02-01

    The analytical expression for the initial magnetization curve for Fe-phenolphormaldehyde resin composite material was derived based on the already proposed ideas of the magnetization vector deviation function and the domain wall annihilation function, characterizing the reversible magnetization processes through the extent of deviation of magnetization vectors from magnetic field direction and the irreversible processes through the effective numbers of movable domain walls, respectively. As for composite materials the specific dependences of these functions were observed, the ideas were extended meeting the composites special features, which are principally the much higher inner demagnetizing fields produced by magnetic poles on ferromagnetic particle surfaces. The proposed analytical expression enables us to find the relative extent of each type of magnetization processes when magnetizing a specimen along the initial curve. - Highlights: • Analytical expression of the initial curve derived for SMC. • Initial curve described by elementary magnetization processes. • Influence of inner demagnetizing fields on magnetization process in SMC.

  17. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. An improved initialization center k-means clustering algorithm based on distance and density

    Science.gov (United States)

    Duan, Yanling; Liu, Qun; Xia, Shuyin

    2018-04-01

    Aiming at the problem of the random initial clustering center of k means algorithm that the clustering results are influenced by outlier data sample and are unstable in multiple clustering, a method of central point initialization method based on larger distance and higher density is proposed. The reciprocal of the weighted average of distance is used to represent the sample density, and the data sample with the larger distance and the higher density are selected as the initial clustering centers to optimize the clustering results. Then, a clustering evaluation method based on distance and density is designed to verify the feasibility of the algorithm and the practicality, the experimental results on UCI data sets show that the algorithm has a certain stability and practicality.

  19. Location-based technologies for supporting elderly pedestrian in "getting lost" events.

    Science.gov (United States)

    Pulido Herrera, Edith

    2017-05-01

    Localization-based technologies promise to keep older adults with dementia safe and support them and their caregivers during getting lost events. This paper summarizes mainly technological contributions to support the target group in these events. Moreover, important aspects of the getting lost phenomenon such as its concept and ethical issues are also briefly addressed. Papers were selected from scientific databases and gray literature. Since the topic is still in its infancy, other terms were used to find contributions associated with getting lost e.g. wandering. Trends of applying localization systems were identified as personal locators, perimeter systems and assistance systems. The first system barely considered the older adult's opinion, while assistance systems may involve context awareness to improve the support for both the elderly and the caregiver. Since few studies report multidisciplinary work with a special focus on getting lost, there is not a strong evidence of the real efficiency of localization systems or guidelines to design systems for the target group. Further research about getting lost is required to obtain insights for developing customizable systems. Moreover, considering conditions of the older adult might increase the impact of developments that combine localization technologies and artificial intelligence techniques. Implications for Rehabilitation Whilst there is no cure for dementia such as Alzheimer's, it is feasible to take advantage of technological developments to somewhat diminish its negative impact. For instance, location-based systems may provide information to early diagnose the Alzheimer's disease by assessing navigational impairments of older adults. Assessing the latest supportive technologies and methodologies may provide insights to adopt strategies to properly manage getting lost events. More user-centered designs will provide appropriate assistance to older adults. Namely, customizable systems could assist older adults

  20. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  1. Tsunami Source Identification on the 1867 Tsunami Event Based on the Impact Intensity

    Science.gov (United States)

    Wu, T. R.

    2014-12-01

    The 1867 Keelung tsunami event has drawn significant attention from people in Taiwan. Not only because the location was very close to the 3 nuclear power plants which are only about 20km away from the Taipei city but also because of the ambiguous on the tsunami sources. This event is unique in terms of many aspects. First, it was documented on many literatures with many languages and with similar descriptions. Second, the tsunami deposit was discovered recently. Based on the literatures, earthquake, 7-meter tsunami height, volcanic smoke, and oceanic smoke were observed. Previous studies concluded that this tsunami was generated by an earthquake with a magnitude around Mw7.0 along the Shanchiao Fault. However, numerical results showed that even a Mw 8.0 earthquake was not able to generate a 7-meter tsunami. Considering the steep bathymetry and intense volcanic activities along the Keelung coast, one reasonable hypothesis is that different types of tsunami sources were existed, such as the submarine landslide or volcanic eruption. In order to confirm this scenario, last year we proposed the Tsunami Reverse Tracing Method (TRTM) to find the possible locations of the tsunami sources. This method helped us ruling out the impossible far-field tsunami sources. However, the near-field sources are still remain unclear. This year, we further developed a new method named 'Impact Intensity Analysis' (IIA). In the IIA method, the study area is divided into a sequence of tsunami sources, and the numerical simulations of each source is conducted by COMCOT (Cornell Multi-grid Coupled Tsunami Model) tsunami model. After that, the resulting wave height from each source to the study site is collected and plotted. This method successfully helped us to identify the impact factor from the near-field potential sources. The IIA result (Fig. 1) shows that the 1867 tsunami event was a multi-source event. A mild tsunami was trigged by a Mw7.0 earthquake, and then followed by the submarine

  2. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  3. Event-based prospective memory in mildly and severely autistic children.

    Science.gov (United States)

    Sheppard, Daniel P; Kvavilashvili, Lia; Ryder, Nuala

    2016-01-01

    There is a growing body of research into the development of prospective memory (PM) in typically developing children but research is limited in autistic children (Aut) and rarely includes children with more severe symptoms. This study is the first to specifically compare event-based PM in severely autistic children to mildly autistic and typically developing children. Fourteen mildly autistic children and 14 severely autistic children, aged 5-13 years, were matched for educational attainment with 26 typically developing children aged 5-6 years. Three PM tasks and a retrospective memory task were administered. Results showed that severely autistic children performed less well than typically developing children on two PM tasks but mildly autistic children did not differ from either group. No group differences were found on the most motivating (a toy reward) task. The findings suggest naturalistic tasks and motivation are important factors in PM success in severely autistic children and highlights the need to consider the heterogeneity of autism and symptom severity in relation to performance on event-based PM tasks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Valenced cues and contexts have different effects on event-based prospective memory.

    Science.gov (United States)

    Graf, Peter; Yu, Martin

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  5. Valenced cues and contexts have different effects on event-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Peter Graf

    Full Text Available This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  6. Event-Based Prospective Memory Is Resistant but Not Immune to Proactive Interference.

    Science.gov (United States)

    Oates, Joyce M; Peynircioglu, Zehra F

    2016-01-01

    Recent evidence suggests that proactive interference (PI) does not hurt event-based prospective memory (ProM) the way it does retrospective memory (RetroM) (Oates, Peynircioglu, & Bates, 2015). We investigated this apparent resistance further. Introduction of a distractor task to ensure we were testing ProM rather than vigilance in Experiment 1 and tripling the number of lists to provide more opportunity for PI buildup in Experiment 2 still did not produce performance decrements. However, when the ProM task was combined with a RetroM task in Experiment 3, a comparable buildup and release was observed also in the ProM task. It appears that event based ProM is indeed somewhat resistant to PI, but this resistance can break down when the ProM task comprises the same stimuli as in an embedded RetroM task. We discuss the results using the ideas of cue overload and distinctiveness as well as shared attentional and working memory resources.

  7. 78 FR 75560 - Biofuels Washington LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Science.gov (United States)

    2013-12-12

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER14-506-000] Biofuels Washington LLC; Supplemental Notice That Initial Market- Based Rate Filing Includes Request for Blanket Section 204 Authorization This is a supplemental notice in the above-referenced proceeding, of Biofuels...

  8. A Service-Learning Initiative within a Community-Based Small Business

    Science.gov (United States)

    Simola, Sheldene

    2009-01-01

    Purpose: The purpose of this paper is to extend previous scholarly writing on community service-learning (SL) initiatives by looking beyond their use in the not-for-profit sector to their potential use in community-based small businesses. Design/methodology/approach: A rationale for the appropriateness of using SL projects in small businesses is…

  9. 77 FR 60984 - World Digital Innovations; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Science.gov (United States)

    2012-10-05

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-2654-001] World Digital Innovations; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 Authorization This is a supplemental notice in the above-referenced proceeding, of World Digital...

  10. The Cultural Adaptation of a Community-Based Child Maltreatment Prevention Initiative.

    Science.gov (United States)

    McLeigh, Jill D; Katz, Carmit; Davidson-Arad, Bilha; Ben-Arieh, Asher

    2017-06-01

    A unique primary prevention effort, Strong Communities for Children (Strong Communities), focuses on changing attitudes and expectations regarding communities' collective responsibilities for the safety of children. Findings from a 6-year pilot of the initiative in South Carolina have shown promise in reducing child maltreatment, but efforts to adapt the initiative to different cultural contexts have been lacking. No models exist for adapting an initiative that takes a community-level approach to ensuring children's safety. Thus, this article addresses the gap by providing an overview of the original initiative, how the initiative was adapted to the Israeli context, and lessons learned from the experience. Building on conceptualizations of cultural adaptation by Castro et al. (Prevention Science, 5, 2004, 41) and Resnicow et al. (Ethnicity and Disease, 9, 1999, 11), sources of nonfit (i.e., sociodemographic traits, political conflict, government services, and the presence and role of community organizations) were identified and deep and surface structure modifications were made to the content and delivery. Ultimately, this article describes the adaption and dissemination of a community-based child maltreatment prevention initiative in Tel Aviv, Israel, and addresses researchers' calls for more publications describing the adaptation of interventions and the procedures that need to be implemented to achieve cultural relevance. © 2015 Family Process Institute.

  11. Simulating the influence of life trajectory events on transport mode behavior in an agent-based system

    NARCIS (Netherlands)

    Verhoeven, M.; Arentze, T.A.; Timmermans, H.J.P.; Waerden, van der P.J.H.J.

    2007-01-01

    this paper describes the results of a study on the impact of lifecycle or life trajectory events on activity-travel decisions. This lifecycle trajectory of individual agents can be easily incorporated in an agent-based simulation system. This paper focuses on two lifecycle events, change in

  12. Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network

    Directory of Open Access Journals (Sweden)

    Hasmaini Mohamad

    2016-06-01

    Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.

  13. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    Directory of Open Access Journals (Sweden)

    Katherine M. Boydell

    2011-01-01

    Full Text Available A symbolic interaction (BLUMER, 1969; MEAD, 1934; PRUS, 1996; PRUS & GRILLS, 2003 approach was taken to study the collective event (PRUS, 1997 of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed us to study the process of the creation of the dance and its capacity to convert abstract research into concrete form and to produce generalizable abstract knowledge from the empirical research findings. Thus, through the techniques of movement, metaphor, voice-over, and music, the characterization of experience through dance was personal and generic, individual and collective, particular and trans-situational. The dance performance allowed us to address the visceral, emotional, and visual aspects of our research which are frequently invisible in traditional academia. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs110155

  14. Emergency Load Shedding Strategy Based on Sensitivity Analysis of Relay Operation Margin against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun Sun

    2012-01-01

    the runtime emergent states of related system component. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal load shedding strategy is applied to adjust the emergent states timely before the unwanted relay operation. Load dynamics is also taken...... into account to compensate load shedding amount calculation. And the multi-agent technology is applied for the whole strategy implementation. A test system is built in real time digital simulator (RTDS) and has demonstrated the effectiveness of the proposed strategy.......In order to prevent long term voltage instability and induced cascading events, a load shedding strategy based on the sensitivity of relay operation margin to load powers is discussed and proposed in this paper. The operation margin of critical impedance backup relay is defined to identify...

  15. Acquisition and classification of static single-event upset cross section for SRAM-based FPGAs

    International Nuclear Information System (INIS)

    Yao Zhibin; Fan Ruyu; Guo Hongxia; Wang Zhongming; He Baoping; Zhang Fengqi; Zhang Keying

    2011-01-01

    In order to evaluate single event upsets (SEUs) in SRAM-based FPGAs and to find the sensitive resource in configuration memory, a heavy ions irradiation experiment was carried out on a Xilinx FPGAs device XCV300PQ240. The experiment was conducted to gain the static SEU cross section and classify the SEUs in configurations memory according to different resource uses. The results demonstrate that the inter-memory of SRAM-based FPGAs is extremely sensitive to heavy-ion-induced SEUs. The LUT and routing resources are the main source of SEUs in the configuration memory, which covers more than 97.46% of the total upsets. The SEU sensitivity of various resources is different. The IOB control bit and LUT elements are more sensitive,and more attention should be paid to the LUT elements in radiation hardening,which account for a quite large proportion of the configuration memory. (authors)

  16. Event-Based Color Segmentation With a High Dynamic Range Sensor

    Directory of Open Access Journals (Sweden)

    Alexandre Marcireau

    2018-04-01

    Full Text Available This paper introduces a color asynchronous neuromorphic event-based camera and a methodology to process color output from the device to perform color segmentation and tracking at the native temporal resolution of the sensor (down to one microsecond. Our color vision sensor prototype is a combination of three Asynchronous Time-based Image Sensors, sensitive to absolute color information. We devise a color processing algorithm leveraging this information. It is designed to be computationally cheap, thus showing how low level processing benefits from asynchronous acquisition and high temporal resolution data. The resulting color segmentation and tracking performance is assessed both with an indoor controlled scene and two outdoor uncontrolled scenes. The tracking's mean error to the ground truth for the objects of the outdoor scenes ranges from two to twenty pixels.

  17. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    Science.gov (United States)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  18. Criminal charges prior to and after initiation of office-based buprenorphine treatment

    Directory of Open Access Journals (Sweden)

    Harris Elizabeth E

    2012-03-01

    Full Text Available Abstract Background There is little data on the impact of office-based buprenorphine therapy on criminal activity. The goal of this study was to determine the impact of primary care clinic-based buprenorphine maintenance therapy on rates of criminal charges and the factors associated with criminal charges in the 2 years after initiation of treatment. Methods We collected demographic and outcome data on 252 patients who were given at least one prescription for buprenorphine. We searched a public database of criminal charges and recorded criminal charges prior to and after enrollment. We compared the total number of criminal cases and drug cases 2 years before versus 2 years after initiation of treatment. Results There was at least one criminal charge made against 38% of the subjects in the 2 years after initiation of treatment; these subjects were more likely to have used heroin, to have injected drugs, to have had any prior criminal charges, and recent criminal charges. There was no significant difference in the number of subjects with any criminal charge or a drug charge before and after initiation of treatment. Likewise, the mean number of all cases and drug cases was not significantly different between the two periods. However, among those who were opioid-negative for 6 or more months in the first year of treatment, there was a significant decline in criminal cases. On multivariable analysis, having recent criminal charges was significantly associated with criminal charges after initiation of treatment (adjusted odds ratio 3.92; subjects who were on opioid maintenance treatment prior to enrollment were significantly less likely to have subsequent criminal charges (adjusted odds ratio 0.52. Conclusions Among subjects with prior criminal charges, initiation of office-based buprenorphine treatment did not appear to have a significant impact on subsequent criminal charges.

  19. A Community-Based Event Delivery Protocol in Publish/Subscribe Systems for Delay Tolerant Sensor Networks

    Directory of Open Access Journals (Sweden)

    Haigang Gong

    2009-09-01

    Full Text Available The basic operation of a Delay Tolerant Sensor Network (DTSN is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  20. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    Science.gov (United States)

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  1. wayGoo recommender system: personalized recommendations for events scheduling, based on static and real-time information

    Science.gov (United States)

    Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.

    2016-05-01

    wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.

  2. Parachuting from fixed objects: descriptive study of 106 fatal events in BASE jumping 1981-2006.

    Science.gov (United States)

    Westman, A; Rosén, M; Berggren, P; Björnstig, U

    2008-06-01

    To analyse the characteristics of fatal incidents in fixed object sport parachuting (building, antenna, span, earth (BASE) jumping) and create a basis for prevention. Descriptive epidemiological study. Data on reported fatal injury events (n = 106) worldwide in 1981-2006 retrieved from the BASE fatality list. Human, equipment and environmental factors. Identification of typical fatal incident and injury mechanisms for each of the four fixed object types of BASE jumping (building, antenna, span, earth). Human factors included parachutist free fall instability (loss of body control before parachute deployment), free fall acrobatics and deployment failure by the parachutist. Equipment factors included pilot chute malfunction and parachute malfunction. In cliff jumping (BASE object type E), parachute opening towards the object jumped was the most frequent equipment factor. Environmental factors included poor visibility, strong or turbulent winds, cold and water. The overall annual fatality risk for all object types during the year 2002 was estimated at about one fatality per 60 participants. Participants in BASE jumping should target risk factors with training and technical interventions. The mechanisms described in this study should be used by rescue units to improve the management of incidents.

  3. Analysis of adverse events of renal impairment related to platinum-based compounds using the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Naganuma, Misa; Motooka, Yumi; Sasaoka, Sayaka; Hatahira, Haruna; Hasegawa, Shiori; Fukuda, Akiho; Nakao, Satoshi; Shimada, Kazuyo; Hirade, Koseki; Mori, Takayuki; Yoshimura, Tomoaki; Kato, Takeshi; Nakamura, Mitsuhiro

    2018-01-01

    Platinum compounds cause several adverse events, such as nephrotoxicity, gastrointestinal toxicity, myelosuppression, ototoxicity, and neurotoxicity. We evaluated the incidence of renal impairment as adverse events are related to the administration of platinum compounds using the Japanese Adverse Drug Event Report database. We analyzed adverse events associated with the use of platinum compounds reported from April 2004 to November 2016. The reporting odds ratio at 95% confidence interval was used to detect the signal for each renal impairment incidence. We evaluated the time-to-onset profile of renal impairment and assessed the hazard type using Weibull shape parameter and used the applied association rule mining technique to discover undetected relationships such as possible risk factor. In total, 430,587 reports in the Japanese Adverse Drug Event Report database were analyzed. The reporting odds ratios (95% confidence interval) for renal impairment resulting from the use of cisplatin, oxaliplatin, carboplatin, and nedaplatin were 2.7 (2.5-3.0), 0.6 (0.5-0.7), 0.8 (0.7-1.0), and 1.3 (0.8-2.1), respectively. The lower limit of the reporting odds ratio (95% confidence interval) for cisplatin was >1. The median (lower-upper quartile) onset time of renal impairment following the use of platinum-based compounds was 6.0-8.0 days. The Weibull shape parameter β and 95% confidence interval upper limit of oxaliplatin were impairment during cisplatin use in real-world setting. The present findings demonstrate that the incidence of renal impairment following cisplatin use should be closely monitored when patients are hypertensive or diabetic, or when they are co-administered furosemide, loxoprofen, or pemetrexed. In addition, healthcare professionals should closely assess a patient's background prior to treatment.

  4. Toward zero waste: Composting and recycling for sustainable venue based events

    International Nuclear Information System (INIS)

    Hottle, Troy A.; Bilec, Melissa M.; Brown, Nicholas R.; Landis, Amy E.

    2015-01-01

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO 2 equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO 2 eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO 2 eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night (with the

  5. Toward zero waste: Composting and recycling for sustainable venue based events

    Energy Technology Data Exchange (ETDEWEB)

    Hottle, Troy A., E-mail: troy.hottle@asu.edu [Arizona State University, School of Sustainable Engineering and the Built Environment, 370 Interdisciplinary Science and Technology Building 4 (ISTB4), 781 East Terrace Road, Tempe, AZ 85287-6004 (United States); Bilec, Melissa M., E-mail: mbilec@pitt.edu [University of Pittsburgh, Civil and Environmental Engineering, 153 Benedum Hall, 3700 O’Hara Street, Pittsburgh, PA 15261-3949 (United States); Brown, Nicholas R., E-mail: nick.brown@asu.edu [Arizona State University, University Sustainability Practices, 1130 East University Drive, Suite 206, Tempe, AZ 85287 (United States); Landis, Amy E., E-mail: amy.landis@asu.edu [Arizona State University, School of Sustainable Engineering and the Built Environment, 375 Interdisciplinary Science and Technology Building 4 (ISTB4), 781 East Terrace Road, Tempe, AZ 85287-6004 (United States)

    2015-04-15

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO{sub 2} equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO{sub 2} eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO{sub 2} eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night

  6. Area-based initiatives – engines of innovation in planning and policy?

    DEFF Research Database (Denmark)

    Larsen, Jacob Norvig; Agger, Annika

    . Nevertheless, there is still considerable uncertainty as to the most important outcomes of place-based initiatives. Evaluations have mostly focussed on direct quantitative socio-economic indicators. These have often been quite insignificant, while other effects have been largely neglected. This paper proposes...... and development in planning culture turns out to be a more substantial result than the reduction of social exclusion and economic deprivation. The paper analyses all available official evaluation studies of Danish place-based urban policy initiatives from mid-1990s through 2010. In addition to this, recent...... studies of local planning culture change are discussed. Main findings are that during the past two decades a general change in planning culture has developed gradually, triggered by urban regeneration full scale experimentation with place-based approaches. Second, planners as well as public administrators...

  7. Using evidence-based leadership initiatives to create a healthy nursing work environment.

    Science.gov (United States)

    Nayback-Beebe, Ann M; Forsythe, Tanya; Funari, Tamara; Mayfield, Marie; Thoms, William; Smith, Kimberly K; Bradstreet, Harry; Scott, Pamela

    2013-01-01

    In an effort to create a healthy nursing work environment in a military hospital Intermediate Care Unit (IMCU), a facility-level Evidence Based Practice working group composed of nursing.Stakeholders brainstormed and piloted several unit-level evidence-based leadership initiatives to improve the IMCU nursing work environment. These initiatives were guided by the American Association of Critical Care Nurses Standards for Establishing and Sustaining Healthy Work Environments which encompass: (1) skilled communication, (2) true collaboration, (3) effective decision making, (4) appropriate staffing, (5) meaningful recognition, and (6) authentic leadership. Interim findings suggest implementation of these six evidence-based, relationship-centered principals, when combined with IMCU nurses' clinical expertise, management experience, and personal values and preferences, improved staff morale, decreased staff absenteeism, promoted a healthy nursing work environment, and improved patient care.

  8. Pharmacogenetics-based area-under-curve model can predict efficacy and adverse events from axitinib in individual patients with advanced renal cell carcinoma.

    Science.gov (United States)

    Yamamoto, Yoshiaki; Tsunedomi, Ryouichi; Fujita, Yusuke; Otori, Toru; Ohba, Mitsuyoshi; Kawai, Yoshihisa; Hirata, Hiroshi; Matsumoto, Hiroaki; Haginaka, Jun; Suzuki, Shigeo; Dahiya, Rajvir; Hamamoto, Yoshihiko; Matsuyama, Kenji; Hazama, Shoichi; Nagano, Hiroaki; Matsuyama, Hideyasu

    2018-03-30

    We investigated the relationship between axitinib pharmacogenetics and clinical efficacy/adverse events in advanced renal cell carcinoma (RCC) and established a model to predict clinical efficacy and adverse events using pharmacokinetic and gene polymorphisms related to drug metabolism and efflux in a phase II trial. We prospectively evaluated the area under the plasma concentration-time curve (AUC) of axitinib, objective response rate, and adverse events in 44 consecutive advanced RCC patients treated with axitinib. To establish a model for predicting clinical efficacy and adverse events, polymorphisms in genes including ABC transporters ( ABCB1 and ABCG2 ), UGT1A , and OR2B11 were analyzed by whole-exome sequencing, Sanger sequencing, and DNA microarray. To validate this prediction model, calculated AUC by 6 gene polymorphisms was compared with actual AUC in 16 additional consecutive patients prospectively. Actual AUC significantly correlated with the objective response rate ( P = 0.0002) and adverse events (hand-foot syndrome, P = 0.0055; and hypothyroidism, P = 0.0381). Calculated AUC significantly correlated with actual AUC ( P treatment precisely predicted actual AUC after axitinib treatment ( P = 0.0066). Our pharmacogenetics-based AUC prediction model may determine the optimal initial dose of axitinib, and thus facilitate better treatment of patients with advanced RCC.

  9. Various sizes of sliding event bursts in the plastic flow of metallic glasses based on a spatiotemporal dynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Jingli, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn; Chen, Cun [School of Mathematics and Statistics, Zhengzhou University, Zhengzhou 450001 (China); Wang, Gang, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn [Laboratory for Microstructures, Shanghai University, Shanghai 200444 (China); Cheung, Wing-Sum [Department of Mathematics, The University of HongKong, HongKong (China); Sun, Baoan; Mattern, Norbert [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Siegmund, Stefan [Department of Mathematics, TU Dresden, D-01062 Dresden (Germany); Eckert, Jürgen [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Institute of Materials Science, TU Dresden, D-01062 Dresden (Germany)

    2014-07-21

    This paper presents a spatiotemporal dynamic model based on the interaction between multiple shear bands in the plastic flow of metallic glasses during compressive deformation. Various sizes of sliding events burst in the plastic deformation as the generation of different scales of shear branches occurred; microscopic creep events and delocalized sliding events were analyzed based on the established model. This paper discusses the spatially uniform solutions and traveling wave solution. The phase space of the spatially uniform system applied in this study reflected the chaotic state of the system at a lower strain rate. Moreover, numerical simulation showed that the microscopic creep events were manifested at a lower strain rate, whereas the delocalized sliding events were manifested at a higher strain rate.

  10. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  11. Excessive Heat Events and National Security: Building Resilience based on Early Warning Systems

    Science.gov (United States)

    Vintzileos, A.

    2017-12-01

    Excessive heat events (EHE) affect security of Nations in multiple direct and indirect ways. EHE are the top cause for morbidity/mortality associated to any atmospheric extremes. Higher energy consumption used for cooling can lead to black-outs and social disorder. EHE affect the food supply chain reducing crop yield and increasing the probability of food contamination during delivery and storage. Distribution of goods during EHE can be severely disrupted due to mechanical failure of transportation equipment. EHE during athletic events e.g., marathons, may result to a high number of casualties. Finally, EHE may also affect military planning by e.g. reducing hours of exercise and by altering combat gear. Early warning systems for EHE allow for building resilience. In this paper we first define EHE as at least two consecutive heat days; a heat day is defined as a day with a maximum heat index with probability of occurrence that exceeds a certain threshold. We then use retrospective forecasts performed with a multitude of operational models and show that it is feasible to forecast EHE at forecast lead of week-2 and week-3 over the contiguous United States. We finally introduce an improved definition of EHE based on an intensity index and investigate forecast skill of the predictive system in the tropics and subtropics.

  12. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  13. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Science.gov (United States)

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  14. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    Directory of Open Access Journals (Sweden)

    Xiaoqing Hao

    Full Text Available To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P, weakly positive (p, neutral (o, weakly negative (n, and strongly negative (N. These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism.

  15. Management of investment-construction projects basing on the matrix of key events

    Directory of Open Access Journals (Sweden)

    Morozenko Andrey Aleksandrovich

    2016-11-01

    Full Text Available The article considers the current problematic issues in the management of investment-construction projects, examines the questions of efficiency increase of construction operations on the basis of the formation of a reflex-adaptive organizational structure. The authors analyzed the necessity of forming a matrix of key events in the investment-construction project (ICP, which will create the optimal structure of the project, basing on the work program for its implementation. For convenience of representing programs of the project implementation in time the authors make recommendations to consolidate the works into separate, economically independent functional blocks. It is proposed to use an algorithm of forming the matrix of an investment-construction project, considering the economic independence of the functional blocks and stages of the ICP implementation. The use of extended network model is justified, which is supplemented by organizational and structural constraints at different stages of the project, highlighting key events fundamentally influencing the further course of the ICP implementation.

  16. Opportunities for Web-based Drug Repositioning: Searching for Potential Antihypertensive Agents with Hypotension Adverse Events.

    Science.gov (United States)

    Wang, Kejian; Wan, Mei; Wang, Rui-Sheng; Weng, Zuquan

    2016-04-01

    Drug repositioning refers to the process of developing new indications for existing drugs. As a phenotypic indicator of drug response in humans, clinical side effects may provide straightforward signals and unique opportunities for drug repositioning. We aimed to identify drugs frequently associated with hypotension adverse reactions (ie, the opposite condition of hypertension), which could be potential candidates as antihypertensive agents. We systematically searched the electronic records of the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) through the openFDA platform to assess the association between hypotension incidence and antihypertensive therapeutic effect regarding a list of 683 drugs. Statistical analysis of FAERS data demonstrated that those drugs frequently co-occurring with hypotension events were more likely to have antihypertensive activity. Ranked by the statistical significance of frequent hypotension reporting, the well-known antihypertensive drugs were effectively distinguished from others (with an area under the receiver operating characteristic curve > 0.80 and a normalized discounted cumulative gain of 0.77). In addition, we found a series of antihypertensive agents (particularly drugs originally developed for treating nervous system diseases) among the drugs with top significant reporting, suggesting the good potential of Web-based and data-driven drug repositioning. We found several candidate agents among the hypotension-related drugs on our list that may be redirected for lowering blood pressure. More important, we showed that a pharmacovigilance system could alternatively be used to identify antihypertensive agents and sustainably create opportunities for drug repositioning.

  17. Remineralization of initial enamel caries in vitro using a novel peptide based on amelogenin

    Science.gov (United States)

    Li, Danxue; Lv, Xueping; Tu, Huanxin; Zhou, Xuedong; Yu, Haiyang; Zhang, Linglin

    2015-09-01

    Dental caries is the most common oral disease with high incidence, widely spread and can seriously affect the health of oral cavity and the whole body. Current caries prevention measures such as fluoride treatment, antimicrobial agents, and traditional Chinese herbal, have limitations to some extent. Here we design and synthesize a novel peptide based on the amelogenin, and assess its ability to promote the remineralization of initial enamel caries lesions. We used enamel blocks to form initial lesions, and then subjected to 12-day pH cycling in the presence of peptide, NaF and HEPES buffer. Enamel treated with peptide or NaF had shallower, narrower lesions, thicker remineralized surfaces and less mineral loss than enamel treated with HEPES. This peptide can promote the remineralization of initial enamel caries and inhibit the progress of caries. It is a promising anti-caries agent with various research prospects and practical application value.

  18. Least Squares Estimate of the Initial Phases in STFT based Speech Enhancement

    DEFF Research Database (Denmark)

    Nørholm, Sidsel Marie; Krawczyk-Becker, Martin; Gerkmann, Timo

    2015-01-01

    In this paper, we consider single-channel speech enhancement in the short time Fourier transform (STFT) domain. We suggest to improve an STFT phase estimate by estimating the initial phases. The method is based on the harmonic model and a model for the phase evolution over time. The initial phases...... are estimated by setting up a least squares problem between the noisy phase and the model for phase evolution. Simulations on synthetic and speech signals show a decreased error on the phase when an estimate of the initial phase is included compared to using the noisy phase as an initialisation. The error...... on the phase is decreased at input SNRs from -10 to 10 dB. Reconstructing the signal using the clean amplitude, the mean squared error is decreased and the PESQ score is increased....

  19. Bone metastasis pattern in initial metastatic breast cancer: a population-based study

    Directory of Open Access Journals (Sweden)

    Xiong Z

    2018-02-01

    Full Text Available Zhenchong Xiong,1–3,* Guangzheng Deng,1–3,* Xinjian Huang,1–3,* Xing Li,1–3 Xinhua Xie,1–3 Jin Wang,1–3 Zeyu Shuang,1–3 Xi Wang1–3 1Department of Breast Surgery, Sun Yat-sen University Cancer Center, Guangzhou, China; 2State Key Laboratory of Oncology in Southern China, Guangzhou, China; 3Collaborative Innovation Center for Cancer Medicine, Guangzhou, China *These authors contributed equally to this work Purpose: Bone is one of the most common sites of breast cancer metastasis, and population-based studies of patients with bone metastasis in initial metastatic breast cancer (MBC are lacking. Materials and methods: From 2010 to 2013, 245,707 breast cancer patients and 8901 patients diagnosed with initial bone metastasis were identified by Surveillance, Epidemiology, and End Results database of the National Cancer Institute. Multivariate logistic and Cox regression were used to identify predictive factors for the presence of bone metastasis and prognosis factors. Kaplan–Meier method and log-rank test were used for survival analysis. Results: Eight thousand nine hundred one patients with initial MBC had bone involvement, accounting for 3.6% of the entire cohort and 62.5% of the patients with initial MBC. Also, 70.5% of patients with bone metastasis were hormone receptor (HR positive (HR+/human epidermal growth factor receptor 2 [HER2]−: 57.6%; HR+/HER2+: 12.9%. Patients with initial bone metastasis had a better 5-year survival rate compared to those with initial brain, liver, or lung metastasis. HR+/HER2− and HR+/HER2+ breast cancer had a propensity of bone metastasis in the entire cohort and were correlated with better prognosis in patients with initial bone metastasis. Local surgery had significantly improved overall survival in initial MBC patients with bone metastasis. Conclusion: Our study has provided population-based estimates of epidemiologic characteristics and prognosis in patients with bone metastasis at the time of

  20. Impacts of European drought events: insights from an international database of text-based reports

    Science.gov (United States)

    Stahl, Kerstin; Kohn, Irene; Blauhut, Veit; Urquijo, Julia; De Stefano, Lucia; Acácio, Vanda; Dias, Susana; Stagge, James H.; Tallaksen, Lena M.; Kampragou, Eleni; Van Loon, Anne F.; Barker, Lucy J.; Melsen, Lieke A.; Bifulco, Carlo; Musolino, Dario; de Carli, Alessandro; Massarutto, Antonio; Assimacopoulos, Dionysis; Van Lanen, Henny A. J.

    2016-03-01

    Drought is a natural hazard that can cause a wide range of impacts affecting the environment, society, and the economy. Providing an impact assessment and reducing vulnerability to these impacts for regions beyond the local scale, spanning political and sectoral boundaries, requires systematic and detailed data regarding impacts. This study presents an assessment of the diversity of drought impacts across Europe based on the European Drought Impact report Inventory (EDII), a unique research database that has collected close to 5000 impact reports from 33 European countries. The reported drought impacts were classified into major impact categories, each of which had a number of subtypes. The distribution of these categories and types was then analyzed over time, by country, across Europe and for particular drought events. The results show that impacts on agriculture and public water supply dominate the collection of drought impact reports for most countries and for all major drought events since the 1970s, while the number and relative fractions of reported impacts in other sectors can vary regionally and from event to event. The analysis also shows that reported impacts have increased over time as more media and website information has become available and environmental awareness has increased. Even though the distribution of impact categories is relatively consistent across Europe, the details of the reports show some differences. They confirm severe impacts in southern regions (particularly on agriculture and public water supply) and sector-specific impacts in central and northern regions (e.g., on forestry or energy production). The protocol developed thus enabled a new and more comprehensive view on drought impacts across Europe. Related studies have already developed statistical techniques to evaluate the link between drought indices and the categorized impacts using EDII data. The EDII is a living database and is a promising source for further research on

  1. Allowing Brief Delays in Responding Improves Event-Based Prospective Memory for Young Adults Living with HIV Disease

    OpenAIRE

    Loft, Shayne; Doyle, Katie L.; Naar-King, Sylvie; Outlaw, Angulique Y.; Nichols, Sharon L.; Weber, Erica; Blackstone, Kaitlin; Woods, Steven Paul

    2014-01-01

    Event-based prospective memory (PM) tasks require individuals to remember to perform an action when they encounter a specific cue in the environment, and have clear relevance for daily functioning for individuals with HIV. In many everyday tasks, the individual must not only maintain the intent to perform the PM task, but the PM task response also competes with the alternative and more habitual task response. The current study examined whether event-based PM can be improved by slowing down th...

  2. Eruptive event generator based on the Gibson-Low magnetic configuration

    Science.gov (United States)

    Borovikov, D.; Sokolov, I. V.; Manchester, W. B.; Jin, M.; Gombosi, T. I.

    2017-08-01

    Coronal mass ejections (CMEs), a kind of energetic solar eruptions, are an integral subject of space weather research. Numerical magnetohydrodynamic (MHD) modeling, which requires powerful computational resources, is one of the primary means of studying the phenomenon. With increasing accessibility of such resources, grows the demand for user-friendly tools that would facilitate the process of simulating CMEs for scientific and operational purposes. The Eruptive Event Generator based on Gibson-Low flux rope (EEGGL), a new publicly available computational model presented in this paper, is an effort to meet this demand. EEGGL allows one to compute the parameters of a model flux rope driving a CME via an intuitive graphical user interface. We provide a brief overview of the physical principles behind EEGGL and its functionality. Ways toward future improvements of the tool are outlined.

  3. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  4. Ontology-based knowledge management for personalized adverse drug events detection.

    Science.gov (United States)

    Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue

    2011-01-01

    Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.

  5. An Initialization Method Based on Hybrid Distance for k-Means Algorithm.

    Science.gov (United States)

    Yang, Jie; Ma, Yan; Zhang, Xiangfen; Li, Shunbao; Zhang, Yuping

    2017-11-01

    The traditional [Formula: see text]-means algorithm has been widely used as a simple and efficient clustering method. However, the performance of this algorithm is highly dependent on the selection of initial cluster centers. Therefore, the method adopted for choosing initial cluster centers is extremely important. In this letter, we redefine the density of points according to the number of its neighbors, as well as the distance between points and their neighbors. In addition, we define a new distance measure that considers both Euclidean distance and density. Based on that, we propose an algorithm for selecting initial cluster centers that can dynamically adjust the weighting parameter. Furthermore, we propose a new internal clustering validation measure, the clustering validation index based on the neighbors (CVN), which can be exploited to select the optimal result among multiple clustering results. Experimental results show that the proposed algorithm outperforms existing initialization methods on real-world data sets and demonstrates the adaptability of the proposed algorithm to data sets with various characteristics.

  6. A Simple Density with Distance Based Initial Seed Selection Technique for K Means Algorithm

    Directory of Open Access Journals (Sweden)

    Sajidha Syed Azimuddin

    2017-01-01

    Full Text Available Open issues with respect to K means algorithm are identifying the number of clusters, initial seed concept selection, clustering tendency, handling empty clusters, identifying outliers etc. In this paper we propose a novel and a simple technique considering both density and distance of the concepts in a dataset to identify initial seed concepts for clustering. Many authors have proposed different techniques to identify initial seed concepts; but our method ensures that the initial seed concepts are chosen from different clusters that are to be generated by the clustering solution. The hallmark of our algorithm is that it is a single pass algorithm that does not require any extra parameters to be estimated. Further, our seed concepts are one among the actual concepts and not the mean of representative concepts as is the case in many other algorithms. We have implemented our proposed algorithm and compared the results with the interval based technique of Fouad Khan. We see that our method outperforms the interval based method. We have also compared our method with the original random K means and K Means++ algorithms.

  7. A method of recovering the initial vectors of globally coupled map lattices based on symbolic dynamics

    International Nuclear Information System (INIS)

    Sun Li-Sha; Kang Xiao-Yun; Zhang Qiong; Lin Lan-Xin

    2011-01-01

    Based on symbolic dynamics, a novel computationally efficient algorithm is proposed to estimate the unknown initial vectors of globally coupled map lattices (CMLs). It is proved that not all inverse chaotic mapping functions are satisfied for contraction mapping. It is found that the values in phase space do not always converge on their initial values with respect to sufficient backward iteration of the symbolic vectors in terms of global convergence or divergence (CD). Both CD property and the coupling strength are directly related to the mapping function of the existing CML. Furthermore, the CD properties of Logistic, Bernoulli, and Tent chaotic mapping functions are investigated and compared. Various simulation results and the performances of the initial vector estimation with different signal-to-noise ratios (SNRs) are also provided to confirm the proposed algorithm. Finally, based on the spatiotemporal chaotic characteristics of the CML, the conditions of estimating the initial vectors using symbolic dynamics are discussed. The presented method provides both theoretical and experimental results for better understanding and characterizing the behaviours of spatiotemporal chaotic systems. (general)

  8. A method of recovering the initial vectors of globally coupled map lattices based on symbolic dynamics

    Science.gov (United States)

    Sun, Li-Sha; Kang, Xiao-Yun; Zhang, Qiong; Lin, Lan-Xin

    2011-12-01

    Based on symbolic dynamics, a novel computationally efficient algorithm is proposed to estimate the unknown initial vectors of globally coupled map lattices (CMLs). It is proved that not all inverse chaotic mapping functions are satisfied for contraction mapping. It is found that the values in phase space do not always converge on their initial values with respect to sufficient backward iteration of the symbolic vectors in terms of global convergence or divergence (CD). Both CD property and the coupling strength are directly related to the mapping function of the existing CML. Furthermore, the CD properties of Logistic, Bernoulli, and Tent chaotic mapping functions are investigated and compared. Various simulation results and the performances of the initial vector estimation with different signal-to-noise ratios (SNRs) are also provided to confirm the proposed algorithm. Finally, based on the spatiotemporal chaotic characteristics of the CML, the conditions of estimating the initial vectors using symbolic dynamics are discussed. The presented method provides both theoretical and experimental results for better understanding and characterizing the behaviours of spatiotemporal chaotic systems.

  9. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  10. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongxia Zhang

    2017-11-01

    Full Text Available Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM. This study investigated how EBPM performance is affected by task duration by having university students (n = 223 perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task’s duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1 Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2 As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3 The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  11. Historical Chronology of ENSO Events Based Upon Documentary Data From South America: Strengths and Limitations

    Science.gov (United States)

    Luc, O.

    2007-05-01

    The first reconstructions of past El Niño occurrences were proposed by W. Quinn twenty years ago. They were based on documentary evidence of anomalous rainfall episodes, destructive floods and other possible impacts of El Niño conditions in Peru and other South-American countries. It has been shown, later, that the El Niño chronological sequence covering the last four and a half centuries produced by Quinn needed a thorough revision since many so-called EN events had not occurred while some others had been overlooked. Beside the classical methodological problems met in historical climatology studies (reliability of data, confidence in the sources, primary and secondary information), the reconstruction of former EN events faces specific difficulties dealing with the significance of the indicators and their spatial location. For instance, strong precipitation anomalies during summer in Southern Ecuador and northern Peru and precipitation excess recorded in the preceding winter in central Chile constitute quite reliable proxies of El Niño conditions, in modern times. However this observed teleconnection pattern, which is useful to reinforce the interpretation of past EN occurrences, seems to have been inoperative before the early nineteenth century. It is interpreted that atmospheric circulation features during the Little Ice Age interfered with the teleconnection system linking the EN impacts in northern Peru and central Chile. As a consequence, how should be evaluated the significance of documented winter precipitation excess in central Chile in years during which there is drought evidence in northern Peru, during the sixteenth to eighteenth century? And vice versa, are former evidences for precipitation excess in northern Peru (prior to the nineteenth century) quite reliable indicators for EN conditions, even if the preceding winter was dry in the Valparaiso-Santiago region? Other specific problems met in the building-up of a consolidated EN chronological

  12. Design of Bus Protocol Intelligent Initiation System Based On RS485

    Directory of Open Access Journals (Sweden)

    Li Liming

    2017-01-01

    Full Text Available In order to design an effective and reliable RS485 bus protocol based on RS485 bus, this paper introduces the structure and transmission mode of the command frame and the response frame, and also introduce four control measures and the communication in order to process quality of this system. The communication protocol is open, tolerant, reliable and fast, and can realize ignition more reliable and accurate in the intelligent initiation system.

  13. De-Virtualizing Social Events: Understanding the Gap between Online and Offline Participation for Event Invitations

    OpenAIRE

    Huang, Ai-Ju; Wang, Hao-Chuan; Yuan, Chien Wen

    2013-01-01

    One growing use of computer-based communication media is for gathering people to initiate or sustain social events. Although the use of computer-mediated communication and social network sites such as Facebook for event promotion is becoming popular, online participation in an event does not always translate to offline attendance. In this paper, we report on an interview study of 31 participants that examines how people handle online event invitations and what influences their online and offl...

  14. Swarm-Aurora: A web-based tool for quickly identifying multi-instrument auroral events

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Knudsen, D. J.; Frey, H. U.; Kauristie, K.; Partamies, N.; Jackel, B. J.; Gillies, M.; Holmdahl Olsen, P. E.

    2016-12-01

    In recent years there has been a dramatic increase in ground-based auroral imaging systems. These include the continent-wide THEMIS-ASI network, and imagers operated by other programs including GO-Canada, MIRACLE, AGO, OMTI, and more. In the near future, a new Canadian program called TREx will see the deployment of new narrow-band ASIs that will provide multi-wavelength imaging across Western Canada. At the same time, there is an unprecedented fleet of international spacecraft probing geospace at low and high altitudes. We are now in the position to simultaneously observe the magnetospheric drivers of aurora, observe in situ the waves, currents, and particles associated with MI coupling, and the conjugate aurora. Whereas a decade ago, a single magnetic conjunction between one ASI and a low altitude satellite was a relatively rare event, we now have a plethora of triple conjunctions between imagers, low-altitude spacecraft, and near-equatorial magnetospheric probes. But with these riches comes a new level of complexity. It is often difficult to identify the many useful conjunctions for a specific line of inquiry from the multitude of conjunctions where the geospace conditions are often not relevant and/or the imaging is compromised by clouds, moon, or other factors. Swarm-Aurora was designed to facilitate and drive the use of Swarm in situ measurements in auroral science. The project seeks to build a bridge between the Swarm science community, Swarm data, and the complimentary auroral data and community. Swarm-Aurora (http://swarm-aurora.phys.ucalgary.ca) incorporates a web-based tool which provides access to quick-look summary data for a large array of instruments, with Swarm in situ and ground-based ASI data as the primary focus. This web interface allows researchers to quickly and efficiently browse Swarm and ASI data to identify auroral events of interest to them. This allows researchers to be able to easily and quickly identify Swarm overflights of ASIs that

  15. iTRAQ-Based Quantitative Proteomic Analysis of the Initiation of Head Regeneration in Planarians.

    Directory of Open Access Journals (Sweden)

    Xiaofang Geng

    Full Text Available The planarian Dugesia japonica has amazing ability to regenerate a head from the anterior ends of the amputated stump with maintenance of the original anterior-posterior polarity. Although planarians present an attractive system for molecular investigation of regeneration and research has focused on clarifying the molecular mechanism of regeneration initiation in planarians at transcriptional level, but the initiation mechanism of planarian head regeneration (PHR remains unclear at the protein level. Here, a global analysis of proteome dynamics during the early stage of PHR was performed using isobaric tags for relative and absolute quantitation (iTRAQ-based quantitative proteomics strategy, and our data are available via ProteomeXchange with identifier PXD002100. The results showed that 162 proteins were differentially expressed at 2 h and 6 h following amputation. Furthermore, the analysis of expression patterns and functional enrichment of the differentially expressed proteins showed that proteins involved in muscle contraction, oxidation reduction and protein synthesis were up-regulated in the initiation of PHR. Moreover, ingenuity pathway analysis showed that predominant signaling pathways such as ILK, calcium, EIF2 and mTOR signaling which were associated with cell migration, cell proliferation and protein synthesis were likely to be involved in the initiation of PHR. The results for the first time demonstrated that muscle contraction and ILK signaling might played important roles in the initiation of PHR at the global protein level. The findings of this research provide a molecular basis for further unraveling the mechanism of head regeneration initiation in planarians.

  16. Identification of the high pt jet events produced by a resolved photon at HERA and reconstruction of the initial state parton kinematics

    International Nuclear Information System (INIS)

    D'Agostini, G.; Monaldi, D.

    1992-01-01

    We have studied the possibility offered by the HERA detectors to identify the events where a proton interacts with a parton of the (quasi) real photon. We find that the presence of hadronic fragments of the photon outside of the beam pipe allows the identification of the two jet events produced by a resolved photon, with good efficiency and low background from the direct photon events. We show that it is also possible to reconstruct the fractional momenta of the two incoming partons. (orig.)

  17. Guidelines for time-to-event end-point definitions in trials for pancreatic cancer. Results of the DATECAN initiative (Definition for the Assessment of Time-to-event End-points in CANcer trials).

    Science.gov (United States)

    Bonnetain, Franck; Bonsing, Bert; Conroy, Thierry; Dousseau, Adelaide; Glimelius, Bengt; Haustermans, Karin; Lacaine, François; Van Laethem, Jean Luc; Aparicio, Thomas; Aust, Daniela; Bassi, Claudio; Berger, Virginie; Chamorey, Emmanuel; Chibaudel, Benoist; Dahan, Laeticia; De Gramont, Aimery; Delpero, Jean Robert; Dervenis, Christos; Ducreux, Michel; Gal, Jocelyn; Gerber, Erich; Ghaneh, Paula; Hammel, Pascal; Hendlisz, Alain; Jooste, Valérie; Labianca, Roberto; Latouche, Aurelien; Lutz, Manfred; Macarulla, Teresa; Malka, David; Mauer, Muriel; Mitry, Emmanuel; Neoptolemos, John; Pessaux, Patrick; Sauvanet, Alain; Tabernero, Josep; Taieb, Julien; van Tienhoven, Geertjan; Gourgou-Bourgade, Sophie; Bellera, Carine; Mathoulin-Pélissier, Simone; Collette, Laurence

    2014-11-01

    Using potential surrogate end-points for overall survival (OS) such as Disease-Free- (DFS) or Progression-Free Survival (PFS) is increasingly common in randomised controlled trials (RCTs). However, end-points are too often imprecisely defined which largely contributes to a lack of homogeneity across trials, hampering comparison between them. The aim of the DATECAN (Definition for the Assessment of Time-to-event End-points in CANcer trials)-Pancreas project is to provide guidelines for standardised definition of time-to-event end-points in RCTs for pancreatic cancer. Time-to-event end-points currently used were identified from a literature review of pancreatic RCT trials (2006-2009). Academic research groups were contacted for participation in order to select clinicians and methodologists to participate in the pilot and scoring groups (>30 experts). A consensus was built after 2 rounds of the modified Delphi formal consensus approach with the Rand scoring methodology (range: 1-9). For pancreatic cancer, 14 time to event end-points and 25 distinct event types applied to two settings (detectable disease and/or no detectable disease) were considered relevant and included in the questionnaire sent to 52 selected experts. Thirty experts answered both scoring rounds. A total of 204 events distributed over the 14 end-points were scored. After the first round, consensus was reached for 25 items; after the second consensus was reached for 156 items; and after the face-to-face meeting for 203 items. The formal consensus approach reached the elaboration of guidelines for standardised definitions of time-to-event end-points allowing cross-comparison of RCTs in pancreatic cancer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experiment