WorldWideScience

Sample records for cumulative trigger proposal

  1. SQL Triggers Reacting on Time Events: An Extension Proposal

    Science.gov (United States)

    Behrend, Andreas; Dorau, Christian; Manthey, Rainer

    Being able to activate triggers at timepoints reached or after time intervals elapsed has been acknowledged by many authors as a valuable functionality of a DBMS. Recently, the interest in time-based triggers has been renewed in the context of data stream monitoring. However, up till now SQL triggers react to data changes only, even though research proposals and prototypes have been supporting several other event types, in particular time-based ones, since long. We therefore propose a seamless extension of the SQL trigger concept by time-based triggers, focussing on semantic issues arising from such an extension.

  2. Proposal for a level 0 calorimeter trigger system for LHCb

    CERN Document Server

    Bertin, A; Capponi, M; D'Antone, I; De Castro, S; Donà, R; Galli, D; Giacobbe, B; Marconi, U; Massa, I; Piccinini, M; Poli, M; Semprini-Cesari, N; Spighi, R; Vecchi, S; Villa, M; Vitale, A; Zoccoli, A; Zoccoli, Antonio

    1999-01-01

    In this note we present a complete system for the Level-0 LHCb calorimeter triggers. The system is derived from the electromagnetic calorimeter pre-trigger developed for the HERA-B experiment. The proposed system follows closely the Level-0 trigger algorithms presented in the LHCb Technical Proposal based on an electromagnetic and hadronic showers analysis performed on 3x3 calorimeter matrix. The general architecture presented is completely synchronous and quite flexible to allow adaptation to further improvements on the Level-0 trigger algorithms.

  3. A method proposal for cumulative environmental impact assessment based on the landscape vulnerability evaluation

    International Nuclear Information System (INIS)

    Pavlickova, Katarina; Vyskupova, Monika

    2015-01-01

    Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impact significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process

  4. Proposal of a trigger tool to assess adverse events in dental care.

    Science.gov (United States)

    Corrêa, Claudia Dolores Trierweiler Sampaio de Oliveira; Mendes, Walter

    2017-11-21

    The aim of this study was to propose a trigger tool for research of adverse events in outpatient dentistry in Brazil. The tool was elaborated in two stages: (i) to build a preliminary set of triggers, a literature review was conducted to identify the composition of trigger tools used in other areas of health and the principal adverse events found in dentistry; (ii) to validate the preliminarily constructed triggers a panel of experts was organized using the modified Delphi method. Fourteen triggers were elaborated in a tool with explicit criteria to identify potential adverse events in dental care, essential for retrospective patient chart reviews. Studies on patient safety in dental care are still incipient when compared to other areas of health care. This study intended to contribute to the research in this field. The contribution by the literature and guidance from the expert panel allowed elaborating a set of triggers to detect adverse events in dental care, but additional studies are needed to test the instrument's validity.

  5. Proposed FPGA based tracking for a Level-1 track trigger at CMS for the HL-LHC

    CERN Document Server

    Pozzobon, Nicola

    2014-01-01

    The High Luminosity LHC (HL-LHC) is expected to deliver a luminosity in excess of $5\\times10^{34}$ cm$^{-2}$/s. The high event rate places stringent requirements on the trigger system. A key component of the CMS upgrade for the HL-LHC is a track trigger system which will identify tracks with transverse momenta above 2 GeV already at the first-level trigger within 5 $\\mu$s. This presentation will discuss a proposed track finding and fitting based on the tracklet based approach implemented on FPGAs. Tracklets are formed from pairs of hits in nearby layers in the detector and used in a road search. Summary Fast pattern recognition in Silicon trackers for triggering has often made use of Associative Memories for the pattern recognition step. We propose an alternative approach to solving the pattern recognition and track fitting problem for the upgraded CMS tracker for the HL-LHC operation. We make use of the trigger primitives,stubs, from the tracker. The stubs are formed from pairs of hits in sensors separated r...

  6. Online Scheduling in Manufacturing A Cumulative Delay Approach

    CERN Document Server

    Suwa, Haruhiko

    2013-01-01

    Online scheduling is recognized as the crucial decision-making process of production control at a phase of “being in production" according to the released shop floor schedule. Online scheduling can be also considered as one of key enablers to realize prompt capable-to-promise as well as available-to-promise to customers along with reducing production lead times under recent globalized competitive markets. Online Scheduling in Manufacturing introduces new approaches to online scheduling based on a concept of cumulative delay. The cumulative delay is regarded as consolidated information of uncertainties under a dynamic environment in manufacturing and can be collected constantly without much effort at any points in time during a schedule execution. In this approach, the cumulative delay of the schedule has the important role of a criterion for making a decision whether or not a schedule revision is carried out. The cumulative delay approach to trigger schedule revisions has the following capabilities for the ...

  7. The challenge of cumulative impacts

    Energy Technology Data Exchange (ETDEWEB)

    Masden, Elisabeth

    2011-07-01

    Full text: As governments pledge to combat climate change, wind turbines are becoming a common feature of terrestrial and marine environments. Although wind power is a renewable energy source and a means of reducing carbon emissions, there is a need to ensure that the wind farms themselves do not damage the environment. There is particular concern over the impacts of wind farms on bird populations, and with increasing numbers of wind farm proposals, the concern focuses on cumulative impacts. Individually, a wind farm, or indeed any activity/action, may have minor effects on the environment, but collectively these may be significant, potentially greater than the sum of the individual parts acting alone. Cumulative impact assessment is a legislative requirement of environmental impact assessment but such assessments are rarely adequate restricting the acquisition of basic knowledge about the cumulative impacts of wind farms on bird populations. Reasons for this are numerous but a recurring theme is the lack of clear definitions and guidance on how to perform cumulative assessments. Here we present a conceptual framework and include illustrative examples to demonstrate how the framework can be used to improve the planning and execution of cumulative impact assessments. The core concept is that explicit definitions of impacts, actions and scales of assessment are required to reduce uncertainty in the process of assessment and improve communication between stake holders. Only when it is clear what has been included within a cumulative assessment, is it possible to make comparisons between developments. Our framework requires improved legislative guidance on the actions to include in assessments, and advice on the appropriate baselines against which to assess impacts. Cumulative impacts are currently considered on restricted scales (spatial and temporal) relating to individual development assessments. We propose that benefits would be gained from elevating cumulative

  8. TRIGGER

    CERN Multimedia

    W. Smith

    2012-01-01

      Level-1 Trigger The Level-1 Trigger group is ready to deploy improvements to the L1 Trigger algorithms for 2012. These include new high-PT patterns for the RPC endcap, an improved CSC PT assignment, a new PT-matching algorithm for the Global Muon Trigger, and new calibrations for ECAL, HCAL, and the Regional Calorimeter Trigger. These should improve the efficiency, rate, and stability of the L1 Trigger. The L1 Trigger group also is migrating the online systems to SLC5. To make the data transfer from the Global Calorimeter Trigger to the Global Trigger more reliable and also to allow checking the data integrity online, a new optical link system has been developed by the GCT and GT groups and successfully tested at the CMS electronics integration facility in building 904. This new system is now undergoing further tests at Point 5 before being deployed for data-taking this year. New L1 trigger menus have recently been studied and proposed by Emmanuelle Perez and the L1 Detector Performance Group...

  9. The LHCb trigger

    International Nuclear Information System (INIS)

    Korolko, I.

    1998-01-01

    This paper describes progress in the development of the LHCb trigger system since the letter of intent. The trigger philosophy has significantly changed, resulting in an increase of trigger efficiency for signal B events. It is proposed to implement a level-1 vertex topology trigger in specialised hardware. (orig.)

  10. Proposal of upgrade of the ATLAS muon trigger in the barrel-endcap transition region with RPCs

    CERN Document Server

    Massa, L; The ATLAS collaboration

    2014-01-01

    This report presents a project for the upgrade of the Level-1 muon trigger in the barrel-endcap transition region (1.01) caused by charged particles originating from secondary interactions downstream of the interaction point. After the LHC upgrade forseen for 2018, the Level-1 muon trigger rate would saturate the allocated bandwidth unless new measures are adopted to improve the rejection of fake triggers. ATLAS is going to improve the trigger selectivity in the region |$\\eta$|>1.3 with the New Small Wheel detector upgrade. To obtain a similar trigger selectivity in the barrel-endcap transition region, it is proposed to add new RPC chambers at the edge of the inner layer of the barrel muon spectrometer. These chambers will be based on a three layer structure with thinner gas gaps and electrodes with respect to the ATLAS standard and a new low-profile light-weight mechanical structure that will allow the installation in the limited available space. New front-end electronics, integrating fast TDC capabilities w...

  11. Probabilistic clustering of rainfall condition for landslide triggering

    Science.gov (United States)

    Rossi, Mauro; Luciani, Silvia; Cesare Mondini, Alessandro; Kirschbaum, Dalia; Valigi, Daniela; Guzzetti, Fausto

    2013-04-01

    Landslides are widespread natural and man made phenomena. They are triggered by earthquakes, rapid snow melting, human activities, but mostly by typhoons and intense or prolonged rainfall precipitations. In Italy mostly they are triggered by intense precipitation. The prediction of landslide triggered by rainfall precipitations over large areas is commonly based on the exploitation of empirical models. Empirical landslide rainfall thresholds are used to identify rainfall conditions for the possible landslide initiation. It's common practice to define rainfall thresholds by assuming a power law lower boundary in the rainfall intensity-duration or cumulative rainfall-duration space above which landslide can occur. The boundary is defined considering rainfall conditions associated to landslide phenomena using heuristic approaches, and doesn't consider rainfall events not causing landslides. Here we present a new fully automatic method to identify the probability of landslide occurrence associated to rainfall conditions characterized by measures of intensity or cumulative rainfall and rainfall duration. The method splits the rainfall events of the past in two groups: a group of events causing landslides and its complementary, then estimate their probabilistic distributions. Next, the probabilistic membership of the new event to one of the two clusters is estimated. The method doesn't assume a priori any threshold model, but simple exploits the real empirical distribution of rainfall events. The approach was applied in the Umbria region, Central Italy, where a catalogue of landslide timing, were obtained through the search of chronicles, blogs and other source of information in the period 2002-2012. The approach was tested using rain gauge measures and satellite rainfall estimates (NASA TRMM-v6), allowing in both cases the identification of the rainfall condition triggering landslides in the region. Compared to the other existing threshold definition methods, the prosed

  12. About the cumulants of periodic signals

    Science.gov (United States)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  13. Cumulative cultural learning: Development and diversity

    Science.gov (United States)

    2017-01-01

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children’s learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission—the cornerstone of human cultural diversity. PMID:28739945

  14. Cumulative cultural learning: Development and diversity.

    Science.gov (United States)

    Legare, Cristine H

    2017-07-24

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children's learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission-the cornerstone of human cultural diversity.

  15. Evolutionary neural network modeling for software cumulative failure time prediction

    International Nuclear Information System (INIS)

    Tian Liang; Noore, Afzel

    2005-01-01

    An evolutionary neural network modeling approach for software cumulative failure time prediction based on multiple-delayed-input single-output architecture is proposed. Genetic algorithm is used to globally optimize the number of the delayed input neurons and the number of neurons in the hidden layer of the neural network architecture. Modification of Levenberg-Marquardt algorithm with Bayesian regularization is used to improve the ability to predict software cumulative failure time. The performance of our proposed approach has been compared using real-time control and flight dynamic application data sets. Numerical results show that both the goodness-of-fit and the next-step-predictability of our proposed approach have greater accuracy in predicting software cumulative failure time compared to existing approaches

  16. A Framework for Treating Cumulative Trauma with Art Therapy

    Science.gov (United States)

    Naff, Kristina

    2014-01-01

    Cumulative trauma is relatively undocumented in art therapy practice, although there is growing evidence that art therapy provides distinct benefits for resolving various traumas. This qualitative study proposes an art therapy treatment framework for cumulative trauma derived from semi-structured interviews with three art therapists and artistic…

  17. An Integrated Cumulative Transformation and Feature Fusion Approach for Bearing Degradation Prognostics

    Directory of Open Access Journals (Sweden)

    Lixiang Duan

    2018-01-01

    Full Text Available Aimed at degradation prognostics of a rolling bearing, this paper proposed a novel cumulative transformation algorithm for data processing and a feature fusion technique for bearing degradation assessment. First, a cumulative transformation is presented to map the original features extracted from a vibration signal to their respective cumulative forms. The technique not only makes the extracted features show a monotonic trend but also reduces the fluctuation; such properties are more propitious to reflect the bearing degradation trend. Then, a new degradation index system is constructed, which fuses multidimensional cumulative features by kernel principal component analysis (KPCA. Finally, an extreme learning machine model based on phase space reconstruction is proposed to predict the degradation trend. The model performance is experimentally validated with a whole-life experiment of a rolling bearing. The results prove that the proposed method reflects the bearing degradation process clearly and achieves a good balance between model accuracy and complexity.

  18. Event-triggered attitude control of spacecraft

    Science.gov (United States)

    Wu, Baolin; Shen, Qiang; Cao, Xibin

    2018-02-01

    The problem of spacecraft attitude stabilization control system with limited communication and external disturbances is investigated based on an event-triggered control scheme. In the proposed scheme, information of attitude and control torque only need to be transmitted at some discrete triggered times when a defined measurement error exceeds a state-dependent threshold. The proposed control scheme not only guarantees that spacecraft attitude control errors converge toward a small invariant set containing the origin, but also ensures that there is no accumulation of triggering instants. The performance of the proposed control scheme is demonstrated through numerical simulation.

  19. A proposed DT-seeded Muon Track Trigger for the HL-LHC

    CERN Document Server

    CMS Collaboration

    2015-01-01

    The LHC program after the observation of the candidate SM Higgs boson will continue with collisions at 13 and 14 TeV, which will help clarify future subjects of study and shape the tools needed to carry them on. Any upgrade of the LHC experiments for unprecedented luminosities, such as the HL-LHC ones, must then maintain the acceptance on electroweak processes that can lead to a detailed study of the properties of the candidate Higgs boson. The acceptance of the key leptonic, photonic and hadronic trigger objects should be kept such that the overall physics acceptance, in particular for low-mass scale processes, can be the same as the one the experiments featured in 2012. In such a scenario, a new approach to early trigger implementation is needed. One of the major steps to be taken is the exploitation of high-granularity tracking sub-detectors, such as the CMS Silicon Tracker, in taking the early trigger decision. Their inclusion into the trigger chain can be crucial in several tasks, including the confirmat...

  20. Symptom-triggered benzodiazepine therapy for alcohol withdrawal syndrome in the emergency department: a comparison with the standard fixed dose benzodiazepine regimen.

    LENUS (Irish Health Repository)

    Cassidy, Eugene M

    2012-10-01

    The aim of the study was to compare symptom-triggered and standard benzodiazepine regimens for the treatment of alcohol withdrawal syndrome in an emergency department clinical decision unit. The authors found that the symptom-triggered approach reduced cumulative benzodiazepine dose and length of stay.

  1. The Trigger Processor and Trigger Processor Algorithms for the ATLAS New Small Wheel Upgrade

    CERN Document Server

    Lazovich, Tomo; The ATLAS collaboration

    2015-01-01

    The ATLAS New Small Wheel (NSW) is an upgrade to the ATLAS muon endcap detectors that will be installed during the next long shutdown of the LHC. Comprising both MicroMegas (MMs) and small-strip Thin Gap Chambers (sTGCs), this system will drastically improve the performance of the muon system in a high cavern background environment. The NSW trigger, in particular, will significantly reduce the rate of fake triggers coming from track segments in the endcap not originating from the interaction point. We will present an overview of the trigger, the proposed sTGC and MM trigger algorithms, and the hardware implementation of the trigger. In particular, we will discuss both the heart of the trigger, an ATCA system with FPGA-based trigger processors (using the same hardware platform for both MM and sTGC triggers), as well as the full trigger electronics chain, including dedicated cards for transmission of data via GBT optical links. Finally, we will detail the challenges of ensuring that the trigger electronics can ...

  2. Rockfall triggering by cyclic thermal stressing of exfoliation fractures

    Science.gov (United States)

    Collins, Brian D.; Stock, Greg M.

    2016-01-01

    Exfoliation of rock deteriorates cliffs through the formation and subsequent opening of fractures, which in turn can lead to potentially hazardous rockfalls. Although a number of mechanisms are known to trigger rockfalls, many rockfalls occur during periods when likely triggers such as precipitation, seismic activity and freezing conditions are absent. It has been suggested that these enigmatic rockfalls may occur due to solar heating of rock surfaces, which can cause outward expansion. Here we use data from 3.5 years of field monitoring of an exfoliating granite cliff in Yosemite National Park in California, USA, to assess the magnitude and temporal pattern of thermally induced rock deformation. From a thermodynamic analysis, we find that daily, seasonal and annual temperature variations are sufficient to drive cyclic and cumulative opening of fractures. Application of fracture theory suggests that these changes can lead to further fracture propagation and the consequent detachment of rock. Our data indicate that the warmest times of the day and year are particularly conducive to triggering rockfalls, and that cyclic thermal forcing may enhance the efficacy of other, more typical rockfall triggers.

  3. Evaluation of potential meteorological triggers of large landslides in sensitive glaciomarine clay, eastern Canada

    Directory of Open Access Journals (Sweden)

    D. Gauthier

    2012-11-01

    Full Text Available Heavy rains spread over some interval preceding large landslides in sensitive glaciomarine clay in eastern Canada are often noted as a triggering or causative factor in case studies or research reports for individual landslides, although the quantity or duration of the triggering rain event has never been characterized adequately. We selected five large landslide events that occurred in the glaciomarine clay in eastern Canada, and calculated cumulative antecedent precipitation for intervals ranging between one and 365 days preceding each event. We also calculated the antecedent precipitation values for every other day in the record, and computed the relative rank of the landslide day within the complete record. Our results show that several intervals for each landslide event are highly ranked – including those preceding a presumably earthquake-triggered landslide – but overall the rankings were highly variable, ranging between 99% and 6%. The set of highest-ranking intervals are unique for each event, including both short and long-term cumulative precipitation. All of the landslides occurred in the spring months, and the release of sequestered surface and ground water during the spring ground thaw may be related to the timing of the large landslides, so that the evolution of ground frost in the early winter may be of interest for landslide prediction. We found no simple precipitation threshold for triggering large landslides in sensitive glaciomarine clay in eastern Canada, suggesting that some complex temporal and spatial combination of pre-conditions, external energy (e.g. earthquakes, precipitation triggers and other factors such as ground frost formation and thaw are required to trigger a landslide.

  4. Cumulative impact assessments and bird/wind farm interactions: Developing a conceptual framework

    International Nuclear Information System (INIS)

    Masden, Elizabeth A.; Fox, Anthony D.; Furness, Robert W.; Bullman, Rhys; Haydon, Daniel T.

    2010-01-01

    The wind power industry has grown rapidly in the UK to meet EU targets of sourcing 20% of energy from renewable sources by 2020. Although wind power is a renewable energy source, there are environmental concerns over increasing numbers of wind farm proposals and associated cumulative impacts. Individually, a wind farm, or indeed any action, may have minor effects on the environment, but collectively these may be significant, potentially greater than the sum of the individual parts acting alone. EU and UK legislation requires a cumulative impact assessment (CIA) as part of Environmental Impact Assessments (EIA). However, in the absence of detailed guidance and definitions, such assessments within EIA are rarely adequate, restricting the acquisition of basic knowledge about the cumulative impacts of wind farms on bird populations. Here we propose a conceptual framework to promote transparency in CIA through the explicit definition of impacts, actions and scales within an assessment. Our framework requires improved legislative guidance on the actions to include in assessments, and advice on the appropriate baselines against which to assess impacts. Cumulative impacts are currently considered on restricted scales (spatial and temporal) relating to individual development EIAs. We propose that benefits would be gained from elevating CIA to a strategic level, as a component of spatially explicit planning.

  5. Turning stumbling blocks into stepping stones in the analysis of cumulative impacts

    Science.gov (United States)

    Leslie M. Reid

    2004-01-01

    Federal and state legislation, such as the National Environmental Policy Act and the California Environmental Quality Act, require that responsible agency staff consider the cumulative impacts of proposed activities before permits are issued for certain kinds of public or private projects. The Council on Environmental Quality (CEQ 1997) defined a cumulative impact as...

  6. LHC-B trigger and data acquisition progress report

    CERN Document Server

    Dijkstra, H; Harris, Frank

    1997-01-01

    97-05 This report describes the progress since the Letter of Intent [1] in the development of the trigger and data acquisition system for LHC-B. The basic philosophy has changed significantly, with the proposal to implement tracking and vertex topology triggers in specialised hardware. This will be at an additional trigger level, giving 4 levels in total. We present details of the new proposal, together with preliminary requirements estimates, and some simulation results.

  7. Proposal for a model to assess the effect of seismic activity on the triggering of debris flows

    Science.gov (United States)

    Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Luna, Byron Quan; Nadim, Farrokh

    2013-04-01

    Landslide triggered by earthquakes is a serious threat for many communities around the world, and in some cases is known to have caused 25-50% of the earthquake fatalities. Seismic shaking can contribute to the triggering of debris flows either during the seismic event or indirectly by increasing the susceptibility of the slope to debris flow during intense rainfall in a period after the seismic event. The paper proposes a model to quantify both these effects. The model is based on an infinite slope formulation where precipitation and earthquakes influence the slope stability as follows: (1) During the shaking, the factor of safety is reduced due to cyclic pore pressure build-up where the cyclic pore pressure is modelled as a function of earthquake duration and intensity (measured as number of equivalent shear stress cycles and cyclic shear stress magnitude) and in-situ soil conditions (measured as average normalised shear stress). The model is calibrated using cyclic triaxial and direct simple shear (DSS) test data on clay and sand. (2) After the shaking, the factor of safety is modified using a combined empirical and analytical model that links observed earthquake induced changes in rainfall thresholds for triggering of debris flow to an equivalent reduction in soil shear strength. The empirical part uses data from past earthquakes to propose a conceptual model linking a site-specific reduction factor for rainfall intensity threshold (needed to trigger debris flows) to earthquake magnitude, distance from the epicentre and time period after the earthquake. The analytical part is a hydrological model for transient rainfall infiltration into an infinite slope in order to translate the change in rainfall intensity threshold into an equivalent reduction in soil shear strength. This is generalised into a functional form giving a site-specific shear strength reduction factor as function of earthquake history and soil conditions. The model is suitable for hazard and risk

  8. The STAR Level-3 trigger system

    International Nuclear Information System (INIS)

    Adler, C.; Berger, J.; Demello, M.; Dietel, T.; Flierl, D.; Landgraf, J.; Lange, J.S.; LeVine, M.J.; Ljubicic, A.; Nelson, J.; Roehrich, D.; Stock, R.; Struck, C.; Yepes, P.

    2003-01-01

    The STAR Level-3 trigger issues a trigger decision upon a complete online reconstruction of Au+Au collisions at relativistic heavy ion collider energies. Central interactions are processed up to a rate of 50 s -1 including a simple analysis of physics observables. The setup of the processor farm and the event reconstruction as well as experiences and the proposed trigger algorithms are described

  9. Genetic algorithm-based improved DOA estimation using fourth-order cumulants

    Science.gov (United States)

    Ahmed, Ammar; Tufail, Muhammad

    2017-05-01

    Genetic algorithm (GA)-based direction of arrival (DOA) estimation is proposed using fourth-order cumulants (FOC) and ESPRIT principle which results in Multiple Invariance Cumulant ESPRIT algorithm. In the existing FOC ESPRIT formulations, only one invariance is utilised to estimate DOAs. The unused multiple invariances (MIs) must be exploited simultaneously in order to improve the estimation accuracy. In this paper, a fitness function based on a carefully designed cumulant matrix is developed which incorporates MIs present in the sensor array. Better DOA estimation can be achieved by minimising this fitness function. Moreover, the effectiveness of Newton's method as well as GA for this optimisation problem has been illustrated. Simulation results show that the proposed algorithm provides improved estimation accuracy compared to existing algorithms, especially in the case of low SNR, less number of snapshots, closely spaced sources and high signal and noise correlation. Moreover, it is observed that the optimisation using Newton's method is more likely to converge to false local optima resulting in erroneous results. However, GA-based optimisation has been found attractive due to its global optimisation capability.

  10. Structure functions and particle production in the cumulative region: two different exponentials

    International Nuclear Information System (INIS)

    Braun, M.; Vechernin, V.

    1997-01-01

    In the framework of the recently proposed (QCD-based parton model for the cumulative phenomena in the interactions with nuclei two mechanisms for particle production, direct and spectator ones, are analyzed. It is shown that due to final-state interactions the leading terms of the direct mechanism contribution are cancelled and the spectator mechanism is the dominant one. It leads to a smaller slope of the cumulative particle production rates compared to the slope of the nuclear structure function in the cumulative region x ≥ 1, in agreement with the recent experimental data

  11. Lessons from (triggered) tremor

    Science.gov (United States)

    Gomberg, Joan

    2010-01-01

    I test a “clock-advance” model that implies triggered tremor is ambient tremor that occurs at a sped-up rate as a result of loading from passing seismic waves. This proposed model predicts that triggering probability is proportional to the product of the ambient tremor rate and a function describing the efficacy of the triggering wave to initiate a tremor event. Using data mostly from Cascadia, I have compared qualitatively a suite of teleseismic waves that did and did not trigger tremor with ambient tremor rates. Many of the observations are consistent with the model if the efficacy of the triggering wave depends on wave amplitude. One triggered tremor observation clearly violates the clock-advance model. The model prediction that larger triggering waves result in larger triggered tremor signals also appears inconsistent with the measurements. I conclude that the tremor source process is a more complex system than that described by the clock-advance model predictions tested. Results of this and previous studies also demonstrate that (1) conditions suitable for tremor generation exist in many tectonic environments, but, within each, only occur at particular spots whose locations change with time; (2) any fluid flow must be restricted to less than a meter; (3) the degree to which delayed failure and secondary triggering occurs is likely insignificant; and 4) both shear and dilatational deformations may trigger tremor. Triggered and ambient tremor rates correlate more strongly with stress than stressing rate, suggesting tremor sources result from time-dependent weakening processes rather than simple Coulomb failure.

  12. Cumulative Poisson Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  13. Multiparty correlation measure based on the cumulant

    International Nuclear Information System (INIS)

    Zhou, D. L.; Zeng, B.; Xu, Z.; You, L.

    2006-01-01

    We propose a genuine multiparty correlation measure for a multiparty quantum system as the trace norm of the cumulant of the state. The legitimacy of our multiparty correlation measure is explicitly demonstrated by proving it satisfies the five basic conditions required for a correlation measure. As an application we construct an efficient algorithm for the calculation of our measures for all stabilizer states

  14. Analysis of sensory ratings data with cumulative link models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Brockhoff, Per B.

    2013-01-01

    Examples of categorical rating scales include discrete preference, liking and hedonic rating scales. Data obtained on these scales are often analyzed with normal linear regression methods or with omnibus Pearson chi2 tests. In this paper we propose to use cumulative link models that allow for reg...

  15. Adaptive strategies for cumulative cultural learning.

    Science.gov (United States)

    Ehn, Micael; Laland, Kevin

    2012-05-21

    The demographic and ecological success of our species is frequently attributed to our capacity for cumulative culture. However, it is not yet known how humans combine social and asocial learning to generate effective strategies for learning in a cumulative cultural context. Here we explore how cumulative culture influences the relative merits of various pure and conditional learning strategies, including pure asocial and social learning, critical social learning, conditional social learning and individual refiner strategies. We replicate the Rogers' paradox in the cumulative setting. However, our analysis suggests that strategies that resolved Rogers' paradox in a non-cumulative setting may not necessarily evolve in a cumulative setting, thus different strategies will optimize cumulative and non-cumulative cultural learning. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Trigger tracking for the LHCb upgrade

    CERN Multimedia

    Dungs, K

    2014-01-01

    This poster presents a trigger system for the upgraded LHCb detector, scheduled to begin operation in 2020. The proposed trigger system is implemented entirely in software. We show that track reconstruction of a similar quality to that available in the offline algorithms can be performed on the full inelastic pp-collision rate. A track finding efficiency of 98.8% relative to offline can be achieved for good trigger tracks. The CPU time required for this reconstruction is less than 60% of the available budget.

  17. Review Document: Full Software Trigger

    CERN Document Server

    Albrecht, J; Raven, G

    2014-01-01

    This document presents a trigger system for the upgraded LHCb detector, scheduled to begin operation in 2020. This document serves as input for the internal review towards the "DAQ, online and trigger TDR". The proposed trigger system is implemented entirely in software. In this document we show that track reconstruction of a similar quality to that available in the offline algorithms can be performed on the full inelastic $pp$-collision rate, without prior event selections implemented in custom hardware and without relying upon a partial event reconstruction. A track nding eciency of 98.8 % relative to oine can be achieved for tracks with $p_T >$ 500 MeV/$c$. The CPU time required for this reconstruction is about 40 % of the available budget. Proof-of-principle selections are presented which demonstrate that excellent performance is achievable using an inclusive beauty trigger, in addition to exclusive beauty and charm triggers. Finally, it is shown that exclusive beauty and charm selections that do not intr...

  18. Elaboration of a concept for the cumulative environmental exposure assessment of biocides

    Energy Technology Data Exchange (ETDEWEB)

    Gross, Rita; Bunke, Dirk; Moch, Katja [Oeko-Institut e.V. - Institut fuer Angewandte Oekologie e.V., Freiburg im Breisgau (Germany); Gartiser, Stefan [Hydrotox GmbH, Freiburg im Breisgau (Germany)

    2011-12-15

    Article 10(1) of the EU Biocidal Products Directive 98/8/EC (BPD) requires that for the inclusion of an active substance in Annex I, Annex IA or IB, cumulation effects from the use of biocidal products containing the same active substance shall be taken into account, where relevant. The study proves the feasibility of a technical realisation of Article 10(1) of the BPD and elaborates a first concept for the cumulative environmental exposure assessment of biocides. Existing requirements concerning cumulative assessments in other regulatory frameworks have been evaluated and their applicability for biocides has been examined. Technical terms and definitions used in this context were documented with the aim to harmonise terminology with other frameworks and to set up a precise definition within the BPD. Furthermore, application conditions of biocidal products have been analysed to find out for which cumulative exposure assessments may be relevant. Different parameters were identified which might serve as indicators for the relevance of cumulative exposure assessments. These indicators were then integrated in a flow chart by means of which the relevance of cumulative exposure assessments can be checked. Finally, proposals for the technical performance of cumulative exposure assessments within the Review Programme have been elaborated with the aim to bring the results of the project into the upcoming development and harmonization processes on EU level. (orig.)

  19. Triggering soft bombs at the LHC

    Science.gov (United States)

    Knapen, Simon; Griso, Simone Pagan; Papucci, Michele; Robinson, Dean J.

    2017-08-01

    Very high multiplicity, spherically-symmetric distributions of soft particles, with p T ˜ few×100 MeV, may be a signature of strongly-coupled hidden valleys that exhibit long, efficient showering windows. With traditional triggers, such `soft bomb' events closely resemble pile-up and are therefore only recorded with minimum bias triggers at a very low efficiency. We demonstrate a proof-of-concept for a high-level triggering strategy that efficiently separates soft bombs from pile-up by searching for a `belt of fire': a high density band of hits on the innermost layer of the tracker. Seeding our proposed high-level trigger with existing jet, missing transverse energy or lepton hardware-level triggers, we show that net trigger efficiencies of order 10% are possible for bombs of mass several × 100 GeV. We also consider the special case that soft bombs are the result of an exotic decay of the 125 GeV Higgs. The fiducial rate for `Higgs bombs' triggered in this manner is marginally higher than the rate achievable by triggering directly on a hard muon from associated Higgs production.

  20. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The hardware of the trigger components has been mostly finished. The ECAL Endcap Trigger Concentrator Cards (TCC) are in production while Barrel TCC firmware has been upgraded, and the Trigger Primitives can now be stored by the Data Concentrator Card for readout by the DAQ. The Regional Calorimeter Trigger (RCT) system is complete, and the timing is being finalized. All 502 HCAL trigger links to RCT run without error. The HCAL muon trigger timing has been equalized with DT, RPC, CSC and ECAL. The hardware and firmware for the Global Calorimeter Trigger (GCT) jet triggers are being commissioned and data from these triggers is available for readout. The GCT energy sums from rings of trigger towers around the beam pipe beam have been changed to include two rings from both sides. The firmware for Drift Tube Track Finder, Barrel Sorter and Wedge Sorter has been upgraded, and the synchronization of the DT trigger is satisfactory. The CSC local trigger has operated flawlessly u...

  1. Complete cumulative index (1963-1983)

    International Nuclear Information System (INIS)

    1983-01-01

    This complete cumulative index covers all regular and special issues and supplements published by Atomic Energy Review (AER) during its lifetime (1963-1983). The complete cumulative index consists of six Indexes: the Index of Abstracts, the Subject Index, the Title Index, the Author Index, the Country Index and the Table of Elements Index. The complete cumulative index supersedes the Cumulative Indexes for Volumes 1-7: 1963-1969 (1970), and for Volumes 1-10: 1963-1972 (1972); this Index also finalizes Atomic Energy Review, the publication of which has recently been terminated by the IAEA

  2. L1 track finding for a time multiplexed trigger

    Energy Technology Data Exchange (ETDEWEB)

    Cieri, D., E-mail: davide.cieri@bristol.ac.uk [University of Bristol, Bristol (United Kingdom); Rutherford Appleton Laboratory, Didcot (United Kingdom); Brooke, J.; Grimes, M. [University of Bristol, Bristol (United Kingdom); Newbold, D. [University of Bristol, Bristol (United Kingdom); Rutherford Appleton Laboratory, Didcot (United Kingdom); Harder, K.; Shepherd-Themistocleous, C.; Tomalin, I. [Rutherford Appleton Laboratory, Didcot (United Kingdom); Vichoudis, P. [CERN, Geneva (Switzerland); Reid, I. [Brunel University, London (United Kingdom); Iles, G.; Hall, G.; James, T.; Pesaresi, M.; Rose, A.; Tapper, A.; Uchida, K. [Imperial College, London (United Kingdom)

    2016-07-11

    At the HL-LHC, proton bunches will cross each other every 25 ns, producing an average of 140 pp-collisions per bunch crossing. To operate in such an environment, the CMS experiment will need a L1 hardware trigger able to identify interesting events within a latency of 12.5 μs. The future L1 trigger will make use also of data coming from the silicon tracker to control the trigger rate. The architecture that will be used in future to process tracker data is still under discussion. One interesting proposal makes use of the Time Multiplexed Trigger concept, already implemented in the CMS calorimeter trigger for the Phase I trigger upgrade. The proposed track finding algorithm is based on the Hough Transform method. The algorithm has been tested using simulated pp-collision data. Results show a very good tracking efficiency. The algorithm will be demonstrated in hardware in the coming months using the MP7, which is a μTCA board with a powerful FPGA capable of handling data rates approaching 1 Tb/s.

  3. L1 Track Finding for a Time Multiplexed Trigger

    CERN Document Server

    AUTHOR|(CDS)2090481; Grimes, M.; Newbold, D.; Harder, K.; Shepherd-Themistocleous, C.; Tomalin, I.; Vichoudis, P.; Reid, I.; Iles, G.; Hall, G.; James, T.; Pesaresi, M.; Rose, A.; Tapper, A.; Uchida, K.

    2016-01-01

    At the HL-LHC, proton bunches will cross each other every 25 ns, producing an average of 140 p p-collisions per bunch crossing. To operate in such an environment, the CMS experiment will need a L1 hardware trigger able to identify interesting events within a latency of 12.5 us. The future L1 trigger will make use also of data coming from the silicon tracker to control the trigger rate. The architecture that will be used in future to process tracker data is still under discussion. One interesting proposal makes use of the Time Multiplexed Trigger concept, already implemented in the CMS calorimeter trigger for the Phase I trigger upgrade. The proposed track finding algorithm is based on the Hough Transform method. The algorithm has been tested using simulated pp-collision data. Results show a very good tracking efficiency. The algorithm will be demonstrated in hardware in the coming months using the MP7, which is a uTCA board with a powerful FPGA capable of handling data rates approaching 1 Tb/s.

  4. Trigger release mechanism for release of mine water to Magela Creek

    International Nuclear Information System (INIS)

    McQuade, C.V.; McGill, R.A.

    1988-01-01

    The Ranger Uranium Mine is surrounded by a World Heritage National Park. The strict environmental controls under which the mine operates are based on scientific and social requirements. Release of non-process storm runoff water to the Magela Creek during flood discharge and under controlled conditions has been identified as best practicable technology for the operation of the water management system. Social and political factors have limited this release to a wet season with an annual exceedance probability of one in ten. The first-generation trigger mechanism was based on a percentile analysis of monthly rainfall. The second-generation trigger is based on cumulative monthly volume increase in the retention ponds and is considered to be more applicable to the operation of the mine water management system. 6 figs., 2 tabs

  5. On the mechanism of hadron cumulative production on nucleus

    International Nuclear Information System (INIS)

    Efremov, A.V.

    1976-01-01

    A mechanism of cumulative production of hadrons on nucleus is proposed which is similar to that of high perpendicular hadron production. The cross section obtained describes the main qualitative features of such prosesses, e.g., initial energy dependence atomic number behaviour, dependence on the rest mass of the produced particle and its production angle

  6. Robust self-triggered MPC for constrained linear systems

    NARCIS (Netherlands)

    Brunner, F.D.; Heemels, W.P.M.H.; Allgöwer, F.

    2014-01-01

    In this paper we propose a robust self-triggered model predictive control algorithm for linear systems with additive bounded disturbances and hard constraints on the inputs and state. In self-triggered control, at every sampling instant the time until the next sampling instant is computed online

  7. Event-triggered decentralized adaptive fault-tolerant control of uncertain interconnected nonlinear systems with actuator failures.

    Science.gov (United States)

    Choi, Yun Ho; Yoo, Sung Jin

    2018-06-01

    This paper investigates the event-triggered decentralized adaptive tracking problem of a class of uncertain interconnected nonlinear systems with unexpected actuator failures. It is assumed that local control signals are transmitted to local actuators with time-varying faults whenever predefined conditions for triggering events are satisfied. Compared with the existing control-input-based event-triggering strategy for adaptive control of uncertain nonlinear systems, the aim of this paper is to propose a tracking-error-based event-triggering strategy in the decentralized adaptive fault-tolerant tracking framework. The proposed approach can relax drastic changes in control inputs caused by actuator faults in the existing triggering strategy. The stability of the proposed event-triggering control system is analyzed in the Lyapunov sense. Finally, simulation comparisons of the proposed and existing approaches are provided to show the effectiveness of the proposed theoretical result in the presence of actuator faults. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Cumulative carbon as a policy framework for achieving climate stabilization

    Science.gov (United States)

    Matthews, H. Damon; Solomon, Susan; Pierrehumbert, Raymond

    2012-01-01

    The primary objective of the United Nations Framework Convention on Climate Change is to stabilize greenhouse gas concentrations at a level that will avoid dangerous climate impacts. However, greenhouse gas concentration stabilization is an awkward framework within which to assess dangerous climate change on account of the significant lag between a given concentration level and the eventual equilibrium temperature change. By contrast, recent research has shown that global temperature change can be well described by a given cumulative carbon emissions budget. Here, we propose that cumulative carbon emissions represent an alternative framework that is applicable both as a tool for climate mitigation as well as for the assessment of potential climate impacts. We show first that both atmospheric CO2 concentration at a given year and the associated temperature change are generally associated with a unique cumulative carbon emissions budget that is largely independent of the emissions scenario. The rate of global temperature change can therefore be related to first order to the rate of increase of cumulative carbon emissions. However, transient warming over the next century will also be strongly affected by emissions of shorter lived forcing agents such as aerosols and methane. Non-CO2 emissions therefore contribute to uncertainty in the cumulative carbon budget associated with near-term temperature targets, and may suggest the need for a mitigation approach that considers separately short- and long-lived gas emissions. By contrast, long-term temperature change remains primarily associated with total cumulative carbon emissions owing to the much longer atmospheric residence time of CO2 relative to other major climate forcing agents. PMID:22869803

  9. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The trigger synchronization procedures for running with cosmic muons and operating with the LHC were reviewed during the May electronics week. Firmware maintenance issues were also reviewed. Link tests between the new ECAL endcap trigger concentrator cards (TCC48) and the Regional Calorimeter Trigger have been performed. Firmware for the energy sum triggers and an upgraded tau trigger of the Global Calorimeter Triggers has been developed and is under test. The optical fiber receiver boards for the Track-Finder trigger theta links of the DT chambers are now all installed. The RPC trigger is being made more robust by additional chamber and cable shielding and also by firmware upgrades. For the CSC’s the front-end and trigger motherboard firmware have been updated. New RPC patterns and DT/CSC lookup tables taking into account phi asymmetries in the magnetic field configuration are under study. The motherboard for the new pipeline synchronizer of the Global Trigg...

  10. Finite-volume cumulant expansion in QCD-colorless plasma

    Energy Technology Data Exchange (ETDEWEB)

    Ladrem, M. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); Physics Department, Algiers (Algeria); ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Ahmed, M.A.A. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Taiz University in Turba, Physics Department, Taiz (Yemen); Alfull, Z.Z. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); Cherif, S. [ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Ghardaia University, Sciences and Technologies Department, Ghardaia (Algeria)

    2015-09-15

    Due to the finite-size effects, the localization of the phase transition in finite systems and the determination of its order, become an extremely difficult task, even in the simplest known cases. In order to identify and locate the finite-volume transition point T{sub 0}(V) of the QCD deconfinement phase transition to a colorless QGP, we have developed a new approach using the finite-size cumulant expansion of the order parameter and the L{sub mn}-method. The first six cumulants C{sub 1,2,3,4,5,6} with the corresponding under-normalized ratios (skewness Σ, kurtosis κ, pentosis Π{sub ±}, and hexosis H{sub 1,2,3}) and three unnormalized combinations of them, (O = σ{sup 2}κΣ{sup -1},U = σ{sup -2}Σ{sup -1},N = σ{sup 2}κ) are calculated and studied as functions of (T, V). A new approach, unifying in a clear and consistent way the definitions of cumulant ratios, is proposed.Anumerical FSS analysis of the obtained results has allowed us to locate accurately the finite-volume transition point. The extracted transition temperature value T{sub 0}(V) agrees with that expected T{sub 0}{sup N}(V) from the order parameter and the thermal susceptibility χ{sub T} (T, V), according to the standard procedure of localization to within about 2%. In addition to this, a very good correlation factor is obtained proving the validity of our cumulants method. The agreement of our results with those obtained by means of other models is remarkable. (orig.)

  11. Cumulative effects of wind turbines. A guide to assessing the cumulative effects of wind energy development

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This guidance provides advice on how to assess the cumulative effects of wind energy developments in an area and is aimed at developers, planners, and stakeholders interested in the development of wind energy in the UK. The principles of cumulative assessment, wind energy development in the UK, cumulative assessment of wind energy development, and best practice conclusions are discussed. The identification and assessment of the cumulative effects is examined in terms of global environmental sustainability, local environmental quality and socio-economic activity. Supplementary guidance for assessing the principle cumulative effects on the landscape, on birds, and on the visual effect is provided. The consensus building approach behind the preparation of this guidance is outlined in the annexes of the report.

  12. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The production of the trigger hardware is now basically finished, and in time for the turn-on of the LHC. The last boards produced are the Trigger Concentrator Cards for the ECAL Endcaps (TCC-EE). After the recent installation of the four EE Dees, the TCC-EE prototypes were used for their commissioning. Production boards are arriving and are being tested continuously, with the last ones expected in November. The Regional Calorimeter Trigger hardware is fully integrated after installation of the last EE cables. Pattern tests from the HCAL up to the GCT have been performed successfully. The HCAL triggers are fully operational, including the connection of the HCAL-outer and forward-HCAL (HO/HF) technical triggers to the Global Trigger. The HCAL Trigger and Readout (HTR) board firmware has been updated to permit recording of the tower “feature bit” in the data. The Global Calorimeter Trigger hardware is installed, but some firmware developments are still n...

  13. Empirical rainfall thresholds for the triggering of landslides in Asturias (NW Spain)

    Science.gov (United States)

    Valenzuela, Pablo; Luís Zêzere, José; José Domínguez-Cuesta, María; Mora García, Manuel Antonio

    2017-04-01

    Rainfall-triggered landslides are common and widespread phenomena in Asturias, a mountainous region in the NW of Spain where the climate is characterized by average annual precipitation and temperature values of 960 mm and 13.3°C respectively. Different types of landslides (slides, flows and rockfalls) frequently occur during intense rainfall events, causing every year great economic losses and sometimes human injuries or fatalities. For this reason, its temporal forecast is of great interest. The main goal of the present research is the calculation of empirical rainfall thresholds for the triggering of landslides in the Asturian region, following the methodology described by Zêzere et al., 2015. For this purpose, data from 559 individual landslides collected from press archives during a period of eight hydrological years (October 2008-September 2016) and gathered within the BAPA landslide database (http://geol.uniovi.es/BAPA) were used. Precipitation data series of 37 years came from 6 weather stations representative of the main geographical and climatic conditions within the study area. Applied methodology includes: (i) the definition of landslide events, (ii) the reconstruction of the cumulative antecedent rainfall for each event from 1 to 90 consecutive days, (iii) the estimation of the return period for each cumulated rainfall-duration condition using Gumbel probability distribution, (iv) the definition of the critical cumulated rainfall-duration conditions taking into account the highest return period, (v) the calculation of the thresholds considering both the conditions for the occurrence and non-occurrence of landslides. References: Zêzere, J.L., Vaz, T., Pereira, S., Oliveira, S.C., Marqués, R., García, R.A.C. 2015. Rainfall thresholds for landslide activity in Portugal: a state of the art. Environmental Earth Sciences, 73, 2917-2936. doi: 10.1007/s12665-014-3672-0

  14. Determining rainfall thresholds that trigger landslides in Colombia

    International Nuclear Information System (INIS)

    Mayorga Marquez, Ruth

    2003-01-01

    Considering that rainfall is the natural event that more often triggers landslides, it is important to study the relationship between this phenomenon and the occurrence of earth mass movements, by determining rainfall thresholds that trigger landslides in different zones of Colombia. The research presents a methodology that allows proposing rainfall thresholds that trigger landslides in Colombia, by means of a relationship between the accumulated rain in the soil (antecedent rainfall) and the rain that falls the day of the landslide occurrence (event rainfall)

  15. Preference, resistance to change, and the cumulative decision model.

    Science.gov (United States)

    Grace, Randolph C

    2018-01-01

    According to behavioral momentum theory (Nevin & Grace, 2000a), preference in concurrent chains and resistance to change in multiple schedules are independent measures of a common construct representing reinforcement history. Here I review the original studies on preference and resistance to change in which reinforcement variables were manipulated parametrically, conducted by Nevin, Grace and colleagues between 1997 and 2002, as well as more recent research. The cumulative decision model proposed by Grace and colleagues for concurrent chains is shown to provide a good account of both preference and resistance to change, and is able to predict the increased sensitivity to reinforcer rate and magnitude observed with constant-duration components. Residuals from fits of the cumulative decision model to preference and resistance to change data were positively correlated, supporting the prediction of behavioral momentum theory. Although some questions remain, the learning process assumed by the cumulative decision model, in which outcomes are compared against a criterion that represents the average outcome value in the current context, may provide a plausible model for the acquisition of differential resistance to change. © 2018 Society for the Experimental Analysis of Behavior.

  16. TRIGGER

    CERN Multimedia

    by Wesley Smith

    2010-01-01

    Level-1 Trigger Hardware and Software The overall status of the L1 trigger has been excellent and the running efficiency has been high during physics fills. The timing is good to about 1%. The fine-tuning of the time synchronization of muon triggers is ongoing and will be completed after more than 10 nb-1 of data have been recorded. The CSC trigger primitive and RPC trigger timing have been refined. A new configuration for the CSC Track Finder featured modified beam halo cuts and improved ghost cancellation logic. More direct control was provided for the DT opto-receivers. New RPC Cosmic Trigger (RBC/TTU) trigger algorithms were enabled for collision runs. There is further work planned during the next technical stop to investigate a few of the links from the ECAL to the Regional Calorimeter Trigger (RCT). New firmware and a new configuration to handle trigger rate spikes in the ECAL barrel are also being tested. A board newly developed by the tracker group (ReTRI) has been installed and activated to block re...

  17. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  18. An experimental investigation of triggered film boiling destabilisation

    International Nuclear Information System (INIS)

    Naylor, P.

    1985-03-01

    Film boiling was established on a polished brass rod in water, collapse being initiated by either a pressure pulse or a transient bulk water flow. This work is relevant to the triggering stage of a molten fuel-coolant interaction, and a criterion is proposed for triggered film boiling collapse with pressure pulse. (U.K.)

  19. Near-Field Source Localization Using a Special Cumulant Matrix

    Science.gov (United States)

    Cui, Han; Wei, Gang

    A new near-field source localization algorithm based on a uniform linear array was proposed. The proposed algorithm estimates each parameter separately but does not need pairing parameters. It can be divided into two important steps. The first step is bearing-related electric angle estimation based on the ESPRIT algorithm by constructing a special cumulant matrix. The second step is the other electric angle estimation based on the 1-D MUSIC spectrum. It offers much lower computational complexity than the traditional near-field 2-D MUSIC algorithm and has better performance than the high-order ESPRIT algorithm. Simulation results demonstrate that the performance of the proposed algorithm is close to the Cramer-Rao Bound (CRB).

  20. TRIGGER

    CERN Multimedia

    Roberta Arcidiacono

    2013-01-01

    Trigger Studies Group (TSG) The Trigger Studies Group has just concluded its third 2013 workshop, where all POGs presented the improvements to the physics object reconstruction, and all PAGs have shown their plans for Trigger development aimed at the 2015 High Level Trigger (HLT) menu. The Strategy for Trigger Evolution And Monitoring (STEAM) group is responsible for Trigger menu development, path timing, Trigger performance studies coordination, HLT offline DQM as well as HLT release, menu and conditions validation – this last task in collaboration with PdmV (Physics Data and Monte Carlo Validation group). In the last months the group has delivered several HLT rate estimates and comparisons, using the available data and Monte Carlo samples. The studies were presented at the Trigger workshops in September and December, and STEAM has contacted POGs and PAGs to understand the origin of the discrepancies observed between 8 TeV data and Monte Carlo simulations. The most recent results show what the...

  1. 32 CFR 651.16 - Cumulative impacts.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Cumulative impacts. 651.16 Section 651.16... § 651.16 Cumulative impacts. (a) NEPA analyses must assess cumulative effects, which are the impact on the environment resulting from the incremental impact of the action when added to other past, present...

  2. Polarization in high Psub(trans) and cumulative hadron production

    International Nuclear Information System (INIS)

    Efremov, A.V.

    1978-01-01

    The final hadron polarization in the high Psub(trans) processes is analyzed in the parton hard scattering picture. Scaling assumption allows a correct qualitative description to be given for the Psub(trans)-behaviour of polarization or escape angle behaviour in cumulative production. The energy scaling and weak dependence on the beam and target type is predicted. A method is proposed for measuring the polarization of hadron jets

  3. Cumulative Clearness Index Frequency Distributions on the Territory of the Russian Federation

    Science.gov (United States)

    Frid, S. E.; Lisitskaya, N. V.; Popel, O. S.

    2018-02-01

    Cumulative distributions of clearness index values are constructed for the territory of Russia based on ground observation results and NASA POWER data. The obtained distributions lie close to each other, which means that the NASA POWER data can be used in solar power installations simulation at temperate and high latitudes. Approximation of the obtained distributions is carried out. The values of equation coefficients for the cumulative clearness index distributions constructed for a wide range of climatic conditions are determined. Equations proposed for a tropical climate are used in the calculations, so they can be regarded as universal ones.

  4. Study of a Level-3 Tau Trigger with the Pixel Detector

    CERN Document Server

    Kotlinski, Danek; Nikitenko, Alexander

    2001-01-01

    We present a Monte Carlo study of the performance of a Level-3 Tau trigger based on the Pixel Detector data. The trigger is designed to select of the Higgs bosons decaying into two tau leptons with tau jet(s) in the final state. The proposed trigger is particularly useful as it operates at an early stage of the CMS High Level Trigger system. The performance of the trigger is studied for the most difficult case of high luminosity LHC scenario.

  5. The BTeV trigger system

    International Nuclear Information System (INIS)

    Kaplan, D.M.

    2000-01-01

    BTeV is a dedicated beauty and charm experiment proposed for the Fermilab Tevatron. The broad physics program envisaged for BTeV requires a trigger that is efficient for a wide variety of heavy-quark decays, including those to all-hadronic final states. To achieve this, we plan to trigger on evidence of detached vertices at the very first trigger level, taking advantage of fast-readout pixel detectors to facilitate fast pattern recognition. Simulations show that 100-to-1 rejection of light-quark background events can be achieved at Level 1 using specialized trackfinding hardware, and that an additional factor of 10-100 in data reduction can be achieved using general purpose processor farms at Levels 2 and 3. This is adequate to allow data taking at luminosities in excess of 2x10 32 cm -2 s -1

  6. TRIGGER

    CERN Multimedia

    W. Smith

    2010-01-01

    Level-1 Trigger Hardware and Software The Level-1 Trigger hardware has performed well during both the recent proton-proton and heavy ion running. Efforts were made to improve the visibility and handling of alarms and warnings. The tracker ReTRI boards that prevent fixed frequencies of Level-1 Triggers are now configured through the Trigger Supervisor. The Global Calorimeter Trigger (GCT) team has introduced a buffer cleanup procedure at stops and a reset of the QPLL during configuring to ensure recalibration in case of a switch from the LHC clock to the local clock. A device to test the cables between the Regional Calorimeter Trigger and the GCT has been manufactured. A wrong charge bit was fixed in the CSC Trigger. The ECAL group is improving crystal masking and spike suppression in the trigger primitives. New firmware for the Drift Tube Track Finder (DTTF) sorters was developed to improve fake track tagging and sorting. Zero suppression was implemented in the DT Sector Collector readout. The track finder b...

  7. TRIGGER

    CERN Multimedia

    Wesley Smith

    Trigger Hardware The status of the trigger components was presented during the September CMS Week and Annual Review and at the monthly trigger meetings in October and November. Procedures for cold and warm starts (e.g. refreshing of trigger parameters stored in registers) of the trigger subsystems have been studied. Reviews of parts of the Global Calorimeter Trigger (GCT) and the Global Trigger (GT) have taken place in October and November. The CERN group summarized the status of the Trigger Timing and Control (TTC) system. All TTC crates and boards are installed in the underground counting room, USC55. The central clock system will be upgraded in December (after the Global Run at the end of November GREN) to the new RF2TTC LHC machine interface timing module. Migration of subsystem's TTC PCs to SLC4/ XDAQ 3.12 is being prepared. Work is on going to unify the access to Local Timing Control (LTC) and TTC CMS interface module (TTCci) via SOAP (Simple Object Access Protocol, a lightweight XML-based messaging ...

  8. Cluster observations and simulations of He+ EMIC triggered emissions

    Science.gov (United States)

    Grison, B.; Shoji, M.; Santolik, O.; Omura, Y.

    2012-12-01

    EMIC triggered emissions have been reported in the inner magnetosphere at the edge of the plasmapause nightside [Pickett et al., 2010]. The generation mechanism proposed by Omura et al. [2010] is very similar to the one of the whistler chorus emissions and simulation results agree with observations and theory [Shoji et Omura, 2011]. The main characteristics of these emissions generated in the magnetic equatorial plane region are a frequency with time dispersion and a high level of coherence. The start frequency of previously mentioned observations is above half of the proton gyrofrequency. It means that the emissions are generated on the proton branch. On the He+ branch, generation of triggered emissions, in the same region, requests more energetic protons and the triggering process starts below the He+ gyrofrequency. It makes their identification in Cluster data rather difficult. Recent simulation results confirm the possibility of EMIC triggered emission on the He+ branch. In the present contribution we propose to compare a Cluster event to simulation results in order to investigate the possibility to identify observations to a He+ triggered emission. The impact of the observed waves on particle precipitation is also investigated.

  9. A study of the relationship between peak skin dose and cumulative air kerma in interventional neuroradiology and cardiology

    International Nuclear Information System (INIS)

    Neil, S; Padgham, C; Martin, C J

    2010-01-01

    A study of peak skin doses (PSDs) during neuroradiology and cardiology interventional procedures has been carried out using Gafchromic XR-RV2 film. Use of mosaics made from squares held in cling film has allowed doses to the head to be mapped successfully. The displayed cumulative air kerma (CAK) has been calibrated in terms of cumulative entrance surface dose (CESD) and results indicate that this can provide a reliable indicator of the PSD in neuroradiology. Results linking PSD to CESD for interventional cardiology were variable, but CAK is still considered to provide the best option for use as an indicator of potential radiation-induced effects. A CESD exceeding 3 Gy is considered a suitable action level for triggering follow-up of patients in neuroradiology and cardiology for possible skin effects. Application of dose action levels defined in this way would affect 8% of neurological embolisation procedures and 5% of cardiology ablation and multiple stent procedures at the hospitals where the investigations were carried out. A close relationship was observed between CESD and dose-area product (DAP) for particular types of procedure, and DAPs of 200-300 Gy cm 2 could be used as trigger levels where CAK readings were not available. The DAP value would depend on the mean field size and would need to be determined for each application.

  10. Performance of the ALICE PHOS trigger and improvements for RUN 2

    International Nuclear Information System (INIS)

    Zhao, C; Røed, K; Skaali, T B; Liu, L; Rohrich, D; Kharlov, Y; Bratrud, L; Alme, J

    2013-01-01

    This paper will discuss the performance of the PHOS level-0 trigger and planned improvements for RUN 2. Due to hardware constraints the Trigger Region Unit boards are limited to an operating frequency of 20 MHz. This has led to some ambiguity and biases of the trigger inputs. The trigger input generation scheme was therefore optimized to improve the performance. The PHOS level-0 trigger system has been working with an acceptable efficiency and purity. Proposed actions to further improve the performance and possibly eliminate the impact of the biased trigger inputs will also be presented

  11. Divergent Cumulative Cultural Evolution

    OpenAIRE

    Marriott, Chris; Chebib, Jobran

    2016-01-01

    Divergent cumulative cultural evolution occurs when the cultural evolutionary trajectory diverges from the biological evolutionary trajectory. We consider the conditions under which divergent cumulative cultural evolution can occur. We hypothesize that two conditions are necessary. First that genetic and cultural information are stored separately in the agent. Second cultural information must be transferred horizontally between agents of different generations. We implement a model with these ...

  12. Feasibility studies of a Level-1 Tracking Trigger for ATLAS

    CERN Document Server

    Warren, M; Brenner, R; Konstantinidis, N; Sutton, M

    2009-01-01

    The existing ATLAS Level-1 trigger system is seriously challenged at the SLHC's higher luminosity. A hardware tracking trigger might be needed, but requires a detailed understanding of the detector. Simulation of high pile-up events, with various data-reduction techniques applied will be described. Two scenarios are envisaged: (a) regional readout - calorimeter and muon triggers are used to identify portions of the tracker; and (b) track-stub finding using special trigger layers. A proposed hardware system, including data reduction on the front-end ASICs, readout within a super-module and integrating regional triggering into all levels of the readout system, will be discussed.

  13. Cumulative organic anion transporter-mediated drug-drug interaction potential of multiple components in salvia miltiorrhiza (danshen) preparations.

    Science.gov (United States)

    Wang, Li; Venitz, Jürgen; Sweet, Douglas H

    2014-12-01

    To evaluate organic anion transporter-mediated drug-drug interaction (DDI) potential for individual active components of Danshen (Salvia miltiorrhiza) vs. combinations using in vitro and in silico approaches. Inhibition profiles for single Danshen components and combinations were generated in stably-expressing human (h)OAT1 and hOAT3 cells. Plasma concentration-time profiles for compounds were estimated from in vivo human data using an i.v. two-compartment model (with first-order elimination). The cumulative DDI index was proposed as an indicator of DDI potential for combination products. This index was used to evaluate the DDI potential for Danshen injectables from 16 different manufacturers and 14 different lots from a single manufacturer. The cumulative DDI index predicted in vivo inhibition potentials, 82% (hOAT1) and 74% (hOAT3), comparable with those observed in vitro, 72 ± 7% (hOAT1) and 81 ± 10% (hOAT3), for Danshen component combinations. Using simulated unbound Cmax values, a wide range in cumulative DDI index between manufacturers, and between lots, was predicted. Many products exhibited a cumulative DDI index > 1 (50% inhibition). Danshen injectables will likely exhibit strong potential to inhibit hOAT1 and hOAT3 function in vivo. The proposed cumulative DDI index might improve prediction of DDI potential of herbal medicines or pharmaceutical preparations containing multiple components.

  14. Cumulative emission budgets and their implications: the case for SAFE carbon

    Science.gov (United States)

    Allen, Myles; Bowerman, Niel; Frame, David; Mason, Charles

    2010-05-01

    The risk of dangerous long-term climate change due to anthropogenic carbon dioxide emissions is predominantly determined by cumulative emissions over all time, not the rate of emission in any given year or commitment period. This has profound implications for climate mitigation policy: emission targets for specific years such as 2020 or 2050 provide no guarantee of meeting any overall cumulative emission budget. By focusing attention on short-term measures to reduce the flow of emissions, they may even exacerbate the overall long-term stock. Here we consider how climate policies might be designed explicitly to limit cumulative emissions to, for example, one trillion tonnes of carbon, a figure that has been estimated to give a most likely warming of two degrees above pre-industrial, with a likely range of 1.6-2.6 degrees. Three approaches are considered: tradable emission permits with the possibility of indefinite emission banking, carbon taxes explicitly linked to cumulative emissions and mandatory carbon sequestration. Framing mitigation policy around cumulative targets alleviates the apparent tension between climate protection and short-term consumption that bedevils any attempt to forge global agreement. We argue that the simplest and hence potentially the most effective approach might be a mandatory requirement on the fossil fuel industry to ensure that a steadily increasing fraction of fossil carbon extracted from the ground is artificially removed from the active carbon cycle through some form of sequestration. We define Sequestered Adequate Fraction of Extracted (SAFE) carbon as a source in which this sequestered fraction is anchored to cumulative emissions, increasing smoothly to reach 100% before we release the trillionth tonne. While adopting the use of SAFE carbon would increase the cost of fossil energy much as a system of emission permits or carbon taxes would, it could do so with much less explicit government intervention. We contrast this proposal

  15. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The final parts of the Level-1 trigger hardware are now being put in place. For the ECAL endcaps, more than half of the Trigger Concentrator Cards for the ECAL Endcap (TCC-EE) are now available at CERN, such that one complete endcap can be covered. The Global Trigger now correctly handles ECAL calibration sequences, without being influenced by backpressure. The Regional Calorimeter Trigger (RCT) hardware is complete and working in USC55. Intra-crate tests of all 18 RCT crates and the Global Calorimeter Trigger (GCT) are regularly taking place. Pattern tests have successfully captured data from HCAL through RCT to the GCT Source Cards. HB/HE trigger data are being compared with emulator results to track down the very few remaining hardware problems. The treatment of hot and dead cells, including their recording in the database, has been defined. For the GCT, excellent agreement between the emulator and data has been achieved for jets and HF ET sums. There is still som...

  16. TRIGGER

    CERN Multimedia

    W. Smith

    Level-1 Trigger Hardware and Software The trigger system has been constantly in use in cosmic and commissioning data taking periods. During CRAFT running it delivered 300 million muon and calorimeter triggers to CMS. It has performed stably and reliably. During the abort gaps it has also provided laser and other calibration triggers. Timing issues, namely synchronization and latency issues, have been solved. About half of the Trigger Concentrator Cards for the ECAL Endcap (TCC-EE) are installed, and the firmware is being worked on. The production of the other half has started. The HCAL Trigger and Readout (HTR) card firmware has been updated, and new features such as fast parallel zero-suppression have been included. Repairs of drift tube (DT) trigger mini-crates, optical links and receivers of sector collectors are under way and have been completed on YB0. New firmware for the optical receivers of the theta links to the drift tube track finder is being installed. In parallel, tests with new eta track finde...

  17. TRIGGER

    CERN Multimedia

    R. Carlin with contributions from D. Acosta

    2012-01-01

    Level-1 Trigger Data-taking continues at cruising speed, with high availability of all components of the Level-1 trigger. We have operated the trigger up to a luminosity of 7.6E33, where we approached 100 kHz using the 7E33 prescale column.  Recently, the pause without triggers in case of an automatic "RESYNC" signal (the "settle" and "recover" time) was reduced in order to minimise the overall dead-time. This may become very important when the LHC comes back with higher energy and luminosity after LS1. We are also preparing for data-taking in the proton-lead run in early 2013. The CASTOR detector will make its comeback into CMS and triggering capabilities are being prepared for this. Steps to be taken include improved cooperation with the TOTEM trigger system and using the LHC clock during the injection and ramp phases of LHC. Studies are being finalised that will have a bearing on the Trigger Technical Design Report (TDR), which is to be rea...

  18. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  20. Secant cumulants and toric geometry

    NARCIS (Netherlands)

    Michalek, M.; Oeding, L.; Zwiernik, P.W.

    2012-01-01

    We study the secant line variety of the Segre product of projective spaces using special cumulant coordinates adapted for secant varieties. We show that the secant variety is covered by open normal toric varieties. We prove that in cumulant coordinates its ideal is generated by binomial quadrics. We

  1. Study on the plasma generation characteristics of an induction-triggered coaxial pulsed plasma thruster

    Science.gov (United States)

    Weisheng, CUI; Wenzheng, LIU; Jia, TIAN; Xiuyang, CHEN

    2018-02-01

    At present, spark plugs are used to trigger discharge in pulsed plasma thrusters (PPT), which are known to be life-limiting components due to plasma corrosion and carbon deposition. A strong electric field could be formed in a cathode triple junction (CTJ) to achieve a trigger function under vacuum conditions. We propose an induction-triggered electrode structure on the basis of the CTJ trigger principle. The induction-triggered electrode structure could increase the electric field strength of the CTJ without changing the voltage between electrodes, contributing to a reduction in the electrode breakdown voltage. Additionally, it can maintain the plasma generation effect when the breakdown voltage is reduced in the discharge experiments. The induction-triggered electrode structure could ensure an effective trigger when the ablation distance of Teflon increases, and the magnetic field produced by the discharge current could further improve the plasma density and propagation velocity. The induction-triggered coaxial PPT we propose has a simplified trigger structure, and it is an effective attempt to optimize the micro-satellite thruster.

  2. Cumulative Environmental Management Association : Wood Buffalo Region

    International Nuclear Information System (INIS)

    Friesen, B.

    2001-01-01

    The recently announced oil sands development of the Wood Buffalo Region in Alberta was the focus of this power point presentation. Both mining and in situ development is expected to total $26 billion and 2.6 million barrels per day of bitumen production. This paper described the economic, social and environmental challenges facing the resource development of this region. In addition to the proposed oil sands projects, this region will accommodate the needs of conventional oil and gas production, forestry, building of pipelines and power lines, municipal development, recreation, tourism, mining exploration and open cast mining. The Cumulative Environmental Management Association (CEMA) was inaugurated as a non-profit association in April 2000, and includes 41 members from all sectors. Its major role is to ensure a sustainable ecosystem and to avoid any cumulative impacts on wildlife. Other work underway includes the study of soil and plant species diversity, and the effects of air emissions on human health, wildlife and vegetation. The bioaccumulation of heavy metals and their impacts on surface water and fish is also under consideration to ensure the quality and quantity of surface water and ground water. 3 figs

  3. Analysis of cumulative exergy losses in the chains of technological processes

    International Nuclear Information System (INIS)

    Szargut, J.

    1989-01-01

    This paper reports on cumulative exergy consumption (CExC) which characterizes the chain of technological processes leading from natural resources to the final product under consideration. The difference of CExC and exergy of material or energy carrier expresses the cumulative exergy loss (CExL) in the mentioned technological chain. Two apportionment methods of CExL have been proposed. Partial exergy losses appear in particular links of the technological chain and characterize the influence of irreversibility of these links. Constituent exergy losses express the influence of thermodynamic imperfection of constituent technological chains leading to the final link of the total technological chain. Analysis of the partial and constituent exergy losses informs about the possibilities of improvement of the technological chains

  4. A self seeded first level track trigger for ATLAS

    International Nuclear Information System (INIS)

    Schöning, A

    2012-01-01

    For the planned high luminosity upgrade of the Large Hadron Collider, aiming to increase the instantaneous luminosity to 5 × 10 34 cm −2 s −1 , the implementation of a first level track trigger has been proposed. This trigger could be installed in the year ∼ 2021 along with the complete renewal of the ATLAS inner detector. The fast readout of the hit information from the Inner Detector is considered as the main challenge of such a track trigger. Different concepts for the implementation of a first level trigger are currently studied within the ATLAS collaboration. The so called 'Self Seeded' track trigger concept exploits fast frontend filtering algorithms based on cluster size reconstruction and fast vector tracking to select hits associated to high momentum tracks. Simulation studies have been performed and results on efficiencies, purities and trigger rates are presented for different layouts.

  5. Cumulative Culture and Future Thinking: Is Mental Time Travel a Prerequisite to Cumulative Cultural Evolution?

    Science.gov (United States)

    Vale, G. L.; Flynn, E. G.; Kendal, R. L.

    2012-01-01

    Cumulative culture denotes the, arguably, human capacity to build on the cultural behaviors of one's predecessors, allowing increases in cultural complexity to occur such that many of our cultural artifacts, products and technologies have progressed beyond what a single individual could invent alone. This process of cumulative cultural evolution…

  6. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  7. TRIGGER

    CERN Multimedia

    W. Smith

    At the March meeting, the CMS trigger group reported on progress in production, tests in the Electronics Integration Center (EIC) in Prevessin 904, progress on trigger installation in the underground counting room at point 5, USC55, the program of trigger pattern tests and vertical slice tests and planning for the Global Runs starting this summer. The trigger group is engaged in the final stages of production testing, systems integration, and software and firmware development. Most systems are delivering final tested electronics to CERN. The installation in USC55 is underway and integration testing is in full swing. A program of orderly connection and checkout with subsystems and central systems has been developed. This program includes a series of vertical subsystem slice tests providing validation of a portion of each subsystem from front-end electronics through the trigger and DAQ to data captured and stored. After full checkout, trigger subsystems will be then operated in the CMS Global Runs. Continuous...

  8. TRIGGER

    CERN Multimedia

    Wesley Smith

    2011-01-01

    Level-1 Trigger Hardware and Software New Forward Scintillating Counters (FSC) for rapidity gap measurements have been installed and integrated into the Trigger recently. For the Global Muon Trigger, tuning of quality criteria has led to improvements in muon trigger efficiencies. Several subsystems have started campaigns to increase spares by recovering boards or producing new ones. The barrel muon sector collector test system has been reactivated, new η track finder boards are in production, and φ track finder boards are under revision. In the CSC track finder, an η asymmetry problem has been corrected. New pT look-up tables have also improved efficiency. RPC patterns were changed from four out of six coincident layers to three out of six in the barrel, which led to a significant increase in efficiency. A new PAC firmware to trigger on heavy stable charged particles allows looking for chamber hit coincidences in two consecutive bunch-crossings. The redesign of the L1 Trigger Emulator...

  9. TRIGGER

    CERN Multimedia

    W. Smith from contributions of C. Leonidopoulos, I. Mikulec, J. Varela and C. Wulz.

    Level-1 Trigger Hardware and Software Over the past few months, the Level-1 trigger has successfully recorded data with cosmic rays over long continuous stretches as well as LHC splash events, beam halo, and collision events. The L1 trigger hardware, firmware, synchronization, performance and readiness for beam operation were reviewed in October. All L1 trigger hardware is now installed at Point 5, and most of it is completely commissioned. While the barrel ECAL Trigger Concentrator Cards are fully operational, the recently delivered endcap ECAL TCC system is still being commissioned. For most systems there is a sufficient number of spares available, but for a few systems additional reserve modules are needed. It was decided to increase the overall L1 latency by three bunch crossings to increase the safety margin for trigger timing adjustments. In order for CMS to continue data taking during LHC frequency ramps, the clock distribution tree needs to be reset. The procedures for this have been tested. A repl...

  10. Flexible trigger menu implementation on the Global Trigger for the CMS Level-1 trigger upgrade

    Science.gov (United States)

    MATSUSHITA, Takashi; CMS Collaboration

    2017-10-01

    The CMS experiment at the Large Hadron Collider (LHC) has continued to explore physics at the high-energy frontier in 2016. The integrated luminosity delivered by the LHC in 2016 was 41 fb-1 with a peak luminosity of 1.5 × 1034 cm-2s-1 and peak mean pile-up of about 50, all exceeding the initial estimations for 2016. The CMS experiment has upgraded its hardware-based Level-1 trigger system to maintain its performance for new physics searches and precision measurements at high luminosities. The Global Trigger is the final step of the CMS Level-1 trigger and implements a trigger menu, a set of selection requirements applied to the final list of objects from calorimeter and muon triggers, for reducing the 40 MHz collision rate to 100 kHz. The Global Trigger has been upgraded with state-of-the-art FPGA processors on Advanced Mezzanine Cards with optical links running at 10 GHz in a MicroTCA crate. The powerful processing resources of the upgraded system enable implementation of more algorithms at a time than previously possible, allowing CMS to be more flexible in how it handles the available trigger bandwidth. Algorithms for a trigger menu, including topological requirements on multi-objects, can be realised in the Global Trigger using the newly developed trigger menu specification grammar. Analysis-like trigger algorithms can be represented in an intuitive manner and the algorithms are translated to corresponding VHDL code blocks to build a firmware. The grammar can be extended in future as the needs arise. The experience of implementing trigger menus on the upgraded Global Trigger system will be presented.

  11. A new functional and structural generation of JK edge-triggered flip-flops

    International Nuclear Information System (INIS)

    Stefanescu, I.

    1977-01-01

    A new type of logical structure for a JK edge-triggered flip-flop is proposed by the author. The structure facilitates flip-flop realizations, named ''jk-JK edge-triggered flip-flops'', satisfying more functional requirements, and offering an increased flexibility in logical design, with respect to the conventional JK edge-triggered flip-flops. The function of new flip-flops covers the function of JK edge-triggered flip-flops, known as integrated circuits. (author)

  12. Self-triggered coordination with ternary controllers

    NARCIS (Netherlands)

    De Persis, Claudio; Frasca, Paolo

    2012-01-01

    This paper regards coordination of networked systems with ternary controllers. We develop a hybrid coordination system which implements a self-triggered communication policy, based on polling the neighbors upon need. We prove that the proposed scheme ensures finite-time convergence to a neighborhood

  13. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  14. Cumulative human impacts on marine predators.

    Science.gov (United States)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J; Halpern, Benjamin S; Breed, Greg A; Nickel, Barry; Teutschel, Nicole M; Crowder, Larry B; Benson, Scott; Dutton, Peter H; Bailey, Helen; Kappes, Michelle A; Kuhn, Carey E; Weise, Michael J; Mate, Bruce; Shaffer, Scott A; Hassrick, Jason L; Henry, Robert W; Irvine, Ladd; McDonald, Birgitte I; Robinson, Patrick W; Block, Barbara A; Costa, Daniel P

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact (CUI) on marine predators by combining electronic tracking data of eight protected predator species (n=685 individuals) in the California Current Ecosystem with data on 24 anthropogenic stressors. We show significant variation in CUI with some of the highest impacts within US National Marine Sanctuaries. High variation in underlying species and cumulative impact distributions means that neither alone is sufficient for effective spatial management. Instead, comprehensive management approaches accounting for both cumulative human impacts and trade-offs among multiple stressors must be applied in planning the use of marine resources.

  15. An analytical model for cumulative infiltration into a dual-permeability media

    Science.gov (United States)

    Peyrard, Xavier; Lassabatere, Laurent; Angulo-Jaramillo, Rafael; Simunek, Jiri

    2010-05-01

    Modeling of water infiltration into the vadose zone is important for better understanding of movement of water-transported contaminants. There is a great need to take into account the soil heterogeneity and, in particular, the presence of macropores or cracks that could generate preferential flow. Several mathematical models have been proposed to describe unsaturated flow through heterogeneous soils. The dual-permeability model assumes that flow is governed by Richards equation in both porous regions (matrix and fractures). Water can be exchanged between the two regions following a first-order rate law. A previous study showed that the influence of the hydraulic conductivity of the matrix/macropore interface had a little influence on cumulative infiltration at the soil surface. As a result, one could consider the surface infiltration for a specific case of no water exchange between the fracture and matrix regions (a case of zero interfacial hydraulic conductivity). In such a case, water infiltration can be considered to be the sum of the cumulative infiltrations into the matrix and the fractures. On the basis of analytical models for each sub domain (matrix and fractures), an analytical model is proposed for the entire dual-porosity system. A sensitivity analysis is performed to characterize the influence of several factors, such as the saturated hydraulic conductivity ratio, the water pressure scale parameter ratio, and the saturated volumetric water content scale ratio, on the total cumulative infiltration. Such an analysis greatly helps in quantifying the impact of macroporosity and fractures on water infiltration, which can be of great interest for hydrological models.

  16. A balanced solution to the cumulative threat of industrialized wind farm development on cinereous vultures (Aegypius monachus) in south-eastern Europe

    Science.gov (United States)

    Whitfield, D. Philip; Kati, Vassiliki

    2017-01-01

    Wind farm development can combat climate change but may also threaten bird populations’ persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2%) caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our ‘win-win’ approach is appropriate to other potential conflicts where wind farms may cumulatively threaten wildlife

  17. A balanced solution to the cumulative threat of industrialized wind farm development on cinereous vultures (Aegypius monachus in south-eastern Europe.

    Directory of Open Access Journals (Sweden)

    Dimitris P Vasilakis

    Full Text Available Wind farm development can combat climate change but may also threaten bird populations' persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms than currently (operating wind farms, equivalent to 44% of the current population (103 individuals if all proposals are authorized (2744 MW. Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW, cumulative collision mortality would still be high (17% of current population and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2% caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our 'win-win' approach is appropriate to other potential conflicts where wind farms may cumulatively threaten

  18. TRIGGER

    CERN Multimedia

    W. Smith, from contributions of D. Acosta

    2012-01-01

      The L1 Trigger group deployed several major improvements this year. Compared to 2011, the single-muon trigger rate has been reduced by a factor of 2 and the η coverage has been restored to 2.4, with high efficiency. During the current technical stop, a higher jet seed threshold will be applied in the Global Calorimeter Trigger in order to significantly reduce the strong pile-up dependence of the HT and multi-jet triggers. The currently deployed L1 menu, with the “6E33” prescales, has a total rate of less than 100 kHz and operates with detector readout dead time of less than 3% for luminosities up to 6.5 × 1033 cm–2s–1. Further prescale sets have been created for 7 and 8 × 1033 cm–2s–1 luminosities. The L1 DPG is evaluating the performance of the Trigger for upcoming conferences and publication. Progress on the Trigger upgrade was reviewed during the May Upgrade Week. We are investigating scenarios for stagin...

  19. TRIGGER

    CERN Multimedia

    R. Arcidiacono

    2013-01-01

      In 2013 the Trigger Studies Group (TSG) has been restructured in three sub-groups: STEAM, for the development of new HLT menus and monitoring their performance; STORM, for the development of HLT tools, code and actual configurations; and FOG, responsible for the online operations of the High Level Trigger. The Strategy for Trigger Evolution And Monitoring (STEAM) group is responsible for Trigger Menu development, path timing, trigger performance studies coordination, HLT offline DQM as well as HLT release, menu and conditions validation – in collaboration and with the technical support of the PdmV group. Since the end of proton-proton data taking, the group has started preparing for 2015 data taking, with collisions at 13 TeV and 25 ns bunch spacing. The reliability of the extrapolation to higher energy is being evaluated comparing the trigger rates on 7 and 8 TeV Monte Carlo samples with the data taken in the past two years. The effect of 25 ns bunch spacing is being studied on the d...

  20. TRIGGER

    CERN Multimedia

    W. Smith

    Level-1 Trigger Hardware and Software The road map for the final commissioning of the level-1 trigger system has been set. The software for the trigger subsystems is being upgraded to run under CERN Scientific Linux 4 (SLC4). There is also a new release for the Trigger Supervisor (TS 1.4), which implies upgrade work by the subsystems. As reported by the CERN group, a campaign to tidy the Trigger Timing and Control (TTC) racks has begun. The machine interface was upgraded by installing the new RF2TTC module, which receives RF signals from LHC Point 4. Two Beam Synchronous Timing (BST) signals, one for each beam, can now be received in CMS. The machine group will define the exact format of the information content shortly. The margin on the locking range of the CMS QPLL is planned for study for different subsystems in the next Global Runs, using a function generator. The TTC software has been successfully tested on SLC4. Some TTC subsystems have already been upgraded to SLC4. The TTCci Trigger Supervisor ...

  1. BTeV trigger/DAQ innovations

    International Nuclear Information System (INIS)

    Votava, Margaret

    2005-01-01

    The BTeV experiment was a collider based high energy physics (HEP) B-physics experiment proposed at Fermilab. It included a large-scale, high speed trigger/data acquisition (DAQ) system, reading data off the detector at 500 Gbytes/sec and writing to mass storage at 200 Mbytes/sec. The online design was considered to be highly credible in terms of technical feasibility, schedule and cost. This paper will give an overview of the overall trigger/DAQ architecture, highlight some of the challenges, and describe the BTeV approach to solving some of the technical challenges. At the time of termination in early 2005, the experiment had just passed its baseline review. Although not fully implemented, many of the architecture choices, design, and prototype work for the online system (both trigger and DAQ) were well on their way to completion. Other large, high-speed online systems may have interest in the some of the design choices and directions of BTeV, including (a) a commodity-based tracking trigger running asynchronously at full rate, (b) the hierarchical control and fault tolerance in a large real time environment, (c) a partitioning model that supports offline processing on the online farms during idle periods with plans for dynamic load balancing, and (d) an independent parallel highway architecture

  2. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    Science.gov (United States)

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.

  3. 78 FR 26308 - Endangered and Threatened Wildlife and Plants; Proposed Threatened Status for Coral Pink Sand...

    Science.gov (United States)

    2013-05-06

    ... effects of climate change and drought; and (4) cumulative interaction of individual factors such as off..., we considered the types of activities that might trigger regulatory impacts under the rule, as well... work programs; Child Nutrition; Food Stamps; Social Services Block Grants; Vocational Rehabilitation...

  4. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  5. Cumulative effects assessment: Does scale matter?

    International Nuclear Information System (INIS)

    Therivel, Riki; Ross, Bill

    2007-01-01

    Cumulative effects assessment (CEA) is (or should be) an integral part of environmental assessment at both the project and the more strategic level. CEA helps to link the different scales of environmental assessment in that it focuses on how a given receptor is affected by the totality of plans, projects and activities, rather than on the effects of a particular plan or project. This article reviews how CEAs consider, and could consider, scale issues: spatial extent, level of detail, and temporal issues. It is based on an analysis of Canadian project-level CEAs and UK strategic-level CEAs. Based on a review of literature and, especially, case studies with which the authors are familiar, it concludes that scale issues are poorly considered at both levels, with particular problems being unclear or non-existing cumulative effects scoping methodologies; poor consideration of past or likely future human activities beyond the plan or project in question; attempts to apportion 'blame' for cumulative effects; and, at the plan level, limited management of cumulative effects caused particularly by the absence of consent regimes. Scale issues are important in most of these problems. However both strategic-level and project-level CEA have much potential for managing cumulative effects through better siting and phasing of development, demand reduction and other behavioural changes, and particularly through setting development consent rules for projects. The lack of strategic resource-based thresholds constrains the robust management of strategic-level cumulative effects

  6. Humanoid infers Archimedes' principle: understanding physical relations and object affordances through cumulative learning experiences

    Science.gov (United States)

    2016-01-01

    Emerging studies indicate that several species such as corvids, apes and children solve ‘The Crow and the Pitcher’ task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause–effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended ‘learning–prediction–abstraction’ loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. PMID:27466440

  7. Humanoid infers Archimedes' principle: understanding physical relations and object affordances through cumulative learning experiences.

    Science.gov (United States)

    Bhat, Ajaz Ahmad; Mohan, Vishwanathan; Sandini, Giulio; Morasso, Pietro

    2016-07-01

    Emerging studies indicate that several species such as corvids, apes and children solve 'The Crow and the Pitcher' task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause-effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended 'learning-prediction-abstraction' loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. © 2016 The Author(s).

  8. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  9. Cumulative human impacts on marine predators

    DEFF Research Database (Denmark)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact...

  10. Semi-intelligent trigger-generation scheme for Cherenkov light imaging cameras

    International Nuclear Information System (INIS)

    Bhat, C.L.; Tickoo, A.K.; Koul, R.; Kaul, I.K.

    1994-01-01

    We propose here an improved trigger-generation scheme for TeV gamma-ray imaging telescopes. Based on a memory-based Majority Coincidence Circuit, this scheme involves deriving two-or three-pixel nearest-neighbour coincidences as against the conventional approach of generating prompt coincidences using any two photomultiplier detector pixels of an imaging-camera. As such, the new method can discriminate better against shot-noise-generated triggers and, to a significant extent, also against cosmic-ray and local-muon-generated background events, without compromising on the telescope response to events of γ-ray origin. An optional feature of the proposed scheme is that a suitably scaled-up value of the chance-trigger rate can be independently derived, thereby making it possible to use this parameter reliably for keeping a log of the ''health'' of the experimental system. (orig.)

  11. Semi-intelligent trigger-generation scheme for Cherenkov light imaging cameras

    Science.gov (United States)

    Bhat, C. L.; Tickoo, A. K.; Koul, R.; Kaul, I. K.

    1994-02-01

    We propose here an improved trigger-generation scheme for TeV gamma-ray imaging telescopes. Based on a memory-based Majority Coincidence Circuit, this scheme involves deriving two- or three-pixel nearest-neighbour coincidences as against the conventional approach of generating prompt coincidences using any two photomultiplier detector pixels of an imaging-camera. As such, the new method can discriminate better against shot-noise-generated triggers and, to a significant extent, also against cosmic-ray and local-muon-generated background events, without compromising on the telescope response to events of γ-ray origin. An optional feature of the proposed scheme is that a suitably scaled-up value of the chance-trigger rate can be independently derived, thereby making it possible to use this parameter reliably for keeping a log of the ``health'' of the experimental system.

  12. Muon Trigger for Mobile Phones

    Science.gov (United States)

    Borisyak, M.; Usvyatsov, M.; Mulhearn, M.; Shimmin, C.; Ustyuzhanin, A.

    2017-10-01

    The CRAYFIS experiment proposes to use privately owned mobile phones as a ground detector array for Ultra High Energy Cosmic Rays. Upon interacting with Earth’s atmosphere, these events produce extensive particle showers which can be detected by cameras on mobile phones. A typical shower contains minimally-ionizing particles such as muons. As these particles interact with CMOS image sensors, they may leave tracks of faintly-activated pixels that are sometimes hard to distinguish from random detector noise. Triggers that rely on the presence of very bright pixels within an image frame are not efficient in this case. We present a trigger algorithm based on Convolutional Neural Networks which selects images containing such tracks and are evaluated in a lazy manner: the response of each successive layer is computed only if activation of the current layer satisfies a continuation criterion. Usage of neural networks increases the sensitivity considerably comparable with image thresholding, while the lazy evaluation allows for execution of the trigger under the limited computational power of mobile phones.

  13. Event-Triggered Fault Detection of Nonlinear Networked Systems.

    Science.gov (United States)

    Li, Hongyi; Chen, Ziran; Wu, Ligang; Lam, Hak-Keung; Du, Haiping

    2017-04-01

    This paper investigates the problem of fault detection for nonlinear discrete-time networked systems under an event-triggered scheme. A polynomial fuzzy fault detection filter is designed to generate a residual signal and detect faults in the system. A novel polynomial event-triggered scheme is proposed to determine the transmission of the signal. A fault detection filter is designed to guarantee that the residual system is asymptotically stable and satisfies the desired performance. Polynomial approximated membership functions obtained by Taylor series are employed for filtering analysis. Furthermore, sufficient conditions are represented in terms of sum of squares (SOSs) and can be solved by SOS tools in MATLAB environment. A numerical example is provided to demonstrate the effectiveness of the proposed results.

  14. TRIGGER

    CERN Multimedia

    by Wesley Smith

    2011-01-01

    Level-1 Trigger Hardware and Software After the winter shutdown minor hardware problems in several subsystems appeared and were corrected. A reassessment of the overall latency has been made. In the TTC system shorter cables between TTCci and TTCex have been installed, which saved one bunch crossing, but which may have required an adjustment of the RPC timing. In order to tackle Pixel out-of-syncs without influencing other subsystems, a special hardware/firmware re-sync protocol has been introduced in the Global Trigger. The link between the Global Calorimeter Trigger and the Global Trigger with the new optical Global Trigger Interface and optical receiver daughterboards has been successfully tested in the Electronics Integration Centre in building 904. New firmware in the GCT now allows a setting to remove the HF towers from energy sums. The HF sleeves have been replaced, which should lead to reduced rates of anomalous signals, which may allow their inclusion after this is validated. For ECAL, improvements i...

  15. TRIGGER

    CERN Multimedia

    W. Smith from contributions of C. Leonidopoulos

    2010-01-01

    Level-1 Trigger Hardware and Software Since nearly all of the Level-1 (L1) Trigger hardware at Point 5 has been commissioned, activities during the past months focused on the fine-tuning of synchronization, particularly for the ECAL and the CSC systems, on firmware upgrades and on improving trigger operation and monitoring. Periodic resynchronizations or hard resets and a shortened luminosity section interval of 23 seconds were implemented. For the DT sector collectors, an automatic power-off was installed in case of high temperatures, and the monitoring capabilities of the opto-receivers and the mini-crates were enhanced. The DTTF and the CSCTF now have improved memory lookup tables. The HCAL trigger primitive logic implemented a new algorithm providing better stability of the energy measurement in the presence of any phase misalignment. For the Global Calorimeter Trigger, additional Source Cards have been manufactured and tested. Testing of the new tau, missing ET and missing HT algorithms is underw...

  16. A new method to cluster genomes based on cumulative Fourier power spectrum.

    Science.gov (United States)

    Dong, Rui; Zhu, Ziyue; Yin, Changchuan; He, Rong L; Yau, Stephen S-T

    2018-06-20

    Analyzing phylogenetic relationships using mathematical methods has always been of importance in bioinformatics. Quantitative research may interpret the raw biological data in a precise way. Multiple Sequence Alignment (MSA) is used frequently to analyze biological evolutions, but is very time-consuming. When the scale of data is large, alignment methods cannot finish calculation in reasonable time. Therefore, we present a new method using moments of cumulative Fourier power spectrum in clustering the DNA sequences. Each sequence is translated into a vector in Euclidean space. Distances between the vectors can reflect the relationships between sequences. The mapping between the spectra and moment vector is one-to-one, which means that no information is lost in the power spectra during the calculation. We cluster and classify several datasets including Influenza A, primates, and human rhinovirus (HRV) datasets to build up the phylogenetic trees. Results show that the new proposed cumulative Fourier power spectrum is much faster and more accurately than MSA and another alignment-free method known as k-mer. The research provides us new insights in the study of phylogeny, evolution, and efficient DNA comparison algorithms for large genomes. The computer programs of the cumulative Fourier power spectrum are available at GitHub (https://github.com/YaulabTsinghua/cumulative-Fourier-power-spectrum). Copyright © 2018. Published by Elsevier B.V.

  17. An analysis of cumulative risks based on biomonitoring data for six phthalates using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single chemical drives the cumulative risk of an individual exposed to multiple chemicals. Phthalates are a class of chemicals with ubiquitous exposures in the general population that have the potential to cause ...

  18. Insight into multiple-triggering effect in DTSCRs for ESD protection

    Science.gov (United States)

    Zhang, Lizhong; Wang, Yuan; Wang, Yize; He, Yandong

    2017-07-01

    The diode-triggered silicon-controlled rectifier (DTSCR) is widely used for electrostatic discharge (ESD) protection in advanced CMOS process owing to its advantages, such as design simplification, adjustable trigger/holding voltage, low parasitic capacitance. However, the multiple-triggering effect in the typical DTSCR device may cause undesirable larger overall trigger voltage, which results in a reduced ESD safe margin. In previous research, the major cause is attributed to the higher current level required in the intrinsic SCR. The related discussions indicate that it seems to result from the current division rule between the intrinsic and parasitic SCR formed in the triggering process. In this letter, inserting a large space into the trigger diodes is proposed to get a deeper insight into this issue. The triggering current is observed to be regularly reduced along with the increased space, which confirms that the current division is determined by the parasitic resistance distributed between the intrinsic and parasitic SCR paths. The theoretical analysis is well confirmed by device simulation and transmission line pulse (TLP) test results. The reduced overall trigger voltage is achieved in the modified DTSCR structures due to the comprehensive result of the parasitic resistance vs triggering current, which indicates a minimized multiple-triggering effect. Project supported by the Beijing Natural Science Foundation, China (No. 4162030).

  19. Constraints and triggers: situational mechanics of gender in negotiation.

    Science.gov (United States)

    Bowles, Hannah Riley; Babcock, Linda; McGinn, Kathleen L

    2005-12-01

    The authors propose 2 categories of situational moderators of gender in negotiation: situational ambiguity and gender triggers. Reducing the degree of situational ambiguity constrains the influence of gender on negotiation. Gender triggers prompt divergent behavioral responses as a function of gender. Field and lab studies (1 and 2) demonstrated that decreased ambiguity in the economic structure of a negotiation (structural ambiguity) reduces gender effects on negotiation performance. Study 3 showed that representation role (negotiating for self or other) functions as a gender trigger by producing a greater effect on female than male negotiation performance. Study 4 showed that decreased structural ambiguity constrains gender effects of representation role, suggesting that situational ambiguity and gender triggers work in interaction to moderate gender effects on negotiation performance. Copyright 2006 APA, all rights reserved.

  20. A proposed Drift Tubes-seeded muon track trigger for the CMS experiment at the High Luminosity-LHC

    CERN Document Server

    AUTHOR|(CDS)2070813; Lazzizzera, Ignazio; Vanini, Sara; Zotto, Pierluigi

    2016-01-01

    The LHC program at 13 and 14 TeV, after the observation of the candidate SM Higgs boson, will help clarify future subjects of study and shape the needed tools. Any upgrade of the LHC experiments for unprecedented luminosities, such as the High Luminosity-LHC ones, must then maintain the acceptance on electroweak processes that can lead to a detailed study of the properties of the candidate Higgs boson. The acceptance of the key lepton, photon and hadron triggers should be kept such that the overall physics acceptance, in particular for low-mass scale processes, can be the same as the one the experiments featured in 2012.In such a scenario, a new approach to early trigger implementation is needed. One of the major steps will be the inclusion of high-granularity tracking sub-detectors, such as the CMS Silicon Tracker, in taking the early trigger decision. This contribution can be crucial in several tasks, including the confirmation of triggers in other subsystems, and the improvement of the on-line momentum mea...

  1. Chapter 19. Cumulative watershed effects and watershed analysis

    Science.gov (United States)

    Leslie M. Reid

    1998-01-01

    Cumulative watershed effects are environmental changes that are affected by more than.one land-use activity and that are influenced by.processes involving the generation or transport.of water. Almost all environmental changes are.cumulative effects, and almost all land-use.activities contribute to cumulative effects

  2. Retrospective respiratory triggering renal perfusion MRI

    Energy Technology Data Exchange (ETDEWEB)

    Attenberger, Ulrike I.; Michaely, Henrik J.; Schoenberg, Stefan O. (Dept. of Clinical Radiology and Nuclear Medicine, Univ. Hospital Mannheim, Univ. of Heidelberg, Mannheim (Germany)), e-mail: ulrike.attenberger@medma.uni-heidelberg.de; Sourbron, Steven P. (Div. of Medical Physics, Univ. of Leeds, Leeds (United Kingdom)); Reiser, Maximilian F. (Dept. of Clinical Radiology, Univ. Hospitals Munich, Grosshadern, Ludwig-Maximilians-Univ., Munich (Germany))

    2010-12-15

    Background: Artifacts of respiratory motion are one of the well-known limitations of dynamic contrast-enhanced MRI (DCE-MRI) of the kidney. Purpose: To propose and evaluate a retrospective triggering approach to minimize the effect of respiratory motion in DCE-MRI of the kidney. Material and Methods: Nine consecutive patients underwent renal perfusion measurements. Data were acquired with a 2D saturation-recovery TurboFLASH sequence. In order to test the dependence of the results on size and location of the manually drawn triggering regions of interest (ROIs), three widely differing triggering regions were defined by one observer. Mean value, standard deviation, and variability of the renal function parameters plasma flow (FP), plasma volume (VP), plasma transit time (TP), tubular flow (FT), tubular volume (VT), and tubular transit time (TT) were calculated on a per-patient basis. Results: The results show that triggered data have adequate temporal resolution to measure blood flow. The overall average values of the function parameters were: 152.77 (FP), 15.18 (VP), 6,73 (TP), 18.50 (FT), 35.36 (VT), and 117.67 (TT). The variability (calculated in % SD from the mean value) for three different respiratory triggering regions defined on a per-patient basis was between 0.81% and 9.87% for FP, 1.45% and 8.19% for VP, 0% and 9.63% for TP, 2.15% and 12.23% for TF, 0.8% and 17.28% for VT, and 1.97% and 12.87% for TT. Conclusion: Triggering reduces the oscillations in the signal curves and produces sharper parametric maps. In contrast to numerically challenging approaches like registration and segmentation it can be applied in clinical routine, but a (semi)-automatic approach to select the triggering ROI is desirable to reduce user dependence.

  3. An Analysis of Cumulative Risks Indicated by Biomonitoring Data of Six Phthalates Using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...

  4. Ecosystem assessment methods for cumulative effects at the regional scale

    International Nuclear Information System (INIS)

    Hunsaker, C.T.

    1989-01-01

    Environmental issues such as nonpoint-source pollution, acid rain, reduced biodiversity, land use change, and climate change have widespread ecological impacts and require an integrated assessment approach. Since 1978, the implementing regulations for the National Environmental Policy Act (NEPA) have required assessment of potential cumulative environmental impacts. Current environmental issues have encouraged ecologists to improve their understanding of ecosystem process and function at several spatial scales. However, management activities usually occur at the local scale, and there is little consideration of the potential impacts to the environmental quality of a region. This paper proposes that regional ecological risk assessment provides a useful approach for assisting scientists in accomplishing the task of assessing cumulative impacts. Critical issues such as spatial heterogeneity, boundary definition, and data aggregation are discussed. Examples from an assessment of acidic deposition effects on fish in Adirondack lakes illustrate the importance of integrated data bases, associated modeling efforts, and boundary definition at the regional scale

  5. Event-Triggered Output-Feedback Control for Disturbed Linear Systems

    Directory of Open Access Journals (Sweden)

    Hao Jiang

    2018-01-01

    Full Text Available In the last few decades, event-triggered control received considerable attention, because of advantages in reducing the resource utilization, such as communication load and processor. In this paper, we propose an event-triggered output-feedback controller for disturbed linear systems, in order to achieve both better resource utilization and disturbance attenuation properties at the same time. Based on our prior work on state-feedback H∞ control for disturbed systems, we propose an approach to design an output-feedback H∞ controller for the system whose states are not completely observable, and a sufficient condition guaranteeing the asymptotic stability and robustness of the system is given in the form of LMIs (Linear Matrix Inequalities.

  6. TRIGGER

    CERN Multimedia

    W. Smith

    2011-01-01

    Level-1 Trigger Hardware and Software Overall the L1 trigger hardware has been running very smoothly during the last months of proton running. Modifications for the heavy-ion run have been made where necessary. The maximal design rate of 100 kHz can be sustained without problems. All L1 latencies have been rechecked. The recently installed Forward Scintillating Counters (FSC) are being used in the heavy ion run. The ZDC scintillators have been dismantled, but the calorimeter itself remains. We now send the L1 accept signal and other control signals to TOTEM. Trigger cables from TOTEM to CMS will be installed during the Christmas shutdown, so that the TOTEM data can be fully integrated within the CMS readout. New beam gas triggers have been developed, since the BSC-based trigger is no longer usable at high luminosities. In particular, a special BPTX signal is used after a quiet period with no collisions. There is an ongoing campaign to provide enough spare modules for the different subsystems. For example...

  7. TRIGGER

    CERN Multimedia

    J. Alimena

    2013-01-01

    Trigger Strategy Group The Strategy for Trigger Evolution And Monitoring (STEAM) group is responsible for the development of future High-Level Trigger menus, as well as of its DQM and validation, in collaboration and with the technical support of the PdmV group. Taking into account the beam energy and luminosity expected in 2015, a rough estimate of the trigger rates indicates a factor four increase with respect to 2012 conditions. Assuming that a factor two can be tolerated thanks to the increase in offline storage and processing capabilities, a toy menu has been developed using the new OpenHLT workflow to estimate the transverse energy/momentum thresholds that would halve the current trigger rates. The CPU time needed to run the HLT has been compared between data taken with 25 ns and 50 ns bunch spacing, for equivalent pile-up: no significant difference was observed on the global time per event distribution at the only available data point, corresponding to a pile-up of about 10 interactions. Using th...

  8. Aftershocks and triggering processes in rock fracture

    Science.gov (United States)

    Davidsen, J.; Kwiatek, G.; Goebel, T.; Stanchits, S. A.; Dresen, G.

    2017-12-01

    One of the hallmarks of our understanding of seismicity in nature is the importance of triggering processes, which makes the forecasting of seismic activity feasible. These triggering processes by which one earthquake induces (dynamic or static) stress changes leading to potentially multiple other earthquakes are at the core relaxation processes. A specic example of triggering are aftershocks following a large earthquake, which have been observed to follow certain empirical relationships such as the Omori-Utsu relation. Such an empirical relation should arise from the underlying microscopic dynamics of the involved physical processes but the exact connection remains to be established. Simple explanations have been proposed but their general applicability is unclear. Many explanations involve the picture of an earthquake as a purely frictional sliding event. Here, we present experimental evidence that these empirical relationships are not limited to frictional processes but also arise in fracture zone formation and are mostly related to compaction-type events. Our analysis is based on tri-axial compression experiments under constant displacement rate on sandstone and granite samples using spatially located acoustic emission events and their focal mechanisms. More importantly, we show that event-event triggering plays an important role in the presence of large-scale or macrocopic imperfections while such triggering is basically absent if no signicant imperfections are present. We also show that spatial localization and an increase in activity rates close to failure do not necessarily imply triggering behavior associated with aftershocks. Only if a macroscopic crack is formed and its propagation remains subcritical do we observe significant triggering.

  9. Distributed event-triggered consensus tracking of second-order multi-agent systems with a virtual leader

    International Nuclear Information System (INIS)

    Cao Jie; Wu Zhi-Hai; Peng Li

    2016-01-01

    This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy. (paper)

  10. Study on the cumulative impact of reclamation activities on ecosystem health in coastal waters.

    Science.gov (United States)

    Shen, Chengcheng; Shi, Honghua; Zheng, Wei; Li, Fen; Peng, Shitao; Ding, Dewen

    2016-02-15

    The purpose of this study is to develop feasible tools to investigate the cumulative impact of reclamations on coastal ecosystem health, so that the strategies of ecosystem-based management can be applied in the coastal zone. An indicator system and model were proposed to assess the cumulative impact synthetically. Two coastal water bodies, namely Laizhou Bay (LZB) and Tianjin coastal waters (TCW), in the Bohai Sea of China were studied and compared, each in a different phase of reclamations. Case studies showed that the indicator scores of coastal ecosystem health in LZB and TCW were 0.75 and 0.68 out of 1.0, respectively. It can be concluded that coastal reclamations have a historically cumulative effect on benthic environment, whose degree is larger than that on aquatic environment. The ecosystem-based management of coastal reclamations should emphasize the spatially and industrially intensive layout. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Managing cumulative impacts: A key to sustainability?

    Energy Technology Data Exchange (ETDEWEB)

    Hunsaker, C.T.

    1994-12-31

    This paper addresses how science can be more effectively used in creating policy to manage cumulative effects on ecosystems. The paper focuses on the scientific techniques that we have to identify and to assess cumulative impacts on ecosystems. The term ``sustainable development`` was brought into common use by the World Commission on Environment and Development (The Brundtland Commission) in 1987. The Brundtland Commission report highlighted the need to simultaneously address developmental and environmental imperatives simultaneously by calling for development that ``meets the needs of the present generation without compromising the needs of future generations.`` We cannot claim to be working toward sustainable development until we can quantitatively assess cumulative impacts on the environment: The two concepts are inextricibally linked in that the elusiveness of cumulative effects likely has the greatest potential of keeping us from achieving sustainability. In this paper, assessment and management frameworks relevant to cumulative impacts are discussed along with recent literature on how to improve such assessments. When possible, examples are given for marine ecosystems.

  12. Landslides triggered by the 1946 Ancash earthquake, Peru

    Science.gov (United States)

    Kampherm, T. S.; Evans, S. G.; Valderrama Murillo, P.

    2009-04-01

    The 1946 M7.3 Ancash Earthquake triggered a large number of landslides in an epicentral area that straddled the Continental Divide of South America in the Andes of Peru. A small number of landslides were described in reconnaissance reports by E. Silgado and Arnold Heim published shortly after the earthquake, but further details of the landslides triggered by the earthquake have not been reported since. Utilising field traverses, aerial photograph interpretation and GIS, our study mapped 45 landslides inferred to have been triggered by the event. 83% were rock avalanches involving Cretaceous limestones interbedded with shales. The five largest rock/debris avalanches occurred at Rio Llama (est. vol. 37 M m3), Suytucocha (est. vol., 13.5 Mm3), Quiches (est. vol. 10.5 Mm3 ), Pelagatos (est. vol. 8 Mm3), and Shundoy (est. vol. 8 Mm3). The Suytucocha, Quiches, and Pelagatos landslides were reported by Silgado and Heim. Rock slope failure was most common on slopes with a southwest aspect, an orientation corresponding to the regional dip direction of major planar structures in the Andean foreland belt (bedding planes and thrust faults). In valleys oriented transverse to the NW-SE structural grain of the epicentral area, south-westerly dipping bedding planes combined with orthogonal joint sets to form numerous wedge failures. Many initial rock slope failures were transformed into rock/debris avalanches by the entrainment of colluvium in their path. At Acobamba, a rock avalanche that transformed into a debris avalanche (est. vol. 4.3 Mm3) overwhelmed a village resulting in the deaths of 217 people. The cumulative volume-frequency plot shows a strong power law relation below a marked rollover, similar in form to that derived for landslides triggered by the 1994 Northridge Earthquake. The total volume of the 45 landslides is approximately 93 Mm3. The data point for the Ancash Earthquake plots near the regression line calculated by Keefer (1994), and modified by Malamud et al

  13. A Framework to Assess the Cumulative Hydrological Impacts of Dams on flow Regime

    Science.gov (United States)

    Wang, Y.; Wang, D.

    2016-12-01

    In this study we proposed a framework to assess the cumulative impact of dams on hydrological regime, and the impacts of the Three Gorges Dam on flow regime in Yangtze River were investigated with the framework. We reconstructed the unregulated flow series to compare with the regulated flow series in the same period. Eco-surplus and eco-deficit and the Indicators of Hydrologic Alteration parameters were used to examine the hydrological regime change. Among IHA parameters, Wilcoxon signed-rank test and Principal Components Analysis identified the representative indicators of hydrological alterations. Eco-surplus and eco-deficit showed that the reservoir also changed the seasonal regime of the flows in autumn and winter. Annual extreme flows and October flows changes lead to negative ecological implications downstream from the Three Gorges Dam. Ecological operation for the Three Gorges Dam is necessary to mitigate the negative effects on the river ecosystem in the middle reach of Yangtze River. The framework proposed here could be a robust method to assess the cumulative impacts of reservoir operation.

  14. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments...

  15. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    1999-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments....

  16. El Carreto o Cumulá - Aspidosperma Dugandii Standl El Carreto o Cumulá - Aspidosperma Dugandii Standl

    Directory of Open Access Journals (Sweden)

    Dugand Armando

    1944-03-01

    Full Text Available Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin duda alguna al A. Dugandii StandI. Por otra parte, Santiago Cortés (FI. Colomb. 206. 1898; ed, 2: 239. 1912 cita el Cumulá "de Anapoima y otros lugares del (rio Magdalena" diciendo que pertenece a las Leguminosas, pero la brevísima descripción que este autor hace de la madera "naranjada y notable por densidad, dureza y resistencia a la humedad", me induce a creer que se trata del mismo Cumula coleccionado recientemente en Tocaima, ya que esta población esta situada a pocos kilómetros de Anapoima. Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin

  17. Flexible trigger menu implementation on the Global Trigger for the CMS Level-1 trigger upgrade

    CERN Document Server

    Matsushita, Takashi

    2017-01-01

    The CMS experiment at the Large Hadron Collider (LHC) has continued to explore physics at the high-energy frontier in 2016. The integrated luminosity delivered by the LHC in 2016 was 41~fb$^{-1}$ with a peak luminosity of 1.5 $\\times$ 10$^{34}$ cm$^{-2}$s$^{-1}$ and peak mean pile-up of about 50, all exceeding the initial estimations for 2016. The CMS experiment has upgraded its hardware-based Level-1 trigger system to maintain its performance for new physics searches and precision measurements at high luminosities. The Global Trigger is the final step of the CMS \\mbox{Level-1} trigger and implements a trigger menu, a set of selection requirements applied to the final list of objects from calorimeter and muon triggers, for reducing the 40 MHz collision rate to 100 kHz. The Global Trigger has been upgraded with state-of-the-art FPGA processors on Advanced Mezzanine Cards with optical links running at 10 GHz in a MicroTCA crate. The powerful processing resources of the upgraded system enable implemen...

  18. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...

  19. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  20. 7 CFR 42.132 - Determining cumulative sum values.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Determining cumulative sum values. 42.132 Section 42... Determining cumulative sum values. (a) The parameters for the on-line cumulative sum sampling plans for AQL's... 3 1 2.5 3 1 2 1 (b) At the beginning of the basic inspection period, the CuSum value is set equal to...

  1. The Algebra of the Cumulative Percent Operation.

    Science.gov (United States)

    Berry, Andrew J.

    2002-01-01

    Discusses how to help students avoid some pervasive reasoning errors in solving cumulative percent problems. Discusses the meaning of ."%+b%." the additive inverse of ."%." and other useful applications. Emphasizes the operational aspect of the cumulative percent concept. (KHR)

  2. How variable is the number of triggered aftershocks?

    Science.gov (United States)

    Marsan, D.; Helmstetter, A.

    2017-07-01

    Aftershock activity depends at first order on the main shock magnitude but also shows important fluctuations between shocks of equal magnitude. We here investigate these fluctuations, by quantifying them and by relating them to the main shock stress drop and other variables, for southern California earthquakes. A method is proposed in order to only count directly triggered aftershocks, rather than secondary aftershocks (i.e., triggered by previous aftershocks), and to only quantify fluctuations going beyond the natural Poisson variability. Testing of the method subjected to various model errors allows to quantify its robustness. It is found that these fluctuations follow a distribution that is well fitted by a lognormal distribution, with a coefficient of variation of about 1.0 to 1.1. A simple model is proposed to relate this observed dependence to main shock stress drop variability.

  3. Event-triggered Kalman-consensus filter for two-target tracking sensor networks.

    Science.gov (United States)

    Su, Housheng; Li, Zhenghao; Ye, Yanyan

    2017-11-01

    This paper is concerned with the problem of event-triggered Kalman-consensus filter for two-target tracking sensor networks. According to the event-triggered protocol and the mean-square analysis, a suboptimal Kalman gain matrix is derived and a suboptimal event-triggered distributed filter is obtained. Based on the Kalman-consensus filter protocol, all sensors which only depend on its neighbors' information can track their corresponding targets. Furthermore, utilizing Lyapunov method and matrix theory, some sufficient conditions are presented for ensuring the stability of the system. Finally, a simulation example is presented to verify the effectiveness of the proposed event-triggered protocol. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    DEFF Research Database (Denmark)

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    Recently, Fine and Gray (J Am Stat Assoc 94:496–509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter...... estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums...

  5. The sign problem in real-time path integral simulations: Using the cumulant action to implement multilevel blocking

    International Nuclear Information System (INIS)

    Mak, C. H.

    2009-01-01

    A practical method to tackle the sign problem in real-time path integral simulations is proposed based on the multilevel blocking idea. The formulation is made possible by using a cumulant expansion of the action, which in addition to addressing the sign problem, provides an unbiased estimator for the action from a statistically noisy sample of real-time paths. The cumulant formulation also allows the analytical gradients of the action to be computed with little extra computational effort, and it can easily be implemented in a massively parallel environment.

  6. Architecture of a Level 1 Track Trigger for the CMS Experiment

    CERN Document Server

    Heintz, Ulrich

    2010-01-01

    The luminosity goal for the Super-LHC is 1035/cm2/s. At this luminosity the number of proton-proton interactions in each beam crossing will be in the hundreds. This will stress many components of the CMS detector. One system that has to be upgraded is the trigger system. To keep the rate at which the level 1 trigger fires manageable, information from the tracker has to be integrated into the level 1 trigger. Current design proposals foresee tracking detectors that perform on-detector filtering to reject hits from low-momentum particles. In order to build a trigger system, the filtered hit data from different layers and sectors of the tracker will have to be transmitted off the detector and brought together in a logic processor that generates trigger tracks within the time window allowed by the level 1 trigger latency. I will describe a possible architecture for the off-detector logic that accomplishes this goal.

  7. Cumulative t-link threshold models for the genetic analysis of calving ease scores

    Directory of Open Access Journals (Sweden)

    Tempelman Robert J

    2003-09-01

    Full Text Available Abstract In this study, a hierarchical threshold mixed model based on a cumulative t-link specification for the analysis of ordinal data or more, specifically, calving ease scores, was developed. The validation of this model and the Markov chain Monte Carlo (MCMC algorithm was carried out on simulated data from normally and t4 (i.e. a t-distribution with four degrees of freedom distributed populations using the deviance information criterion (DIC and a pseudo Bayes factor (PBF measure to validate recently proposed model choice criteria. The simulation study indicated that although inference on the degrees of freedom parameter is possible, MCMC mixing was problematic. Nevertheless, the DIC and PBF were validated to be satisfactory measures of model fit to data. A sire and maternal grandsire cumulative t-link model was applied to a calving ease dataset from 8847 Italian Piemontese first parity dams. The cumulative t-link model was shown to lead to posterior means of direct and maternal heritabilities (0.40 ± 0.06, 0.11 ± 0.04 and a direct maternal genetic correlation (-0.58 ± 0.15 that were not different from the corresponding posterior means of the heritabilities (0.42 ± 0.07, 0.14 ± 0.04 and the genetic correlation (-0.55 ± 0.14 inferred under the conventional cumulative probit link threshold model. Furthermore, the correlation (> 0.99 between posterior means of sire progeny merit from the two models suggested no meaningful rerankings. Nevertheless, the cumulative t-link model was decisively chosen as the better fitting model for this calving ease data using DIC and PBF.

  8. EXAFS cumulants of CdSe

    International Nuclear Information System (INIS)

    Diop, D.

    1997-04-01

    EXAFS functions had been extracted from measurements on the K edge of Se at different temperatures between 20 and 300 K. The analysis of the EXAFS of the filtered first two shells has been done in the wavevector range laying between 2 and 15.5 A -1 in terms of the cumulants of the effective distribution of distances. The cumulants C 3 and C 4 obtained from the phase difference and the amplitude ratio methods have shown the anharmonicity in the vibrations of atoms around their equilibrium position. (author). 13 refs, 3 figs

  9. A programmable systolic trigger processor for FERA bus data

    International Nuclear Information System (INIS)

    Appelquist, G.; Hovander, B.; Sellden, B.; Bohm, C.

    1992-09-01

    A generic CAMAC based trigger processor module for fast processing of large amounts of ADC data, has been designed. This module has been realised using complex programmable gate arrays (LCAs from XILINX). The gate arrays have been connected to memories and multipliers in such a way that different gate array configurations can cover a wide range of module applications. Using this module, it is possible to construct complex trigger processors. The module uses both the fast ECL FERA bus and the CAMAC bus for inputs and outputs. The latter, however, is primarily used for set-up and control but may also be used for data output. Large numbers of ADCs can be served by a hierarchical arrangement of trigger processor modules, processing ADC data with pipe-line arithmetics producing the final result at the apex of the pyramid. The trigger decision will be transmitted to the data acquisition system via a logic signal while numeric results may be extracted by the CAMAC controller. The trigger processor was originally developed for the proposed neutral particle search experiment at CERN, NUMASS. There it was designed to serve as a second level trigger processor. It was required to correct all ADC raw data for efficiency and pedestal, calculate the total calorimeter energy, obtain the optimal time of flight data and calculate the particle mass. A suitable mass cut would then deliver the trigger decision. More complex triggers were also considered. (au)

  10. TRIGGER

    CERN Multimedia

    W. Smith

    At the December meeting, the CMS trigger group reported on progress in production, tests in the Electronics Integration Center (EIC) in Prevessin 904, progress on trigger installation in the underground counting room at point 5, USC55, and results from the Magnet Test and Cosmic Challenge (MTCC) phase II. The trigger group is engaged in the final stages of production testing, systems integration, and software and firmware development. Most systems are delivering final tested electronics to CERN. The installation in USC55 is underway and moving towards integration testing. A program of orderly connection and checkout with subsystems and central systems has been developed. This program includes a series of vertical subsystem slice tests providing validation of a portion of each subsystem from front-end electronics through the trigger and DAQ to data captured and stored. This is combined with operations and testing without beam that will continue until startup. The plans for start-up, pilot and early running tri...

  11. Study of cumulative fatigue damage detection for used parts with nonlinear output frequency response functions based on NARMAX modelling

    Science.gov (United States)

    Huang, Honglan; Mao, Hanying; Mao, Hanling; Zheng, Weixue; Huang, Zhenfeng; Li, Xinxin; Wang, Xianghong

    2017-12-01

    Cumulative fatigue damage detection for used parts plays a key role in the process of remanufacturing engineering and is related to the service safety of the remanufactured parts. In light of the nonlinear properties of used parts caused by cumulative fatigue damage, the based nonlinear output frequency response functions detection approach offers a breakthrough to solve this key problem. First, a modified PSO-adaptive lasso algorithm is introduced to improve the accuracy of the NARMAX model under impulse hammer excitation, and then, an effective new algorithm is derived to estimate the nonlinear output frequency response functions under rectangular pulse excitation, and a based nonlinear output frequency response functions index is introduced to detect the cumulative fatigue damage in used parts. Then, a novel damage detection approach that integrates the NARMAX model and the rectangular pulse is proposed for nonlinear output frequency response functions identification and cumulative fatigue damage detection of used parts. Finally, experimental studies of fatigued plate specimens and used connecting rod parts are conducted to verify the validity of the novel approach. The obtained results reveal that the new approach can detect cumulative fatigue damages of used parts effectively and efficiently and that the various values of the based nonlinear output frequency response functions index can be used to detect the different fatigue damages or working time. Since the proposed new approach can extract nonlinear properties of systems by only a single excitation of the inspected system, it shows great promise for use in remanufacturing engineering applications.

  12. New tests of cumulative prospect theory and the priority heuristic

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-04-01

    Full Text Available Previous tests of cumulative prospect theory (CPT and of the priority heuristic (PH found evidence contradicting these two models of risky decision making. However, those tests were criticized because they had characteristics that might ``trigger'' use of other heuristics. This paper presents new tests that avoid those characteristics. Expected values of the gambles are nearly equal in each choice. In addition, if a person followed expected value (EV, expected utility (EU, CPT, or PH in these tests, she would shift her preferences in the same direction as shifts in EV or EU. In contrast, the transfer of attention exchange model (TAX and a similarity model predict that people will reverse preferences in the opposite direction. Results contradict the PH, even when PH is modified to include a preliminary similarity evaluation using the PH parameters. New tests of probability-consequence interaction were also conducted. Strong interactions were observed, contrary to PH. These results add to the growing bodies of evidence showing that neither CPT nor PH is an accurate description of risky decision making.

  13. Synchronization of Switched Neural Networks With Communication Delays via the Event-Triggered Control.

    Science.gov (United States)

    Wen, Shiping; Zeng, Zhigang; Chen, Michael Z Q; Huang, Tingwen

    2017-10-01

    This paper addresses the issue of synchronization of switched delayed neural networks with communication delays via event-triggered control. For synchronizing coupled switched neural networks, we propose a novel event-triggered control law which could greatly reduce the number of control updates for synchronization tasks of coupled switched neural networks involving embedded microprocessors with limited on-board resources. The control signals are driven by properly defined events, which depend on the measurement errors and current-sampled states. By using a delay system method, a novel model of synchronization error system with delays is proposed with the communication delays and event-triggered control in the unified framework for coupled switched neural networks. The criteria are derived for the event-triggered synchronization analysis and control synthesis of switched neural networks via the Lyapunov-Krasovskii functional method and free weighting matrix approach. A numerical example is elaborated on to illustrate the effectiveness of the derived results.

  14. Cumulative effects in strategic environmental assessment: The influence of plan boundaries

    Energy Technology Data Exchange (ETDEWEB)

    Bidstrup, Morten, E-mail: bidstrup@plan.aau.dk [Aalborg University (Denmark); Kørnøv, Lone, E-mail: lonek@plan.aau.dk [Aalborg University (Denmark); Partidário, Maria Rosário, E-mail: mariapartidario@tecnico.ulisboa.pt [CEG-IST, Instituto Superior Técnico, Universidade de Lisboa (Portugal)

    2016-02-15

    Cumulative effects (CE) assessment is lacking quality in impact assessment (IA) worldwide. It has been argued that the strategic environmental assessment (SEA) provides a suitable IA framework for addressing CE because it is applied to developments with broad boundaries, but few have tested this claim. Through a case study on the Danish mining sector, this article explores how plan boundaries influence the analytical boundaries applied for assessing CE in SEA. The case was studied through document analysis in combination with semi-structured group interviews of the responsible planners, who also serve as SEA practitioners. It was found that CE are to some extent assessed and managed implicitly throughout the planning process. However, this is through a focus on lowering the cumulative stress of mining rather than the cumulative stress on and capacity of the receiving environment. Plan boundaries do influence CE assessment, though all boundaries are not equally influential. The geographical and time boundaries of the Danish mining plans are broad or flexible enough to accommodate a meaningful assessment of CE, but the topical boundary is restrictive. The study indicates that collaboration among planning authorities and legally appointed CE leadership may facilitate better practice on CE assessment in sector-specific SEA contexts. However, most pressing is the need for relating assessment to the receiving environment as opposed to solely the stress of a proposed plan.

  15. Cumulative effects in strategic environmental assessment: The influence of plan boundaries

    International Nuclear Information System (INIS)

    Bidstrup, Morten; Kørnøv, Lone; Partidário, Maria Rosário

    2016-01-01

    Cumulative effects (CE) assessment is lacking quality in impact assessment (IA) worldwide. It has been argued that the strategic environmental assessment (SEA) provides a suitable IA framework for addressing CE because it is applied to developments with broad boundaries, but few have tested this claim. Through a case study on the Danish mining sector, this article explores how plan boundaries influence the analytical boundaries applied for assessing CE in SEA. The case was studied through document analysis in combination with semi-structured group interviews of the responsible planners, who also serve as SEA practitioners. It was found that CE are to some extent assessed and managed implicitly throughout the planning process. However, this is through a focus on lowering the cumulative stress of mining rather than the cumulative stress on and capacity of the receiving environment. Plan boundaries do influence CE assessment, though all boundaries are not equally influential. The geographical and time boundaries of the Danish mining plans are broad or flexible enough to accommodate a meaningful assessment of CE, but the topical boundary is restrictive. The study indicates that collaboration among planning authorities and legally appointed CE leadership may facilitate better practice on CE assessment in sector-specific SEA contexts. However, most pressing is the need for relating assessment to the receiving environment as opposed to solely the stress of a proposed plan.

  16. 76 FR 81490 - Agency Information Collection Activities; Proposed Collection; Comment Request; Contractor...

    Science.gov (United States)

    2011-12-28

    ... Activities; Proposed Collection; Comment Request; Contractor Cumulative Claim and Reconciliation (Renewal... identify the Docket ID Number EPA-HQ-OARM-2011-0997, Contractor Cumulative Claim and Reconciliation. Hand... information collection activity or ICR does this apply to? Affected entities: All contractors who have...

  17. BAT Triggering Performance

    Science.gov (United States)

    McLean, Kassandra M.; Fenimore, E. E.; Palmer, D. M.; BAT Team

    2006-09-01

    The Burst Alert Telescope (BAT) onboard Swift has detected and located about 160 gamma-ray bursts (GRBs) in its first twenty months of operation. BAT employs two triggering systems to find GRBs: image triggering, which looks for a new point source in the field of view, and rate triggering, which looks for a significant increase in the observed counts. The image triggering system looks at 1 minute, 5 minute, and full pointing accumulations of counts in the detector plane in the energy range of 15-50 keV, with about 50 evaluations per pointing (about 40 minutes). The rate triggering system looks through 13 different time scales (from 4ms to 32s), 4 overlapping energy bins (covering 15-350 keV), 9 regions of the detector plane (from the full plane to individual quarters), and two background sampling models to search for GRBs. It evaluates 27000 trigger criteria in a second, for close to 1000 criteria. The image triggering system looks at 1, 5, and 40 minute accumulations of counts in the detector plane in the energy range of 15-50 keV. Both triggering systems are working very well with the settings from before launch and after we turned on BAT. However, we now have more than a year and a half of data to evaluate these triggering systems and tweak them for optimal performance, as well as lessons learned from these triggering systems.

  18. CEAMF study, volume 2 : cumulative effects indicators, thresholds, and case studies : final

    International Nuclear Information System (INIS)

    2003-03-01

    The four types of cumulative effects on the environment are: alteration, loss, and fragmentation of habitat; disturbance; barriers to movement; and direct and indirect mortality. Defining where and how human activities can be continued without irreversible net harm to the environment is part of cumulative effects management. Various land-use and habitat indicators were tested in the Blueberry and Sukunka study areas of British Columbia, to address the environmental effects associated with oil and gas development. As recommended, a tiered threshold approach was used to allow for flexibility in different land management regimes and ecological settings. Success will depend on defining acceptable change, threshold values, standard public database, standard processes to calculate indicator values using the database, and project-specific and cooperative management actions. A pilot study was suggested to test the candidate thresholds and implementation process. The two areas proposed for consideration were the Jedney Enhanced Resource Development Resource Management Zone in the Fort St. John Forest District, and the Etsho Enhanced Resource Development Resource Management Zone in the Fort Nelson Forest District. Both are of interest to the petroleum and forest sectors, and support the woodland caribou, a species which is extremely sensitive to cumulative effects of habitat fragmentation and disturbance. 117 refs., 11 tabs., 39 figs.

  19. Detecting spatial patterns with the cumulant function – Part 1: The theory

    Directory of Open Access Journals (Sweden)

    P. Naveau

    2008-02-01

    Full Text Available In climate studies, detecting spatial patterns that largely deviate from the sample mean still remains a statistical challenge. Although a Principal Component Analysis (PCA, or equivalently a Empirical Orthogonal Functions (EOF decomposition, is often applied for this purpose, it provides meaningful results only if the underlying multivariate distribution is Gaussian. Indeed, PCA is based on optimizing second order moments, and the covariance matrix captures the full dependence structure of multivariate Gaussian vectors. Whenever the application at hand can not satisfy this normality hypothesis (e.g. precipitation data, alternatives and/or improvements to PCA have to be developed and studied. To go beyond this second order statistics constraint, that limits the applicability of the PCA, we take advantage of the cumulant function that can produce higher order moments information. The cumulant function, well-known in the statistical literature, allows us to propose a new, simple and fast procedure to identify spatial patterns for non-Gaussian data. Our algorithm consists in maximizing the cumulant function. Three families of multivariate random vectors, for which explicit computations are obtained, are implemented to illustrate our approach. In addition, we show that our algorithm corresponds to selecting the directions along which projected data display the largest spread over the marginal probability density tails.

  20. A Fast Hardware Tracker for the ATLAS Trigger System

    CERN Document Server

    Neubauer, Mark S

    2011-01-01

    In hadron collider experiments, triggering the detector to store interesting events for offline analysis is a challenge due to the high rates and multiplicities of particles produced. Maintaining high trigger efficiency for the physics we are most interested in while at the same time suppressing high rate physics from inclusive QCD processes is a difficult but important problem. It is essential that the trigger system be flexible and robust, with sufficient redundancy and operating margin. Providing high quality track reconstruction over the full ATLAS detector by the start of processing at LVL2 is an important element to achieve these needs. As the instantaneous luminosity increases, the computational load on the LVL2 system will significantly increase due to the need for more sophisticated algorithms to suppress backgrounds. The Fast Tracker (FTK) is a proposed upgrade to the ATLAS trigger system. It is designed to enable early rejection of background events and thus leave more LVL2 execution time by moving...

  1. TRIGGER

    CERN Multimedia

    W. Smith

    Level-1 Trigger Hardware The CERN group is working on the TTC system. Seven out of nine sub-detector TTC VME crates with all fibers cabled are installed in USC55. 17 Local Trigger Controller (LTC) boards have been received from production and are in the process of being tested. The RF2TTC module replacing the TTCmi machine interface has been delivered and will replace the TTCci module used to mimic the LHC clock. 11 out of 12 crates housing the barrel ECAL off-detector electronics have been installed in USC55 after commissioning at the Electronics Integration Centre in building 904. The cabling to the Regional Calorimeter Trigger (RCT) is terminated. The Lisbon group has completed the Synchronization and Link mezzanine board (SLB) production. The Palaiseau group has fully tested and installed 33 out of 40 Trigger Concentrator Cards (TCC). The seven remaining boards are being remade. The barrel TCC boards have been tested at the H4 test beam, and good agreement with emulator predictions were found. The cons...

  2. A Cross-Layer User Centric Vertical Handover Decision Approach Based on MIH Local Triggers

    Science.gov (United States)

    Rehan, Maaz; Yousaf, Muhammad; Qayyum, Amir; Malik, Shahzad

    Vertical handover decision algorithm that is based on user preferences and coupled with Media Independent Handover (MIH) local triggers have not been explored much in the literature. We have developed a comprehensive cross-layer solution, called Vertical Handover Decision (VHOD) approach, which consists of three parts viz. mechanism for collecting and storing user preferences, Vertical Handover Decision (VHOD) algorithm and the MIH Function (MIHF). MIHF triggers the VHOD algorithm which operates on user preferences to issue handover commands to mobility management protocol. VHOD algorithm is an MIH User and therefore needs to subscribe events and configure thresholds for receiving triggers from MIHF. In this regard, we have performed experiments in WLAN to suggest thresholds for Link Going Down trigger. We have also critically evaluated the handover decision process, proposed Just-in-time interface activation technique, compared our proposed approach with prominent user centric approaches and analyzed our approach from different aspects.

  3. The ALICE Central Trigger Processor (CTP) upgrade

    International Nuclear Information System (INIS)

    Krivda, M.; Alexandre, D.; Barnby, L.S.; Evans, D.; Jones, P.G.; Jusko, A.; Lietava, R.; Baillie, O. Villalobos; Pospíšil, J.

    2016-01-01

    The ALICE Central Trigger Processor (CTP) at the CERN LHC has been upgraded for LHC Run 2, to improve the Transition Radiation Detector (TRD) data-taking efficiency and to improve the physics performance of ALICE. There is a new additional CTP interaction record sent using a new second Detector Data Link (DDL), a 2 GB DDR3 memory and an extension of functionality for classes. The CTP switch has been incorporated directly onto the new LM0 board. A design proposal for an ALICE CTP upgrade for LHC Run 3 is also presented. Part of the development is a low latency high bandwidth interface whose purpose is to minimize an overall trigger latency

  4. Cumulative Student Loan Debt in Minnesota, 2015

    Science.gov (United States)

    Williams-Wyche, Shaun

    2016-01-01

    To better understand student debt in Minnesota, the Minnesota Office of Higher Education (the Office) gathers information on cumulative student loan debt from Minnesota degree-granting institutions. These data detail the number of students with loans by institution, the cumulative student loan debt incurred at that institution, and the percentage…

  5. New high-energy phenomena in aircraft triggered lightning

    NARCIS (Netherlands)

    van Deursen, A.P.J.; Kochkin, P.; de Boer, A.; Bardet, M.; Boissin, J.F.

    2016-01-01

    High-energy phenomena associated with lighting have been proposed in the twenties, observed for the first time in the sixties, and further investigated more recently by e.g. rocket triggered lightning. Similarly, x-rays have been detected in meter-long discharges in air at standard atmospheric

  6. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  7. NOMAD Trigger Studies

    International Nuclear Information System (INIS)

    Varvell, K.

    1995-01-01

    The author reports on the status of an offline study of the NOMAD triggers, which has several motivations. Of primary importance is to demonstrate, using offline information recorded by the individual subdetectors comprising NOMAD, that the online trigger system is functioning as expected. Such an investigation serves to complement the extensive monitoring which is already carried out online. More specific to the needs of the offline software and analysis, the reconstruction of tracks and vertices in the detector requires some knowledge of the time at which the trigger has occurred, in order to locate relevant hits in the drift chambers and muon chambers in particular. The fact that the different triggers allowed by the MIOTRINO board take varying times to form complicates this task. An offline trigger algorithm may serve as a tool to shed light on situations where the online trigger status bits have not been recorded correctly, as happens in a small number of cases, or as an aid to studies with the aim of further refinement of the online triggers themselves

  8. Towards a Level-1 tracking trigger for the ATLAS experiment at the High Luminosity LHC

    CERN Document Server

    Martin, T A D; The ATLAS collaboration

    2014-01-01

    At the high luminosity HL-LHC, upwards of 160 individual proton-proton interactions (pileup) are expected per bunch-crossing at luminosities of around $5\\times10^{34}$ cm$^{-2}$s$^{-1}$. A proposal by the ATLAS collaboration to split the ATLAS first level trigger in to two stages is briefly detailed. The use of fast track finding in the new first level trigger is explored as a method to provide the discrimination required to reduce the event rate to acceptable levels for the read out system while maintaining high efficiency on the selection of the decay products of electroweak bosons at HL-LHC luminosities. It is shown that available bandwidth in the proposed new strip tracker is sufficiency for a region of interest based track trigger given certain optimisations, further methods for improving upon the proposal are discussed.

  9. Cumulative stress and autonomic dysregulation in a community sample.

    Science.gov (United States)

    Lampert, Rachel; Tuit, Keri; Hong, Kwang-Ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita

    2016-05-01

    Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers (N = 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview (CAI), a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24-h ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors and cumulative adverse life events (CALE) were all inversely associated with ultra low-frequency (ULF), very low-frequency (VLF) and low-frequency (LF) power and SDNN (all p accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance alone but were not longer significant after adjusting for race and health behaviors. In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial.

  10. Cumulative processes and quark distribution in nuclei

    International Nuclear Information System (INIS)

    Kondratyuk, L.; Shmatikov, M.

    1984-01-01

    Assuming existence of multiquark (mainly 12q) bags in nuclei the spectra of cumulative nucleons and mesons produced in high-energy particle-nucleus collisions are discussed. The exponential form of quark momentum distribution in 12q-bag (agreeing well with the experimental data on lepton-nucleus interactions at large q 2 ) is shown to result in quasi-exponential distribution of cumulative particles over the light-cone variable αsub(B). The dependence of f(αsub(B); psub(perpendicular)) (where psub(perpendicular) is the transverse momentum of the bag) upon psub(perpendicular) is considered. The yields of cumulative resonances as well as effects related to the u- and d-quark distributions in N > Z nuclei being different are dicscussed

  11. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  12. Decision analysis with cumulative prospect theory.

    Science.gov (United States)

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  13. Study on current limiting characteristics of SFCL with two trigger current levels

    International Nuclear Information System (INIS)

    Lim, S.H.

    2010-01-01

    In this paper, the superconducting fault current limiter (SFCL) with two trigger current levels was suggested and its effectiveness through the analysis on the current limiting characteristics was described. The proposed SFCL, which consists of the triggering and the limiting components, can limit the fault current by generating the limiting impedance through two steps according to the amplitude of the initial fault current. In case that the fault happens, the lower initial fault current causes the only superconducting element of the triggering component to be quenched. On the other hand, the higher initial fault current makes both the superconducting elements comprising the triggering and the limiting components of the SFCL to be quenched, which contributes to the higher impedance of the SFCL. Therefore, the effective fault current limiting operation of the SFCL can be performed by generating the SFCL's impedance in proportion to the amplitude of the initial fault current. To confirm the current limiting operation of the proposed SFCL, the short-circuit tests of the SFCL according to the fault angle were carried out and its effective fault current limiting operations could be discussed.

  14. Triggering at high luminosity: fake triggers from pile-up

    International Nuclear Information System (INIS)

    Johnson, R.

    1983-01-01

    Triggers based on a cut in transverse momentum (p/sub t/) have proved to be useful in high energy physics both because they indicte that a hard constituent scattering has occurred and because they can be made quickly enough to gate electronics. These triggers will continue to be useful at high luminosities if overlapping events do not cause an excessive number of fake triggers. In this paper, I determine if this is indeed a problem at high luminosity machines

  15. Original and cumulative prospect theory: a discussion of empirical differences

    NARCIS (Netherlands)

    Wakker, P.P.; Fennema, H.

    1997-01-01

    This note discusses differences between prospect theory and cumulative prospect theory. It shows that cumulative prospect theory is not merely a formal correction of some theoretical problems in prospect theory, but it also gives different predictions. Experiments are described that favor cumulative

  16. Cumulative radiation dose of multiple trauma patients during their hospitalization

    International Nuclear Information System (INIS)

    Wang Zhikang; Sun Jianzhong; Zhao Zudan

    2012-01-01

    Objective: To study the cumulative radiation dose of multiple trauma patients during their hospitalization and to analyze the dose influence factors. Methods: The DLP for CT and DR were retrospectively collected from the patients during June, 2009 and April, 2011 at a university affiliated hospital. The cumulative radiation doses were calculated by summing typical effective doses of the anatomic regions scanned. Results: The cumulative radiation doses of 113 patients were collected. The maximum,minimum and the mean values of cumulative effective doses were 153.3, 16.48 mSv and (52.3 ± 26.6) mSv. Conclusions: Multiple trauma patients have high cumulative radiation exposure. Therefore, the management of cumulative radiation doses should be enhanced. To establish the individualized radiation exposure archives will be helpful for the clinicians and technicians to make decision whether to image again and how to select the imaging parameters. (authors)

  17. Perspectives on cumulative risks and impacts.

    Science.gov (United States)

    Faust, John B

    2010-01-01

    Cumulative risks and impacts have taken on different meanings in different regulatory and programmatic contexts at federal and state government levels. Traditional risk assessment methodologies, with considerable limitations, can provide a framework for the evaluation of cumulative risks from chemicals. Under an environmental justice program in California, cumulative impacts are defined to include exposures, public health effects, or environmental effects in a geographic area from the emission or discharge of environmental pollution from all sources, through all media. Furthermore, the evaluation of these effects should take into account sensitive populations and socioeconomic factors where possible and to the extent data are available. Key aspects to this potential approach include the consideration of exposures (versus risk), socioeconomic factors, the geographic or community-level assessment scale, and the inclusion of not only health effects but also environmental effects as contributors to impact. Assessments of this type extend the boundaries of the types of information that toxicologists generally provide for risk management decisions.

  18. Cumulative particle production in the quark recombination model

    International Nuclear Information System (INIS)

    Gavrilov, V.B.; Leksin, G.A.

    1987-01-01

    Production of cumulative particles in hadron-nuclear inteactions at high energies is considered within the framework of recombination quark model. Predictions for inclusive cross sections of production of cumulative particles and different resonances containing quarks in s state are made

  19. ELM mitigation with pellet ELM triggering and implications for PFCs and plasma performance in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Baylor, Larry R. [ORNL; Lang, P. [EURATOM / UKAEA, Abingdon, UK; Allen, S. L. [Lawrence Livermore National Laboratory (LLNL); Lasnier, C. J. [Lawrence Livermore National Laboratory (LLNL); Meitner, Steven J. [ORNL; Combs, Stephen Kirk [ORNL; Commaux, Nicolas JC [ORNL; Loarte, A. [ITER Organization, Cadarache, France; Jernigan, Thomas C. [ORNL

    2015-08-01

    The triggering of rapid small edge localized modes (ELMs) by high frequency pellet injection has been proposed as a method to prevent large naturally occurring ELMs that can erode the ITER plasma facing components (PFCs). Deuterium pellet injection has been used to successfully demonstrate the on-demand triggering of edge localized modes (ELMs) at much higher rates and with much smaller intensity than natural ELMs. The proposed hypothesis for the triggering mechanism of ELMs by pellets is the local pressure perturbation resulting from reheating of the pellet cloud that can exceed the local high-n ballooning mode threshold where the pellet is injected. Nonlinear MHD simulations of the pellet ELM triggering show destabilization of high-n ballooning modes by such a local pressure perturbation.A review of the recent pellet ELM triggering results from ASDEX Upgrade (AUG), DIII-D, and JET reveals that a number of uncertainties about this ELM mitigation technique still remain. These include the heat flux impact pattern on the divertor and wall from pellet triggered and natural ELMs, the necessary pellet size and injection location to reliably trigger ELMs, and the level of fueling to be expected from ELM triggering pellets and synergy with larger fueling pellets. The implications of these issues for pellet ELM mitigation in ITER and its impact on the PFCs are presented along with the design features of the pellet injection system for ITER.

  20. A Fast Hardware Tracker for the ATLAS Trigger System

    CERN Document Server

    Neubauer, M; The ATLAS collaboration

    2009-01-01

    As the LHC luminosity is ramped up to the design level of 10^{34} cm^{-2} s^{-1} and beyond, the high rates, multiplicities, and energies of particles seen by the detectors will pose a unique challenge. Only a tiny fraction of the produced collisions can be stored on tape and immense real-time data reduction is needed. An effective trigger system must maintain high trigger efficiencies for the physics we are most interested in, and at the same time suppress the enormous QCD backgrounds. This requires massive computing power to minimize the online execution time of complex algorithms. A multi-level trigger is an effective solution for an otherwise impossible problem. The Fast Tracker (FTK) is a proposed upgrade to the ATLAS trigger system that will operate at full Level-1 output rates and provide high quality tracks reconstructed over the entire detector by the start of processing in Level-2. FTK solves the combinatorial challenge inherent to tracking by exploiting the massive parallelism of Associative Memori...

  1. Trigger finger

    Science.gov (United States)

    ... digit; Trigger finger release; Locked finger; Digital flexor tenosynovitis ... cut or hand Yellow or green drainage from the cut Hand pain or discomfort Fever If your trigger finger returns, call your surgeon. You may need another surgery.

  2. Droop-Free Distributed Control with Event-Triggered Communication in DC Micro-Grid

    DEFF Research Database (Denmark)

    Han, Renke; Aldana, Nelson Leonardo Diaz; Meng, Lexuan

    2017-01-01

    A novel nonlinear droop-free distributed controller is proposed to achieve accurate current sharing and eliminate voltage drops in dc Micro-Grid (MG). Then by introducing the sample and holding scheme, the proposed controller is extended to the event-triggered-based controller which is designed...

  3. Design and Test Space Exploration of Transport-Triggered Architectures

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper describes a new approach in the high level design and test of transport-triggered architectures (TTA), a special type of application specific instruction processors (ASIP). The proposed method introduces the test as an additional constraint, besides throughput and circuit area. The

  4. Cumulative radiation exposure in children with cystic fibrosis.

    LENUS (Irish Health Repository)

    O'Reilly, R

    2010-02-01

    This retrospective study calculated the cumulative radiation dose for children with cystic fibrosis (CF) attending a tertiary CF centre. Information on 77 children with a mean age of 9.5 years, a follow up time of 658 person years and 1757 studies including 1485 chest radiographs, 215 abdominal radiographs and 57 computed tomography (CT) scans, of which 51 were thoracic CT scans, were analysed. The average cumulative radiation dose was 6.2 (0.04-25) mSv per CF patient. Cumulative radiation dose increased with increasing age and number of CT scans and was greater in children who presented with meconium ileus. No correlation was identified between cumulative radiation dose and either lung function or patient microbiology cultures. Radiation carries a risk of malignancy and children are particularly susceptible. Every effort must be made to avoid unnecessary radiation exposure in these patients whose life expectancy is increasing.

  5. Measurement of four-particle cumulants and symmetric cumulants with subevent methods in small collision systems with the ATLAS detector

    CERN Document Server

    Derendarz, Dominik; The ATLAS collaboration

    2018-01-01

    Measurements of symmetric cumulants SC(n,m)=⟨v2nv2m⟩−⟨v2n⟩⟨v2m⟩ for (n,m)=(2,3) and (2,4) and asymmetric cumulant AC(n) are presented in pp, p+Pb and peripheral Pb+Pb collisions at various collision energies, aiming to probe the long-range collective nature of multi-particle production in small systems. Results are obtained using the standard cumulant method, as well as the two-subevent and three-subevent cumulant methods. Results from the standard method are found to be strongly biased by non-flow correlations as indicated by strong sensitivity to the chosen event class definition. A systematic reduction of non-flow effects is observed when using the two-subevent method and the results become independent of event class definition when the three-subevent method is used. The measured SC(n,m) shows an anti-correlation between v2 and v3, and a positive correlation between v2 and v4. The magnitude of SC(n,m) is constant with Nch in pp collisions, but increases with Nch in p+Pb and Pb+Pb collisions. ...

  6. Numerical evaluation of a robust self-triggered MPC algorithm

    NARCIS (Netherlands)

    Brunner, F.D.; Heemels, W.P.M.H.; Allgöwer, F.

    2016-01-01

    We present numerical examples demonstrating the efficacy of a recently proposed self-triggered model predictive control scheme for disturbed linear discrete-time systems with hard constraints on the input and state. In order to reduce the amount of communication between the controller and the

  7. Topological Trigger Developments

    CERN Multimedia

    Likhomanenko, Tatiana

    2015-01-01

    The main b-physics trigger algorithm used by the LHCb experiment is the so-called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger utilized a custom boosted decision tree algorithm, selected an almost 100% pure sample of b-hadrons with a typical efficiency of 60-70%, and its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and uBoost. The topological trigger algorithm is designed to select all "interesting" decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. These inclu...

  8. Event-Triggered Faults Tolerant Control for Stochastic Systems with Time Delays

    Directory of Open Access Journals (Sweden)

    Ling Huang

    2016-01-01

    Full Text Available This paper is concerned with the state-feedback controller design for stochastic networked control systems (NCSs with random actuator failures and transmission delays. Firstly, an event-triggered scheme is introduced to optimize the performance of the stochastic NCSs. Secondly, stochastic NCSs under event-triggered scheme are modeled as stochastic time-delay systems. Thirdly, some less conservative delay-dependent stability criteria in terms of linear matrix inequalities for the codesign of both the controller gain and the trigger parameters are obtained by using delay-decomposition technique and convex combination approach. Finally, a numerical example is provided to show the less sampled data transmission and less conservatism of the proposed theory.

  9. Efficient trigger signal generation from wasted backward amplified stimulated emission at optical amplifiers for optical coherence tomography

    Directory of Open Access Journals (Sweden)

    Kim Seung Taek

    2015-01-01

    Full Text Available This paper propose an optical structure to generate trigger signals for optical coherence tomography (OCT using backward light which is usually disposed. The backward light is called backward amplified stimulated emission generated from semiconductor optical amplifier (SOA when using swept wavelength tunable laser (SWTL. A circulator is applied to block undesirable lights in the SWTL instead of an isolator in common SWTL. The circulator also diverts backward amplified spontaneous lights, which finally bring out trigger signals for a high speed digitizer. The spectra of the forward lights at SOA and the waveform of the backward lights were measured to check the procedure of the trigger formation in the experiment. The results showed that the trigger signals from the proposed SWTL with the circulator was quite usable in OCT.

  10. Cumulant-Based Coherent Signal Subspace Method for Bearing and Range Estimation

    Directory of Open Access Journals (Sweden)

    Bourennane Salah

    2007-01-01

    Full Text Available A new method for simultaneous range and bearing estimation for buried objects in the presence of an unknown Gaussian noise is proposed. This method uses the MUSIC algorithm with noise subspace estimated by using the slice fourth-order cumulant matrix of the received data. The higher-order statistics aim at the removal of the additive unknown Gaussian noise. The bilinear focusing operator is used to decorrelate the received signals and to estimate the coherent signal subspace. A new source steering vector is proposed including the acoustic scattering model at each sensor. Range and bearing of the objects at each sensor are expressed as a function of those at the first sensor. This leads to the improvement of object localization anywhere, in the near-field or in the far-field zone of the sensor array. Finally, the performances of the proposed method are validated on data recorded during experiments in a water tank.

  11. The Central Trigger Processor (CTP)

    CERN Multimedia

    Franchini, Matteo

    2016-01-01

    The Central Trigger Processor (CTP) receives trigger information from the calorimeter and muon trigger processors, as well as from other sources of trigger. It makes the Level-1 decision (L1A) based on a trigger menu.

  12. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  13. Development of Overflow-Prevention Valve with Trigger Mechanism.

    Science.gov (United States)

    Ishino, Yuji; Mizuno, Takeshi; Takasaki, Masaya

    2016-09-01

    A new overflow-prevention valve for combustible fluid is developed which uses a trigger mechanism. Loading arms for combustible fluid are used for transferring oil from a tanker to tanks and vice versa. The loading arm has a valve for preventing overflow. Overflow- prevention valves cannot use any electric component to avoid combustion. Therefore, the valve must be constructed only by mechanical parts. The conventional overflow-prevention valve uses fluid and pneumatic forces. It consists of a sensor probe, a cylinder, a main valve for shutting off the fluid and a locking mechanism for holding an open state of the main valve. The proposed overflow-prevention valve uses the pressure due to the height difference between the fluid level of the tank and the sensor probe. However, the force of the cylinder produced by the pressure is too small to release the locking mechanism. Therefore, a trigger mechanism is introduced between the cylinder and the locking mechanism. The trigger mechanism produces sufficient force to release the locking mechanism and close the main valve when the height of fluid exceeds a threshold value. A trigger mechanism is designed and fabricated. The operation necessary for closing the main valve is conformed experimentally.

  14. Decentralized event-triggered consensus control strategy for leader-follower networked systems

    Science.gov (United States)

    Zhang, Shouxu; Xie, Duosi; Yan, Weisheng

    2017-08-01

    In this paper, the consensus problem of leader-follower networked systems is addressed. At first, a centralized and a decentralized event-triggered control strategy are proposed, which make the control actuators of followers update at aperiodic invent interval. In particular, the latter one makes each follower requires the local information only. After that, an improved triggering function that only uses the follower's own information and the neighbors' states at their latest event instants is developed to relax the requirement of the continuous state of the neighbors. In addition, the strategy does not require the information of the topology, nor the eigenvalues of the Laplacian matrix. And if the follower does not have direct connection to the leader, the leader's information is not required either. It is analytically shown that by using the proposed strategy the leader-follower networked system is able to reach consensus without continuous communication among followers. Simulation examples are given to show effectiveness of the proposed control strategy.

  15. KATANA – A charge-sensitive triggering system for the SπRIT experiment

    Energy Technology Data Exchange (ETDEWEB)

    Lasko, P. [Institute of Nuclear Physics, Polish Academy of Sciences,Kraków (Poland); Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Adamczyk, M.; Brzychczyk, J. [Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Hirnyk, P.; Łukasik, J. [Institute of Nuclear Physics, Polish Academy of Sciences,Kraków (Poland); Pawłowski, P., E-mail: piotr.pawlowski@ifj.edu.pl [Institute of Nuclear Physics, Polish Academy of Sciences,Kraków (Poland); Pelczar, K. [Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Snoch, A. [University of Wroclaw, Wrocław (Poland); Sochocka, A.; Sosin, Z. [Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Barney, J. [Department of Physics and Astronomy, Michigan State University, East Lansing (United States); National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing (United States); Cerizza, G. [National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing (United States); Estee, J. [Department of Physics and Astronomy, Michigan State University, East Lansing (United States); National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing (United States); Isobe, T. [RIKEN Nishina Center, Wako, Saitama (Japan); Jhang, G. [Department of Physics, Korea University, Seoul (Korea, Republic of); Kaneko, M. [Department of Physics, Kyoto University, Kita-shirakawa, Kyoto (Japan); Kurata-Nishimura, M. [RIKEN Nishina Center, Wako, Saitama (Japan); and others

    2017-06-01

    KATANA - the Krakow Array for Triggering with Amplitude discrimiNAtion - has been built and used as a trigger and veto detector for the SπRIT TPC at RIKEN. Its construction allows operating in magnetic field and providing fast response for ionizing particles, giving an approximate forward multiplicity and charge information. Depending on this information, trigger and veto signals are generated. The article presents performance of the detector and details of its construction. A simple phenomenological parametrization of the number of emitted scintillation photons in plastic scintillator is proposed. The effect of the light output deterioration in the plastic scintillator due to the in-beam irradiation is discussed.

  16. Improving cumulative effects assessment in Alberta: Regional strategic assessment

    International Nuclear Information System (INIS)

    Johnson, Dallas; Lalonde, Kim; McEachern, Menzie; Kenney, John; Mendoza, Gustavo; Buffin, Andrew; Rich, Kate

    2011-01-01

    The Government of Alberta, Canada is developing a regulatory framework to better manage cumulative environmental effects from development in the province. A key component of this effort is regional planning, which will lay the primary foundation for cumulative effects management into the future. Alberta Environment has considered the information needs of regional planning and has concluded that Regional Strategic Assessment may offer significant advantages if integrated into the planning process, including the overall improvement of cumulative environmental effects assessment in the province.

  17. A bivariate optimal replacement policy with cumulative repair cost ...

    Indian Academy of Sciences (India)

    Min-Tsai Lai

    Shock model; cumulative damage model; cumulative repair cost limit; preventive maintenance model. 1. Introduction ... with two types of shocks: one type is failure shock, and the other type is damage ...... Theory, methods and applications.

  18. Cooperative Control of Mobile Sensor Networks for Environmental Monitoring: An Event-Triggered Finite-Time Control Scheme.

    Science.gov (United States)

    Lu, Qiang; Han, Qing-Long; Zhang, Botao; Liu, Dongliang; Liu, Shirong

    2017-12-01

    This paper deals with the problem of environmental monitoring by developing an event-triggered finite-time control scheme for mobile sensor networks. The proposed control scheme can be executed by each sensor node independently and consists of two parts: one part is a finite-time consensus algorithm while the other part is an event-triggered rule. The consensus algorithm is employed to enable the positions and velocities of sensor nodes to quickly track the position and velocity of a virtual leader in finite time. The event-triggered rule is used to reduce the updating frequency of controllers in order to save the computational resources of sensor nodes. Some stability conditions are derived for mobile sensor networks with the proposed control scheme under both a fixed communication topology and a switching communication topology. Finally, simulation results illustrate the effectiveness of the proposed control scheme for the problem of environmental monitoring.

  19. Numeraire-invariant option pricing and american, bermudan, trigger stream rollover

    NARCIS (Netherlands)

    Jamshidian, F.

    2004-01-01

    Part I proposes a numeraire-invariant option pricing framework. It defines an option, its price process, and such notions as option indistinguishability and equivalence, domination, payoff process, trigger option, and semipositive option. It develops some of their basic properties, including price

  20. A solar tornado triggered by flares?

    OpenAIRE

    Panesar, N. K.; Innes, D. E.; Tiwari, S. K.; Low, B. C.

    2013-01-01

    Context. Solar tornados are dynamical, conspicuously helical magnetic structures that are mainly observed as a prominence activity. Aims. We investigate and propose a triggering mechanism for the solar tornado observed in a prominence cavity by SDO/AIA on September 25, 2011. Methods. High-cadence EUV images from the SDO/AIA and the Ahead spacecraft of STEREO/EUVI are used to correlate three flares in the neighbouring active-region (NOAA 11303) and their EUV waves with the dynamical de...

  1. Event-triggered output feedback control for distributed networked systems.

    Science.gov (United States)

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2016-01-01

    This paper addresses the problem of output-feedback communication and control with event-triggered framework in the context of distributed networked control systems. The design problem of the event-triggered output-feedback control is proposed as a linear matrix inequality (LMI) feasibility problem. The scheme is developed for the distributed system where only partial states are available. In this scheme, a subsystem uses local observers and share its information to its neighbors only when the subsystem's local error exceeds a specified threshold. The developed method is illustrated by using a coupled cart example from the literature. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Evaluating Cumulative Ecosystem Response to Restoration Projects in the Columbia River Estuary, Annual Report 2004

    Energy Technology Data Exchange (ETDEWEB)

    Diefenderfer, Heida L.; Roegner, Curtis; Thom, Ronald M.; Dawley, Earl M.; Whiting, Allan H.; Johnson, Gary E.; Sobocinski, Kathryn L.; Anderson, Michael G.; Ebberts, Blaine

    2005-12-15

    The restoration of wetland salmon habitat in the tidal portion of the Columbia River is occurring at an accelerating pace and is anticipated to improve habitat quality and effect hydrological reconnection between existing and restored habitats. Currently multiple groups are applying a variety of restoration strategies in an attempt to emulate historic estuarine processes. However, the region lacks both a standardized means of evaluating the effectiveness of individual projects as well as methods for determining the cumulative effects of all restoration projects on a regional scale. This project is working to establish a framework to evaluate individual and cumulative ecosystem responses to restoration activities in order to validate the effectiveness of habitat restoration activities designed to benefit salmon through improvements to habitat quality and habitat opportunity (i.e. access) in the Columbia River from Bonneville Dam to the ocean. The review and synthesis of approaches to measure the cumulative effects of multiple restoration projects focused on defining methods and metrics of relevance to the CRE, and, in particular, juvenile salmon use of this system. An extensive literature review found no previous study assessing the cumulative effects of multiple restoration projects on the fundamental processes and functions of a large estuarine system, although studies are underway in other large land-margin ecosystems including the Florida Everglades and the Louisiana coastal wetlands. Literature from a variety of scientific disciplines was consulted to identify the ways that effects can accumulate (e.g., delayed effects, cross-boundary effects, compounding effects, indirect effects, triggers and thresholds) as well as standard and innovative tools and methods utilized in cumulative effects analyses: conceptual models, matrices, checklists, modeling, trends analysis, geographic information systems, carrying capacity analysis, and ecosystem analysis. Potential

  3. Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation

    Energy Technology Data Exchange (ETDEWEB)

    Barajas-Solano, David A.; Tartakovsky, Alexandre M.

    2018-01-01

    We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.

  4. An Event-Triggered Online Energy Management Algorithm of Smart Home: Lyapunov Optimization Approach

    Directory of Open Access Journals (Sweden)

    Wei Fan

    2016-05-01

    Full Text Available As an important component of the smart grid on the user side, a home energy management system is the core of optimal operation for a smart home. In this paper, the energy scheduling problem for a household equipped with photovoltaic devices was investigated. An online energy management algorithm based on event triggering was proposed. The Lyapunov optimization method was adopted to schedule controllable load in the household. Without forecasting related variables, real-time decisions were made based only on the current information. Energy could be rapidly regulated under the fluctuation of distributed generation, electricity demand and market price. The event-triggering mechanism was adopted to trigger the execution of the online algorithm, so as to cut down the execution frequency and unnecessary calculation. A comprehensive result obtained from simulation shows that the proposed algorithm could effectively decrease the electricity bills of users. Moreover, the required computational resource is small, which contributes to the low-cost energy management of a smart home.

  5. How to Choose? Using the Delphi Method to Develop Consensus Triggers and Indicators for Disaster Response.

    Science.gov (United States)

    Lis, Rebecca; Sakata, Vicki; Lien, Onora

    2017-08-01

    To identify key decisions along the continuum of care (conventional, contingency, and crisis) and the critical triggers and data elements used to inform those decisions concerning public health and health care response during an emergency. A classic Delphi method, a consensus-building survey technique, was used with clinicians around Washington State to identify regional triggers and indicators. Additionally, using a modified Delphi method, we combined a workshop and single-round survey with panelists from public health (state and local) and health care coalitions to identify consensus state-level triggers and indicators. In the clinical survey, 122 of 223 proposed triggers or indicators (43.7%) reached consensus and were deemed important in regional decision-making during a disaster. In the state-level survey, 110 of 140 proposed triggers or indicators (78.6%) reached consensus and were deemed important in state-level decision-making during a disaster. The identification of consensus triggers and indicators for health care emergency response is crucial in supporting a comprehensive health care situational awareness process. This can inform the creation of standardized questions to ask health care, public health, and other partners to support decision-making during a response. (Disaster Med Public Health Preparedness. 2017;11:467-472).

  6. An evaluation paradigm for cumulative impact analysis

    Science.gov (United States)

    Stakhiv, Eugene Z.

    1988-09-01

    Cumulative impact analysis is examined from a conceptual decision-making perspective, focusing on its implicit and explicit purposes as suggested within the policy and procedures for environmental impact analysis of the National Environmental Policy Act of 1969 (NEPA) and its implementing regulations. In this article it is also linked to different evaluation and decision-making conventions, contrasting a regulatory context with a comprehensive planning framework. The specific problems that make the application of cumulative impact analysis a virtually intractable evaluation requirement are discussed in connection with the federal regulation of wetlands uses. The relatively familiar US Army Corps of Engineers' (the Corps) permit program, in conjunction with the Environmental Protection Agency's (EPA) responsibilities in managing its share of the Section 404 regulatory program requirements, is used throughout as the realistic context for highlighting certain pragmatic evaluation aspects of cumulative impact assessment. To understand the purposes of cumulative impact analysis (CIA), a key distinction must be made between the implied comprehensive and multiobjective evaluation purposes of CIA, promoted through the principles and policies contained in NEPA, and the more commonly conducted and limited assessment of cumulative effects (ACE), which focuses largely on the ecological effects of human actions. Based on current evaluation practices within the Corps' and EPA's permit programs, it is shown that the commonly used screening approach to regulating wetlands uses is not compatible with the purposes of CIA, nor is the environmental impact statement (EIS) an appropriate vehicle for evaluating the variety of objectives and trade-offs needed as part of CIA. A heuristic model that incorporates the basic elements of CIA is developed, including the idea of trade-offs among social, economic, and environmental protection goals carried out within the context of environmental

  7. Smart Trigger Pre-Processor Custom Electronics for the PHENIX Experiment

    International Nuclear Information System (INIS)

    Nagle, James L.

    2003-01-01

    OAK-B135 The document provides a final technical report on activities and accomplishments of the experimental relativistic heavy ion physics group at the University of Colorado at Boulder as supported by the Outstanding Junior Investigator Program, Division of Nuclear Physics at the Department of Energy. All of the goals of the grant proposal were achieved during this last year of the Outstanding Junior Investigator funding period. The development of a Smart Trigger Pre-Processor module for fast trigger primitive calculations in the PHENIX experiment has been completed. We finalized the board design, constructed and tested two prototype modules, and with additional funding from the PHENIX project, we fabricated a full set of 15 modules for the Muon Tracking system. During Run-4 at RHIC:, we have begun the process of integrating these modules into the PHENIX data acquisition system, Additionally, we put a large Effort into developing new trigger and fast-track analysis methods for J j J data filtering and reconstruction. These algorithms make use of the trigger primitivE∼s generated via the new electronics

  8. Effect-based trigger values for in vitro bioassays: Reading across from existing water quality guideline values.

    Science.gov (United States)

    Escher, Beate I; Neale, Peta A; Leusch, Frederic D L

    2015-09-15

    Cell-based bioassays are becoming increasingly popular in water quality assessment. The new generations of reporter-gene assays are very sensitive and effects are often detected in very clean water types such as drinking water and recycled water. For monitoring applications it is therefore imperative to derive trigger values that differentiate between acceptable and unacceptable effect levels. In this proof-of-concept paper, we propose a statistical method to read directly across from chemical guideline values to trigger values without the need to perform in vitro to in vivo extrapolations. The derivation is based on matching effect concentrations with existing chemical guideline values and filtering out appropriate chemicals that are responsive in the given bioassays at concentrations in the range of the guideline values. To account for the mixture effects of many chemicals acting together in a complex water sample, we propose bioanalytical equivalents that integrate the effects of groups of chemicals with the same mode of action that act in a concentration-additive manner. Statistical distribution methods are proposed to derive a specific effect-based trigger bioanalytical equivalent concentration (EBT-BEQ) for each bioassay of environmental interest that targets receptor-mediated toxicity. Even bioassays that are indicative of the same mode of action have slightly different numeric trigger values due to differences in their inherent sensitivity. The algorithm was applied to 18 cell-based bioassays and 11 provisional effect-based trigger bioanalytical equivalents were derived as an illustrative example using the 349 chemical guideline values protective for human health of the Australian Guidelines for Water Recycling. We illustrate the applicability using the example of a diverse set of water samples including recycled water. Most recycled water samples were compliant with the proposed triggers while wastewater effluent would not have been compliant with a few

  9. LHCb: The LHCb Trigger Architecture beyond LS1

    CERN Multimedia

    Albrecht, J; Neubert, S; Raven, G; Sokoloff, M D; Williams, M

    2013-01-01

    The LHCb experiment is a spectrometer dedicated to the study of heavy flavor at the LHC. The rate of proton-proton collisions at the LHC is 15 MHz, but resource limitations mean that only 5 kHz can be written to storage for offline analytsis. For this reason the LHCb data acquisition system -- trigger -- plays a key role in selecting signal events and rejecting background. In contrast to previous experiments at hadron colliders like for example CDF or D0, the bulk of the LHCb trigger is implemented in software and deployed on a farm of 20k parallel processing nodes. This system, called the High Level Trigger (HLT) is responsible for reducing the rate from the maximum at which the detector can be read out, 1.1 MHz, to the 5 kHz which can be processed offline,and has 20 ms in which to process and accept/reject each event. In order to minimize systematic uncertainties, the HLT was designed from the outset to reuse the offline reconstruction and selection code. During the long shutdown it is proposed to extend th...

  10. Cumulants in perturbation expansions for non-equilibrium field theory

    International Nuclear Information System (INIS)

    Fauser, R.

    1995-11-01

    The formulation of perturbation expansions for a quantum field theory of strongly interacting systems in a general non-equilibrium state is discussed. Non-vanishing initial correlations are included in the formulation of the perturbation expansion in terms of cumulants. The cumulants are shown to be the suitable candidate for summing up the perturbation expansion. Also a linked-cluster theorem for the perturbation series with cumulants is presented. Finally a generating functional of the perturbation series with initial correlations is studied. We apply the methods to a simple model of a fermion-boson system. (orig.)

  11. Reliability model analysis and primary experimental evaluation of laser triggered pulse trigger

    International Nuclear Information System (INIS)

    Chen Debiao; Yang Xinglin; Li Yuan; Li Jin

    2012-01-01

    High performance pulse trigger can enhance performance and stability of the PPS. It is necessary to evaluate the reliability of the LTGS pulse trigger, so we establish the reliability analysis model of this pulse trigger based on CARMES software, the reliability evaluation is accord with the statistical results. (authors)

  12. Proposed Hall D Detector Electronics

    International Nuclear Information System (INIS)

    Paul Smith

    1998-01-01

    With nearly 10**5 channels, the signal processing and data acquisition electronics system will present a significant challenge. We envisage much of the electronics being physically located on or near the detectors to avoid the long and expensive low-level signal cables otherwise required. CERN detectors such as COMPASS and ATLAS provide a good model, and we should build on their experience as much as possible. Radiation hardness and minimal power dissipation are additional constraints. The high beam rate will necessitate good time resolution, integrated low level triggering capability and sufficient pipelining of the data to accommodate the trigger decision time. A proposed architecture is shown in the figure. Detector channels are either ''pixels'', e.g. PWCs, drift chambers, and ring cerenkovs, or charge detectors, e.g. CSI or lead glass. Pixel detectors are discriminated, while charge detectors are digitized by Flash ADCs (FADC). The digitized information is pipelined in shift registers which provide a time window for the first level of triggering to consider. After passing through the shift registers, the data are further pipelined in RAM to provide time for the level 1 trigger decision. In the event of a level 1 trigger, the RAM contents are transferred to a level 2 processor farm where more detailed trigger decisions take place

  13. Asynchronous sampled-data approach for event-triggered systems

    Science.gov (United States)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  14. Maintenance hemodialysis patients have high cumulative radiation exposure.

    LENUS (Irish Health Repository)

    Kinsella, Sinead M

    2010-10-01

    Hemodialysis is associated with an increased risk of neoplasms which may result, at least in part, from exposure to ionizing radiation associated with frequent radiographic procedures. In order to estimate the average radiation exposure of those on hemodialysis, we conducted a retrospective study of 100 patients in a university-based dialysis unit followed for a median of 3.4 years. The number and type of radiological procedures were obtained from a central radiology database, and the cumulative effective radiation dose was calculated using standardized, procedure-specific radiation levels. The median annual radiation dose was 6.9 millisieverts (mSv) per patient-year. However, 14 patients had an annual cumulative effective radiation dose over 20 mSv, the upper averaged annual limit for occupational exposure. The median total cumulative effective radiation dose per patient over the study period was 21.7 mSv, in which 13 patients had a total cumulative effective radiation dose over 75 mSv, a value reported to be associated with a 7% increased risk of cancer-related mortality. Two-thirds of the total cumulative effective radiation dose was due to CT scanning. The average radiation exposure was significantly associated with the cause of end-stage renal disease, history of ischemic heart disease, transplant waitlist status, number of in-patient hospital days over follow-up, and death during the study period. These results highlight the substantial exposure to ionizing radiation in hemodialysis patients.

  15. Stay away from asthma triggers

    Science.gov (United States)

    Asthma triggers - stay away from; Asthma triggers - avoiding; Reactive airway disease - triggers; Bronchial asthma - triggers ... clothes. They should leave the coat outside or away from your child. Ask people who work at ...

  16. Event-triggered cooperative target tracking in wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Lu Kelin

    2016-10-01

    Full Text Available Since the issues of low communication bandwidth supply and limited battery capacity are very crucial for wireless sensor networks, this paper focuses on the problem of event-triggered cooperative target tracking based on set-membership information filtering. We study some fundamental properties of the set-membership information filter with multiple sensor measurements. First, a sufficient condition is derived for the set-membership information filter, under which the boundedness of the outer ellipsoidal approximation set of the estimation means is guaranteed. Second, the equivalence property between the parallel and sequential versions of the set-membership information filter is presented. Finally, the results are applied to a 1D event-triggered target tracking scenario in which the negative information is exploited in the sense that the measurements that do not satisfy the triggering conditions are modelled as set-membership measurements. The tracking performance of the proposed method is validated with extensive Monte Carlo simulations.

  17. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  18. Cumulative effect in multiple production processes on nuclei

    International Nuclear Information System (INIS)

    Golubyatnikova, E.S.; Shmonin, V.L.; Kalinkin, B.N.

    1989-01-01

    It is shown that the cumulative effect is a natural result of the process of hadron multiple production in nuclear reactions. Interpretation is made of the universality of slopes of inclusive spectra and other characteristics of cumulative hadrons. The character of information from such reactions is discussed, which could be helpful in studying the mechanism of multiparticle production. 27 refs.; 4 figs

  19. A configurable tracking algorithm to detect cosmic muon tracks for the CMS-RPC based technical trigger

    CERN Document Server

    Rajan, R T; Loddo, F; Maggi, M; Ranieri, A; Abbrescia, M; Guida, R; Iaselli, G; Nuzzo, S; Pugliese, G; Roselli, G; Trentadue, R; Tupputi, b, S; Benussi, L; Bertani, M; Bianco, S; Fabbri, F; Cavallo, N; Cimmino, e, A; Lomidze, D; Noli, P; Paolucci, P; Piccolo, D; Polese, G; Sciacca, C; Baesso, g, P; Belli, G; Necchi, M; Ratti, S P; Pagano, D; Vitulo, P; Viviani, C; Dimitrov, A; Litov, L; Pavlov, B; Petkov, P; Genchev, V; Iaydjiev, P; Bunkowski, K; Kierzkowski, K; Konecki, M; Kudla, I; Pietrusinski, M; Pozniak, K

    2009-01-01

    In the CERN CMS experiment at LHC Collider special trigger signals called Technical Triggers will be used for the purpose of test and calibration. The Resistive Plate Chambers (RPC) based Technical Trigger system is a part of the CMS muon trigger system and is designed to detect cosmic muon tracks. It is based on two boards, namely RBC (RPC Balcony Collector) and TTU (Technical Trigger Unit). The proposed tracking algorithm (TA) written in VHDL and implemented in the TTU board detects single or multiple cosmic muon tracks at every bunch crossing along with their track lengths and corresponding chamber coordinates. The TA implementation in VHDL and its preliminary simulation results are presented.

  20. Triggered creep as a possible mechanism for delayed dynamic triggering of tremor and earthquakes

    Science.gov (United States)

    Shelly, David R.; Peng, Zhigang; Hill, David P.; Aiken, Chastity

    2011-01-01

    The passage of radiating seismic waves generates transient stresses in the Earth's crust that can trigger slip on faults far away from the original earthquake source. The triggered fault slip is detectable in the form of earthquakes and seismic tremor. However, the significance of these triggered events remains controversial, in part because they often occur with some delay, long after the triggering stress has passed. Here we scrutinize the location and timing of tremor on the San Andreas fault between 2001 and 2010 in relation to distant earthquakes. We observe tremor on the San Andreas fault that is initiated by passing seismic waves, yet migrates along the fault at a much slower velocity than the radiating seismic waves. We suggest that the migrating tremor records triggered slow slip of the San Andreas fault as a propagating creep event. We find that the triggered tremor and fault creep can be initiated by distant earthquakes as small as magnitude 5.4 and can persist for several days after the seismic waves have passed. Our observations of prolonged tremor activity provide a clear example of the delayed dynamic triggering of seismic events. Fault creep has been shown to trigger earthquakes, and we therefore suggest that the dynamic triggering of prolonged fault creep could provide a mechanism for the delayed triggering of earthquakes. ?? 2011 Macmillan Publishers Limited. All rights reserved.

  1. CMS Trigger Performance

    CERN Document Server

    Donato, Silvio

    2017-01-01

    During its second run of operation (Run 2) which started in 2015, the LHC will deliver a peak instantaneous luminosity that may reach $2 \\cdot 10^{34}$ cm$^{-2}$s$^{-1}$ with an average pile-up of about 55, far larger than the design value. Under these conditions, the online event selection is a very challenging task. In CMS, it is realized by a two-level trigger system the Level-1 (L1) Trigger, implemented in custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the offline reconstruction software running on a computer farm. In order to face this challenge, the L1 trigger has been through a major upgrade compared to Run 1, whereby all electronic boards of the system have been replaced, allowing more sophisticated algorithms to be run online. Its last stage, the global trigger, is now able to perform complex selections and to compute high-level quantities, like invariant masses. Likewise, the algorithms that run in the HLT go through big improvements; in particular, new appr...

  2. Natural experimentation is a challenging method for identifying headache triggers.

    Science.gov (United States)

    Houle, Timothy T; Turner, Dana P

    2013-04-01

    In this study, we set out to determine whether individual headache sufferers can learn about the potency of their headache triggers (causes) using only natural experimentation. Headache patients naturally use the covariation of the presence-absence of triggers with headache attacks to assess the potency of triggers. The validity of this natural experimentation has never been investigated. A companion study has proposed 3 assumptions that are important for assigning causal status to triggers. This manuscript examines one of these assumptions, constancy in trigger presentation, using real-world conditions. The similarity of day-to-day weather conditions over 4 years, as well as the similarity of ovarian hormones and perceived stress over a median of 89 days in 9 regularly cycling headache sufferers, was examined using several available time series. An arbitrary threshold of 90% similarity using Gower's index identified similar days for comparison. The day-to-day variability in just these 3 headache triggers is substantial enough that finding 2 naturally similar days for which to contrast the effect of a fourth trigger (eg, drinking wine vs not drinking wine) will only infrequently occur. Fluctuations in weather patterns resulted in a median of 2.3 days each year that were similar (range 0-27.4). Considering fluctuations in stress patterns and ovarian hormones, only 1.5 days/month (95% confidence interval 1.2-2.9) and 2.0 days/month (95% confidence interval 1.9-2.2), respectively, met our threshold for similarity. Although assessing the personal causes of headache is an age-old endeavor, the great many candidate triggers exhibit variability that may prevent sound conclusions without assistance from formal experimentation or statistical balancing. © 2013 American Headache Society.

  3. The NA27 trigger

    International Nuclear Information System (INIS)

    Bizzarri, R.; Di Capua, E.; Falciano, S.; Iori, M.; Marel, G.; Piredda, G.; Zanello, L.; Haupt, L.; Hellman, S.; Holmgren, S.O.; Johansson, K.E.

    1985-05-01

    We have designed and implemented a minimum bias trigger together with a fiducial volume trigger for the experiment NA27, performed at the CERN SPS. A total of more than 3 million bubble chamber pictures have been taken with a triggered cross section smaller than 75% of the total inelastic cross section. Events containing charm particles were triggered with an efficiency of 98 +2 sub(-3)%. With the fiducial volume trigger, the probability for a picture to contain an interaction in the visible hydrogen increased from 47.3% to 59.5%, reducing film cost and processing effort with about 20%. The improvement in data taking rate is shown to be negligible. (author)

  4. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date....... It is common practice to apply the Kaplan-Meier or Aalen-Johansen estimator to the total sample and report either the estimated cumulative incidence curve or just a single point on the curve as a description of the disease risk. METHODS: We argue that, whenever the disease or disorder of interest is influenced...

  5. Triggers of oral lichen planus flares and the potential role of trigger avoidance in disease management.

    Science.gov (United States)

    Chen, Hannah X; Blasiak, Rachel; Kim, Edwin; Padilla, Ricardo; Culton, Donna A

    2017-09-01

    Many patients with oral lichen planus (OLP) report triggers of flares, some of which overlap with triggers of other oral diseases, including oral allergy syndrome and oral contact dermatitis. The purpose of this study was to evaluate the prevalence of commonly reported triggers of OLP flares, their overlap with triggers of other oral diseases, and the potential role of trigger avoidance as a management strategy. Questionnaire-based survey of 51 patients with biopsy-proven lichen planus with oral involvement seen in an academic dermatology specialty clinic and/or oral pathology clinic between June 2014 and June 2015. Of the participants, 94% identified at least one trigger of their OLP flares. Approximately half of the participants (51%) reported at least one trigger that overlapped with known triggers of oral allergy syndrome, and 63% identified at least one trigger that overlapped with known triggers of oral contact dermatitis. Emotional stress was the most commonly reported trigger (77%). Regarding avoidance, 79% of the study participants reported avoiding their known triggers in daily life. Of those who actively avoided triggers, 89% reported an improvement in symptoms and 70% reported a decrease in the frequency of flares. Trigger identification and avoidance can play a potentially effective role in the management of OLP. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Image-guided automatic triggering of a fractional CO2 laser in aesthetic procedures.

    Science.gov (United States)

    Wilczyński, Sławomir; Koprowski, Robert; Wiernek, Barbara K; Błońska-Fajfrowska, Barbara

    2016-09-01

    Laser procedures in dermatology and aesthetic medicine are associated with the need for manual laser triggering. This leads to pulse overlapping and side effects. Automatic laser triggering based on image analysis can provide a secure fit to each successive doses of radiation. A fractional CO2 laser was used in the study. 500 images of the human skin of healthy subjects were acquired. Automatic triggering was initiated by an application together with a camera which tracks and analyses the skin in visible light. The tracking algorithm uses the methods of image analysis to overlap images. After locating the characteristic points in analysed adjacent areas, the correspondence of graphs is found. The point coordinates derived from the images are the vertices of graphs with respect to which isomorphism is sought. When the correspondence of graphs is found, it is possible to overlap the neighbouring parts of the image. The proposed method of laser triggering owing to the automatic image fitting method allows for 100% repeatability. To meet this requirement, there must be at least 13 graph vertices obtained from the image. For this number of vertices, the time of analysis of a single image is less than 0.5s. The proposed method, applied in practice, may help reduce the number of side effects during dermatological laser procedures resulting from laser pulse overlapping. In addition, it reduces treatment time and enables to propose new techniques of treatment through controlled, precise laser pulse overlapping. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  8. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  9. A Two-Step Approach for Analytical Optimal Hedging with Two Triggers

    Directory of Open Access Journals (Sweden)

    Tiesong Hu

    2016-02-01

    Full Text Available Hedging is widely used to mitigate severe water shortages in the operation of reservoirs during droughts. Rationing is usually instituted with one hedging policy, which is based only on one trigger, i.e., initial storage level or current water availability. It may perform poorly in balancing the benefits of a release during the current period versus those of carryover storage during future droughts. This study proposes a novel hedging rule to improve the efficiency of a reservoir operated to supply water, in which, based on two triggers, hedging is initiated with three different hedging sub-rules through a two-step approach. In the first step, the sub-rule is triggered based on the relationship between the initial reservoir storage level and the level of the target rule curve or the firm rule curve at the end of the current period. This step is mainly concerned with increasing the water level or not in the current period. Hedging is then triggered under the sub-rule based on current water availability in the second step, in which the trigger implicitly considers both initial and ending reservoir storage levels in the current period. Moreover, the amount of hedging is analytically derived based on the Karush–Kuhn–Tucker (KKT conditions. In addition, the hedging parameters are optimized using the improved particle swarm optimization (IPSO algorithm coupled with a rule-based simulation. A single water-supply reservoir located in Hubei Province in central China is selected as a case study. The operation results show that the proposed rule is reasonable and significantly improves the reservoir operation performance for both long-term and critical periods relative to other operation policies, such as the standard operating policy (SOP and the most commonly used hedging rules.

  10. Cumulative Environmental Impacts: Science and Policy to Protect Communities.

    Science.gov (United States)

    Solomon, Gina M; Morello-Frosch, Rachel; Zeise, Lauren; Faust, John B

    2016-01-01

    Many communities are located near multiple sources of pollution, including current and former industrial sites, major roadways, and agricultural operations. Populations in such locations are predominantly low-income, with a large percentage of minorities and non-English speakers. These communities face challenges that can affect the health of their residents, including limited access to health care, a shortage of grocery stores, poor housing quality, and a lack of parks and open spaces. Environmental exposures may interact with social stressors, thereby worsening health outcomes. Age, genetic characteristics, and preexisting health conditions increase the risk of adverse health effects from exposure to pollutants. There are existing approaches for characterizing cumulative exposures, cumulative risks, and cumulative health impacts. Although such approaches have merit, they also have significant constraints. New developments in exposure monitoring, mapping, toxicology, and epidemiology, especially when informed by community participation, have the potential to advance the science on cumulative impacts and to improve decision making.

  11. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  12. Baltic Sea biodiversity status vs. cumulative human pressures

    DEFF Research Database (Denmark)

    Andersen, Jesper H.; Halpern, Benjamin S.; Korpinen, Samuli

    2015-01-01

    Abstract Many studies have tried to explain spatial and temporal variations in biodiversity status of marine areas from a single-issue perspective, such as fishing pressure or coastal pollution, yet most continental seas experience a wide range of human pressures. Cumulative impact assessments have...... been developed to capture the consequences of multiple stressors for biodiversity, but the ability of these assessments to accurately predict biodiversity status has never been tested or ground-truthed. This relationship has similarly been assumed for the Baltic Sea, especially in areas with impaired...... status, but has also never been documented. Here we provide a first tentative indication that cumulative human impacts relate to ecosystem condition, i.e. biodiversity status, in the Baltic Sea. Thus, cumulative impact assessments offer a promising tool for informed marine spatial planning, designation...

  13. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  14. Observing earthquakes triggered in the near field by dynamic deformations

    Science.gov (United States)

    Gomberg, J.; Bodin, P.; Reasenberg, P.A.

    2003-01-01

    We examine the hypothesis that dynamic deformations associated with seismic waves trigger earthquakes in many tectonic environments. Our analysis focuses on seismicity at close range (within the aftershock zone), complementing published studies of long-range triggering. Our results suggest that dynamic triggering is not confined to remote distances or to geothermal and volcanic regions. Long unilaterally propagating ruptures may focus radiated dynamic deformations in the propagation direction. Therefore, we expect seismicity triggered dynamically by a directive rupture to occur asymmetrically, with a majority of triggered earthquakes in the direction of rupture propagation. Bilaterally propagating ruptures also may be directive, and we propose simple criteria for assessing their directivity. We compare the inferred rupture direction and observed seismicity rate change following 15 earthquakes (M 5.7 to M 8.1) that occured in California and Idaho in the United States, the Gulf of Aqaba, Syria, Guatemala, China, New Guinea, Turkey, Japan, Mexico, and Antarctica. Nine of these mainshocks had clearly directive, unilateral ruptures. Of these nine, seven apparently induced an asymmetric increase in seismicity rate that correlates with the rupture direction. The two exceptions include an earthquake preceded by a comparable-magnitude event on a conjugate fault and another for which data limitations prohibited conclusive results. Similar (but weaker) correlations were found for the bilaterally rupturing earthquakes we studied. Although the static stress change also may trigger seismicity, it and the seismicity it triggers are expected to be similarly asymmetric only if the final slip is skewed toward the rupture terminus. For several of the directive earthquakes, we suggest that the seismicity rate change correlates better with the dynamic stress field than the static stress change.

  15. A parallel non-neural trigger tracker for the SSC

    International Nuclear Information System (INIS)

    Farber, R.M.; Kennison, W.; Lapedes, A.S.

    1991-01-01

    The Superconducting Super Collider (SSC) is a major project promising to open the vistas of very high particle physics. When the SSC is in operation, data will be produced at a staggering rate. Current estimates place the raw data coming our of the proposed silicon detector system at 2.5 x 10 16 bits/second. Clearly, storing all events for later off-line processing is totally impracticable. A hierarchy of triggers, firing only on events meeting increasingly specific criteria, are planned to cull interesting events from the flood of information. Each event consists of a sequence of isolated ''hits'', caused by particles hitting various parts of the detector. Collating these hits into the tracks of the approximately 500 particles/event, and then quickly deciding which events meet the criteria for later processing, is essential if the SSC is to produce usable information. This paper addresses the need for real-time triggering and track reconstruction. A benchmarked and buildable algorithm, operable at the required data rates, is described. The use of neural nets, suggested by other researchers, is specifically avoided as unnecessary and impractical. Instead, a parallel algorithm, and associated hardware architecture using only conventional technology, is presented. The algorithm has been tested on fully scaled up, extensively detailed, simulated SSC events, with extremely encouraging results. Preliminary hardware analysis indicate that the trigger/tracker may be built within proposed SSC budget guidelines. 7 refs., 4 figs

  16. Minimum Bias Trigger in ATLAS

    International Nuclear Information System (INIS)

    Kwee, Regina

    2010-01-01

    Since the restart of the LHC in November 2009, ATLAS has collected inelastic pp collisions to perform first measurements on charged particle densities. These measurements will help to constrain various models describing phenomenologically soft parton interactions. Understanding the trigger efficiencies for different event types are therefore crucial to minimize any possible bias in the event selection. ATLAS uses two main minimum bias triggers, featuring complementary detector components and trigger levels. While a hardware based first trigger level situated in the forward regions with 2.2 < |η| < 3.8 has been proven to select pp-collisions very efficiently, the Inner Detector based minimum bias trigger uses a random seed on filled bunches and central tracking detectors for the event selection. Both triggers were essential for the analysis of kinematic spectra of charged particles. Their performance and trigger efficiency measurements as well as studies on possible bias sources will be presented. We also highlight the advantage of these triggers for particle correlation analyses. (author)

  17. Causality and headache triggers

    Science.gov (United States)

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  18. Exponential Synchronization of Networked Chaotic Delayed Neural Network by a Hybrid Event Trigger Scheme.

    Science.gov (United States)

    Fei, Zhongyang; Guan, Chaoxu; Gao, Huijun; Zhongyang Fei; Chaoxu Guan; Huijun Gao; Fei, Zhongyang; Guan, Chaoxu; Gao, Huijun

    2018-06-01

    This paper is concerned with the exponential synchronization for master-slave chaotic delayed neural network with event trigger control scheme. The model is established on a network control framework, where both external disturbance and network-induced delay are taken into consideration. The desired aim is to synchronize the master and slave systems with limited communication capacity and network bandwidth. In order to save the network resource, we adopt a hybrid event trigger approach, which not only reduces the data package sending out, but also gets rid of the Zeno phenomenon. By using an appropriate Lyapunov functional, a sufficient criterion for the stability is proposed for the error system with extended ( , , )-dissipativity performance index. Moreover, hybrid event trigger scheme and controller are codesigned for network-based delayed neural network to guarantee the exponential synchronization between the master and slave systems. The effectiveness and potential of the proposed results are demonstrated through a numerical example.

  19. The D0 calorimeter trigger

    International Nuclear Information System (INIS)

    Guida, J.

    1992-12-01

    The D0 calorimeter trigger system consists of many levels to make physics motivated trigger decisions. The Level-1 trigger uses hardware techniques to reduce the trigger rate from ∼ 100kHz to 200Hz. It forms sums of electromagnetic and hadronic energy, globally and in towers, along with finding the missing transverse energy. A minimum energy is set on these energy sums to pass the event. The Level-2 trigger is a set of software filters, operating in a parallel-processing microvax farm which further reduces the trigger rate to a few Hertz. These filters will reject events which lack electron candidates, jet candidates, or missing transverse energy in the event. The performance of these triggers during the early running of the D0 detector will also be discussed

  20. A branch-and-cut-and-price algorithm for the cumulative capacitated vehicle routing problem

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Lysgaard, Jens

    2014-01-01

    The paper considers the Cumulative Capacitated Vehicle Routing Problem (CCVRP), which is a variation of the well-known Capacitated Vehicle Routing Problem (CVRP). In this problem, the traditional objective of minimizing total distance or time traveled by the vehicles is replaced by minimizing...... the sum of arrival times at the customers. A branch-and-cut-and-price algorithm for obtaining optimal solutions to the problem is proposed. Computational results based on a set of standard CVRP benchmarks are presented....

  1. Handoff Triggering and Network Selection Algorithms for Load-Balancing Handoff in CDMA-WLAN Integrated Networks

    Directory of Open Access Journals (Sweden)

    Khalid Qaraqe

    2008-10-01

    Full Text Available This paper proposes a novel vertical handoff algorithm between WLAN and CDMA networks to enable the integration of these networks. The proposed vertical handoff algorithm assumes a handoff decision process (handoff triggering and network selection. The handoff trigger is decided based on the received signal strength (RSS. To reduce the likelihood of unnecessary false handoffs, the distance criterion is also considered. As a network selection mechanism, based on the wireless channel assignment algorithm, this paper proposes a context-based network selection algorithm and the corresponding communication algorithms between WLAN and CDMA networks. This paper focuses on a handoff triggering criterion which uses both the RSS and distance information, and a network selection method which uses context information such as the dropping probability, blocking probability, GoS (grade of service, and number of handoff attempts. As a decision making criterion, the velocity threshold is determined to optimize the system performance. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations using four handoff strategies. The four handoff strategies are evaluated and compared with each other in terms of GOS. Finally, the proposed scheme is validated by computer simulations.

  2. Handoff Triggering and Network Selection Algorithms for Load-Balancing Handoff in CDMA-WLAN Integrated Networks

    Directory of Open Access Journals (Sweden)

    Kim Jang-Sub

    2008-01-01

    Full Text Available This paper proposes a novel vertical handoff algorithm between WLAN and CDMA networks to enable the integration of these networks. The proposed vertical handoff algorithm assumes a handoff decision process (handoff triggering and network selection. The handoff trigger is decided based on the received signal strength (RSS. To reduce the likelihood of unnecessary false handoffs, the distance criterion is also considered. As a network selection mechanism, based on the wireless channel assignment algorithm, this paper proposes a context-based network selection algorithm and the corresponding communication algorithms between WLAN and CDMA networks. This paper focuses on a handoff triggering criterion which uses both the RSS and distance information, and a network selection method which uses context information such as the dropping probability, blocking probability, GoS (grade of service, and number of handoff attempts. As a decision making criterion, the velocity threshold is determined to optimize the system performance. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations using four handoff strategies. The four handoff strategies are evaluated and compared with each other in terms of GOS. Finally, the proposed scheme is validated by computer simulations.

  3. A general-purpose trigger processor system and its application to fast vertex trigger

    International Nuclear Information System (INIS)

    Hazumi, M.; Banas, E.; Natkaniec, Z.; Ostrowicz, W.

    1997-12-01

    A general-purpose hardware trigger system has been developed. The system comprises programmable trigger processors and pattern generator/samplers. The hardware design of the system is described. An application as a prototype of the very fast vertex trigger in an asymmetric B-factory at KEK is also explained. (author)

  4. BTeV Trigger

    International Nuclear Information System (INIS)

    Gottschalk, Erik E.

    2006-01-01

    BTeV was designed to conduct precision studies of CP violation in BB-bar events using a forward-geometry detector in a hadron collider. The detector was optimized for high-rate detection of beauty and charm particles produced in collisions between protons and antiprotons. The trigger was designed to take advantage of the main difference between events with beauty and charm particles and more typical hadronic events-the presence of detached beauty and charm decay vertices. The first stage of the BTeV trigger was to receive data from a pixel vertex detector, reconstruct tracks and vertices for every beam crossing, reject at least 98% of beam crossings in which neither beauty nor charm particles were produced, and trigger on beauty events with high efficiency. An overview of the trigger design and its evolution to include commodity networking and computing components is presented

  5. Cumulative query method for influenza surveillance using search engine data.

    Science.gov (United States)

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  6. Whistler-triggered chorus emissions observed during daytime at low latitude ground station Jammu

    Science.gov (United States)

    Pratap Patel, Ravindra; Singh, K. K.; Singh, A. K.; Singh, R. P.

    In this paper, we present whistler-triggered chorus emission recorded during daytime at low latitude ground station Jammu (geomag. Lat. = 22 degree 26 minute N; L = 1.17) during the period from 1996 to 2003. After analysis of the eight years collected data, we found out 29 events, which are definitely identified as chorus emission triggered by whistlers. During the observation period the magnetic activity is high. Analysis shows that the whistlers have propagated along the geomagnetic field line having L-values lying between L = 1.9 and 4.4. These waves could have propagated along the geomagnetic field lines either in ducted mode or pro-longitudinal mode. The measured relative intensity of the triggered emission and whistler wave is approximately the same and also varies from one event to another. It is proposed that these waves are generated through a process of wave-particle interaction and wave-wave interactions. Related parameters of this interaction are computed for different L-value and wave amplitude. With the help of dynamic spectra of these emissions, the proposed mechanisms are explained.

  7. Application of approximations for joint cumulative k-distributions for mixtures to FSK radiation heat transfer in multi-component high temperature non-LTE plasmas

    International Nuclear Information System (INIS)

    Maurente, André; França, Francis H.R.; Miki, Kenji; Howell, John R.

    2012-01-01

    Approximations for joint cumulative k-distribution for mixtures are efficient for full spectrum k-distribution (FSK) computations. These approximations provide reduction of the database that is necessary to perform FSK computation when compared to the direct approach, which uses cumulative k-distributions computed from the spectrum of the mixture, and also less computational expensive when compared to techniques in which RTE's are required to be solved for each component of the mixture. The aim of the present paper is to extend the approximations for joint cumulative k-distributions for non-LTE media. For doing that, a FSK to non-LTE media formulation well-suited to be applied along with approximations for joint cumulative k-distributions is presented. The application of the proposed methodology is demonstrated by solving the radiation heat transfer in non-LTE high temperature plasmas composed of N, O, N 2 , NO, N 2 + and mixtures of these species. The two more efficient approximations, that is, the superposition and multiplication are employed and analyzed.

  8. Triggering trigeminal neuralgia

    DEFF Research Database (Denmark)

    Di Stefano, Giulia; Maarbjerg, Stine; Nurmikko, Turo

    2018-01-01

    Introduction Although it is widely accepted that facial pain paroxysms triggered by innocuous stimuli constitute a hallmark sign of trigeminal neuralgia, very few studies to date have systematically investigated the role of the triggers involved. In the recently published diagnostic classification...

  9. The challenges and opportunities in cumulative effects assessment

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Melissa M., E-mail: mfoley@usgs.gov [U.S. Geological Survey, Pacific Coastal and Marine Science Center, 400 Natural Bridges, Dr., Santa Cruz, CA 95060 (United States); Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940 (United States); Mease, Lindley A., E-mail: lamease@stanford.edu [Center for Ocean Solutions, Stanford University, 473 Via Ortega, Stanford, CA 94305 (United States); Martone, Rebecca G., E-mail: rmartone@stanford.edu [Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940 (United States); Prahler, Erin E. [Center for Ocean Solutions, Stanford University, 473 Via Ortega, Stanford, CA 94305 (United States); Morrison, Tiffany H., E-mail: tiffany.morrison@jcu.edu.au [ARC Centre of Excellence for Coral Reef Studies, James Cook University, Townsville, QLD, 4811 (Australia); Murray, Cathryn Clarke, E-mail: cmurray@pices.int [WWF-Canada, 409 Granville Street, Suite 1588, Vancouver, BC V6C 1T2 (Canada); Wojcik, Deborah, E-mail: deb.wojcik@duke.edu [Nicholas School for the Environment, Duke University, 9 Circuit Dr., Durham, NC 27708 (United States)

    2017-01-15

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  10. The challenges and opportunities in cumulative effects assessment

    International Nuclear Information System (INIS)

    Foley, Melissa M.; Mease, Lindley A.; Martone, Rebecca G.; Prahler, Erin E.; Morrison, Tiffany H.; Murray, Cathryn Clarke; Wojcik, Deborah

    2017-01-01

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  11. The challenges and opportunities in cumulative effects assessment

    Science.gov (United States)

    Foley, Melissa M.; Mease, Lindley A; Martone, Rebecca G; Prahler, Erin E; Morrison, Tiffany H; Clarke Murray, Cathryn; Wojcik, Deborah

    2016-01-01

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  12. LHCb Topological Trigger Reoptimization

    CERN Document Server

    INSPIRE-00400931; Ilten, Philip; Khairullin, Egor; Rogozhnikov, Alex; Ustyuzhanin, Andrey; Williams, Michael

    2015-12-23

    The main b-physics trigger algorithm used by the LHCb experiment is the so-called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger, which utilized a custom boosted decision tree algorithm, selected a nearly 100% pure sample of b-hadrons with a typical efficiency of 60-70%; its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and neural networks. The topological trigger algorithm is designed to select all "interesting" decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. ...

  13. Upgrade readout and trigger electronics for the ATLAS liquid argon calorimeters for future LHC running

    CERN Document Server

    Yamanaka, T; The ATLAS collaboration

    2014-01-01

    The ATLAS Liquid Argon (LAr) calorimeters produce almost 200K signals that must be digitized and processed by the front-end and back-end electronics at every triggered event. Additionally, the front-end electronics sums analog signals to provide coarse-grained energy sums to the first-level (L1) trigger system. The current design was optimized for the nominal LHC luminosity of 10^34 cm^-2s^-1. However, in future higher-luminosity phases of LHC operation, the luminosity (and associated pile-up noise) will be 3-7 times higher. An improved spatial granularity of the trigger primitives is therefore proposed, in order to improve the trigger performance at high background rejection rates. For the first upgrade phase in 2018, new LAr Trigger Digitizer Boards are being designed to receive the higher granularity signals, digitize them on-detector and send them via fast optical links to a new digital processing system (DPS). This applies digital filtering and identifies significant energy depositions in each trigger ch...

  14. Upgraded readout and trigger electronics for the ATLAS liquid argon calorimeters for future LHC running

    CERN Document Server

    Yamanaka, T; The ATLAS collaboration

    2014-01-01

    The ATLAS Liquid Argon (LAr) calorimeters produce almost 200K signals that must be digitized and processed by the front-end and back-end electronics at every triggered event. Additionally, the front-end electronics sums analog signals to provide coarse-grained energy sums to the first-level (L1) trigger system. The current design was optimized for the nominal LHC luminosity of 10^34 cm^-2s^-1. However, in future higher-luminosity phases of LHC operation, the luminosity (and associated pile-up noise) will be 3-7 times higher. An improved spatial granularity of the trigger primitives is therefore proposed, in order to improve the trigger performance at high background rejection rates. For the first upgrade phase in 2018, new LAr Trigger Digitizer Boards are being designed to receive the higher granularity signals, digitize them on-detector and send them via fast optical links to a new digital processing system (DPS). This applies digital filtering and identifies significant energy depositions in each trigger ch...

  15. Instrumentation of the upgraded ATLAS tracker with a double buffer front-end architecture for track triggering

    International Nuclear Information System (INIS)

    Wardrope, D

    2012-01-01

    The Large Hadron Collider will be upgraded to provide instantaneous luminosity L = 5 × 10 34 cm −2 s −1 , leading to excessive rates from the ATLAS Level-1 trigger. A double buffer front-end architecture for the ATLAS tracker replacement is proposed, that will enable the use of track information in trigger decisions within 20 μs in order to reduce the high trigger rates. Analysis of ATLAS simulations have found that using track information will enable the use of single lepton triggers with transverse momentum thresholds of p T ∼ 25 GeV, which will be of great benefit to the future physics programme of ATLAS.

  16. The TOTEM modular trigger system

    Energy Technology Data Exchange (ETDEWEB)

    Bagliesi, M.G., E-mail: mg.bagliesi@pi.infn.i [University of Siena and INFN Pisa (Italy); Berretti, M.; Cecchi, R.; Greco, V.; Lami, S.; Latino, G.; Oliveri, E.; Pedreschi, E.; Scribano, A.; Spinella, F.; Turini, N. [University of Siena and INFN Pisa (Italy)

    2010-05-21

    The TOTEM experiment will measure the total cross-section with the luminosity independent method and study elastic and diffractive scattering at the LHC. We are developing a modular trigger system, based on programmable logic, that will select meaningful events within 2.5{mu}s. The trigger algorithm is based on a tree structure in order to obtain information compression. The trigger primitive is generated directly on the readout chip, VFAT, that has a specific fast output that gives low resolution hits information. In two of the TOTEM detectors, Roman Pots and T2, a coincidence chip will perform track recognition directly on the detector readout boards, while for T1 the hits are transferred from the VFATs to the trigger hardware. Starting from more than 2000 bits delivered by the detector electronics, we extract, in a first step, six trigger patterns of 32 LVDS signals each; we build, then, on a dedicated board, a 1-bit (L1) trigger signal for the TOTEM experiment and 16 trigger bits to the CMS experiment global trigger system for future common data taking.

  17. The TOTEM modular trigger system

    International Nuclear Information System (INIS)

    Bagliesi, M.G.; Berretti, M.; Cecchi, R.; Greco, V.; Lami, S.; Latino, G.; Oliveri, E.; Pedreschi, E.; Scribano, A.; Spinella, F.; Turini, N.

    2010-01-01

    The TOTEM experiment will measure the total cross-section with the luminosity independent method and study elastic and diffractive scattering at the LHC. We are developing a modular trigger system, based on programmable logic, that will select meaningful events within 2.5μs. The trigger algorithm is based on a tree structure in order to obtain information compression. The trigger primitive is generated directly on the readout chip, VFAT, that has a specific fast output that gives low resolution hits information. In two of the TOTEM detectors, Roman Pots and T2, a coincidence chip will perform track recognition directly on the detector readout boards, while for T1 the hits are transferred from the VFATs to the trigger hardware. Starting from more than 2000 bits delivered by the detector electronics, we extract, in a first step, six trigger patterns of 32 LVDS signals each; we build, then, on a dedicated board, a 1-bit (L1) trigger signal for the TOTEM experiment and 16 trigger bits to the CMS experiment global trigger system for future common data taking.

  18. Storytelling as a trigger for sharing conversations

    OpenAIRE

    Emma Louise Parfitt

    2014-01-01

    This article explores whether traditional oral storytelling can be used to provide insights into the way in which young people of 12-14 years identify and understand the language of emotion and behaviour. Following the preliminary analysis, I propose that storytelling may trigger sharing conversations. My research attempts to extend the social and historical perspectives of Jack Zipes, on fairy tales, into a sociological analysis of young people’s lives today. I seek to investigate the extent...

  19. Managing regional cumulative effects of oil sands development in Alberta, Canada

    International Nuclear Information System (INIS)

    Spaling, H.; Zwier, J.

    2000-01-01

    This paper demonstrates an approach to regional cumulative effects management using the case of oil sands development in Alberta, Canada. The 17 existing, approved, or planned projects, all concentrated in a relatively small region, pose significant challenges for conducting and reviewing cumulative effects assessment (CEA) on a project-by-project basis. In response, stakeholders have initiated a regional cumulative effects management system that is among the first such initiatives anywhere. Advantages of this system include (1) more efficient gathering and sharing of information, including a common regional database, (2) setting acceptable regional environmental thresholds for all projects, (3) collaborative assessment of similar cumulative effects from related projects, (4) co-ordinated regulatory review and approval process for overlapping CEAs, and (5) institutional empowerment from a Regional Sustainable Development Strategy administered by a public authority. This case provides a model for integrating project-based CEA with regional management of cumulative effects. (author)

  20. Cumulative effects of planned industrial development and climate change on marine ecosystems

    Directory of Open Access Journals (Sweden)

    Cathryn Clarke Murray

    2015-07-01

    Full Text Available With increasing human population, large scale climate changes, and the interaction of multiple stressors, understanding cumulative effects on marine ecosystems is increasingly important. Two major drivers of change in coastal and marine ecosystems are industrial developments with acute impacts on local ecosystems, and global climate change stressors with widespread impacts. We conducted a cumulative effects mapping analysis of the marine waters of British Columbia, Canada, under different scenarios: climate change and planned developments. At the coast-wide scale, climate change drove the largest change in cumulative effects with both widespread impacts and high vulnerability scores. Where the impacts of planned developments occur, planned industrial and pipeline activities had high cumulative effects, but the footprint of these effects was comparatively localized. Nearshore habitats were at greatest risk from planned industrial and pipeline activities; in particular, the impacts of planned pipelines on rocky intertidal habitats were predicted to cause the highest change in cumulative effects. This method of incorporating planned industrial development in cumulative effects mapping allows explicit comparison of different scenarios with the potential to be used in environmental impact assessments at various scales. Its use allows resource managers to consider cumulative effect hotspots when making decisions regarding industrial developments and avoid unacceptable cumulative effects. Management needs to consider both global and local stressors in managing marine ecosystems for the protection of biodiversity and the provisioning of ecosystem services.

  1. LHCb Topological Trigger Reoptimization

    International Nuclear Information System (INIS)

    Likhomanenko, Tatiana; Khairullin, Egor; Rogozhnikov, Alex; Ustyuzhanin, Andrey; Ilten, Philip; Williams, Michael

    2015-01-01

    The main b-physics trigger algorithm used by the LHCb experiment is the so- called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger, which utilized a custom boosted decision tree algorithm, selected a nearly 100% pure sample of b-hadrons with a typical efficiency of 60-70%; its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and neural networks. The topological trigger algorithm is designed to select all ’interesting” decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. Methods studied include cascading, ensembling and blending techniques. Furthermore, novel boosting techniques have been implemented that will help reduce systematic uncertainties in Run 2 measurements. We demonstrate that the reoptimized topological trigger is expected to significantly improve on the Run 1 performance for a wide range of b-hadron decays. (paper)

  2. The Neuro-Z-Vertex Trigger of the Belle II Experiment

    Directory of Open Access Journals (Sweden)

    Skambraks Sebastian

    2016-01-01

    This contribution presents the foreseen neural network trigger setup and the preceding 2D track finder. Special focus is put on the proposal and evaluation of a possible 3D upgrade of the 2D track finder. Additionally, details are given on a dedicated setup for the upcoming cosmic ray test.

  3. The effects of cumulative practice on mathematics problem solving.

    Science.gov (United States)

    Mayfield, Kristin H; Chase, Philip N

    2002-01-01

    This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving.

  4. Super-Resolution Algorithm in Cumulative Virtual Blanking

    Science.gov (United States)

    Montillet, J. P.; Meng, X.; Roberts, G. W.; Woolfson, M. S.

    2008-11-01

    The proliferation of mobile devices and the emergence of wireless location-based services have generated consumer demand for precise location. In this paper, the MUSIC super-resolution algorithm is applied to time delay estimation for positioning purposes in cellular networks. The goal is to position a Mobile Station with UMTS technology. The problem of Base-Stations herability is solved using Cumulative Virtual Blanking. A simple simulator is presented using DS-SS signal. The results show that MUSIC algorithm improves the time delay estimation in both the cases whether or not Cumulative Virtual Blanking was carried out.

  5. Study On Aftershock Triggering In Consideration Of Tectonic Stress Field

    Science.gov (United States)

    Hu, C.; Cai, Y.

    2007-12-01

    : The occurrence of earthquake is related to the strength of rock and tectonic stress field. The seismic risk factor (SRF),D=\\left|{τn }\\right|/(μσn ) is proposed to describe the dangerous status of aftershock triggering in this paper. Dearthquakes, velocity field from GPS as well as geological survey. As one order of approximation, the magnitudes of the regional tectonic stress field can be estimated by the Coulomb failure criterion. Finite element method (FEM) and the concept of the factor D are used to study the aftershock triggering of the 1976 Tangshan Ms=7.8 earthquake. The results show that: (1) Most of the aftershocks triggered by the Tangshan earthquake occurred in the two-leaf-shaped regions of D≥ 1 near the north-east end of the main-shock fault. The largest leaf is about 100km long and 40km wide. (2) The areas of aftershock triggering predicted by the seismic risk factorD and Δ CFS (the changes in the Coulomb failure stress) are almost the same near the fault. The difference between them is that the aftershock area predicted by Δ CFS≥ 0 is too large and the area predicted by the factor D≥ 1 is limited. The areas of aftershock triggering predicted by Δ CFS≥ 0.04 MPa are nearly the same as those of D≥ 1 obtained by the study. (3) Sometimes Δ CFS =0.01MPa is taken as a low threshold of aftershock triggering. However, Δ CFS≥ 0 only means the probability increase of the earthquake triggering, not means the earthquake will occur. The earthquake occurrence is not only related to Δ CFS, but also to the tectonic stress field before the main-shock.

  6. Analysis of LDPE-ZnO-clay nanocomposites using novel cumulative rheological parameters

    Science.gov (United States)

    Kracalik, Milan

    2017-05-01

    Polymer nanocomposites exhibit complex rheological behaviour due to physical and also possibly chemical interactions between individual phases. Up to now, rheology of dispersive polymer systems has been usually described by evaluation of viscosity curve (shear thinning phenomenon), storage modulus curve (formation of secondary plateau) or plotting information about dumping behaviour (e.g. Van Gurp-Palmen-plot, comparison of loss factor tan δ). On the contrary to evaluation of damping behaviour, values of cot δ were calculated and called as "storage factor", analogically to loss factor. Then values of storage factor were integrated over specific frequency range and called as "cumulative storage factor". In this contribution, LDPE-ZnO-clay nanocomposites with different dispersion grades (physical networks) have been prepared and characterized by both conventional as well as novel analysis approach. Next to cumulative storage factor, further cumulative rheological parameters like cumulative complex viscosity, cumulative complex modulus or cumulative storage modulus have been introduced.

  7. Workshop on data acquisition and trigger system simulations for high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- an Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.

  8. Workshop on data acquisition and trigger system simulations for high energy physics

    International Nuclear Information System (INIS)

    1992-01-01

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit ampersand The Design of a Queue for this Circuit; Fast Data Compression ampersand Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ ampersand Online Processing at the SSC; Planned Enhancements to MODSEM II ampersand SIMOBJECT -- an Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies

  9. The Phase-I Trigger Readout Electronics Upgrade of the ATLAS Liquid Argon Calorimeters

    CERN Document Server

    Yang, Yi-lin; The ATLAS collaboration

    2018-01-01

    The Super Cell has been proposed in the Phase-I LAr upgrade to replace the existing trigger system "Trigger Tower" due to higher luminosity environments in Run 3 at LHC. The higher granularity of the Super Cell trigger systems requires higher data transmission and processing rate. The new system is also needed to be compatible with the existing trigger system. To fulfill these requirements, the new electronics including frond end and back end are developed. In the front-end part, the new LSB sums the LAr cell signals into Super Cell signals. The new baseplane distributes analog signals among FEBs, LTDB and TBB. The LTDB sums Super Cell signals to Trigger Tower signals and redirected the signals to TBB. The Analog signals are also digitized in LTDB and then sent to back end electronics. In the back-end part, the architecture is based on ATCA. The LAr carrier is used for monitoring and controlling. The LATOMEs inserted into the LAr carrier provide energy calculation from the digitized signals. So far, the demon...

  10. Limited preemptive scheduling of mixed time-triggered and event-triggered tasks

    NARCIS (Netherlands)

    Heuvel, van den M.M.H.P.; Bril, R.J.; Zhang, X.; Abdullah, S.M.J.; Isovic, D.

    2013-01-01

    Many embedded systems have complex timing constraints and, at the same time, have flexibility requirements which prohibit offline planning of the entire system. To support a mixture of time-triggered and event-triggered tasks, some industrial systems deploy a table-driven dispatcher for

  11. Expansion formulae for characteristics of cumulative cost in finite horizon production models

    NARCIS (Netherlands)

    Ayhan, H.; Schlegel, S.

    2001-01-01

    We consider the expected value and the tail probability of cumulative shortage and holding cost (i.e. the probability that cumulative cost is more than a certain value) in finite horizon production models. An exact expression is provided for the expected value of the cumulative cost for general

  12. Cumulative Trauma Among Mayas Living in Southeast Florida.

    Science.gov (United States)

    Millender, Eugenia I; Lowe, John

    2017-06-01

    Mayas, having experienced genocide, exile, and severe poverty, are at high risk for the consequences of cumulative trauma that continually resurfaces through current fear of an uncertain future. Little is known about the mental health and alcohol use status of this population. This correlational study explored t/he relationship of cumulative trauma as it relates to social determinants of health (years in the United States, education, health insurance status, marital status, and employment), psychological health (depression symptoms), and health behaviors (alcohol use) of 102 Guatemalan Mayas living in Southeast Florida. The results of this study indicated that, as specific social determinants of health and cumulative trauma increased, depression symptoms (particularly among women) and the risk for harmful alcohol use (particularly among men) increased. Identifying risk factors at an early stage before serious disease or problems are manifest provides room for early screening leading to early identification, early treatment, and better outcomes.

  13. Headache triggers in the US military.

    Science.gov (United States)

    Theeler, Brett J; Kenney, Kimbra; Prokhorenko, Olga A; Fideli, Ulgen S; Campbell, William; Erickson, Jay C

    2010-05-01

    Headaches can be triggered by a variety of factors. Military service members have a high prevalence of headache but the factors triggering headaches in military troops have not been identified. The objective of this study is to determine headache triggers in soldiers and military beneficiaries seeking specialty care for headaches. A total of 172 consecutive US Army soldiers and military dependents (civilians) evaluated at the headache clinics of 2 US Army Medical Centers completed a standardized questionnaire about their headache triggers. A total of 150 (87%) patients were active-duty military members and 22 (13%) patients were civilians. In total, 77% of subjects had migraine; 89% of patients reported at least one headache trigger with a mean of 8.3 triggers per patient. A wide variety of headache triggers was seen with the most common categories being environmental factors (74%), stress (67%), consumption-related factors (60%), and fatigue-related factors (57%). The types of headache triggers identified in active-duty service members were similar to those seen in civilians. Stress-related triggers were significantly more common in soldiers. There were no significant differences in trigger types between soldiers with and without a history of head trauma. Headaches in military service members are triggered mostly by the same factors as in civilians with stress being the most common trigger. Knowledge of headache triggers may be useful for developing strategies that reduce headache occurrence in the military.

  14. Origin of path independence between cumulative CO2 emissions and global warming

    Science.gov (United States)

    Seshadri, Ashwin K.

    2017-11-01

    Observations and GCMs exhibit approximate proportionality between cumulative carbon dioxide (CO2) emissions and global warming. Here we identify sufficient conditions for the relationship between cumulative CO2 emissions and global warming to be independent of the path of CO2 emissions; referred to as "path independence". Our starting point is a closed form expression for global warming in a two-box energy balance model (EBM), which depends explicitly on cumulative emissions, airborne fraction and time. Path independence requires that this function can be approximated as depending on cumulative emissions alone. We show that path independence arises from weak constraints, occurring if the timescale for changes in cumulative emissions (equal to ratio between cumulative emissions and emissions rate) is small compared to the timescale for changes in airborne fraction (which depends on CO2 uptake), and also small relative to a derived climate model parameter called the damping-timescale, which is related to the rate at which deep-ocean warming affects global warming. Effects of uncertainties in the climate model and carbon cycle are examined. Large deep-ocean heat capacity in the Earth system is not necessary for path independence, which appears resilient to climate modeling uncertainties. However long time-constants in the Earth system carbon cycle are essential, ensuring that airborne fraction changes slowly with timescale much longer than the timescale for changes in cumulative emissions. Therefore path independence between cumulative emissions and warming cannot arise for short-lived greenhouse gases.

  15. Instrumentation of a Level-1 Track Trigger at ATLAS with Double Buffer Front-End Architecture

    CERN Document Server

    Cooper, B; The ATLAS collaboration

    2012-01-01

    The increased collision rate and pile-up produced at the HLLHC requires a substantial upgrade of the ATLAS level-1 trigger in order to maintain a broad physics reach. We show that tracking information can be used to control trigger rates, and describe a proposal for how this information can be extracted within a two-stage level-1 trigger design that has become the baseline for the HLLHC upgrade. We demonstrate that, in terms of the communication between the external processing and the tracking detector frontends, a hardware solution is possible that fits within the latency constraints of level-1.

  16. A Conceptual Framework for the Assessment of Cumulative Exposure to Air Pollution at a Fine Spatial Scale

    Directory of Open Access Journals (Sweden)

    Kihal-Talantikite Wahida

    2016-03-01

    Full Text Available Many epidemiological studies examining long-term health effects of exposure to air pollutants have characterized exposure by the outdoor air concentrations at sites that may be distant to subjects’ residences at different points in time. The temporal and spatial mobility of subjects and the spatial scale of exposure assessment could thus lead to misclassification in the cumulative exposure estimation. This paper attempts to fill the gap regarding cumulative exposure assessment to air pollution at a fine spatial scale in epidemiological studies investigating long-term health effects. We propose a conceptual framework showing how major difficulties in cumulative long-term exposure assessment could be surmounted. We then illustrate this conceptual model on the case of exposure to NO2 following two steps: (i retrospective reconstitution of NO2 concentrations at a fine spatial scale; and (ii a novel approach to assigning the time-relevant exposure estimates at the census block level, using all available data on residential mobility throughout a 10- to 20-year period prior to that for which the health events are to be detected. Our conceptual framework is both flexible and convenient for the needs of different epidemiological study designs.

  17. A Conceptual Framework for the Assessment of Cumulative Exposure to Air Pollution at a Fine Spatial Scale

    Science.gov (United States)

    Wahida, Kihal-Talantikite; Padilla, Cindy M.; Denis, Zmirou-Navier; Olivier, Blanchard; Géraldine, Le Nir; Philippe, Quenel; Séverine, Deguen

    2016-01-01

    Many epidemiological studies examining long-term health effects of exposure to air pollutants have characterized exposure by the outdoor air concentrations at sites that may be distant to subjects’ residences at different points in time. The temporal and spatial mobility of subjects and the spatial scale of exposure assessment could thus lead to misclassification in the cumulative exposure estimation. This paper attempts to fill the gap regarding cumulative exposure assessment to air pollution at a fine spatial scale in epidemiological studies investigating long-term health effects. We propose a conceptual framework showing how major difficulties in cumulative long-term exposure assessment could be surmounted. We then illustrate this conceptual model on the case of exposure to NO2 following two steps: (i) retrospective reconstitution of NO2 concentrations at a fine spatial scale; and (ii) a novel approach to assigning the time-relevant exposure estimates at the census block level, using all available data on residential mobility throughout a 10- to 20-year period prior to that for which the health events are to be detected. Our conceptual framework is both flexible and convenient for the needs of different epidemiological study designs. PMID:26999170

  18. The LHCb trigger

    CERN Document Server

    Hernando Morata, Jose Angel

    2006-01-01

    The LHCb experiment relies on an efficient trigger to select a rate up to 2 kHz of events useful for physics analysis from an initial rate of 10 MHz of visible collisions. In this contribution, we describe the different LHCb trigger algorithms and present their expected performance.

  19. Nonuniform Sparse Data Clustering Cascade Algorithm Based on Dynamic Cumulative Entropy

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available A small amount of prior knowledge and randomly chosen initial cluster centers have a direct impact on the accuracy of the performance of iterative clustering algorithm. In this paper we propose a new algorithm to compute initial cluster centers for k-means clustering and the best number of the clusters with little prior knowledge and optimize clustering result. It constructs the Euclidean distance control factor based on aggregation density sparse degree to select the initial cluster center of nonuniform sparse data and obtains initial data clusters by multidimensional diffusion density distribution. Multiobjective clustering approach based on dynamic cumulative entropy is adopted to optimize the initial data clusters and the best number of the clusters. The experimental results show that the newly proposed algorithm has good performance to obtain the initial cluster centers for the k-means algorithm and it effectively improves the clustering accuracy of nonuniform sparse data by about 5%.

  20. Crane Safety Assessment Method Based on Entropy and Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Aihua Li

    2017-01-01

    Full Text Available Assessing the safety status of cranes is an important problem. To overcome the inaccuracies and misjudgments in such assessments, this work describes a safety assessment method for cranes that combines entropy and cumulative prospect theory. Firstly, the proposed method transforms the set of evaluation indices into an evaluation vector. Secondly, a decision matrix is then constructed from the evaluation vectors and evaluation standards, and an entropy-based technique is applied to calculate the index weights. Thirdly, positive and negative prospect value matrices are established from reference points based on the positive and negative ideal solutions. Thus, this enables the crane safety grade to be determined according to the ranked comprehensive prospect values. Finally, the safety status of four general overhead traveling crane samples is evaluated to verify the rationality and feasibility of the proposed method. The results demonstrate that the method described in this paper can precisely and reasonably reflect the safety status of a crane.

  1. The ATLAS High Level Trigger Steering Framework and the Trigger 
Configuration System.

    CERN Document Server

    Pérez Cavalcanti, Tiago; The ATLAS collaboration

    2011-01-01

    The ATLAS High Level Trigger Steering Framework and the Trigger 
Configuration System.
 
The ATLAS detector system installed in the Large Hadron Collider (LHC) 
at CERN is designed to study proton-proton and nucleus-nucleus 
collisions with a maximum center of mass energy of 14 TeV at a bunch 
collision rate of 40MHz.  In March 2010 the four LHC experiments saw 
the first proton-proton collisions at 7 TeV. Still within the year a 
collision rate of nearly 10 MHz is expected. At ATLAS, events of 
potential interest for ATLAS physics are selected by a three-level 
trigger system, with a final recording rate of about 200 Hz. The first 
level (L1) is implemented in custom hardware; the two levels of 
the high level trigger (HLT) are software triggers, running on large 
farms of standard computers and network devices. 

Within the ATLAS physics program more than 500 trigger signatures are 
defined. The HLT tests each signature on each L1-accepted event; the 
test outcome is recor...

  2. The ATLAS trigger: high-level trigger commissioning and operation during early data taking

    International Nuclear Information System (INIS)

    Goncalo, R

    2008-01-01

    The ATLAS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14 TeV, with a bunch-crossing rate of 40 MHz. The ATLAS three-level trigger will reduce this input rate to match the foreseen offline storage capability of 100-200 Hz. This paper gives an overview of the ATLAS High Level Trigger focusing on the system design and its innovative features. We then present the ATLAS trigger strategy for the initial phase of LHC exploitation. Finally, we report on the valuable experience acquired through in-situ commissioning of the system where simulated events were used to exercise the trigger chain. In particular we show critical quantities such as event processing times, measured in a large-scale HLT farm using a complex trigger menu

  3. The methodological proposal of photography as a discharger of the trigger of memory: application to Telêmaco Borba s history (1950-1969

    Directory of Open Access Journals (Sweden)

    Juliana de Oliveira Teixeira

    2014-07-01

    Full Text Available This work has the objectives of testing and systematizing the methodological proposal of photography as a discharger of the trigger of memory, a technique that combines photographic images with oral history. The method, developed by the group Communication and History of Universidade Estadual de Londrina, was formalized in the dissertation of Maria Luisa Hoffmann (2010 and, since then, it has been applied to cities with recent histories. To do a relevant test in this dissertation, Telêmaco Borba(PR was chosen as field of study, and the precepts of empiricism in communication were respected, following the ideas of Maria Immacolata Vassallo Lopes (2010 and Luiz Claudio Martino (2010. The application of the method was also based theoretically, using as references the works of Jacques Le Goff (2003, Ecléa Bosi (2009 and Boris Kossoy (2009. During the empirical process, nine pioneers of Telêmaco Borba were submitted to the methodological proposal, using 17 old photographs of the city. Roughly, the results of the test show that the technique, when applied with the epistemological criteria of science, becomes an efficient empirical tool, capable of bringing new data and information to the studies of memory and to the history of the studied cities.

  4. The trigger supervisor: Managing triggering conditions in a high energy physics experiment

    International Nuclear Information System (INIS)

    Wadsworth, B.; Lanza, R.; LeVine, M.J.; Scheetz, R.A.; Videbaek, F.

    1987-01-01

    A trigger supervisor, implemented in VME-bus hardware, is described, which enables the host computer to dynamically control and monitor the trigger configuration for acquiring data from multiple detector partitions in a complex experiment

  5. Correlation between thermal gradient and flexure-type deformation as a potential trigger for exfoliation-related rock falls (Invited)

    Science.gov (United States)

    Collins, B. D.; Stock, G. M.

    2010-12-01

    Stress-induced exfoliation of granitic rocks is an important means by which cliffs deform and subsequently erode. During exfoliation, fractures are formed, and when exposed in cliff faces, are susceptible to subsequent rock falls. This is the case in Yosemite National Park, California, where exfoliation continues to play a primary role in cliff evolution. In Yosemite, numerous mechanisms are inferred to trigger rock falls; nevertheless, many rock falls have no recognized triggers. As a result, several potential, but as yet unquantified, triggering mechanisms have been proposed. One of these, thermally induced flexure, wherein solar radiation and temperature variation drives cumulative deformation of partially detached rock flakes, has the potential to explain several recent rock falls in Yosemite. We explore this potential mechanism by quantifying the deformation, temperature, and solar radiation exposure of a near-vertical rock flake in Yosemite Valley. The flake, 14 m tall, 4 m wide and 12 cm thick, receives direct sunlight during most of the day. Whereas the flake is attached to the cliff face at its bottom and top, the sides are detached from the cliff by a 10 cm wide crack on one side, tapering to a 1 cm wide crack on the opposite side. Instrumentation consists of three custom-designed crackmeters placed between the flake and the adjacent cliff face, three air temperature sensors located behind the flake, and three dual air temperature-light sensors located on the outside surface of the flake. Nearby relative humidity and barometric pressure sensors complete the instrumentation. Five-minute interval data from spring - fall 2010 indicate the flake undergoes maximum deformation at mid-span between attachment points and that it deforms from both diurnal and climatic temperature fluctuations. Recorded maximum deformations, measured perpendicular to crack orientation, are 1 cm diurnally and nearly 1.5 cm (including diurnal effect) over a 5-day period of cooler

  6. Cumulative effects of wind turbines. Volume 3: Report on results of consultations on cumulative effects of wind turbines on birds

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This report gives details of the consultations held in developing the consensus approach taken in assessing the cumulative effects of wind turbines. Contributions on bird issues, and views of stakeholders, the Countryside Council for Wales, electric utilities, Scottish Natural Heritage, and the National Wind Power Association are reported. The scoping of key species groups, where cumulative effects might be expected, consideration of other developments, the significance of any adverse effects, mitigation, regional capacity assessments, and predictive models are discussed. Topics considered at two stakeholder workshops are outlined in the appendices.

  7. Higher order cumulants in colorless partonic plasma

    Energy Technology Data Exchange (ETDEWEB)

    Cherif, S. [Sciences and Technologies Department, University of Ghardaia, Ghardaia, Algiers (Algeria); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria); Ahmed, M. A. A. [Department of Physics, College of Science, Taibah University Al-Madinah Al-Mounawwarah KSA (Saudi Arabia); Department of Physics, Taiz University in Turba, Taiz (Yemen); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria); Ladrem, M., E-mail: mladrem@yahoo.fr [Department of Physics, College of Science, Taibah University Al-Madinah Al-Mounawwarah KSA (Saudi Arabia); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria)

    2016-06-10

    Any physical system considered to study the QCD deconfinement phase transition certainly has a finite volume, so the finite size effects are inevitably present. This renders the location of the phase transition and the determination of its order as an extremely difficult task, even in the simplest known cases. In order to identify and locate the colorless QCD deconfinement transition point in finite volume T{sub 0}(V), a new approach based on the finite-size cumulant expansion of the order parameter and the ℒ{sub m,n}-Method is used. We have shown that both cumulants of higher order and their ratios, associated to the thermodynamical fluctuations of the order parameter, in QCD deconfinement phase transition behave in a particular enough way revealing pronounced oscillations in the transition region. The sign structure and the oscillatory behavior of these in the vicinity of the deconfinement phase transition point might be a sensitive probe and may allow one to elucidate their relation to the QCD phase transition point. In the context of our model, we have shown that the finite volume transition point is always associated to the appearance of a particular point in whole higher order cumulants under consideration.

  8. Cumulative effects of forest management activities: how might they occur?

    Science.gov (United States)

    R. M. Rice; R. B. Thomas

    1985-01-01

    Concerns are often voiced about possible environmental damage as the result of the cumulative sedimentation effects of logging and forest road construction. In response to these concerns, National Forests are developing procedures to reduce the possibility that their activities may lead to unacceptable cumulative effects

  9. MRI of ventilated neonates and infants: respiratory pressure as trigger signal

    International Nuclear Information System (INIS)

    Lotz, J.; Reiffen, H.P.

    2004-01-01

    Introduction: motivated by the difficulties often encountered in the setup of respiratory trigger in MR imaging of mechanical ventilated pediatric patients, a simplified approach in terms of time and reliability was sought. Method: with the help of a male-to-male Luer-Lock adapter in combination with a 3-way adapter the tube of the respiratory compensation bellow was fixed to the output channel for capnography of the airway filter. Ten patients (age 4 months to 6 years) were tested with spin echo imaging and either respiration compensation (T1-weighted imaging) or respiratory triggered (T2-weighted imaging). Results: a clear trigger signal was achieved in all cases. No negative influence on the quality or security of the mechanical ventilation of the patients was observed. Summary: the proposed adapter is safe, efficient and fast to install in patients undergoing MR imaging in general anaesthesia. (orig.) [de

  10. Self-triggered image intensifier tube for high-resolution UHECR imaging detector

    CERN Document Server

    Sasaki, M; Jobashi, M

    2003-01-01

    The authors have developed a self-triggered image intensifier tube with high-resolution imaging capability. An image detected by a first image intensifier tube as an electrostatic lens with a photocathode diameter of 100 mm is separated by a half-mirror into a path for CCD readout (768x494 pixels) and a fast control to recognize and trigger the image. The proposed system provides both a high signal-to-noise ratio to improve single photoelectron detection and excellent spatial resolution between 207 and 240 mu m rendering this device a potentially essential tool for high-energy physics and astrophysics experiments, as well as high-speed photography. When combined with a 1-arcmin resolution optical system with 50 deg. field-of-view proposed by the present authors, the observation of ultra high-energy cosmic rays and high-energy neutrinos using this device is expected, leading to revolutionary progress in particle astrophysics as a complementary technique to traditional astronomical observations at multiple wave...

  11. Application of Higher-Order Cumulant in Fault Diagnosis of Rolling Bearing

    International Nuclear Information System (INIS)

    Shen, Yongjun; Yang, Shaopu; Wang, Junfeng

    2013-01-01

    In this paper a new method of pattern recognition based on higher-order cumulant and envelope analysis is presented. The core of this new method is to construct analytical signals from the given signals and obtain the envelope signals firstly, then compute and compare the higher-order cumulants of the envelope signals. The higher-order cumulants could be used as a characteristic quantity to distinguish these given signals. As an example, this method is applied in fault diagnosis for 197726 rolling bearing of freight locomotive. The comparisons of the second-order, third-order and fourth-order cumulants of the envelope signals from different vibration signals of rolling bearing show this new method could discriminate the normal and two fault signals distinctly

  12. The ATLAS Muon and Tau Trigger

    CERN Document Server

    Dell'Asta, L; The ATLAS collaboration

    2013-01-01

    [Muon] The ATLAS experiment at CERN's Large Hadron Collider (LHC) deploys a three-levels processing scheme for the trigger system. The level-1 muon trigger system gets its input from fast muon trigger detectors. Fast sector logic boards select muon candidates, which are passed via an interface board to the central trigger processor and then to the High Level Trigger (HLT). The muon HLT is purely software based and encompasses a level-2 (L2) trigger followed by an event filter (EF) for a staged trigger approach. It has access to the data of the precision muon detectors and other detector elements to refine the muon hypothesis. Trigger-specific algorithms were developed and are used for the L2 to increase processing speed for instance by making use of look-up tables and simpler algorithms, while the EF muon triggers mostly benefit from offline reconstruction software to obtain most precise determination of the track parameters. There are two algorithms with different approaches, namely inside-out and outside-in...

  13. Evaluation of a combination tumor treatment using thermo-triggered liposomal drug delivery and carbon ion irradiation.

    Science.gov (United States)

    Kokuryo, Daisuke; Aoki, Ichio; Yuba, Eiji; Kono, Kenji; Aoshima, Sadahito; Kershaw, Jeff; Saga, Tsuneo

    2017-07-01

    The combination of radiotherapy with chemotherapy is one of the most promising strategies for cancer treatment. Here, a novel combination strategy utilizing carbon ion irradiation as a high-linear energy transfer (LET) radiotherapy and a thermo-triggered nanodevice is proposed, and drug accumulation in the tumor and treatment effects are evaluated using magnetic resonance imaging relaxometry and immunohistology (Ki-67, n = 15). The thermo-triggered liposomal anticancer nanodevice was administered into colon-26 tumor-grafted mice, and drug accumulation and efficacy was compared for 6 groups (n = 32) that received or did not receive the radiotherapy and thermo trigger. In vivo quantitative R 1 maps visually demonstrated that the multimodal thermosensitive polymer-modified liposomes (MTPLs) can accumulate in the tumor tissue regardless of whether the region was irradiated by carbon ions or not. The tumor volume after combination treatment with carbon ion irradiation and MTPLs with thermo-triggering was significantly smaller than all the control groups at 8 days after treatment. The proposed strategy of combining high-LET irradiation and the nanodevice provides an effective approach for minimally invasive cancer treatment. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. Performance of the ATLAS Muon Trigger and Phase-1 Upgrade of Level-1 Endcap Muon Trigger

    CERN Document Server

    Mizukami, Atsushi; The ATLAS collaboration

    2017-01-01

    The ATLAS experiment utilises a trigger system to efficiently record interesting events. It consists of first-level and high-level triggers. The first-level trigger is implemented with custom-built hardware to reduce the event rate from 40 MHz to100 kHz. Then the software-based high-level triggers refine the trigger decisions reducing the output rate down to 1 kHz. Events with muons in the final state are an important signature for many physics topics at the LHC. An efficient trigger on muons and a detailed understanding of its performance are required. Trigger efficiencies are, for example, obtained from the muon decay of Z boson, with a Tag&Probe method, using proton-proton collision data collected in 2016 at a centre-of-mass energy of 13 TeV. The LHC is expected to increase its instantaneous luminosity to $3\\times10^{34} \\rm{cm^{-2}s^{-1}}$ after the phase-1 upgrade between 2018-2020. The upgrade of the ATLAS trigger system is mandatory to cope with this high-luminosity. In the phase-1 upgrade, new det...

  15. Implementation of a custom time-domain firmware trigger for RADAR-based cosmic ray detection

    Science.gov (United States)

    Prohira, S.; Besson, D.; Kunwar, S.; Ratzlaff, K.; Young, R.

    2018-05-01

    Interest in Radio-based detection schemes for ultra-high energy cosmic rays (UHECR) has surged in recent years, owing to the potentially very low cost/detection ratio. The method of radio-frequency (RF) scatter has been proposed as potentially the most economical detection technology. Though the first dedicated experiment to employ this method, the Telescope Array RADAR experiment (TARA) reported no signal, efforts to develop more robust and sensitive trigger techniques continue. This paper details the development of a time-domain firmware trigger that exploits characteristics of the expected scattered signal from an UHECR extensive-air shower (EAS). The improved sensitivity of this trigger is discussed, as well as implementation in two separate field deployments from 2016 to 2017.

  16. Reliability analysis of multi-trigger binary systems subject to competing failures

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Levitin, Gregory

    2013-01-01

    This paper suggests two combinatorial algorithms for the reliability analysis of multi-trigger binary systems subject to competing failure propagation and failure isolation effects. Propagated failure with global effect (PFGE) is referred to as a failure that not only causes outage to the component from which the failure originates, but also propagates through all other system components causing the entire system failure. However, the propagation effect from the PFGE can be isolated in systems with functional dependence (FDEP) behavior. This paper studies two distinct consequences of PFGE resulting from a competition in the time domain between the failure isolation and failure propagation effects. As compared to existing works on competing failures that are limited to systems with a single FDEP group, this paper considers more complicated cases where the systems have multiple dependent FDEP groups. Analysis of such systems is more challenging because both the occurrence order between the trigger failure event and PFGE from the dependent components and the occurrence order among the multiple trigger failure events have to be considered. Two combinatorial and analytical algorithms are proposed. Both of them have no limitation on the type of time-to-failure distributions for the system components. Their correctness is verified using a Markov-based method. An example of memory systems is analyzed to demonstrate and compare the applications and advantages of the two proposed algorithms. - Highlights: ► Reliability of binary systems with multiple dependent functional dependence groups is analyzed. ► Competing failure propagation and failure isolation effect is considered. ► The proposed algorithms are combinatorial and applicable to any arbitrary type of time-to-failure distributions for system components.

  17. Cumulative keyboard strokes: a possible risk factor for carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Eleftheriou Andreas

    2012-08-01

    Full Text Available Abstract Background Contradictory reports have been published regarding the association of Carpal Tunnel Syndrome (CTS and the use of computer keyboard. Previous studies did not take into account the cumulative exposure to keyboard strokes among computer workers. The aim of the present study was to investigate the association between cumulative keyboard use (keyboard strokes and CTS. Methods Employees (461 from a Governmental data entry & processing unit agreed to participate (response rate: 84.1 % in a cross-sectional study. Α questionnaire was distributed to the participants to obtain information on socio-demographics and risk factors for CTS. The participants were examined for signs and symptoms related to CTS and were asked if they had previous history or surgery for CTS. The cumulative amount of the keyboard strokes per worker per year was calculated by the use of payroll’s registry. Two case definitions for CTS were used. The first included subjects with personal history/surgery for CTS while the second included subjects that belonged to the first case definition plus those participants were identified through clinical examination. Results Multivariate analysis used for both case definitions, indicated that those employees with high cumulative exposure to keyboard strokes were at increased risk of CTS (case definition A: OR = 2.23;95 % CI = 1.09-4.52 and case definition B: OR = 2.41; 95%CI = 1.36-4.25. A dose response pattern between cumulative exposure to keyboard strokes and CTS has been revealed (p  Conclusions The present study indicated a possible association between cumulative exposure to keyboard strokes and development of CTS. Cumulative exposure to key-board strokes would be taken into account as an exposure indicator regarding exposure assessment of computer workers. Further research is needed in order to test the results of the current study and assess causality between cumulative keyboard strokes and

  18. DUMAND data acquisition with triggering

    International Nuclear Information System (INIS)

    Brenner, A.E.; Theriot, D.; March, R.H.

    1980-01-01

    A data acquisition scheme for the standard DUMAND array that includes a simple triggering scheme as a fundamental part of the system is presented. Although there are a number of not yet fully understood parameters, it is assumed that thresholds can be set in such a manner as to give rise to a triggered signal that is not so dominated by randoms that it gives a substantial decrease in the data acquisition rate over that which would be required by a nontriggered system. It is also assumed that the triggering logic is relatively simple and does not need major computational capabilities for a trigger logic decision. With these assumptions, it is possible to generate the trigger at the array and restrict the data transfer to shore. However, with a not unreasonable delay of 200 microseconds, it is even possible to transmit the information for the trigger to shore and perform all that logic on the shore. The critical point is to send the minimum amount of information necessary to construct the trigger such that one need not send all the possible information in all detectors of the array continuously to shore. 1 figure

  19. The ATLAS hadronic tau trigger

    CERN Document Server

    Black, C; The ATLAS collaboration

    2012-01-01

    With the high luminosities of proton-proton collisions achieved at the LHC, the strategies for triggering have become more important than ever for physics analysis. The naive inclusive single tau lepton triggers now suffer from severe rate limitations. To allow for a large program of physics analyses with taus, the development of topological triggers that combine tau signatures with other measured quantities in the event is required. These combined triggers open many opportunities to study new physics beyond the Standard Model and to search for the Standard Model Higgs. We present the status and performance of the hadronic tau trigger in ATLAS. We demonstrate that the ATLAS tau trigger ran remarkably well over 2011, and how the lessons learned from 2011 led to numerous improvements in the preparation of the 2012 run. These improvements include the introduction of tau selection criteria that are robust against varying pileup scenarios, and the implementation of multivariate selection techniques in the tau trig...

  20. The ATLAS hadronic tau trigger

    CERN Document Server

    Black, C; The ATLAS collaboration

    2012-01-01

    With the high luminosities of proton-proton collisions achieved at the LHC, the strategies for triggering have become more important than ever for physics analysis. The naïve inclusive single tau lepton triggers now suffer from severe rate limitations. To allow for a large program of physics analyses with taus, the development of topological triggers that combine tau signatures with other measured quantities in the event is required. These combined triggers open many opportunities to study new physics beyond the Standard Model and to search for the Standard Model Higgs. We present the status and performance of the hadronic tau trigger in ATLAS. We demonstrate that the ATLAS tau trigger ran remarkably well over 2011, and how the lessons learned from 2011 led to numerous improvements in the preparation of the 2012 run. These improvements include the introduction of tau selection criteria that are robust against varying pileup scenarios, and the implementation of multivariate selection techniques in the tau tri...

  1. Cumulative effective dose associated with radiography and CT of adolescents with spinal injuries.

    Science.gov (United States)

    Lemburg, Stefan P; Peters, Soeren A; Roggenland, Daniela; Nicolas, Volkmar; Heyer, Christoph M

    2010-12-01

    The purpose of this study was to analyze the quantity and distribution of cumulative effective doses in diagnostic imaging of adolescents with spinal injuries. At a level 1 trauma center from July 2003 through June 2009, imaging procedures during initial evaluation and hospitalization and after discharge of all patients 10-20 years old with spinal fractures were retrospectively analyzed. The cumulative effective doses for all imaging studies were calculated, and the doses to patients with spinal injuries who had multiple traumatic injuries were compared with the doses to patients with spinal injuries but without multiple injuries. The significance level was set at 5%. Imaging studies of 72 patients (32 with multiple injuries; average age, 17.5 years) entailed a median cumulative effective dose of 18.89 mSv. Patients with multiple injuries had a significantly higher total cumulative effective dose (29.70 versus 10.86 mSv, p cumulative effective dose to multiple injury patients during the initial evaluation (18.39 versus 2.83 mSv, p cumulative effective dose. Adolescents with spinal injuries receive a cumulative effective dose equal to that of adult trauma patients and nearly three times that of pediatric trauma patients. Areas of focus in lowering cumulative effective dose should be appropriate initial estimation of trauma severity and careful selection of CT scan parameters.

  2. Data analysis at the CMS level-1 trigger: migrating complex selection algorithms from offline analysis and high-level trigger to the trigger electronics

    CERN Document Server

    Wulz, Claudia

    2017-01-01

    With ever increasing luminosity at the LHC, optimum online data selection is becoming more and more important. While in the case of some experiments (LHCb and ALICE) this task is being completely transferred to computer farms, the others -- ATLAS and CMS -- will not be able to do this in the medium-term future for technological, detector-related reasons. Therefore, these experiments pursue the complementary approach of migrating more and more of the offline and high-level trigger intelligence into the trigger electronics. The presentation illustrates how the level-1 trigger of the CMS experiment and in particular its concluding stage, the so-called ``Global Trigger", take up this challenge.

  3. Nonlinear dynamical triggering of slow slip

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Paul A [Los Alamos National Laboratory; Knuth, Matthew W [WISCONSIN; Kaproth, Bryan M [PENN STATE; Carpenter, Brett [PENN STATE; Guyer, Robert A [Los Alamos National Laboratory; Le Bas, Pierre - Yves [Los Alamos National Laboratory; Daub, Eric G [Los Alamos National Laboratory; Marone, Chris [PENN STATE

    2010-12-10

    Among the most fascinating, recent discoveries in seismology have been the phenomena of triggered slip, including triggered earthquakes and triggered-tremor, as well as triggered slow, silent-slip during which no seismic energy is radiated. Because fault nucleation depths cannot be probed directly, the physical regimes in which these phenomena occur are poorly understood. Thus determining physical properties that control diverse types of triggered fault sliding and what frictional constitutive laws govern triggered faulting variability is challenging. We are characterizing the physical controls of triggered faulting with the goal of developing constitutive relations by conducting laboratory and numerical modeling experiments in sheared granular media at varying load conditions. In order to simulate granular fault zone gouge in the laboratory, glass beads are sheared in a double-direct configuration under constant normal stress, while subject to transient perturbation by acoustic waves. We find that triggered, slow, silent-slip occurs at very small confining loads ({approx}1-3 MPa) that are smaller than those where dynamic earthquake triggering takes place (4-7 MPa), and that triggered slow-slip is associated with bursts of LFE-like acoustic emission. Experimental evidence suggests that the nonlinear dynamical response of the gouge material induced by dynamic waves may be responsible for the triggered slip behavior: the slip-duration, stress-drop and along-strike slip displacement are proportional to the triggering wave amplitude. Further, we observe a shear-modulus decrease corresponding to dynamic-wave triggering relative to the shear modulus of stick-slips. Modulus decrease in response to dynamical wave amplitudes of roughly a microstrain and above is a hallmark of elastic nonlinear behavior. We believe that the dynamical waves increase the material non-affine elastic deformation during shearing, simultaneously leading to instability and slow-slip. The inferred

  4. ATLAS: triggers for B-physics

    International Nuclear Information System (INIS)

    George, Simon

    2000-01-01

    The LHC will produce bb-bar events at an unprecedented rate. The number of events recorded by ATLAS will be limited by the rate at which they can be stored offline and subsequently analysed. Despite the huge number of events, the small branching ratios mean that analysis of many of the most interesting channels for CP violation and other measurements will be limited by statistics. The challenge for the Trigger and Data Acquisition (DAQ) system is therefore to maximise the fraction of interesting B decays in the B-physics data stream. The ATLAS Trigger/DAQ system is split into three levels. The initial B-physics selection is made in the first-level trigger by an inclusive low-p T muon trigger (∼6 GeV). The second-level trigger strategy is based on identifying classes of final states by their partial reconstruction. The muon trigger is confirmed before proceeding to a track search. Electron/hadron separation is given by the transition radiation tracking detector and the Electromagnetic calorimeter. Muon identification is possible using the muon detectors and the hadronic calorimeter. From silicon strips, pixels and straw tracking, precise track reconstruction is used to make selections based on invariant mass, momentum and impact parameter. The ATLAS trigger group is currently engaged in algorithm development and performance optimisation for the B-physics trigger. This is closely coupled to the R and D programme for the higher-level triggers. Together the two programmes of work will optimise the hardware, architecture and algorithms to meet the challenging requirements. This paper describes the current status and progress of this work

  5. Evolution of costly explicit memory and cumulative culture.

    Science.gov (United States)

    Nakamaru, Mayuko

    2016-06-21

    Humans can acquire new information and modify it (cumulative culture) based on their learning and memory abilities, especially explicit memory, through the processes of encoding, consolidation, storage, and retrieval. Explicit memory is categorized into semantic and episodic memories. Animals have semantic memory, while episodic memory is unique to humans and essential for innovation and the evolution of culture. As both episodic and semantic memory are needed for innovation, the evolution of explicit memory influences the evolution of culture. However, previous theoretical studies have shown that environmental fluctuations influence the evolution of imitation (social learning) and innovation (individual learning) and assume that memory is not an evolutionary trait. If individuals can store and retrieve acquired information properly, they can modify it and innovate new information. Therefore, being able to store and retrieve information is essential from the perspective of cultural evolution. However, if both storage and retrieval were too costly, forgetting and relearning would have an advantage over storing and retrieving acquired information. In this study, using mathematical analysis and individual-based simulations, we investigate whether cumulative culture can promote the coevolution of costly memory and social and individual learning, assuming that cumulative culture improves the fitness of each individual. The conclusions are: (1) without cumulative culture, a social learning cost is essential for the evolution of storage-retrieval. Costly storage-retrieval can evolve with individual learning but costly social learning does not evolve. When low-cost social learning evolves, the repetition of forgetting and learning is favored more than the evolution of costly storage-retrieval, even though a cultural trait improves the fitness. (2) When cumulative culture exists and improves fitness, storage-retrieval can evolve with social and/or individual learning, which

  6. Designing signal-enriched triggers for boosted jets.

    CERN Document Server

    Toumazou, Marina

    2017-01-01

    Triggers designed to favour the selection of hadronically decaying massive particles have been studied. Both triggers using solely ET and mass cuts (similar to new 2017 triggers) and triggers exploiting polarization information have been studied. The mass cut triggers show substantial gains in rate reduction, while the benefits of polarization triggers are less obvious. The final conclusion is that it is more useful to identify and trigger on generic boosted decays, irrespective of the polarization of the decaying particle

  7. Galactic interaction as the trigger for the young radio galaxy MRC B1221-423

    OpenAIRE

    Anderson, Craig; Johnston, Helen; Hunstead, Richard

    2013-01-01

    Mergers between a massive galaxy and a small gas-rich companion (minor mergers) have been proposed as a viable mechanism for triggering radio emission in an active galaxy. Until now the problem has been catching this sequence of events as they occur. With MRC B1221$-$423 we have an active radio galaxy that has only recently been triggered, and a companion galaxy that provides the "smoking gun". Using spectroscopic data taken with the VIMOS Integral Field Unit detector on the European Southern...

  8. CUMULATE ROCKS ASSOCIATED WITH CARBONATE ASSIMILATION, HORTAVÆR COMPLEX, NORTH-CENTRAL NORWAY

    Science.gov (United States)

    Barnes, C. G.; Prestvik, T.; Li, Y.

    2009-12-01

    The Hortavær igneous complex intruded high-grade metamorphic rocks of the Caledonian Helgeland Nappe Complex at ca. 466 Ma. The complex is an unusual mafic-silicic layered intrusion (MASLI) because the principal felsic rock type is syenite and because the syenite formed in situ rather than by deep-seated partial melting of crustal rocks. Magma differentiation in the complex was by assimilation, primarily of calc-silicate rocks and melts with contributions from marble and semi-pelites, plus fractional crystallization. The effect of assimilation of calcite-rich rocks was to enhance stability of fassaitic clinopyroxene at the expense of olivine, which resulted in alkali-rich residual melts and lowering of silica activity. This combination of MASLI-style emplacement and carbonate assimilation produced three types of cumulate rocks: (1) Syenitic cumulates formed by liquid-crystal separation. As sheets of mafic magma were loaded on crystal-rich syenitic magma, residual liquid was expelled, penetrating the overlying mafic sheets in flame structures, and leaving a cumulate syenite. (2) Reaction cumulates. Carbonate assimilation, illustrated by a simple assimilation reaction: olivine + calcite + melt = clinopyroxene + CO2 resulted in cpx-rich cumulates such as clinopyroxenite, gabbro, and mela-monzodiorite, many of which contain igneous calcite. (3) Magmatic skarns. Calc-silicate host rocks underwent partial melting during assimilation, yielding a Ca-rich melt as the principal assimilated material and permitting extensive reaction with surrounding magma to form Kspar + cpx + garnet-rich ‘cumulate’ rocks. Cumulate types (2) and (3) do not reflect traditional views of cumulate rocks but instead result from a series of melt-present discontinuous (peritectic) reactions and partial melting of calc-silicate xenoliths. In the Hortavær complex, such cumulates are evident because of the distinctive peritectic cumulate assemblages. It is unclear whether assimilation of

  9. EPA Workshop on Epigenetics and Cumulative Risk ...

    Science.gov (United States)

    Agenda Download the Workshop Agenda (PDF) The workshop included presentations and discussions by scientific experts pertaining to three topics (i.e., epigenetic changes associated with diverse stressors, key science considerations in understanding epigenetic changes, and practical application of epigenetic tools to address cumulative risks from environmental stressors), to address several questions under each topic, and included an opportunity for attendees to participate in break-out groups, provide comments and ask questions. Workshop Goals The workshop seeks to examine the opportunity for use of aggregate epigenetic change as an indicator in cumulative risk assessment for populations exposed to multiple stressors that affect epigenetic status. Epigenetic changes are specific molecular changes around DNA that alter expression of genes. Epigenetic changes include DNA methylation, formation of histone adducts, and changes in micro RNAs. Research today indicates that epigenetic changes are involved in many chronic diseases (cancer, cardiovascular disease, obesity, diabetes, mental health disorders, and asthma). Research has also linked a wide range of stressors including pollution and social factors with occurrence of epigenetic alterations. Epigenetic changes have the potential to reflect impacts of risk factors across multiple stages of life. Only recently receiving attention is the nexus between the factors of cumulative exposure to environmental

  10. Robust event-triggered MPC with guaranteed asymptotic bound and average sampling rate

    NARCIS (Netherlands)

    Brunner, F.D.; Heemels, W.P.M.H.; Allgower, F.

    2017-01-01

    We propose a robust event-triggered model predictive control (MPC) scheme for linear time-invariant discrete-time systems subject to bounded additive stochastic disturbances and hard constraints on the input and state. For given probability distributions of the disturbances acting on the system, we

  11. Hyperscaling breakdown and Ising spin glasses: The Binder cumulant

    Science.gov (United States)

    Lundow, P. H.; Campbell, I. A.

    2018-02-01

    Among the Renormalization Group Theory scaling rules relating critical exponents, there are hyperscaling rules involving the dimension of the system. It is well known that in Ising models hyperscaling breaks down above the upper critical dimension. It was shown by Schwartz (1991) that the standard Josephson hyperscaling rule can also break down in Ising systems with quenched random interactions. A related Renormalization Group Theory hyperscaling rule links the critical exponents for the normalized Binder cumulant and the correlation length in the thermodynamic limit. An appropriate scaling approach for analyzing measurements from criticality to infinite temperature is first outlined. Numerical data on the scaling of the normalized correlation length and the normalized Binder cumulant are shown for the canonical Ising ferromagnet model in dimension three where hyperscaling holds, for the Ising ferromagnet in dimension five (so above the upper critical dimension) where hyperscaling breaks down, and then for Ising spin glass models in dimension three where the quenched interactions are random. For the Ising spin glasses there is a breakdown of the normalized Binder cumulant hyperscaling relation in the thermodynamic limit regime, with a return to size independent Binder cumulant values in the finite-size scaling regime around the critical region.

  12. Distributed modelling of shallow landslides triggered by intense rainfall

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Hazard assessment of shallow landslides represents an important aspect of land management in mountainous areas. Among all the methods proposed in the literature, physically based methods are the only ones that explicitly includes the dynamic factors that control landslide triggering (rainfall pattern, land-use. For this reason, they allow forecasting both the temporal and the spatial distribution of shallow landslides. Physically based methods for shallow landslides are based on the coupling of the infinite slope stability analysis with hydrological models. Three different grid-based distributed hydrological models are presented in this paper: a steady state model, a transient "piston-flow" wetting front model, and a transient diffusive model. A comparative test of these models was performed to simulate landslide occurred during a rainfall event (27–28 June 1997 that triggered hundreds of shallow landslides within Lecco province (central Southern Alps, Italy. In order to test the potential for a completely distributed model for rainfall-triggered landslides, radar detected rainfall intensity has been used. A new procedure for quantitative evaluation of distributed model performance is presented and used in this paper. The diffusive model results in the best model for the simulation of shallow landslide triggering after a rainfall event like the one that we have analysed. Finally, radar data available for the June 1997 event permitted greatly improving the simulation. In particular, radar data allowed to explain the non-uniform distribution of landslides within the study area.

  13. Summary report of a workshop on establishing cumulative effects thresholds : a suggested approach for establishing cumulative effects thresholds in a Yukon context

    International Nuclear Information System (INIS)

    2003-01-01

    Increasingly, thresholds are being used as a land and cumulative effects assessment and management tool. To assist in the management of wildlife species such as woodland caribou, the Department of Indian and Northern Affairs (DIAND) Environment Directorate, Yukon sponsored a workshop to develop and use cumulative thresholds in the Yukon. The approximately 30 participants reviewed recent initiatives in the Yukon and other jurisdictions. The workshop is expected to help formulate a strategic vision for implementing cumulative effects thresholds in the Yukon. The key to success resides in building relationships with Umbrella Final Agreement (UFA) Boards, the Development Assessment Process (DAP), and the Yukon Environmental and Socio-Economic Assessment Act (YESAA). Broad support is required within an integrated resource management framework. The workshop featured discussions on current science and theory of cumulative effects thresholds. Potential data and implementation issues were also discussed. It was concluded that thresholds are useful and scientifically defensible. The threshold research results obtained in Alberta, British Columbia and the Northwest Territories are applicable to the Yukon. One of the best tools for establishing and tracking thresholds is habitat effectiveness. Effects must be monitored and tracked. Biologists must share their information with decision makers. Interagency coordination and assistance should be facilitated through the establishment of working groups. Regional land use plans should include thresholds. 7 refs.

  14. Spatiotemporal patterns, triggers and anatomies of seismically detected rockfalls

    Directory of Open Access Journals (Sweden)

    M. Dietze

    2017-11-01

    Full Text Available Rockfalls are a ubiquitous geomorphic process and a natural hazard in steep landscapes across the globe. Seismic monitoring can provide precise information on the timing, location and event anatomy of rockfalls, which are parameters that are otherwise hard to constrain. By pairing data from 49 seismically detected rockfalls in the Lauterbrunnen Valley in the Swiss Alps with auxiliary meteorologic and seismic data of potential triggers during autumn 2014 and spring 2015, we are able to (i analyse the evolution of single rockfalls and their common properties, (ii identify spatial changes in activity hotspots (iii and explore temporal activity patterns on different scales ranging from months to minutes to quantify relevant trigger mechanisms. Seismic data allow for the classification of rockfall activity into two distinct phenomenological types. The signals can be used to discern multiple rock mass releases from the same spot, identify rockfalls that trigger further rockfalls and resolve modes of subsequent talus slope activity. In contrast to findings based on discontinuous methods with integration times of several months, rockfall in the monitored limestone cliff is not spatially uniform but shows a systematic downward shift of a rock mass release zone following an exponential law, most likely driven by a continuously lowering water table. Freeze–thaw transitions, approximated at first order from air temperature time series, account for only 5 out of the 49 rockfalls, whereas 19 rockfalls were triggered by rainfall events with a peak lag time of 1 h. Another 17 rockfalls were triggered by diurnal temperature changes and occurred during the coldest hours of the day and during the highest temperature change rates. This study is thus the first to show direct links between proposed rockfall triggers and the spatiotemporal distribution of rockfalls under natural conditions; it extends existing models by providing seismic observations of the

  15. Towards RTOS support for mixed time-triggered and event-triggered task sets

    NARCIS (Netherlands)

    Heuvel, van den M.M.H.P.; Bril, R.J.; Lukkien, J.J.; Isovic, D.; Sankar Ramachandran, G.

    2012-01-01

    Many embedded systems have complex timing constraints and, at the same time, have flexibility requirements which prohibit offline planning of the entire system. To support a mixture of time-triggered and event-triggered tasks, some industrial systems deploy a real-time operating system (RTOS) with a

  16. The Run-2 ATLAS Trigger System

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00222798; The ATLAS collaboration

    2016-01-01

    The ATLAS trigger successfully collected collision data during the first run of the LHC between 2009-2013 at different centre-of-mass energies between 900 GeV and 8 TeV. The trigger system consists of a hardware Level-1 and a software-based high level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV and higher luminosity, resulting in roughly five times higher trigger rates. A brief review of the ATLAS trigger system upgrades that were implemented between Run-1 and Run-2, allowing to cope with the increased trigger rates while maintaining or even improving the efficiency to select physics processes of interest, will be given. This includes changes to the Level-1 calorimeter and muon trigger systems, the introduction of a new Level-1 topological trigger module and the merging of the previously two-level HLT system into a single event filter farm. A ...

  17. The LVL2 trigger goes online

    CERN Multimedia

    David Berge

    On Friday, the 9th of February, the ATLAS TDAQ community reached an important milestone. In a successful integration test, cosmic-ray muons were recorded with parts of the muon spectrometer, the central-trigger system and a second-level trigger algorithm. This was actually the first time that a full trigger slice all the way from the first-level trigger muon chambers up to event building after event selection by the second-level trigger ran online with cosmic rays. The ATLAS trigger and data acquisition system has a three-tier structure that is designed to cope with the enormous demands of proton-proton collisions at a bunch-crossing frequency of 40 MHz, with a typical event size of 1-2 MB. The online event selection has to reduce the incoming rate by a factor of roughly 200,000 to 200 Hz, a rate digestible by the archival-storage and offline-processing facilities. ATLAS has a mixed system: the first-level trigger (LVL1) is in hardware, while the other two consecutive levels, the second-level trigger (LVL2)...

  18. The Database Driven ATLAS Trigger Configuration System

    CERN Document Server

    Martyniuk, Alex; The ATLAS collaboration

    2015-01-01

    This contribution describes the trigger selection configuration system of the ATLAS low- and high-level trigger (HLT) and the upgrades it received in preparation for LHC Run 2. The ATLAS trigger configuration system is responsible for applying the physics selection parameters for the online data taking at both trigger levels and the proper connection of the trigger lines across those levels. Here the low-level trigger consists of the already existing central trigger (CT) and the new Level-1 Topological trigger (L1Topo), which has been added for Run 2. In detail the tasks of the configuration system during the online data taking are Application of the selection criteria, e.g. energy cuts, minimum multiplicities, trigger object correlation, at the three trigger components L1Topo, CT, and HLT On-the-fly, e.g. rate-dependent, generation and application of prescale factors to the CT and HLT to adjust the trigger rates to the data taking conditions, such as falling luminosity or rate spikes in the detector readout ...

  19. Mismatch or cumulative stress : Toward an integrated hypothesis of programming effects

    NARCIS (Netherlands)

    Nederhof, Esther; Schmidt, Mathias V.

    2012-01-01

    This paper integrates the cumulative stress hypothesis with the mismatch hypothesis, taking into account individual differences in sensitivity to programming. According to the cumulative stress hypothesis, individuals are more likely to suffer from disease as adversity accumulates. According to the

  20. DT Local Trigger performance in 2015

    CERN Document Server

    CMS Collaboration

    2015-01-01

    The Local Trigger system of the CMS Drift Tube chambers (DT) was checked applying similar methods as in the LHC Run 1 (2012). The main variables shown in this note are the trigger efficiency, the trigger quality and the fraction of trigger ghosts. The performance was found to be comparable or better than in Run 1.

  1. Complexity and demographic explanations of cumulative culture.

    Science.gov (United States)

    Querbes, Adrien; Vaesen, Krist; Houkes, Wybo

    2014-01-01

    Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.

  2. Sharing a quota on cumulative carbon emissions

    International Nuclear Information System (INIS)

    Raupach, Michael R.; Davis, Steven J.; Peters, Glen P.; Andrew, Robbie M.; Canadell, Josep G.; Ciais, Philippe

    2014-01-01

    Any limit on future global warming is associated with a quota on cumulative global CO 2 emissions. We translate this global carbon quota to regional and national scales, on a spectrum of sharing principles that extends from continuation of the present distribution of emissions to an equal per-capita distribution of cumulative emissions. A blend of these endpoints emerges as the most viable option. For a carbon quota consistent with a 2 C warming limit (relative to pre-industrial levels), the necessary long-term mitigation rates are very challenging (typically over 5% per year), both because of strong limits on future emissions from the global carbon quota and also the likely short-term persistence in emissions growth in many regions. (authors)

  3. Childhood Cumulative Risk and Later Allostatic Load

    DEFF Research Database (Denmark)

    Doan, Stacey N; Dich, Nadya; Evans, Gary W

    2014-01-01

    State, followed for 8 years (between the ages 9 and 17). Poverty- related stress was computed using the cumulative risk approach, assessing stressors across 9 domains, including environmental, psychosocial, and demographic factors. Allostatic load captured a range of physiological responses, including......Objective: The present study investigated the long-term impact of exposure to poverty-related stressors during childhood on allostatic load, an index of physiological dysregulation, and the potential mediating role of substance use. Method: Participants (n = 162) were rural children from New York...... cardiovascular, hypothalamic pituitary adrenal axis, sympathetic adrenal medullary system, and metabolic activity. Smoking and alcohol/drug use were tested as mediators of the hypothesized childhood risk-adolescent allostatic load relationship. Results: Cumulative risk exposure at age 9 predicted increases...

  4. Cumulative impoundment evaporation in water resource management within the mid-Atlantic: A case study in Virginia

    Science.gov (United States)

    Scott, D.; Burgholzer, R.; Kleiner, J.; Brogan, C. O.; Julson, C.; Withers, E.

    2017-12-01

    Across the eastern United States, successful management of water resources to satisfy the competing demands for human consumption, industry, agriculture, and ecosystems requires both water quality and water quantity considerations. Over the last 2 decades, low streamflows during dry summers have increased scrutiny on water supply withdrawals. Within Virginia, a statewide hydrologic model provides quantitative assessments on impacts from proposed water withdrawals to downstream river flow. Currently, evaporative losses are only accounted for from the large reservoirs. In this study, we sought to provide a baseline estimate for the cumulative evaporation from impoundments across all of the major river basins in Virginia. Virginia provides an ideal case study for the competing water demands in the mid-Atlantic region given the unique tracking of water withdrawals throughout the river corridor. In the over 73,000 Virginia impoundments, the cumulative annual impoundment evaporation was 706 MGD, or 49% of the permitted water withdrawal. The largest reservoirs (>100 acres) represented over 400 MGD, and 136 MGD for the smaller impoundments (water loss (evaporation + demand), with some areas where impoundment evaporation was greater than human water demand. Seasonally, our results suggest that cumulative impoundment evaporation in some watersheds greatly impacts streamflow during low flow periods. Our results demonstrate that future water supply planning will require not only understanding evaporation within large reservoirs, but also the thousands of small impoundments across the landscape.

  5. Mapping cumulative environmental risks: examples from the EU NoMiracle project

    NARCIS (Netherlands)

    Pistocchi, A.; Groenwold, J.; Lahr, J.; Loos, M.; Mujica, M.; Ragas, A.M.J.; Rallo, R.; Sala, S.; Schlink, U.; Strebel, K.; Vighi, M.; Vizcaino, P.

    2011-01-01

    We present examples of cumulative chemical risk mapping methods developed within the NoMiracle project. The different examples illustrate the application of the concentration addition (CA) approach to pesticides at different scale, the integration in space of cumulative risks to individual organisms

  6. Mapping Cumulative Impacts of Human Activities on Marine Ecosystems

    OpenAIRE

    , Seaplan

    2018-01-01

    Given the diversity of human uses and natural resources that converge in coastal waters, the potential independent and cumulative impacts of those uses on marine ecosystems are important to consider during ocean planning. This study was designed to support the development and implementation of the 2009 Massachusetts Ocean Management Plan. Its goal was to estimate and visualize the cumulative impacts of human activities on coastal and marine ecosystems in the state and federal waters off of Ma...

  7. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date...... by calendar time trends, the total sample Kaplan-Meier and Aalen-Johansen estimators do not provide useful estimates of the general risk in the target population. We present some alternatives to this type of analysis. RESULTS: We show how a proportional hazards model may be used to extrapolate disease risk...... estimates if proportionality is a reasonable assumption. If not reasonable, we instead advocate that a more useful description of the disease risk lies in the age-specific cumulative incidence curves across strata given by time of entry or perhaps just the end of follow-up estimates across all strata...

  8. The Relationship between Gender, Cumulative Adversities and ...

    African Journals Online (AJOL)

    The Relationship between Gender, Cumulative Adversities and Mental Health of Employees in ... CAs were measured in three forms (family adversities (CAFam), personal adversities ... Age of employees ranged between 18-65 years.

  9. The Run-2 ATLAS Trigger System

    CERN Document Server

    Ruiz-Martinez, Aranzazu; The ATLAS collaboration

    2016-01-01

    The ATLAS trigger has been successfully collecting collision data during the first run of the LHC between 2009-2013 at a centre-of-mass energy between 900 GeV and 8 TeV. The trigger system consists of a hardware Level-1 (L1) and a software based high-level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV resulting in roughly five times higher trigger rates. We will briefly review the ATLAS trigger system upgrades that were implemented during the shutdown, allowing us to cope with the increased trigger rates while maintaining or even improving our efficiency to select relevant physics processes. This includes changes to the L1 calorimeter and muon trigger systems, the introduction of a new L1 topological trigger module and the merging of the previously two-level HLT system into a single event filter farm. At hand of a few examples, we will show the ...

  10. Calorimetry triggering in ATLAS

    CERN Document Server

    Igonkina, O; Adragna, P; Aharrouche, M; Alexandre, G; Andrei, V; Anduaga, X; Aracena, I; Backlund, S; Baines, J; Barnett, B M; Bauss, B; Bee, C; Behera, P; Bell, P; Bendel, M; Benslama, K; Berry, T; Bogaerts, A; Bohm, C; Bold, T; Booth, J R A; Bosman, M; Boyd, J; Bracinik, J; Brawn, I, P; Brelier, B; Brooks, W; Brunet, S; Bucci, F; Casadei, D; Casado, P; Cerri, A; Charlton, D G; Childers, J T; Collins, N J; Conde Muino, P; Coura Torres, R; Cranmer, K; Curtis, C J; Czyczula, Z; Dam, M; Damazio, D; Davis, A O; De Santo, A; Degenhardt, J; Delsart, P A; Demers, S; Demirkoz, B; Di Mattia, A; Diaz, M; Djilkibaev, R; Dobson, E; Dova, M, T; Dufour, M A; Eckweiler, S; Ehrenfeld, W; Eifert, T; Eisenhandler, E; Ellis, N; Emeliyanov, D; Enoque Ferreira de Lima, D; Faulkner, P J W; Ferland, J; Flacher, H; Fleckner, J E; Flowerdew, M; Fonseca-Martin, T; Fratina, S; Fhlisch, F; Gadomski, S; Gallacher, M P; Garitaonandia Elejabarrieta, H; Gee, C N P; George, S; Gillman, A R; Goncalo, R; Grabowska-Bold, I; Groll, M; Gringer, C; Hadley, D R; Haller, J; Hamilton, A; Hanke, P; Hauser, R; Hellman, S; Hidvgi, A; Hillier, S J; Hryn'ova, T; Idarraga, J; Johansen, M; Johns, K; Kalinowski, A; Khoriauli, G; Kirk, J; Klous, S; Kluge, E-E; Koeneke, K; Konoplich, R; Konstantinidis, N; Kwee, R; Landon, M; LeCompte, T; Ledroit, F; Lei, X; Lendermann, V; Lilley, J N; Losada, M; Maettig, S; Mahboubi, K; Mahout, G; Maltrana, D; Marino, C; Masik, J; Meier, K; Middleton, R P; Mincer, A; Moa, T; Monticelli, F; Moreno, D; Morris, J D; Mller, F; Navarro, G A; Negri, A; Nemethy, P; Neusiedl, A; Oltmann, B; Olvito, D; Osuna, C; Padilla, C; Panes, B; Parodi, F; Perera, V J O; Perez, E; Perez Reale, V; Petersen, B; Pinzon, G; Potter, C; Prieur, D P F; Prokishin, F; Qian, W; Quinonez, F; Rajagopalan, S; Reinsch, A; Rieke, S; Riu, I; Robertson, S; Rodriguez, D; Rogriquez, Y; Rhr, F; Saavedra, A; Sankey, D P C; Santamarina, C; Santamarina Rios, C; Scannicchio, D; Schiavi, C; Schmitt, K; Schultz-Coulon, H C; Schfer, U; Segura, E; Silverstein, D; Silverstein, S; Sivoklokov, S; Sjlin, J; Staley, R J; Stamen, R; Stelzer, J; Stockton, M C; Straessner, A; Strom, D; Sushkov, S; Sutton, M; Tamsett, M; Tan, C L A; Tapprogge, S; Thomas, J P; Thompson, P D; Torrence, E; Tripiana, M; Urquijo, P; Urrejola, P; Vachon, B; Vercesi, V; Vorwerk, V; Wang, M; Watkins, P M; Watson, A; Weber, P; Weidberg, T; Werner, P; Wessels, M; Wheeler-Ellis, S; Whiteson, D; Wiedenmann, W; Wielers, M; Wildt, M; Winklmeier, F; Wu, X; Xella, S; Zhao, L; Zobernig, H; de Seixas, J M; dos Anjos, A; Asman, B; Özcan, E

    2009-01-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 105 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  11. Cumulation of light nuclei

    International Nuclear Information System (INIS)

    Baldin, A.M.; Bondarev, V.K.; Golovanov, L.B.

    1977-01-01

    Limit fragmentation of light nuclei (deuterium, helium) bombarded with 8,6 GeV/c protons was investigated. Fragments (pions, protons and deuterons) were detected within the emission angle 50-150 deg with regard to primary protons and within the pulse range 150-180 MeV/c. By the kinematics of collision of a primary proton with a target at rest the fragments observed correspond to a target mass upto 3 GeV. Thus, the data obtained correspond to teh cumulation upto the third order

  12. Cumulative Mass and NIOSH Variable Lifting Index Method for Risk Assessment: Possible Relations.

    Science.gov (United States)

    Stucchi, Giulia; Battevi, Natale; Pandolfi, Monica; Galinotti, Luca; Iodice, Simona; Favero, Chiara

    2018-02-01

    Objective The aim of this study was to explore whether the Variable Lifting Index (VLI) can be corrected for cumulative mass and thus test its efficacy in predicting the risk of low-back pain (LBP). Background A validation study of the VLI method was published in this journal reporting promising results. Although several studies highlighted a positive correlation between cumulative load and LBP, cumulative mass has never been considered in any of the studies investigating the relationship between manual material handling and LBP. Method Both VLI and cumulative mass were calculated for 2,374 exposed subjects using a systematic approach. Due to high variability of cumulative mass values, a stratification within VLI categories was employed. Dummy variables (1-4) were assigned to each class and used as a multiplier factor for the VLI, resulting in a new index (VLI_CMM). Data on LBP were collected by occupational physicians at the study sites. Logistic regression was used to estimate the risk of acute LBP within levels of risk exposure when compared with a control group formed by 1,028 unexposed subjects. Results Data showed greatly variable values of cumulative mass across all VLI classes. The potential effect of cumulative mass on damage emerged as not significant ( p value = .6526). Conclusion When comparing VLI_CMM with raw VLI, the former failed to prove itself as a better predictor of LBP risk. Application To recognize cumulative mass as a modifier, especially for lumbar degenerative spine diseases, authors of future studies should investigate potential association between the VLI and other damage variables.

  13. Session: What do we know about cumulative or population impacts

    Energy Technology Data Exchange (ETDEWEB)

    Kerlinger, Paul; Manville, Al; Kendall, Bill

    2004-09-01

    This session at the Wind Energy and Birds/Bats workshop consisted of a panel discussion followed by a discussion/question and answer period. The panelists were Paul Kerlinger, Curry and Kerlinger, LLC, Al Manville, U.S. Fish and Wildlife Service, and Bill Kendall, US Geological Service. The panel addressed the potential cumulative impacts of wind turbines on bird and bat populations over time. Panel members gave brief presentations that touched on what is currently known, what laws apply, and the usefulness of population modeling. Topics addressed included which sources of modeling should be included in cumulative impacts, comparison of impacts from different modes of energy generation, as well as what research is still needed regarding cumulative impacts of wind energy development on bird and bat populations.

  14. Wired and Wireless Camera Triggering with Arduino

    Science.gov (United States)

    Kauhanen, H.; Rönnholm, P.

    2017-10-01

    Synchronous triggering is an important task that allows simultaneous data capture from multiple cameras. Accurate synchronization enables 3D measurements of moving objects or from a moving platform. In this paper, we describe one wired and four wireless variations of Arduino-based low-cost remote trigger systems designed to provide a synchronous trigger signal for industrial cameras. Our wireless systems utilize 315 MHz or 434 MHz frequencies with noise filtering capacitors. In order to validate the synchronization accuracy, we developed a prototype of a rotating trigger detection system (named RoTriDeS). This system is suitable to detect the triggering accuracy of global shutter cameras. As a result, the wired system indicated an 8.91 μs mean triggering time difference between two cameras. Corresponding mean values for the four wireless triggering systems varied between 7.92 and 9.42 μs. Presented values include both camera-based and trigger-based desynchronization. Arduino-based triggering systems appeared to be feasible, and they have the potential to be extended to more complicated triggering systems.

  15. Cumulative irritation potential of topical retinoid formulations.

    Science.gov (United States)

    Leyden, James J; Grossman, Rachel; Nighland, Marge

    2008-08-01

    Localized irritation can limit treatment success with topical retinoids such as tretinoin and adapalene. The factors that influence irritant reactions have been shown to include individual skin sensitivity, the particular retinoid and concentration used, and the vehicle formulation. To compare the cutaneous tolerability of tretinoin 0.04% microsphere gel (TMG) with that of adapalene 0.3% gel and a standard tretinoin 0.025% cream. The results of 2 randomized, investigator-blinded studies of 2 to 3 weeks' duration, which utilized a split-face method to compare cumulative irritation scores induced by topical retinoids in subjects with healthy skin, were combined. Study 1 compared TMG 0.04% with adapalene 0.3% gel over 2 weeks, while study 2 compared TMG 0.04% with tretinoin 0.025% cream over 3 weeks. In study 1, TMG 0.04% was associated with significantly lower cumulative scores for erythema, dryness, and burning/stinging than adapalene 0.3% gel. However, in study 2, there were no significant differences in cumulative irritation scores between TMG 0.04% and tretinoin 0.025% cream. Measurements of erythema by a chromameter showed no significant differences between the test formulations in either study. Cutaneous tolerance of TMG 0.04% on the face was superior to that of adapalene 0.3% gel and similar to that of a standard tretinoin cream containing a lower concentration of the drug (0.025%).

  16. Triggering for charm, beauty, and truth

    International Nuclear Information System (INIS)

    Appel, J.A.

    1982-02-01

    As the search for more and more rare processes accelerates, the need for more and more effective event triggers also accelerates. In the earliest experiments, a simple coincidence often sufficed not only as the event trigger, but as the complete record of an event of interest. In today's experiments, not only has the fast trigger become more sophisticated, but one or more additional level of trigger processing precedes writing event data to magnetic tape for later analysis. Further search experiments will certainly require further expansion in the number of trigger levels required to filter those rare events of particular interest

  17. AIDS radio triggers.

    Science.gov (United States)

    Elias, A M

    1991-07-01

    In April 1991, the Ethnic Communities' Council of NSW was granted funding under the Community AIDS Prevention and Education Program through the Department of Community Services and Health, to produce a series of 6x50 second AIDS radio triggers with a 10-second tag line for further information. The triggers are designed to disseminate culturally-sensitive information about HIV/AIDS in English, Italian, Greek, Spanish, Khmer, Turkish, Macedonian, Serbo-Croatian, Arabic, Cantonese, and Vietnamese, with the goal of increasing awareness and decreasing the degree of misinformation about HIV/AIDS among people of non-English-speaking backgrounds through radio and sound. The 6 triggers cover the denial that AIDS exists in the community, beliefs that words and feelings do not protect one from catching HIV, encouraging friends to be compassionate, compassion within the family, AIDS information for a young audience, and the provision of accurate and honest information on HIV/AIDS. The triggers are slated to be completed by the end of July 1991 and will be broadcast on all possible community, ethnic, and commercial radio networks across Australia. They will be available upon request in composite form with an information kit for use by health care professionals and community workers.

  18. Self-similarity of hard cumulative processes in fixed target experiment for BES-II at STAR

    International Nuclear Information System (INIS)

    Tokarev, M.V.; Aparin, A.A.; Zborovsky, I.

    2014-01-01

    Search for signatures of phase transition in Au + Au collisions is in the heart of the heavy ion program at RHIC. Systematic study of particle production over a wide range of collision energy revealed new phenomena such as the nuclear suppression effect expressed by nuclear modification factor, the constituent quark number scaling for elliptic flow, the 'ridge effect' in - fluctuations, etc. To determine the phase boundaries and location of the critical point of nuclear matter, the Beam Energy Scan (BES-I) program at RHIC has been suggested and performed by STAR and PHENIX Collaborations. The obtained results have shown that the program (BES-II) should be continued. In this paper a proposal to use hard cumulative processes in BES Phase-II program is outlined. Selection of the cumulative events is assumed to enrich data sample by a new type of collisions characterized by higher energy density and more compressed matter. This would allow finding clearer signatures of phase transition, location of a critical point and studying extreme conditions in heavy ion collisions.

  19. Measurement of multi-particle azimuthal correlations with the subevent cumulant method with the ATLAS detector

    CERN Document Server

    Zhou, Mingliang; The ATLAS collaboration

    2017-01-01

    The measurement of four-particle cumulant and anisotropic elliptic flow coefficient for the second harmonic, $c_{2}\\{4\\}$ and $v_{2}\\{4\\}$, are presented using $pp$ data at $\\sqrt{s}=5.02$ and $\\sqrt{s}=13$ TeV, and $p+$Pb data at $\\sqrt{s_{\\text{NN}}}=5.02$ TeV. These measurements aim to assess collective nature of multi-particle production. While collectivity is well established in $p$+Pb and Pb+Pb collisions, because of larger non-flow contributions, its evidence in $pp$ collisions is contested. The values of $c_{2}\\{4\\}$ are calculated using the standard cumulant method and recently proposed two and three subevent methods, which can further suppress the non-flow contributions in small systems. In these collisions systems, the three subevent method gives a negative $c_{2}\\{4\\}$, and thus a well-defined $v_{2}\\{4\\}$. The magnitude of $c_{2}\\{4\\}$ is found to be nearly independent of $\\langle N_{\\text{ch}} \\rangle$ and third harmonic $c_{3}\\{4\\}$ is consistent with 0. $v_{2}\\{4\\}$ is found to be smaller than...

  20. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  1. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    1997-01-01

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  2. Geometrical Acceptance Analysis for RPC PAC Trigger

    CERN Document Server

    Seo, Eunsung

    2010-01-01

    The CMS(Compact Muon Solenoid) is one of the four experiments that will analyze the collision results of the protons accelerated by the Large Hardron Collider(LHC) at CERN(Conseil Europen pour la Recherche Nuclaire). In case of the CMS experiment, the trigger system is divided into two stages : The Level-1 Trigger and High Level Trigger. The RPC(Resistive Plate Chamber) PAC(PAttern Comparator) Trigger system, which is a subject of this thesis, is a part of the Level-1 Muon Trigger System. Main task of the PAC Trigger is to identify muons, measures transverse momenta and select the best muon candidates for each proton bunch collision occurring every 25 ns. To calculate the value of PAC Trigger efficiency for triggerable muon, two terms of different efficiencies are needed ; acceptance efficiency and chamber efficiency. Main goal of the works described in this thesis is obtaining the acceptance efficiency of the PAC Trigger in each logical cone. Acceptance efficiency is a convolution of the chambers geometry an...

  3. Event-triggered decentralized robust model predictive control for constrained large-scale interconnected systems

    Directory of Open Access Journals (Sweden)

    Ling Lu

    2016-12-01

    Full Text Available This paper considers the problem of event-triggered decentralized model predictive control (MPC for constrained large-scale linear systems subject to additive bounded disturbances. The constraint tightening method is utilized to formulate the MPC optimization problem. The local predictive control law for each subsystem is determined aperiodically by relevant triggering rule which allows a considerable reduction of the computational load. And then, the robust feasibility and closed-loop stability are proved and it is shown that every subsystem state will be driven into a robust invariant set. Finally, the effectiveness of the proposed approach is illustrated via numerical simulations.

  4. The ATLAS High Level Trigger Steering Framework and the Trigger Configuration System.

    CERN Document Server

    Perez Cavalcanti, Tiago; The ATLAS collaboration

    2011-01-01

    The ATLAS detector system installed in the Large Hadron Collider (LHC) at CERN is designed to study proton-proton and nucleus-nucleus collisions with a maximum centre of mass energy of 14 TeV at a bunch collision rate of 40MHz. In March 2010 the four LHC experiments saw the first proton-proton collisions at 7 TeV. Still within the year a collision rate of nearly 10 MHz is expected. At ATLAS, events of potential interest for ATLAS physics are selected by a three-level trigger system, with a final recording rate of about 200 Hz. The first level (L1) is implemented in custom hardware; the two levels of the high level trigger (HLT) are software triggers, running on large farms of standard computers and network devices. Within the ATLAS physics program more than 500 trigger signatures are defined. The HLT tests each signature on each L1-accepted event; the test outcome is recorded for later analysis. The HLT-Steering is responsible for this. It foremost ensures the independent test of each signature, guarantying u...

  5. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  6. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  7. On the observations of unique low latitude whistler-triggered VLF/ELF emissions

    Science.gov (United States)

    Altaf, M.; Singh, K. K.; Singh, A. K.; Lalmani

    A detailed analysis of the VLF/ELF wave data obtained during a whistler campaign under All India Coordinated Program of Ionosphere Thermosphere Studies (AICPITS) at our low latitude Indian ground station Jammu (geomag. lat. = 22° 26‧ N, L = 1.17) has yielded two types of unusual and unique whistler-triggered VLF/ELF emissions. These include (1) whistler-triggered hook emissions and (2) whistler-triggered long enduring discrete chorus riser emissions in VLF/ELF frequency range during night time. Such types of whistler-triggered emissions have not been reported earlier from any of the ground observations at low latitudes. In the present study, the observed characteristics of these emissions are described and interpreted. Dispersion analysis of these emissions show that the whistlers as well as emissions have propagated along a higher geomagnetic field line path with L-values lying ∼L = 4, suggesting that these triggered emissions are to be regarded as mid-latitude emissions. These waves could have propagated along the geomagnetic field lines either in a ducted mode or in a pro-longitudinal (PL) mode. The measured intensity of the triggered emissions is almost equal to that of the source waves and does not vary throughout the period of observation on that day. It is speculated that these emissions may have been generated through a process of resonant interaction of the whistler waves with energetic electrons. Parameters related to this interaction are computed for different values of L and wave amplitude. The proposed mechanism explains some aspects of the dynamic spectra.

  8. Triggering the GRANDE array

    International Nuclear Information System (INIS)

    Wilson, C.L.; Bratton, C.B.; Gurr, J.; Kropp, W.; Nelson, M.; Sobel, H.; Svoboda, R.; Yodh, G.; Burnett, T.; Chaloupka, V.; Wilkes, R.J.; Cherry, M.; Ellison, S.B.; Guzik, T.G.; Wefel, J.; Gaidos, J.; Loeffler, F.; Sembroski, G.; Goodman, J.; Haines, T.J.; Kielczewska, D.; Lane, C.; Steinberg, R.; Lieber, M.; Nagle, D.; Potter, M.; Tripp, R.

    1990-01-01

    A brief description of the Gamma Ray And Neutrino Detector Experiment (GRANDE) is presented. The detector elements and electronics are described. The trigger logic for the array is then examined. The triggers for the Gamma Ray and the Neutrino portions of the array are treated separately. (orig.)

  9. Calorimetry triggering in ATLAS

    International Nuclear Information System (INIS)

    Igonkina, O; Achenbach, R; Andrei, V; Adragna, P; Aharrouche, M; Bauss, B; Bendel, M; Alexandre, G; Anduaga, X; Aracena, I; Backlund, S; Bogaerts, A; Baines, J; Barnett, B M; Bee, C; P, Behera; Bell, P; Benslama, K; Berry, T; Bohm, C

    2009-01-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 | 10 5 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  10. Calorimetry Triggering in ATLAS

    International Nuclear Information System (INIS)

    Igonkina, O.; Achenbach, R.; Adragna, P.; Aharrouche, M.; Alexandre, G.; Andrei, V.; Anduaga, X.; Aracena, I.; Backlund, S.; Baines, J.; Barnett, B.M.; Bauss, B.; Bee, C.; Behera, P.; Bell, P.; Bendel, M.; Benslama, K.; Berry, T.; Bogaerts, A.; Bohm, C.; Bold, T.; Booth, J.R.A.; Bosman, M.; Boyd, J.; Bracinik, J.; Brawn, I.P.; Brelier, B.; Brooks, W.; Brunet, S.; Bucci, F.; Casadei, D.; Casado, P.; Cerri, A.; Charlton, D.G.; Childers, J.T.; Collins, N.J.; Conde Muino, P.; Coura Torres, R.; Cranmer, K.; Curtis, C.J.; Czyczula, Z.; Dam, M.; Damazio, D.; Davis, A.O.; De Santo, A.; Degenhardt, J.

    2011-01-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2/10 5 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  11. Calorimetry triggering in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Igonkina, O [Nikhef National Institute for Subatomic Physics, Amsterdam (Netherlands); Achenbach, R; Andrei, V [Kirchhoff Institut fuer Physik, Universitaet Heidelberg, Heidelberg (Germany); Adragna, P [Physics Department, Queen Mary, University of London, London (United Kingdom); Aharrouche, M; Bauss, B; Bendel, M [Institut fr Physik, Universitt Mainz, Mainz (Germany); Alexandre, G [Section de Physique, Universite de Geneve, Geneva (Switzerland); Anduaga, X [Universidad Nacional de La Plata, La Plata (Argentina); Aracena, I [Stanford Linear Accelerator Center (SLAC), Stanford (United States); Backlund, S; Bogaerts, A [European Laboratory for Particle Physics (CERN), Geneva (Switzerland); Baines, J; Barnett, B M [STFC Rutherford Appleton Laboratory, Harwell Science and Innovation Campus, Didcot, Oxon (United Kingdom); Bee, C [Centre de Physique des Particules de Marseille, IN2P3-CNRS, Marseille (France); P, Behera [Iowa State University, Ames, Iowa (United States); Bell, P [School of Physics and Astronomy, University of Manchester, Manchester (United Kingdom); Benslama, K [University of Regina, Regina (Canada); Berry, T [Department of Physics, Royal Holloway and Bedford New College, Egham (United Kingdom); Bohm, C [Fysikum, Stockholm University, Stockholm (Sweden)

    2009-04-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 | 10{sup 5} to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  12. Complexity and demographic explanations of cumulative culture.

    Directory of Open Access Journals (Sweden)

    Adrien Querbes

    Full Text Available Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.

  13. The Run-2 ATLAS Trigger System

    International Nuclear Information System (INIS)

    Martínez, A Ruiz

    2016-01-01

    The ATLAS trigger successfully collected collision data during the first run of the LHC between 2009-2013 at different centre-of-mass energies between 900 GeV and 8TeV. The trigger system consists of a hardware Level-1 and a software-based high level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV and higher luminosity, resulting in up to five times higher rates of processes of interest. A brief review of the ATLAS trigger system upgrades that were implemented between Run-1 and Run-2, allowing to cope with the increased trigger rates while maintaining or even improving the efficiency to select physics processes of interest, will be given. This includes changes to the Level-1 calorimeter and muon trigger systems, the introduction of a new Level-1 topological trigger module and the merging of the previously two-level HLT system into a single event processing farm. A few examples will be shown, such as the impressive performance improvements in the HLT trigger algorithms used to identify leptons, hadrons and global event quantities like missing transverse energy. Finally, the status of the commissioning of the trigger system and its performance during the 2015 run will be presented. (paper)

  14. The ATLAS Electron and Photon Trigger

    CERN Document Server

    Jones, Samuel David; The ATLAS collaboration

    2017-01-01

    Electron and photon triggers covering transverse energies from 5 GeV to several TeV are essential for signal selection in a wide variety of ATLAS physics analyses to study Standard Model processes and to search for new phenomena. Final states including leptons and photons had, for example, an important role in the discovery and measurement of the Higgs boson. Dedicated triggers are also used to collect data for calibration, efficiency and fake rate measurements. The ATLAS trigger system is divided in a hardware-based Level-1 trigger and a software-based high-level trigger, both of which were upgraded during the LHC shutdown in preparation for Run-2 operation. To cope with the increasing luminosity and more challenging pile-up conditions at a center-of-mass energy of 13 TeV, the trigger selections at each level are optimized to control the rates and keep efficiencies high. To achieve this goal multivariate analysis techniques are used. The ATLAS electron and photon triggers and their performance with Run 2 dat...

  15. The ATLAS Electron and Photon Trigger

    CERN Document Server

    Jones, Samuel David; The ATLAS collaboration

    2018-01-01

    Electron and photon triggers covering transverse energies from 5 GeV to several TeV are essential for signal selection in a wide variety of ATLAS physics analyses to study Standard Model processes and to search for new phenomena. Final states including leptons and photons had, for example, an important role in the discovery and measurement of the Higgs boson. Dedicated triggers are also used to collect data for calibration, efficiency and fake rate measurements. The ATLAS trigger system is divided in a hardware-based Level-1 trigger and a software-based high-level trigger, both of which were upgraded during the LHC shutdown in preparation for Run-2 operation. To cope with the increasing luminosity and more challenging pile-up conditions at a center-of-mass energy of 13 TeV, the trigger selections at each level are optimized to control the rates and keep efficiencies high. To achieve this goal multivariate analysis techniques are used. The ATLAS electron and photon triggers and their performance with Run 2 dat...

  16. The DOe Silicon Track Trigger

    International Nuclear Information System (INIS)

    Steinbrueck, Georg

    2003-01-01

    We describe a trigger preprocessor to be used by the DOe experiment for selecting events with tracks from the decay of long-lived particles. This Level 2 impact parameter trigger utilizes information from the Silicon Microstrip Tracker to reconstruct tracks with improved spatial and momentum resolutions compared to those obtained by the Level 1 tracking trigger. It is constructed of VME boards with much of the logic existing in programmable processors. A common motherboard provides the I/O infrastructure and three different daughter boards perform the tasks of identifying the roads from the tracking trigger data, finding the clusters in the roads in the silicon detector, and fitting tracks to the clusters. This approach provides flexibility for the design, testing and maintenance phases of the project. The track parameters are provided to the trigger framework in 25 μs. The effective impact parameter resolution for high-momentum tracks is 35 μm, dominated by the size of the Tevatron beam

  17. A spot-matching method using cumulative frequency matrix in 2D gel images

    Science.gov (United States)

    Han, Chan-Myeong; Park, Joon-Ho; Chang, Chu-Seok; Ryoo, Myung-Chun

    2014-01-01

    A new method for spot matching in two-dimensional gel electrophoresis images using a cumulative frequency matrix is proposed. The method improves on the weak points of the previous method called ‘spot matching by topological patterns of neighbour spots’. It accumulates the frequencies of neighbour spot pairs produced through the entire matching process and determines spot pairs one by one in order of higher frequency. Spot matching by frequencies of neighbour spot pairs shows a fairly better performance. However, it can give researchers a hint for whether the matching results can be trustworthy or not, which can save researchers a lot of effort for verification of the results. PMID:26019609

  18. Family Resources and Effects on Child Behavior Problem Interventions: A Cumulative Risk Approach.

    Science.gov (United States)

    Tømmerås, Truls; Kjøbli, John

    2017-01-01

    Family resources have been associated with health care inequality in general and with social gradients in treatment outcomes for children with behavior problems. However, there is limited evidence concerning cumulative risk-the accumulation of social and economic disadvantages in a family-and whether cumulative risk moderates the outcomes of evidence-based parent training interventions. We used data from two randomized controlled trials evaluating high-intensity ( n  = 137) and low-intensity ( n  = 216) versions of Parent Management Training-Oregon (PMTO) with a 50:50 allocation between participants receiving PMTO interventions or regular care. A nine-item family cumulative risk index tapping socioeconomic resources and parental health was constructed to assess the family's exposure to risk. Autoregressive structured equation models (SEM) were run to investigate whether cumulative risk moderated child behaviors at post-treatment and follow-up (6 months). Our results showed opposite social gradients for the treatment conditions: the children exposed to cumulative risk in a pooled sample of both PMTO groups displayed lower levels of behavior problems, whereas children with identical risk exposures who received regular care experienced more problems. Furthermore, our results indicated that the social gradients differed between PMTO interventions: children exposed to cumulative risk in the low-intensity (five sessions) Brief Parent Training fared equally well as their high-resource counterparts, whereas children exposed to cumulative risk in the high-intensity PMTO (12 sessions) experienced vastly better treatment effects. Providing evidence-based parent training seem to be an effective way to counteract health care inequality, and the more intensive PMTO treatment seemed to be a particularly effective way to help families with cumulative risk.

  19. The ATLAS hadronic tau trigger

    International Nuclear Information System (INIS)

    Shamim, Mansoora

    2012-01-01

    The extensive tau physics programs of the ATLAS experiment relies heavily on trigger to select hadronic decays of tau lepton. Such a trigger is implemented in ATLAS to efficiently collect signal events, while keeping the rate of multi-jet background within the allowed bandwidth. This contribution summarizes the performance of the ATLAS hadronic tau trigger system during 2011 data taking period and improvements implemented for the 2012 data collection.

  20. A high-speed DAQ framework for future high-level trigger and event building clusters

    International Nuclear Information System (INIS)

    Caselle, M.; Perez, L.E. Ardila; Balzer, M.; Dritschler, T.; Kopmann, A.; Mohr, H.; Rota, L.; Vogelgesang, M.; Weber, M.

    2017-01-01

    Modern data acquisition and trigger systems require a throughput of several GB/s and latencies of the order of microseconds. To satisfy such requirements, a heterogeneous readout system based on FPGA readout cards and GPU-based computing nodes coupled by InfiniBand has been developed. The incoming data from the back-end electronics is delivered directly into the internal memory of GPUs through a dedicated peer-to-peer PCIe communication. High performance DMA engines have been developed for direct communication between FPGAs and GPUs using 'DirectGMA (AMD)' and 'GPUDirect (NVIDIA)' technologies. The proposed infrastructure is a candidate for future generations of event building clusters, high-level trigger filter farms and low-level trigger system. In this paper the heterogeneous FPGA-GPU architecture will be presented and its performance be discussed.

  1. Correlated stopping, proton clusters and higher order proton cumulants

    Energy Technology Data Exchange (ETDEWEB)

    Bzdak, Adam [AGH University of Science and Technology, Faculty of Physics and Applied Computer Science, Krakow (Poland); Koch, Volker [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Skokov, Vladimir [RIKEN/BNL, Brookhaven National Laboratory, Upton, NY (United States)

    2017-05-15

    We investigate possible effects of correlations between stopped nucleons on higher order proton cumulants at low energy heavy-ion collisions. We find that fluctuations of the number of wounded nucleons N{sub part} lead to rather nontrivial dependence of the correlations on the centrality; however, this effect is too small to explain the large and positive four-proton correlations found in the preliminary data collected by the STAR collaboration at √(s) = 7.7 GeV. We further demonstrate that, by taking into account additional proton clustering, we are able to qualitatively reproduce the preliminary experimental data. We speculate that this clustering may originate either from collective/multi-collision stopping which is expected to be effective at lower energies or from a possible first-order phase transition, or from (attractive) final state interactions. To test these ideas we propose to measure a mixed multi-particle correlation between stopped protons and a produced particle (e.g. pion, antiproton). (orig.)

  2. Global search of triggered non-volcanic tremor

    Science.gov (United States)

    Chao, Tzu-Kai Kevin

    Deep non-volcanic tremor is a newly discovered seismic phenomenon with low amplitude, long duration, and no clear P- and S-waves as compared with regular earthquake. Tremor has been observed at many major plate-boundary faults, providing new information about fault slip behaviors below the seismogenic zone. While tremor mostly occurs spontaneously (ambient tremor) or during episodic slow-slip events (SSEs), sometimes tremor can also be triggered during teleseismic waves of distance earthquakes, which is known as "triggered tremor". The primary focus of my Ph.D. work is to understand the physical mechanisms and necessary conditions of triggered tremor by systematic investigations in different tectonic regions. In the first chapter of my dissertation, I conduct a systematic survey of triggered tremor beneath the Central Range (CR) in Taiwan for 45 teleseismic earthquakes from 1998 to 2009 with Mw ≥ 7.5. Triggered tremors are visually identified as bursts of high-frequency (2-8 Hz), non-impulsive, and long-duration seismic energy that are coherent among many seismic stations and modulated by the teleseismic surface waves. A total of 9 teleseismic earthquakes has triggered clear tremor in Taiwan. The peak ground velocity (PGV) of teleseismic surface waves is the most important factor in determining tremor triggering potential, with an apparent threshold of ˜0.1 cm/s, or 7-8 kPa. However, such threshold is partially controlled by the background noise level, preventing triggered tremor with weaker amplitude from being observed. In addition, I find a positive correlation between the PGV and the triggered tremor amplitude, which is consistent with the prediction of the 'clock-advance' model. This suggests that triggered tremor can be considered as a sped-up occurrence of ambient tremor under fast loading from the passing surface waves. Finally, the incident angles of surface waves also play an important rule in controlling the tremor triggering potential. The next

  3. Uncertainty analysis technique of dynamic response and cumulative damage properties of piping system

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hara, Fumio; Hanaoka, Masaaki; Yamashita, Tadashi.

    1982-01-01

    It is a technologically important subject to establish the method of uncertainty analysis statistically examining the variation of the earthquake response and damage properties of equipment and piping system due to the change of input load and the parameters of structural system, for evaluating the aseismatic capability and dynamic structural reliability of these systems. The uncertainty in the response and damage properties when equipment and piping system are subjected to excessive vibration load is mainly dependent on the irregularity of acting input load such as the unsteady vibration of earthquakes, and structural uncertainty in forms and dimensions. This study is the basic one to establish the method for evaluating the uncertainty in the cumulative damage property at the time of resonant vibration of piping system due to the disperse of structural parameters with a simple model. First, the piping models with simple form were broken by resonant vibration, and the uncertainty in the cumulative damage property was evaluated. Next, the response analysis using an elasto-plastic mechanics model was performed by numerical simulation. Finally, the method of uncertainty analysis for response and damage properties by the perturbation method utilizing equivalent linearization was proposed, and its propriety was proved. (Kako, I.)

  4. Implications of applying cumulative risk assessment to the workplace.

    Science.gov (United States)

    Fox, Mary A; Spicer, Kristen; Chosewood, L Casey; Susi, Pam; Johns, Douglas O; Dotson, G Scott

    2018-06-01

    Multiple changes are influencing work, workplaces and workers in the US including shifts in the main types of work and the rise of the 'gig' economy. Work and workplace changes have coincided with a decline in unions and associated advocacy for improved safety and health conditions. Risk assessment has been the primary method to inform occupational and environmental health policy and management for many types of hazards. Although often focused on one hazard at a time, risk assessment frameworks and methods have advanced toward cumulative risk assessment recognizing that exposure to a single chemical or non-chemical stressor rarely occurs in isolation. We explore how applying cumulative risk approaches may change the roles of workers and employers as they pursue improved health and safety and elucidate some of the challenges and opportunities that might arise. Application of cumulative risk assessment should result in better understanding of complex exposures and health risks with the potential to inform more effective controls and improved safety and health risk management overall. Roles and responsibilities of both employers and workers are anticipated to change with potential for a greater burden of responsibility on workers to address risk factors both inside and outside the workplace that affect health at work. A range of policies, guidance and training have helped develop cumulative risk assessment for the environmental health field and similar approaches are available to foster the practice in occupational safety and health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. The CLEO-III Trigger: Decision and gating

    International Nuclear Information System (INIS)

    Bergfeld, T.J.; Gollin, G.D.; Haney, M.J.

    1996-01-01

    The CLEO-III Trigger provides a trigger decision every 42ns, with a latency of approximately 2.5μs. This paper describes the free-running, pipelined trigger decision logic, the throttling mechanism whereby the data acquisition system can modulate the trigger rate to maximize throughput without buffer overrun, and the subsequent signal distribution mechanism for delivering the trigger decision to the front-end electronics. This paper also describes the multilevel simulation methods employed to allow detailed low-level models of trigger components to be co-simulated with more abstract system models, thus allowing full system modeling without incurring prohibitive computational overheads

  6. Triggers in UA2 and UA1

    International Nuclear Information System (INIS)

    Dorenbosch, J.

    1985-01-01

    The UA2 and UA1 trigger systems are described as they will be used after the upgrade of the CERN SPPS. The luminosity of the collider will increase to 3x10 30 . The bunch spacing is 4 microseconds, comparable to the time available for a second level trigger at the SSC. The first level triggers are very powerful and deliver trigger rates of about 100 Hz. The UA1 second level trigger operates on the final digitizings with a combination of special and general purpose processors. At the highest trigger levels a small farm of processors performs the final reduction. (orig.)

  7. Simulation of the High Performance Time to Digital Converter for the ATLAS Muon Spectrometer trigger upgrade

    International Nuclear Information System (INIS)

    Meng, X.T.; Levin, D.S.; Chapman, J.W.; Zhou, B.

    2016-01-01

    The ATLAS Muon Spectrometer endcap thin-Resistive Plate Chamber trigger project compliments the New Small Wheel endcap Phase-1 upgrade for higher luminosity LHC operation. These new trigger chambers, located in a high rate region of ATLAS, will improve overall trigger acceptance and reduce the fake muon trigger incidence. These chambers must generate a low level muon trigger to be delivered to a remote high level processor within a stringent latency requirement of 43 bunch crossings (1075 ns). To help meet this requirement the High Performance Time to Digital Converter (HPTDC), a multi-channel ASIC designed by CERN Microelectronics group, has been proposed for the digitization of the fast front end detector signals. This paper investigates the HPTDC performance in the context of the overall muon trigger latency, employing detailed behavioral Verilog simulations in which the latency in triggerless mode is measured for a range of configurations and under realistic hit rate conditions. The simulation results show that various HPTDC operational configurations, including leading edge and pair measurement modes can provide high efficiency (>98%) to capture and digitize hits within a time interval satisfying the Phase-1 latency tolerance.

  8. Concept of the CMS Trigger Supervisor

    CERN Document Server

    Magrans de Abril, Ildefons; Varela, Joao

    2006-01-01

    The Trigger Supervisor is an online software system designed for the CMS experiment at CERN. Its purpose is to provide a framework to set up, test, operate and monitor the trigger components on one hand and to manage their interplay and the information exchange with the run control part of the data acquisition system on the other. The Trigger Supervisor is conceived to provide a simple and homogeneous client interface to the online software infrastructure of the trigger subsystems. This document specifies the functional and non-functional requirements, design and operational details, and the components that will be delivered in order to facilitate a smooth integration of the trigger software in the context of CMS.

  9. Error Recovery in the Time-Triggered Paradigm with FTT-CAN.

    Science.gov (United States)

    Marques, Luis; Vasconcelos, Verónica; Pedreiras, Paulo; Almeida, Luís

    2018-01-11

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots.

  10. Transmission fidelity is the key to the build-up of cumulative culture.

    Science.gov (United States)

    Lewis, Hannah M; Laland, Kevin N

    2012-08-05

    Many animals have socially transmitted behavioural traditions, but human culture appears unique in that it is cumulative, i.e. human cultural traits increase in diversity and complexity over time. It is often suggested that high-fidelity cultural transmission is necessary for cumulative culture to occur through refinement, a process known as 'ratcheting', but this hypothesis has never been formally evaluated. We discuss processes of information transmission and loss of traits from a cognitive viewpoint alongside other cultural processes of novel invention (generation of entirely new traits), modification (refinement of existing traits) and combination (bringing together two established traits to generate a new trait). We develop a simple cultural transmission model that does not assume major evolutionary changes (e.g. in brain architecture) and show that small changes in the fidelity with which information is passed between individuals can lead to cumulative culture. In comparison, modification and combination have a lesser influence on, and novel invention appears unimportant to, the ratcheting process. Our findings support the idea that high-fidelity transmission is the key driver of human cumulative culture, and that progress in cumulative culture depends more on trait combination than novel invention or trait modification.

  11. A paradox of cumulative culture.

    Science.gov (United States)

    Kobayashi, Yutaka; Wakano, Joe Yuichiro; Ohtsuki, Hisashi

    2015-08-21

    Culture can grow cumulatively if socially learnt behaviors are improved by individual learning before being passed on to the next generation. Previous authors showed that this kind of learning strategy is unlikely to be evolutionarily stable in the presence of a trade-off between learning and reproduction. This is because culture is a public good that is freely exploited by any member of the population in their model (cultural social dilemma). In this paper, we investigate the effect of vertical transmission (transmission from parents to offspring), which decreases the publicness of culture, on the evolution of cumulative culture in both infinite and finite population models. In the infinite population model, we confirm that culture accumulates largely as long as transmission is purely vertical. It turns out, however, that introduction of even slight oblique transmission drastically reduces the equilibrium level of culture. Even more surprisingly, if the population size is finite, culture hardly accumulates even under purely vertical transmission. This occurs because stochastic extinction due to random genetic drift prevents a learning strategy from accumulating enough culture. Overall, our theoretical results suggest that introducing vertical transmission alone does not really help solve the cultural social dilemma problem. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Tools for Trigger Aware Analyses in ATLAS

    CERN Document Server

    Krasznahorkay, A; The ATLAS collaboration; Stelzer, J

    2010-01-01

    In order to search for rare processes, all four LHC experiments have to use advanced triggering methods for selecting and recording the events of interest. At the expected nominal LHC operating conditions only about 0.0005% of the collision events can be kept for physics analysis in ATLAS. Therefore the understanding and evaluation of the trigger performance is one of the most crucial parts of any physics analysis. ATLAS’s first level trigger is composed of custom-built hardware, while the second and third levels are implemented using regular PCs running reconstruction and selection algorithms. Because of this split, accessing the results of the trigger execution for the two stages is different. The complexity of the software trigger presents further difficulties in accessing the trigger data. To make the job of the physicists easier when evaluating the trigger performance, multiple general-use tools are provided by the ATLAS Trigger Analysis Tools group. The TrigDecisionTool, a general tool, is provided to...

  13. The ZEUS calorimeter first level trigger

    International Nuclear Information System (INIS)

    Smith, W.H.; Ali, I.; Behrens, B.; Fordham, C.; Foudas, C.; Goussiou, A.; Jaworski, M.; Kinnel, T.; Lackey, J.; Robl, P.; Silverstein, S.; Dawson, J.W.; Krakauer, D.A.; Talaga, R.L.; Schlereth, J.L.

    1994-10-01

    The design of the ZEUS Calorimeter First Level Trigger (CFLT) is presented. The CFLT utilizes a pipelined architecture to provide trigger data for a global first leel trigger decision 5 μsec after each beam crossing, occurring every 96 nsec. The charges from 13K phototubes are summed into 1792 trigger tower pulseheights which are digitized by flash ADC's. The digital values are linearized, stored and used for sums and pattern tests. Summary data is forwarded to the Global First Level Trigger for each crossing 2 μsec after the crossing occurred. The CFLT determines the total energy, the total transverse energy, the missing energy, and the energy and number of isolated electrons and muons. It also provides information on the electromagnetic and hadronic energy deposited in various regions of the calorimeter. The CFLT has kept the experimental trigger rate below ∼200 Hz at the highest luminosity experienced at HERA. Performance studies suggest that the CFLT will keep the trigger rate below 1 kHZ against a rate of proton-beam gas interactions on the order of the 100 kHz expected at design luminosity. (orig.)

  14. Effect of tidal triggering on seismicity in Taiwan revealed by the empirical mode decomposition method

    Directory of Open Access Journals (Sweden)

    H.-J. Chen

    2012-07-01

    Full Text Available The effect of tidal triggering on earthquake occurrence has been controversial for many years. This study considered earthquakes that occurred near Taiwan between 1973 and 2008. Because earthquake data are nonlinear and non-stationary, we applied the empirical mode decomposition (EMD method to analyze the temporal variations in the number of daily earthquakes to investigate the effect of tidal triggering. We compared the results obtained from the non-declustered catalog with those from two kinds of declustered catalogs and discuss the aftershock effect on the EMD-based analysis. We also investigated stacking the data based on in-phase phenomena of theoretical Earth tides with statistical significance tests. Our results show that the effects of tidal triggering, particularly the lunar tidal effect, can be extracted from the raw seismicity data using the approach proposed here. Our results suggest that the lunar tidal force is likely a factor in the triggering of earthquakes.

  15. An improved trigger-generation scheme for Cerenkov imaging cameras [Paper No.: I5

    International Nuclear Information System (INIS)

    Bhat, C.L.; Tickoo, A.K.; Kaul, I.K.; Koul, R.

    1993-01-01

    An improved trigger-generation scheme for TeV gamma-ray imaging telescopes is proposed. Based on a memory-based majority coincidence circuit, this scheme involves deriving 3-pixel nearest-neighbor coincidences as against the conventional approach of generating prompt coincidence from any 2 pixel of the imaging-camera. As such the new method discriminates against shot-noise-generated triggers, and perhaps, to some extent against background cosmic-ray events also, without compromising on the telescope response to events of γ-ray origin. The net effect is that a Whipple-like imaging system can be operated with a comparatively higher sensitivity than what is possible at present. In addition, a suitably scaled-up value of the chance-trigger rate can be independently derived, thereby making it possible to use this parameter reliably for keeping a log of the 'health' of the experimental system. (author). 9 refs., 5 figs

  16. A trigger simulation framework for the ALICE experiment

    International Nuclear Information System (INIS)

    Antinori, F; Carminati, F; Gheata, A; Gheata, M

    2011-01-01

    A realistic simulation of the trigger system in a complex HEP experiment is essential for performing detailed trigger efficiency studies. The ALICE trigger simulation is evolving towards a framework capable of replaying the full trigger chain starting from the input to the individual trigger processors and ending with the decision mechanisms of the ALICE central trigger processor. This paper describes the new ALICE trigger simulation framework that is being tested and deployed. The framework handles details like trigger levels, signal delays and busy signals, implementing the trigger logic via customizable trigger device objects managed by a robust scheduling mechanism. A big advantage is the high flexibility of the framework, which is able to mix together components described with very different levels of detail. The framework is being gradually integrated within the ALICE simulation and reconstruction frameworks.

  17. Trigger Menu in 2017

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    This document summarises the trigger menu deployed by the ATLAS experiment during 2017 data taking at proton-proton collision centre-of-mass energies of $\\sqrt{s}=13$ TeV and $\\sqrt{s}=5$ TeV at the LHC and describes the improvements with respect to the trigger system and menu used in 2016 data taking.

  18. Cooperative Robot Localization Using Event-Triggered Estimation

    Science.gov (United States)

    Iglesias Echevarria, David I.

    It is known that multiple robot systems that need to cooperate to perform certain activities or tasks incur in high energy costs that hinder their autonomous functioning and limit the benefits provided to humans by these kinds of platforms. This work presents a communications-based method for cooperative robot localization. Implementing concepts from event-triggered estimation, used with success in the field of wireless sensor networks but rarely to do robot localization, agents are able to only send measurements to their neighbors when the expected novelty in this information is high. Since all agents know the condition that triggers a measurement to be sent or not, the lack of a measurement is therefore informative and fused into state estimates. In the case agents do not receive either direct nor indirect measurements of all others, the agents employ a covariance intersection fusion rule in order to keep the local covariance error metric bounded. A comprehensive analysis of the proposed algorithm and its estimation performance in a variety of scenarios is performed, and the algorithm is compared to similar cooperative localization approaches. Extensive simulations are performed that illustrate the effectiveness of this method.

  19. Complexity and demographic explanations of cumulative culture

    NARCIS (Netherlands)

    Querbes, A.; Vaesen, K.; Houkes, W.N.

    2014-01-01

    Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological

  20. Fragmentation of tensor polarized deuterons into cumulative pions

    International Nuclear Information System (INIS)

    Afanas'ev, S.; Arkhipov, V.; Bondarev, V.

    1998-01-01

    The tensor analyzing power T 20 of the reaction d polarized + A → π - (0 0 ) + X has been measured in the fragmentation of 9 GeV tensor polarized deuterons into pions with momenta from 3.5 to 5.3 GeV/c on hydrogen, beryllium and carbon targets. This kinematic range corresponds to the region of cumulative hadron production with the cumulative variable x c from 1.08 to 1.76. The values of T 20 have been found to be small and consistent with positive values. This contradicts the predictions based on a direct mechanism assuming NN collision between a high momentum nucleon in the deuteron and a target nucleon (NN → NNπ)

  1. Cumulants of heat transfer across nonlinear quantum systems

    Science.gov (United States)

    Li, Huanan; Agarwalla, Bijay Kumar; Li, Baowen; Wang, Jian-Sheng

    2013-12-01

    We consider thermal conduction across a general nonlinear phononic junction. Based on two-time observation protocol and the nonequilibrium Green's function method, heat transfer in steady-state regimes is studied, and practical formulas for the calculation of the cumulant generating function are obtained. As an application, the general formalism is used to study anharmonic effects on fluctuation of steady-state heat transfer across a single-site junction with a quartic nonlinear on-site pinning potential. An explicit nonlinear modification to the cumulant generating function exact up to the first order is given, in which the Gallavotti-Cohen fluctuation symmetry is found still valid. Numerically a self-consistent procedure is introduced, which works well for strong nonlinearity.

  2. Shallow geological structures triggered during the Mw 6.4 Meinong earthquake, southwestern Taiwan

    Directory of Open Access Journals (Sweden)

    Maryline Le Béon

    2017-01-01

    Full Text Available The Meinong earthquake generated up to ~10 cm surface displacement located 10 - 35 km west of the epicenter and monitored by InSAR and GPS. In addition to coseismic deformation related to the deep earthquake source, InSAR revealed three sharp surface displacement gradients. One of them is extensional and is inconsistent with the westward interseismic shortening of ~45 mm yr-1 in this region. The gradient sharpness suggests slip triggering on shallow structures, some of which were not well documented before. To characterize these shallow structures, we investigated potential surface ruptures in the field. Sets of ~NS tension cracks distributed over 25 - 300 m width, with cumulative extension in the same order as InSAR observations, were found along 5.5 km distance along the extensional gradient and are interpreted as surface rupture. We build two E-W regional balanced cross-sections, based on surface geology, subsurface data, and coseismic and interseismic geodetic data. From the Coastal Plain to the east edge of the coseismic deformation area, we propose a series of three active west-dipping backthrusts: the Houchiali fault, the Napalin-Pitou backthrust, and the Lungchuan backthrust. They all root on the 3.5 - 4.0 km deep Tainan detachment located near the base of the 3-km-thick Gutingkeng mudstone. Further east, the detachment would ramp down to ~7 km depth. Coseismic surface deformation measurements suggest that, in addition to the deeper (15 - 20 km main rupture plane, mostly the ramp, the Lungchuan backthrust, and the Tainan detachment were activated during or right after the earthquake. Local extension is considered as transient deformation at the west edge of the shallow main slip zone.

  3. Constraining the trigger for an ancient warming episode

    Science.gov (United States)

    Schultz, Colin

    2011-08-01

    The Paleocene epoch (˜66-56 million years ago) was sandwiched between sudden climate shifts and mass extinctions. The boundary between the end of the Paleocene and the beginning of the Eocene (the P-E boundary) saw the global average temperature soar by 5°C over a few thousand years, leading to a pronounced reorganization of both terrestrial and oceanic plant and animal communities. The P-E boundary warming was triggered by an influx of atmospheric carbon dioxide, but the influx's ultimate trigger is still being debated. Other prominent warming events within the Paleogene (˜66-23 million years ago), the broad time span that encompasses the Paleocene and Eocene, have been linked to regularly recurring changes in the eccentricity of the Earth's orbit that take place on 100,000- and 405,000-year cycles. Proponents of this view suggest that an alignment of the two cycles could lead to the warming of deep ocean waters, melting frozen methane and triggering an increase in atmospheric carbon dioxide. However, some studies have suggested that the P-E boundary warming was instead the product of geological processes, where carbon-rich rocks were baked by injected magma, which eventually liberated the carbon to the atmosphere. Deciding between proposed explanations for the cause of the P-E warming, whether they are astronomical or geological, depends on accurately pinning the event in time. (Geochemistry, Geophysics, Geosystems, doi:10.1029/2010GC003426, 2011)

  4. Trigger system study of the dimuon spectrometer in the ALICE experiment at CERN-LHC

    International Nuclear Information System (INIS)

    Roig, O.

    1999-12-01

    This work is a contribution to the study of nucleus-nucleus collisions at the LHC with ALICE. The aim of this experiment is to search for a new phase of matter, the quark-gluon plasma (QGP). The dimuon forward spectrometer should measure one of the most promising probes of the QGP, the production of heavy quark vector mesons (J/ψ, γ, γ', γ'') through their muonic decays. The dimuon trigger selects the interesting events performing a cut on the transverse momentum of the tracks. The trigger decision is taken by a dedicated electronics using RPC (''Resistive Plate Chambers'') detector information. We have made our own R and D program on the RPC detector with various beam tests. We show the performances obtained during these tests of a low resistivity RPC operating in streamer mode. The ALICE requirements concerning the rate capability, the cluster size and the time resolution are fulfilled. We have optimised the trigger with simulations which include a complete description of the read-out planes and the trigger logic (algorithm). In particular, a technique of clustering is proposed and validated. A method called ''Ds reduction'' is introduced in order to limit the effects of combinatorial background on the trigger rates. The efficiencies and the trigger rates are calculated for Pb-Pb, Ca-Ca, p-p collisions at the LHC. Other more sophisticated cuts, on the invariant mass for example, using again the RPC information have been simulated but have not shown significant improvements of the trigger rates. (author)

  5. Upgraded Readout and Trigger Electronics for the ATLAS Liquid Argon Calorimeter at the LHC at the Horizons 2018-2022

    CERN Document Server

    Oliveira Damazio, Denis; The ATLAS collaboration

    2013-01-01

    The ATLAS Liquid Argon (LAr) calorimeters produce a total of 182,486 signals which are digitized and processed by the front-end and back-end electronics at every triggered event. In addition, the front-end electronics is summing analog signals to provide coarsely grained energy sums, called trigger towers, to the first-level trigger system, which is optimized for nominal LHC luminosities. However, the pile-up noise expected during the High Luminosity phases of LHC will be increased by factors of 3 to 7. An improved spatial granularity of the trigger primitives is therefore proposed in order to improve the identification performance for trigger signatures, like electrons, photons, tau leptons, jets, total and missing energy, at high background rejection rates. For the first upgrade phase in 2018, new LAr Trigger Digitizer Board (LTDB) are being designed to receive higher granularity signals, digitize them on detector and send them via fast optical links to a new digital processing system (DPS). The DPS applies...

  6. Upgraded Readout and Trigger Electronics for the ATLAS Liquid-Argon Calorimeters at the LHC at the Horizons 2018-2022

    CERN Document Server

    Damazio, D O; The ATLAS collaboration

    2013-01-01

    The ATLAS Liquid Argon (LAr) calorimeters produce a total of 182,486 signals which are digitized and processed by the front-end and back-end electronics at every triggered event. In addition, the front-end electronics is summing analog signals to provide coarsely grained energy sums, called trigger towers, to the first-level trigger system, which is optimized for nominal LHC luminosities. However, the pile-up noise expected during the High Luminosity phases of LHC will be increased by factors of 3 to 7. An improved spatial granularity of the trigger primitives is therefore proposed in order to improve the identification performance for trigger signatures, like electrons, photons, tau leptons, jets, total and missing energy, at high background rejection rates. For the first upgrade phase in 2018, new LAr Trigger Digitizer Board (LTDB) are being designed to receive higher granularity signals, digitize them on detector and send them via fast optical links to a new digital processing system (DPS). The DPS applies...

  7. Cumulative watershed effects: a research perspective

    Science.gov (United States)

    Leslie M. Reid; Robert R. Ziemer

    1989-01-01

    A cumulative watershed effect (CWE) is any response to multiple land-use activities that is caused by, or results in, altered watershed function. The CWE issue is politically defined, as is the significance of particular impacts. But the processes generating CWEs are the traditional focus of geomorphology and ecology, and have thus been studied for decades. The CWE...

  8. Intelligent trigger processor for the crystal box

    International Nuclear Information System (INIS)

    Sanders, G.H.; Butler, H.S.; Cooper, M.D.

    1981-01-01

    A large solid angle modular NaI(Tl) detector with 432 phototubes and 88 trigger scintillators is being used to search simultaneously for three lepton flavor changing decays of muon. A beam of up to 10 6 muons stopping per second with a 6% duty factor would yield up to 1000 triggers per second from random triple coincidences. A reduction of the trigger rate to 10 Hz is required from a hardwired primary trigger processor described in this paper. Further reduction to < 1 Hz is achieved by a microprocessor based secondary trigger processor. The primary trigger hardware imposes voter coincidence logic, stringent timing requirements, and a non-adjacency requirement in the trigger scintillators defined by hardwired circuits. Sophisticated geometric requirements are imposed by a PROM-based matrix logic, and energy and vector-momentum cuts are imposed by a hardwired processor using LSI flash ADC's and digital arithmetic loci. The secondary trigger employs four satellite microprocessors to do a sparse data scan, multiplex the data acquisition channels and apply additional event filtering

  9. The ATLAS Level-1 Topological Trigger Performance

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00371751; The ATLAS collaboration

    2016-01-01

    The LHC will collide protons in the ATLAS detector with increasing luminosity through 2016, placing stringent operational and physical requirements to the ATLAS trigger system in order to reduce the 40 MHz collision rate to a manageable event storage rate of 1 kHz, while not rejecting interesting physics events. The Level-1 trigger is the first rate-reducing step in the ATLAS trigger system with an output rate of 100 kHz and decision latency smaller than 2.5 μs. It consists of a calorimeter trigger, muon trigger and a central trigger processor. During the LHC shutdown after the Run 1 finished in 2013, the Level-1 trigger system was upgraded including hardware, firmware and software updates. In particular, new electronics modules were introduced in the real-time data processing path: the Topological Processor System (L1Topo). It consists of a single AdvancedCTA shelf equipped with two Level-1 topological processor blades. They receive real-time information from the Level-1 calorimeter and muon triggers, which...

  10. The ATLAS trigger high-level trigger commissioning and operation during early data taking

    CERN Document Server

    Goncalo, R

    2008-01-01

    The ATLAS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14~TeV, with a bunch-crossing rate of 40~MHz. The ATLAS three-level trigger will reduce this input rate to match the foreseen offline storage capability of 100-200~Hz. After the Level 1 trigger, which is implemented in custom hardware, the High-Level Trigger (HLT) further reduces the rate from up to 100~kHz to the offline storage rate while retaining the most interesting physics. The HLT is implemented in software running in commercially available computer farms and consists of Level 2 and Event Filter. To reduce the network data traffic and the processing time to manageable levels, the HLT uses seeded, step-wise reconstruction, aiming at the earliest possible rejection. Data produced during LHC commissioning will be vital for calibrating and aligning sub-detectors, as well as for testing the ATLAS trigger and setting up t...

  11. Do Holocaust survivors show increased vulnerability or resilience to post-Holocaust cumulative adversity?

    Science.gov (United States)

    Shrira, Amit; Palgi, Yuval; Ben-Ezra, Menachem; Shmotkin, Dov

    2010-06-01

    Prior trauma can hinder coping with additional adversity or inoculate against the effect of recurrent adversity. The present study further addressed this issue by examining whether a subsample of Holocaust survivors and comparison groups, drawn from the Israeli component of the Survey of Health, Ageing, and Retirement in Europe, were differentially affected by post-Holocaust cumulative adversity. Post-Holocaust cumulative adversity had a stronger effect on the lifetime depression of Holocaust survivors than on that of comparisons. However, comparisons were more negatively affected by post-Holocaust cumulative adversity when examining markers of physical and cognitive functioning. Our findings suggest that previous trauma can both sensitize and immunize, as Holocaust survivors show general resilience intertwined with specific vulnerability when confronted with additional cumulative adversity.

  12. Investigation of index finger triggering force using a cadaver experiment: Effects of trigger grip span, contact location, and internal tendon force.

    Science.gov (United States)

    Chang, Joonho; Freivalds, Andris; Sharkey, Neil A; Kong, Yong-Ku; Mike Kim, H; Sung, Kiseok; Kim, Dae-Min; Jung, Kihyo

    2017-11-01

    A cadaver study was conducted to investigate the effects of triggering conditions (trigger grip span, contact location, and internal tendon force) on index finger triggering force and the force efficiency of involved tendons. Eight right human cadaveric hands were employed, and a motion simulator was built to secure and control the specimens. Index finger triggering forces were investigated as a function of different internal tendon forces (flexor digitorum profundus + flexor digitorum superficialis = 40, 70, and 100 N), trigger grip spans (40, 50, and 60 mm), and contact locations between the index finger and a trigger. Triggering forces significantly increased when internal tendon forces increased from 40 to 100 N. Also, trigger grip spans and contact locations had significant effects on triggering forces; maximum triggering forces were found at a 50 mm span and the most proximal contact location. The results revealed that only 10-30% of internal tendon forces were converted to their external triggering forces. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Cumulative second-harmonic generation of Lamb waves propagating in a two-layered solid plate

    International Nuclear Information System (INIS)

    Xiang Yanxun; Deng Mingxi

    2008-01-01

    The physical process of cumulative second-harmonic generation of Lamb waves propagating in a two-layered solid plate is presented by using the second-order perturbation and the technique of nonlinear reflection of acoustic waves at an interface. In general, the cumulative second-harmonic generation of a dispersive guided wave propagation does not occur. However, the present paper shows that the second-harmonic of Lamb wave propagation arising from the nonlinear interaction of the partial bulk acoustic waves and the restriction of the three boundaries of the solid plates does have a cumulative growth effect if some conditions are satisfied. Through boundary condition and initial condition of excitation, the analytical expression of cumulative second-harmonic of Lamb waves propagation is determined. Numerical results show the cumulative effect of Lamb waves on second-harmonic field patterns. (classical areas of phenomenology)

  14. Steps and pips in the history of the cumulative recorder.

    OpenAIRE

    Lattal, Kennon A

    2004-01-01

    From its inception in the 1930s until very recent times, the cumulative recorder was the most widely used measurement instrument in the experimental analysis of behavior. It was an essential instrument in the discovery and analysis of schedules of reinforcement, providing the first real-time analysis of operant response rates and patterns. This review traces the evolution of the cumulative recorder from Skinner's early modified kymographs through various models developed by Skinner and his co...

  15. Neighborhood-targeted and case-triggered use of a single dose of oral cholera vaccine in an urban setting: Feasibility and vaccine coverage.

    Science.gov (United States)

    Parker, Lucy A; Rumunu, John; Jamet, Christine; Kenyi, Yona; Lino, Richard Laku; Wamala, Joseph F; Mpairwe, Allan M; Muller, Vincent; Llosa, Augusto E; Uzzeni, Florent; Luquero, Francisco J; Ciglenecki, Iza; Azman, Andrew S

    2017-06-01

    In June 2015, a cholera outbreak was declared in Juba, South Sudan. In addition to standard outbreak control measures, oral cholera vaccine (OCV) was proposed. As sufficient doses to cover the at-risk population were unavailable, a campaign using half the standard dosing regimen (one-dose) targeted high-risk neighborhoods and groups including neighbors of suspected cases. Here we report the operational details of this first public health use of a single-dose regimen of OCV and illustrate the feasibility of conducting highly targeted vaccination campaigns in an urban area. Neighborhoods of the city were prioritized for vaccination based on cumulative attack rates, active transmission and local knowledge of known cholera risk factors. OCV was offered to all persons older than 12 months at 20 fixed sites and to select groups, including neighbors of cholera cases after the main campaign ('case-triggered' interventions), through mobile teams. Vaccination coverage was estimated by multi-stage surveys using spatial sampling techniques. 162,377 individuals received a single-dose of OCV in the targeted neighborhoods. In these neighborhoods vaccine coverage was 68.8% (95% Confidence Interval (CI), 64.0-73.7) and was highest among children ages 5-14 years (90.0%, 95% CI 85.7-94.3), with adult men being less likely to be vaccinated than adult women (Relative Risk 0.81, 95% CI: 0.68-0.96). In the case-triggered interventions, each lasting 1-2 days, coverage varied (range: 30-87%) with an average of 51.0% (95% CI 41.7-60.3). Vaccine supply constraints and the complex realities where cholera outbreaks occur may warrant the use of flexible alternative vaccination strategies, including highly-targeted vaccination campaigns and single-dose regimens. We showed that such campaigns are feasible. Additional work is needed to understand how and when to use different strategies to best protect populations against epidemic cholera.

  16. Hierarchical trigger of the ALICE calorimeters

    CERN Document Server

    Muller, Hans; Novitzky, Norbert; Kral, Jiri; Rak, Jan; Schambach, Joachim; Wang, Ya-Ping; Wang, Dong; Zhou, Daicui

    2010-01-01

    The trigger of the ALICE electromagnetic calorimeters is implemented in 2 hierarchically connected layers of electronics. In the lower layer, level-0 algorithms search shower energy above threshold in locally confined Trigger Region Units (TRU). The top layer is implemented as a single, global trigger unit that receives the trigger data from all TRUs as input to the level-1 algorithm. This architecture was first developed for the PHOS high pT photon trigger before it was adopted by EMCal also for the jet trigger. TRU units digitize up to 112 analogue input signals from the Front End Electronics (FEE) and concentrate their digital stream in a single FPGA. A charge and time summing algorithm is combined with a peakfinder that suppresses spurious noise and is precise to single LHC bunches. With a peak-to-peak noise level of 150 MeV the linear dynamic range above threshold spans from MIP energies at 215 up to 50 GeV. Local level-0 decisions take less than 600 ns after LHC collisions, upon which all TRUs transfer ...

  17. Level-1 Calorimeter Trigger starts firing

    CERN Multimedia

    Stephen Hillier

    2007-01-01

    L1Calo is one of the major components of ATLAS First Level trigger, along with the Muon Trigger and Central Trigger Processor. It forms all of the first-level calorimeter-based triggers, including electron, jet, tau and missing ET. The final system consists of over 250 custom designed 9U VME boards, most containing a dense array of FPGAs or ASICs. It is subdivided into a PreProcessor, which digitises the incoming trigger signals from the Liquid Argon and Tile calorimeters, and two separate processor systems, which perform the physics algorithms. All of these are highly flexible, allowing the possibility to adapt to beam conditions and luminosity. All parts of the system are read out through Read-Out Drivers, which provide monitoring data and Region of Interest (RoI) information for the Level-2 trigger. Production of the modules is now essentially complete, and enough modules exist to populate the full scale system in USA15. Installation is proceeding rapidly - approximately 90% of the final modules are insta...

  18. Shale Gas Development and Brook Trout: Scaling Best Management Practices to Anticipate Cumulative Effects

    Science.gov (United States)

    Smith, David; Snyder, Craig D.; Hitt, Nathaniel P.; Young, John A.; Faulkner, Stephen P.

    2012-01-01

    Shale gas development may involve trade-offs between energy development and benefits provided by natural ecosystems. However, current best management practices (BMPs) focus on mitigating localized ecological degradation. We review evidence for cumulative effects of natural gas development on brook trout (Salvelinus fontinalis) and conclude that BMPs should account for potential watershed-scale effects in addition to localized influences. The challenge is to develop BMPs in the face of uncertainty in the predicted response of brook trout to landscape-scale disturbance caused by gas extraction. We propose a decision-analysis approach to formulating BMPs in the specific case of relatively undisturbed watersheds where there is consensus to maintain brook trout populations during gas development. The decision analysis was informed by existing empirical models that describe brook trout occupancy responses to landscape disturbance and set bounds on the uncertainty in the predicted responses to shale gas development. The decision analysis showed that a high efficiency of gas development (e.g., 1 well pad per square mile and 7 acres per pad) was critical to achieving a win-win solution characterized by maintaining brook trout and maximizing extraction of available gas. This finding was invariant to uncertainty in predicted response of brook trout to watershed-level disturbance. However, as the efficiency of gas development decreased, the optimal BMP depended on the predicted response, and there was considerable potential value in discriminating among predictive models through adaptive management or research. The proposed decision-analysis framework provides an opportunity to anticipate the cumulative effects of shale gas development, account for uncertainty, and inform management decisions at the appropriate spatial scales.

  19. Schmitt-Trigger-based Recycling Sensor and Robust and High-Quality PUFs for Counterfeit IC Detection

    OpenAIRE

    Lin, Cheng-Wei; Jang, Jae-Won; Ghosh, Swaroop

    2015-01-01

    We propose Schmitt-Trigger (ST) based recycling sensor that are tailored to amplify the aging mechanisms and detect fine grained recycling (minutes to seconds). We exploit the susceptibility of ST to process variations to realize high-quality arbiter PUF. Conventional SRAM PUF suffer from environmental fluctuation-induced bit flipping. We propose 8T SRAM PUF with a back-to-back PMOS latch to improve robustness by 4X. We also propose a low-power 7T SRAM with embedded Magnetic Tunnel Junction (...

  20. 76 FR 16813 - Proposed Information Collection; OMB Control Number 1024-0038

    Science.gov (United States)

    2011-03-25

    ... 1993 Government Performance and Results Act, as amended. This request for OMB approval includes local... Report'' that includes the Cumulative Products Table (which compares actual to proposed performance), a.... This set of information collections has an impact on State, tribal, and local governments that wish to...

  1. The CDF level-3 trigger

    International Nuclear Information System (INIS)

    Devlin, T.

    1993-01-01

    The Collider Detector at Fermilab (CDF) has been operating at the Tevatron and collecting data on proton-antiproton interactions with collision rates above 250,000 Hz. Three levels of filtering select events for data logging at a rate of about 4 Hz. The Level 3 trigger provides most of the capabilities of the offline production programs for event reconstruction and physics analysis. The type of physics triggers, application of cuts, and combinations of logical requirements for event selection are controlled at run time by a trigger table using a syntax fully integrated with the Level 1 and Level 2 hardware triggers. The level 3 software operates in 48 RISC/UNIX processors (over 1000 mips) served by four 20-MByte/sec data buses for input, output and control. The system architecture, debugging, code validation, error reporting, analysis capabilities and performance will be described

  2. MR imaging findings of trigger thumb

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Eric Y.; Chen, Karen C.; Chung, Christine B. [VA San Diego Healthcare System, Radiology Service, San Diego, CA (United States); University of California, San Diego Medical Center, Department of Radiology, San Diego, CA (United States)

    2015-08-15

    Trigger finger (or trigger thumb), also known as sclerosing tenosynovitis, is a common clinical diagnosis that rarely presents for imaging. Because of this selection bias, many radiologists may not be familiar with the process. Furthermore, patients who do present for imaging frequently have misleading examination indications. To our knowledge, magnetic resonance (MR) imaging findings of trigger thumb have not been previously reported in the literature. In this article, we review the entity of trigger thumb, the anatomy involved, and associated imaging findings, which include flexor pollicis longus tendinosis with a distinct nodule, A1 pulley thickening, and tenosynovitis. In addition, in some cases, an abnormal Av pulley is apparent. In the rare cases of trigger finger that present for MR imaging, accurate diagnosis by the radiologist can allow initiation of treatment and avoid further unnecessary workup. (orig.)

  3. MR imaging findings of trigger thumb

    International Nuclear Information System (INIS)

    Chang, Eric Y.; Chen, Karen C.; Chung, Christine B.

    2015-01-01

    Trigger finger (or trigger thumb), also known as sclerosing tenosynovitis, is a common clinical diagnosis that rarely presents for imaging. Because of this selection bias, many radiologists may not be familiar with the process. Furthermore, patients who do present for imaging frequently have misleading examination indications. To our knowledge, magnetic resonance (MR) imaging findings of trigger thumb have not been previously reported in the literature. In this article, we review the entity of trigger thumb, the anatomy involved, and associated imaging findings, which include flexor pollicis longus tendinosis with a distinct nodule, A1 pulley thickening, and tenosynovitis. In addition, in some cases, an abnormal Av pulley is apparent. In the rare cases of trigger finger that present for MR imaging, accurate diagnosis by the radiologist can allow initiation of treatment and avoid further unnecessary workup. (orig.)

  4. Extending the relationship between global warming and cumulative carbon emissions to multi-millennial timescales

    International Nuclear Information System (INIS)

    Frölicher, Thomas L; Paynter, David J

    2015-01-01

    The transient climate response to cumulative carbon emissions (TCRE) is a highly policy-relevant quantity in climate science. The TCRE suggests that peak warming is linearly proportional to cumulative carbon emissions and nearly independent of the emissions scenario. Here, we use simulations of the Earth System Model (ESM) from the Geophysical Fluid Dynamics Laboratory (GFDL) to show that global mean surface temperature may increase by 0.5 °C after carbon emissions are stopped at 2 °C global warming, implying an increase in the coefficient relating global warming to cumulative carbon emissions on multi-centennial timescales. The simulations also suggest a 20% lower quota on cumulative carbon emissions allowed to achieve a policy-driven limit on global warming. ESM estimates from the Coupled Model Intercomparison Project Phase 5 (CMIP5–ESMs) qualitatively agree on this result, whereas Earth System Models of Intermediate Complexity (EMICs) simulations, used in the IPCC 5th assessment report to assess the robustness of TCRE on multi-centennial timescales, suggest a post-emissions decrease in temperature. The reason for this discrepancy lies in the smaller simulated realized warming fraction in CMIP5–ESMs, including GFDL ESM2M, than in EMICs when carbon emissions increase. The temperature response to cumulative carbon emissions can be characterized by three different phases and the linear TCRE framework is only valid during the first phase when carbon emissions increase. For longer timescales, when emissions tape off, two new metrics are introduced that better characterize the time-dependent temperature response to cumulative carbon emissions: the equilibrium climate response to cumulative carbon emissions and the multi-millennial climate response to cumulative carbon emissions. (letter)

  5. Trigger processing using reconfigurable logic in the CMS calorimeter trigger

    Energy Technology Data Exchange (ETDEWEB)

    Brooke, J J; Cussans, D G; Heath, G P; Maddox, A J; Newbold, D M; Rabbetts, P D

    2001-04-01

    We present the design of the Global Calorimeter Trigger processor for the CMS detector at LHC. This is a fully pipelined processor system which collects data from all the CMS calorimeters and produces summary information used in forming the Level-1 trigger decision for each event. The design in based on the use of state-of-the-art reconfigurable logic devices (FPGAs) and fast data links. We present the results of device testing using a low-latency pipelined sort algorithm, which demonstrate that an FPGA can be used to perform processing previously foreseen to require custom ASICs. Our design approach results in a powerful, flexible and compact processor system.

  6. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  7. Trigger Finger

    Science.gov (United States)

    ... in a bent position. People whose work or hobbies require repetitive gripping actions are at higher risk ... developing trigger finger include: Repeated gripping. Occupations and hobbies that involve repetitive hand use and prolonged gripping ...

  8. Distributed Nonlinear Control with Event-Triggered Communication to Achieve Current-Sharing and Voltage Regulation in DC Microgrids

    DEFF Research Database (Denmark)

    Han, Renke; Meng, Lexuan; Guerrero, Josep M.

    2018-01-01

    combining the state-dependent tolerance with a nonnegative offset. In order to design the event-triggered principle and guarantee the global stability, a generalized dc microgrid model is proposed and proven to be positive definite, based on which Lyapunov-based approach is applied. Furthermore, considering......A distributed nonlinear controller is presented to achieve both accurate current-sharing and voltage regulation simultaneously in dc microgrids considering different line impedances’ effects among converters. Then, an improved event-triggered principle for the controller is introduced through...... for precise real-time information transmission, without sacrificing system performance. Experimental results obtained from a dc microgrid setup show the robustness of the new proposal under normal, communication failure, communication delay and plug-and-play operation conditions. Finally, communication...

  9. Secondary Restoration Control of Islanded Microgrids With Decentralized Event-triggered Strategy

    DEFF Research Database (Denmark)

    Guerrero, Josep M.; Chen, Meng; Xiao, Xiangning

    2018-01-01

    in the feedback control laws, the proposed control strategies just require the communication between distributed secondary controllers at some particular instants while having frequency and voltage restoration function and accurate active power sharing. The stability and inter-event interval are also analyzed......Distributed cooperative control methods attract more and more attention in microgrid secondary control because they are more reliable and flexible. However, the traditional methods rely on the periodic communication, which is neither economic nor efficient due to its large communication burden...... in this paper. An islanded microgrid test system is built in PSCAD/EMTDC to validate the proposed control strategies. It shows that the proposed secondary control strategies based on event-triggered approach can highly reduce the inter-agent communication....

  10. The Scarring Effects of Bankruptcy: Cumulative Disadvantage across Credit and Labor Markets

    Science.gov (United States)

    Maroto, Michelle

    2012-01-01

    As the recent economic crisis has demonstrated, inequality often spans credit and labor markets, supporting a system of cumulative disadvantage. Using data from the National Longitudinal Survey of Youth, this research draws on stigma, cumulative disadvantage and status characteristics theories to examine whether credit and labor markets intersect…

  11. A damage cumulation method for crack initiation prediction under non proportional loading and overloading

    International Nuclear Information System (INIS)

    Taheri, S.

    1992-04-01

    For a sequence of constant amplitude cyclic loading containing overloads, we propose a method for damage cumulation in non proportional loading. This method uses as data cyclic stabilized states at non proportional loading and initiation or fatigue curve in uniaxial case. For that, we take into account the dependence of Cyclic Strain Stress Curves (C.S.S.C.) and mean cell size on prehardening and we define a stabilized uniaxial state cyclically equivalent to a non proportional stabilized state through a family of C.S.S.C. Although simple assumptions like linear damage function and linear cumulation is used we obtain a sequence effect for difficult cross slip materials as 316 stainless steel, but the Miner rule for easy cross-slip materials. We show then differences between a load-controlled test and a strain controlled test: for a 316 stainless steel in a load controlled test, the non proportional loading at each cycle is less damaging than the uniaxial one for the same equivalent stress, while the result is opposite in a strain controlled test. We show also that an overloading retards initiation in a load controlled test while it accelerates initiation in a strain controlled test. (author). 26 refs., 8 figs

  12. Proposal of the penalty factor equations considering weld strength over-match

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Jeong, Jae Wook; Lee, Kang Yong

    2017-01-01

    This paper proposes penalty factor equations that take into consideration the weld strength over-match given in the classified form similar to the revised equations presented in the Code Case N-779 via cyclic elastic-plastic finite element analysis. It was found that the Ke analysis data reflecting elastic follow-up can be consolidated by normalizing the primary-plus-secondary stress intensity ranges excluding the nonlinear thermal stress intensity component, S_n to over-match degree of yield strength, M_F. For the effect of over-match on K_n × K_v_"1, dispersion of the Kn × K_v analysis data can be sharply reduced by dividing total stress intensity range, excluding local thermal stresses, S_p_-_l_t by M_F. Finally, the proposed equations were applied to the weld between the safe end and the piping of a pressurizer surge nozzle in pressurized water reactors in order to calculate a cumulative usage factor. The cumulative usage factor was then compared with those derived by the previous K_e factor equations. The result shows that application of the proposed equations can significantly reduce conservatism of fatigue assessment using the previous K_e factor equations

  13. Proposal of the Penalty Factor Equations Considering Weld Strength Over-Match

    Directory of Open Access Journals (Sweden)

    Jong-Sung Kim

    2017-06-01

    Full Text Available This paper proposes penalty factor equations that take into consideration the weld strength over-match given in the classified form similar to the revised equations presented in the Code Case N-779 via cyclic elastic-plastic finite element analysis. It was found that the Ke analysis data reflecting elastic follow-up can be consolidated by normalizing the primary-plus-secondary stress intensity ranges excluding the nonlinear thermal stress intensity component, Sn to over-match degree of yield strength, MF. For the effect of over-match on Kn × Κν, dispersion of the Kn × Κν analysis data can be sharply reduced by dividing total stress intensity range, excluding local thermal stresses, Sp-lt by MF. Finally, the proposed equations were applied to the weld between the safe end and the piping of a pressurizer surge nozzle in pressurized water reactors in order to calculate a cumulative usage factor. The cumulative usage factor was then compared with those derived by the previous Ke factor equations. The result shows that application of the proposed equations can significantly reduce conservatism of fatigue assessment using the previous Ke factor equations.

  14. Problems of describing the cumulative effect in relativistic nuclear physics

    International Nuclear Information System (INIS)

    Baldin, A.M.

    1979-01-01

    The problem of describing the cumulative effect i.e., the particle production on nuclei in the range kinematically forbidden for one-nucleon collisions, is studied. Discrimination of events containing cumulative particles fixes configurations in the wave function of a nucleus, when several nucleons are closely spaced and their quark-parton components are collectivized. For the cumulative processes under consideration large distances between quarks are very important. The fundamental facts and theoretical interpretation of the quantum field theory and of the condensed media theory in the relativistic nuclear physics are presented in brief. The collisions of the relativistic nuclei with low momentum transfers is considered in a fast moving coordinate system. The basic parameter determining this type of collisions is the energy of nucleon binding in nuclei. It has been shown that the short-range correlation model provides a good presentation of many characteristics of the multiple particle production and it may be regarded as an approximate universal property of hadron interactions

  15. Stability Analysis and Trigger Control of LLC Resonant Converter for a Wide Operational Range

    Directory of Open Access Journals (Sweden)

    Zhijian Fang

    2017-09-01

    Full Text Available The gain of a LLC resonant converter can vary with the loads that can be used to improve the efficiency and power density for some special applications, where the maximum gain does not apply at the heaviest loads. However, nonlinear gain characteristics can make the converters unstable during a major disturbance. In this paper, the stability of an LLC resonant converter during a major disturbance is studied and a trigger control scheme is proposed to improve the converter’s stability by extending the converter’s operational range. Through in-depth analysis of the gain curve of the LLC resonant converter, we find that the switching frequency range is one of the key factors determining the system’s stability performance. The same result is also obtained from a mathematical point of view by utilizing the mixed potential function method. Then a trigger control method is proposed to make the LLC resonant converter stable even during a major disturbance, which can be used to extend the converter’s operational range. Finally, experimental results are given to verify the analysis and proposed control scheme.

  16. Triggers for a high sensitivity charm experiment

    International Nuclear Information System (INIS)

    Christian, D.C.

    1994-07-01

    Any future charm experiment clearly should implement an E T trigger and a μ trigger. In order to reach the 10 8 reconstructed charm level for hadronic final states, a high quality vertex trigger will almost certainly also be necessary. The best hope for the development of an offline quality vertex trigger lies in further development of the ideas of data-driven processing pioneered by the Nevis/U. Mass. group

  17. Commissioning the ATLAS Level-1 Central Trigger System

    CERN Document Server

    Sherman, Daniel

    2010-01-01

    The ATLAS Level-1 central trigger is a critical part of ATLAS operation. It receives the 40 MHz bunch clock from the LHC and distributes it to all sub-detectors. It initiates their read-out by forming the Level-1 Accept decision, which is based on information from the calorimeter and muon trigger processors and a variety of additional trigger inputs from detectors in the forward region. It also provides trigger summary information to the data acquisition system and the Level-2 trigger system. In this paper, we present the completion of the installed central trigger system, its performance during cosmic-ray data taking and the experience gained with triggering on the first LHC beams.

  18. D0 triggering and data acquisition

    International Nuclear Information System (INIS)

    Gibbard, B.

    1992-10-01

    The trigger for D0 is a multi-tier system. Within the 3.5 μsec bunch crossing interval, custom electronics select interesting event candidates based on electromagnetic and hadronic energy deposits in the calorimeter and on indications of tracks in the muon system. Subsequent hardware decisions use refined calculations of electron and muon characteristics. The highest level trigger occurs in one element of a farm of microprocessors, where fully developed algorithms for electrons, muons, jets, or missing E t are executed. This highest level trigger also provides the assembly of the event into its final data structure. Performance of this trigger and data acquisition system in collider operation is described

  19. Reliability of physical examination for diagnosis of myofascial trigger points: a systematic review of the literature.

    Science.gov (United States)

    Lucas, Nicholas; Macaskill, Petra; Irwig, Les; Moran, Robert; Bogduk, Nikolai

    2009-01-01

    Trigger points are promoted as an important cause of musculoskeletal pain. There is no accepted reference standard for the diagnosis of trigger points, and data on the reliability of physical examination for trigger points are conflicting. To systematically review the literature on the reliability of physical examination for the diagnosis of trigger points. MEDLINE, EMBASE, and other sources were searched for articles reporting the reliability of physical examination for trigger points. Included studies were evaluated for their quality and applicability, and reliability estimates were extracted and reported. Nine studies were eligible for inclusion. None satisfied all quality and applicability criteria. No study specifically reported reliability for the identification of the location of active trigger points in the muscles of symptomatic participants. Reliability estimates varied widely for each diagnostic sign, for each muscle, and across each study. Reliability estimates were generally higher for subjective signs such as tenderness (kappa range, 0.22-1.0) and pain reproduction (kappa range, 0.57-1.00), and lower for objective signs such as the taut band (kappa range, -0.08-0.75) and local twitch response (kappa range, -0.05-0.57). No study to date has reported the reliability of trigger point diagnosis according to the currently proposed criteria. On the basis of the limited number of studies available, and significant problems with their design, reporting, statistical integrity, and clinical applicability, physical examination cannot currently be recommended as a reliable test for the diagnosis of trigger points. The reliability of trigger point diagnosis needs to be further investigated with studies of high quality that use current diagnostic criteria in clinically relevant patients.

  20. Energy Current Cumulants in One-Dimensional Systems in Equilibrium

    Science.gov (United States)

    Dhar, Abhishek; Saito, Keiji; Roy, Anjan

    2018-06-01

    A recent theory based on fluctuating hydrodynamics predicts that one-dimensional interacting systems with particle, momentum, and energy conservation exhibit anomalous transport that falls into two main universality classes. The classification is based on behavior of equilibrium dynamical correlations of the conserved quantities. One class is characterized by sound modes with Kardar-Parisi-Zhang scaling, while the second class has diffusive sound modes. The heat mode follows Lévy statistics, with different exponents for the two classes. Here we consider heat current fluctuations in two specific systems, which are expected to be in the above two universality classes, namely, a hard particle gas with Hamiltonian dynamics and a harmonic chain with momentum conserving stochastic dynamics. Numerical simulations show completely different system-size dependence of current cumulants in these two systems. We explain this numerical observation using a phenomenological model of Lévy walkers with inputs from fluctuating hydrodynamics. This consistently explains the system-size dependence of heat current fluctuations. For the latter system, we derive the cumulant-generating function from a more microscopic theory, which also gives the same system-size dependence of cumulants.

  1. Triggering ovulation with gonadotropin-releasing hormone agonist versus human chorionic gonadotropin in polycystic ovarian syndrome. A randomized trial

    Directory of Open Access Journals (Sweden)

    Amr Hassaan Farag

    2015-12-01

    Full Text Available Objectives: To compare GnRH agonist to hCG for triggering ovulation in polycystic ovarian syndrome treated with clomiphene citrate. Study design: Prospective randomized study. Materials & methods: Eighty five infertile women with PCOS participated in a randomized allocation concealed prospective trial and had induction of ovulation with clomiphene citrate. GnRH agonist 0.2 mg subcutaneously (group 1 or hCG 10,000 IU intramuscularly (group 2 was given to trigger ovulation. Primary outcome was mid-luteal serum progesterone, while secondary outcomes were ovulation rates and clinical pregnancy rates along 3 cycles. Results: No difference was found between group 1 and group 2 regarding mean serum progesterone and clinical pregnancy rates in each cycle. Cumulative pregnancy rates were similar (17.14% versus 20% respectively; P = 0.332. Ovulation rates were 80% versus 68.6% (P = 0.413; 94.3% versus 90.9% (P = 0.669; 97.1% versus 93.7% (P = 0.603 in the two groups respectively. However, a significant rise in number of patients with mid-luteal serum progesterone >10 ng/mL was noted in the 3rd cycle between both groups, (P < 0.0001 for group 1 while P = 0.007 for group 2. Conclusion: Triggering ovulation with GnRH-a after treatment with clomiphene citrate in PCOS, in view of its known protective effect against OHSS, may be an effective physiological alternative to conventional hCG without compromising luteal function and pregnancy rates after repeated cycles of treatment.

  2. A Novel in situ Trigger Combination Method

    International Nuclear Information System (INIS)

    Buzatu, Adrian; Warburton, Andreas; Krumnack, Nils; Yao, Wei-Ming

    2012-01-01

    Searches for rare physics processes using particle detectors in high-luminosity colliding hadronic beam environments require the use of multi-level trigger systems to reject colossal background rates in real time. In analyses like the search for the Higgs boson, there is a need to maximize the signal acceptance by combining multiple different trigger chains when forming the offline data sample. In such statistically limited searches, datasets are often amassed over periods of several years, during which the trigger characteristics evolve and their performance can vary significantly. Reliable production cross-section measurements and upper limits must take into account a detailed understanding of the effective trigger inefficiency for every selected candidate event. We present as an example the complex situation of three trigger chains, based on missing energy and jet energy, to be combined in the context of the search for the Higgs (H) boson produced in association with a W boson at the Collider Detector at Fermilab (CDF). We briefly review the existing techniques for combining triggers, namely the inclusion, division, and exclusion methods. We introduce and describe a novel fourth in situ method whereby, for each candidate event, only the trigger chain with the highest a priori probability of selecting the event is considered. The in situ combination method has advantages of scalability to large numbers of differing trigger chains and of insensitivity to correlations between triggers. We compare the inclusion and in situ methods for signal event yields in the CDF WH search.

  3. Discrete element modeling of triggered slip in faults with granular gouge: application to dynamic earthquake triggering

    International Nuclear Information System (INIS)

    Ferdowsi, B.

    2014-01-01

    Recent seismological observations based on new, more sensitive instrumentation show that seismic waves radiated from large earthquakes can trigger other earthquakes globally. This phenomenon is called dynamic earthquake triggering and is well-documented for over 30 of the largest earthquakes worldwide. Granular materials are at the core of mature earthquake faults and play a key role in fault triggering by exhibiting a rich nonlinear response to external perturbations. The stick-slip dynamics in sheared granular layers is analogous to the seismic cycle for earthquake fault systems. In this research effort, we characterize the macroscopic scale statistics and the grain-scale mechanisms of triggered slip in sheared granular layers. We model the granular fault gouge using three dimensional discrete element method simulations. The modeled granular system is put into stick-slip dynamics by applying a conning pressure and a shear load. The dynamic triggering is simulated by perturbing the spontaneous stick-slip dynamics using an external vibration applied to the boundary of the layer. The influences of the triggering consist in a frictional weakening during the vibration interval, a clock advance of the next expected large slip event and long term effects in the form of suppression and recovery of the energy released from the granular layer. Our study suggests that above a critical amplitude, vibration causes a significant clock advance of large slip events. We link this clock advance to a major decline in the slipping contact ratio as well as a decrease in shear modulus and weakening of the granular gouge layer. We also observe that shear vibration is less effective in perturbing the stick-slip dynamics of the granular layer. Our study suggests that in order to have an effective triggering, the input vibration must also explore the granular layer at length scales about or less than the average grain size. The energy suppression and the subsequent recovery and increased

  4. Upgrade of the CMS Global Muon Trigger

    CERN Document Server

    Jeitler, Manfred; Rabady, Dinyar; Sakulin, Hannes; Stahl, Achim

    2015-01-01

    The increase in center-of-mass energy and luminosity for Run-II of the Large Hadron Collider poses new challenges for the trigger systems of the experiments. To keep triggering with a similar performance as in Run-I, the CMS muon trigger is currently being upgraded. The new algorithms will provide higher resolution, especially for the muon transverse momentum and will make use of isolation criteria that combine calorimeter with muon information already in the level-1 trigger. The demands of the new algorithms can only be met by upgrading the level-1 trigger system to new powerful FPGAs with high bandwidth I/O. The processing boards will be based on the new μTCA standard. We report on the planned algorithms for the upgraded Global Muon Trigger (μGMT) which sorts and removes duplicates from boundaries of the muon trigger sub-systems. Furthermore, it determines how isolated the muon candidates are based on calorimetric energy deposits. The μGMT will be implemented using a processing board that features a larg...

  5. Upgrade of the CMS Global Muon Trigger

    CERN Document Server

    Lingemann, Joschka; Sakulin, Hannes; Jeitler, Manfred; Stahl, Achim

    2015-01-01

    The increase in center-of-mass energy and luminosity for Run 2 of the Large Hadron Collider pose new challenges for the trigger systems of the experiments. To keep triggering with a similar performance as in Run 1, the CMS muon trigger is currently being upgraded. The new algorithms will provide higher resolution, especially for the muon transverse momentum and will make use of isolation criteria that combine calorimeter with muon information already in the level-1 trigger. The demands of the new algorithms can only be met by upgrading the level-1 trigger system to new powerful FPGAs with high bandwidth I/O. The processing boards will be based on the new microTCA standard. We report on the planned algorithms for the upgraded Global Muon Trigger (GMT) which combines information from the muon trigger sub-systems and assigns the isolation variable. The upgraded GMT will be implemented using a Master Processor 7 card, built by Imperial College, that features a large Xilinx Virtex 7 FPGA. Up to 72 optical links at...

  6. Assessing environmental impacts on stream water quality: the use of cumulative flux and cumulative flux difference approaches to deforestation of the Hafren Forest, mid-Wales

    Directory of Open Access Journals (Sweden)

    C. Neal

    2002-01-01

    Full Text Available A method for examining the impacts of disturbance on stream water quality based on paired catchment “controlâ€? and “responseâ€? water quality time series is described in relation to diagrams of cumulative flux and cumulative flux difference. The paper describes the equations used and illustrates the patterns expected for idealised flux changes followed by an application to stream water quality data for a spruce forested catchment, the Hore, subjected to clear fell. The water quality determinands examined are sodium, chloride, nitrate, calcium and acid neutralisation capacity. The anticipated effects of felling are shown in relation to reduction in mist capture and nitrate release with felling as well as to the influence of weathering and cation exchange mechanisms, but in a much clearer way than observed previously using other approaches. Keywords: Plynlimon, stream, Hore, acid neutralisation capacity, calcium, chloride, nitrate, sodium, cumulative flux, flux

  7. The role of factorial cumulants in reactor neutron noise theory

    International Nuclear Information System (INIS)

    Colombino, A.; Pacilio, N.; Sena, G.

    1979-01-01

    The physical meaning and the combinatorial implications of the factorial cumulant of a state variable such as the number of neutrons or the number of neutron counts are specified. Features of the presentation are: (1) the fission process is treated in its entirety without the customary binary emission restriction, (b) the introduction of the factorial cumulants helps in reducing the complexity of the mathematical problems, (c) all the solutions can be obtained analytically. Only the ergodic hypothesis for the neutron population evolution is dealt with. (author)

  8. Upgrade trigger: Biannual performance update

    CERN Document Server

    Aaij, Roel; Couturier, Ben; Esen, Sevda; De Cian, Michel; De Vries, Jacco Andreas; Dziurda, Agnieszka; Fitzpatrick, Conor; Fontana, Marianna; Grillo, Lucia; Hasse, Christoph; Jones, Christopher Rob; Le Gac, Renaud; Matev, Rosen; Neufeld, Niko; Nikodem, Thomas; Polci, Francesco; Del Buono, Luigi; Quagliani, Renato; Schwemmer, Rainer; Seyfert, Paul; Stahl, Sascha; Szumlak, Tomasz; Vesterinen, Mika Anton; Wanczyk, Joanna; Williams, Mark Richard James; Yin, Hang; Zacharjasz, Emilia Anna

    2017-01-01

    This document presents the performance of the LHCb Upgrade trigger reconstruction sequence, incorporating changes to the underlying reconstruction algorithms and detector description since the Trigger and Online Upgrade TDR. An updated extrapolation is presented using the most recent example of an Event Filter Farm node.

  9. Acute and Cumulative Effects of Unmodified 50-nm Nano-ZnO on Mice.

    Science.gov (United States)

    Kong, Tao; Zhang, Shu-Hui; Zhang, Ji-Liang; Hao, Xue-Qin; Yang, Fan; Zhang, Cai; Yang, Zi-Jun; Zhang, Meng-Yu; Wang, Jie

    2018-01-02

    Nanometer zinc oxide (nano-ZnO) is widely used in diverse industrial and agricultural fields. Due to the extensive contact humans have with these particles, it is crucial to understand the potential effects that nano-ZnO have on human health. Currently, information related to the toxicity and mechanisms of nano-ZnO is limited. The aim of the present study was to investigate acute and cumulative toxic effects of 50-nm unmodified ZnO in mice. This investigation will seek to establish median lethal dose (LD50), a cumulative coefficient, and target organs. The acute and cumulative toxicity was investigated by Karber's method and via a dose-increasing method, respectively. During the experiment, clinical signs, mortality, body weights, hematology, serum biochemistry, gross pathology, organ weight, and histopathology were examined. The LD50 was 5177-mg/kg·bw; the 95% confidence limits for the LD50 were 5116-5238-mg/kg·bw. It could be concluded that the liver, kidney, lung, and gastrointestinal tract were target organs for the 50-nm nano-ZnO acute oral treatment. The cumulative coefficient (K) was 1.9 which indicated that the cumulative toxicity was apparent. The results also indicated that the liver, kidney, lung, and pancrea were target organs for 50-nm nano-ZnO cumulative oral exposure and might be target organs for subchronic and chronic toxicity of oral administered 50-nm ZnO.

  10. Retooling CalEnviroScreen: Cumulative Pollution Burden and Race-Based Environmental Health Vulnerabilities in California

    Science.gov (United States)

    2018-01-01

    The California Community Environmental Health Screening Tool (CalEnviroScreen) advances research and policy pertaining to environmental health vulnerability. However, CalEnviroScreen departs from its historical foundations and comparable screening tools by no longer considering racial status as an indicator of environmental health vulnerability and predictor of cumulative pollution burden. This study used conceptual frameworks and analytical techniques from environmental health and inequality literature to address the limitations of CalEnviroScreen, especially its inattention to race-based environmental health vulnerabilities. It developed an adjusted measure of cumulative pollution burden from the CalEnviroScreen 2.0 data that facilitates multivariate analyses of the effect of neighborhood racial composition on cumulative pollution burden, net of other indicators of population vulnerability, traffic density, industrial zoning, and local and regional clustering of pollution burden. Principal component analyses produced three new measures of population vulnerability, including Latina/o cumulative disadvantage that represents the spatial concentration of Latinas/os, economic disadvantage, limited English-speaking ability, and health vulnerability. Spatial error regression analyses demonstrated that concentrations of Latinas/os, followed by Latina/o cumulative disadvantage, are the strongest demographic determinants of adjusted cumulative pollution burden. Findings have implications for research and policy pertaining to cumulative impacts and race-based environmental health vulnerabilities within and beyond California. PMID:29659481

  11. Retooling CalEnviroScreen: Cumulative Pollution Burden and Race-Based Environmental Health Vulnerabilities in California

    Directory of Open Access Journals (Sweden)

    Raoul S. Liévanos

    2018-04-01

    Full Text Available The California Community Environmental Health Screening Tool (CalEnviroScreen advances research and policy pertaining to environmental health vulnerability. However, CalEnviroScreen departs from its historical foundations and comparable screening tools by no longer considering racial status as an indicator of environmental health vulnerability and predictor of cumulative pollution burden. This study used conceptual frameworks and analytical techniques from environmental health and inequality literature to address the limitations of CalEnviroScreen, especially its inattention to race-based environmental health vulnerabilities. It developed an adjusted measure of cumulative pollution burden from the CalEnviroScreen 2.0 data that facilitates multivariate analyses of the effect of neighborhood racial composition on cumulative pollution burden, net of other indicators of population vulnerability, traffic density, industrial zoning, and local and regional clustering of pollution burden. Principal component analyses produced three new measures of population vulnerability, including Latina/o cumulative disadvantage that represents the spatial concentration of Latinas/os, economic disadvantage, limited English-speaking ability, and health vulnerability. Spatial error regression analyses demonstrated that concentrations of Latinas/os, followed by Latina/o cumulative disadvantage, are the strongest demographic determinants of adjusted cumulative pollution burden. Findings have implications for research and policy pertaining to cumulative impacts and race-based environmental health vulnerabilities within and beyond California.

  12. Retooling CalEnviroScreen: Cumulative Pollution Burden and Race-Based Environmental Health Vulnerabilities in California.

    Science.gov (United States)

    Liévanos, Raoul S

    2018-04-16

    The California Community Environmental Health Screening Tool (CalEnviroScreen) advances research and policy pertaining to environmental health vulnerability. However, CalEnviroScreen departs from its historical foundations and comparable screening tools by no longer considering racial status as an indicator of environmental health vulnerability and predictor of cumulative pollution burden. This study used conceptual frameworks and analytical techniques from environmental health and inequality literature to address the limitations of CalEnviroScreen, especially its inattention to race-based environmental health vulnerabilities. It developed an adjusted measure of cumulative pollution burden from the CalEnviroScreen 2.0 data that facilitates multivariate analyses of the effect of neighborhood racial composition on cumulative pollution burden, net of other indicators of population vulnerability, traffic density, industrial zoning, and local and regional clustering of pollution burden. Principal component analyses produced three new measures of population vulnerability, including Latina/o cumulative disadvantage that represents the spatial concentration of Latinas/os, economic disadvantage, limited English-speaking ability, and health vulnerability. Spatial error regression analyses demonstrated that concentrations of Latinas/os, followed by Latina/o cumulative disadvantage, are the strongest demographic determinants of adjusted cumulative pollution burden. Findings have implications for research and policy pertaining to cumulative impacts and race-based environmental health vulnerabilities within and beyond California.

  13. Triggered Release from Polymer Capsules

    Energy Technology Data Exchange (ETDEWEB)

    Esser-Kahn, Aaron P. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry; Odom, Susan A. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry; Sottos, Nancy R. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Materials Science and Engineering; White, Scott R. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Aerospace Engineering; Moore, Jeffrey S. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry

    2011-07-06

    Stimuli-responsive capsules are of interest in drug delivery, fragrance release, food preservation, and self-healing materials. Many methods are used to trigger the release of encapsulated contents. Here we highlight mechanisms for the controlled release of encapsulated cargo that utilize chemical reactions occurring in solid polymeric shell walls. Triggering mechanisms responsible for covalent bond cleavage that result in the release of capsule contents include chemical, biological, light, thermal, magnetic, and electrical stimuli. We present methods for encapsulation and release, triggering methods, and mechanisms and conclude with our opinions on interesting obstacles for chemically induced activation with relevance for controlled release.

  14. The ATLAS Trigger System: Ready for Run-2

    CERN Document Server

    Maeda, Junpei; The ATLAS collaboration

    2015-01-01

    The ATLAS trigger has been successfully collecting collision data during the first run of the LHC between 2009-2013 at a centre-of-mass energy between 900 GeV and 8 TeV. The trigger system consists of a hardware Level-1 and a software based high-level trigger that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. During the data-taking period of Run-2 the LHC will operate at a centre-of-mass energy of about 13 TeV resulting in roughly five times higher trigger rates. In these proceedings, we briefly review the ATLAS trigger system upgrades that were implemented during the shutdown, allowing us to cope with the increased trigger rates while maintaining or even improving our efficiency to select relevant physics processes. This includes changes to the Level-1 calorimeter and muon trigger system, the introduction of a new Level-1 topological trigger module and themerging of the previously two-level higher-level trigger system into a single even...

  15. Dynamic Aftershock Triggering Correlated with Cyclic Loading in the Slip Direction

    Science.gov (United States)

    Hardebeck, J.

    2014-12-01

    Dynamic stress changes have been shown to contribute to aftershock triggering, but the physical triggering mechanisms are not fully understood. Some proposed mechanisms are based on dynamic stress loading of the target fault in a direction that encourages earthquake slip (e.g. dynamic Coulomb stress triggering), while other mechanisms are based on fault weakening due to shaking. If dynamic stress loading in the fault slip direction plays a role in aftershock triggering, we would expect to see a relationship between the dynamic stress orientations and the aftershock focal mechanisms. Alternatively, if dynamic stress change triggering functions only through a fault weakening mechanism that is independent of the slip direction of the target fault, no such relationship is expected. I study aftershock sequences of 4 M≥6.7 mainshocks in southern California, and find a small but significant relationship between modeled dynamic stress direction and aftershock focal mechanisms. The mainshock dynamic stress changes have two observed impacts: changing the focal mechanisms in a given location to favor those aligned with the dynamic stress change, and changing the spatial distribution of seismicity to favor locations where the dynamic stress change aligns with the background stress. The aftershock focal mechanisms are significantly more aligned with the dynamic stress changes than the preshock mechanisms for only the first 0.5-1 year following most mainshocks, although for at least 10 years following Hector Mine. Dynamic stress effects on focal mechanisms are best observed at long periods (30-60 sec). Dynamic stress effects are only observed when using metrics based on repeated stress cycling in the same direction, for example considering the dominant stress orientation over the full time series, and not for the peak dynamic stress. These results imply that dynamic aftershock triggering operates at least in part through cyclic loading in the direction of fault slip, although

  16. Experience of cumulative effects assessment in the UK

    Directory of Open Access Journals (Sweden)

    Piper Jake

    2004-01-01

    Full Text Available Cumulative effects assessment (CEA is a development of environmental impact assessment which attempts to take into account the wider picture of what impacts may affect the environment as a result of either multiple or linear projects, or development plans. CEA is seen as a further valuable tool in promoting sustainable development. The broader canvas upon which the assessment is made leads to a suite of issues such as complexity in methods and assessment of significance, the desirability of co-operation between developers and other parties, new ways of addressing mitigation and monitoring. After outlining the legislative position and the process of CEA, this paper looks at three cases studies in the UK where cumulative assessment has been carried out - the cases concern wind farms, major infrastructure and off-shore developments.

  17. Clock and trigger distribution for CBM-TOF quality evaluation of RPC super module detector assemblies

    Science.gov (United States)

    Li, C.; Huang, X.; Cao, P.; Wang, J.; An, Q.

    2018-03-01

    RPC Super module (SM) detector assemblies are used for charged hadron identification in the Time-of-Flight (TOF) spectrometer at the Compressed Baryonic Matter (CBM) experiment. Each SM contains several multi-gap Resistive Plate Chambers (MRPCs) and provides up to 320 electronic channels in total for high-precision time measurements. Time resolution of the Time-to-Digital Converter (TDC) is required to be better than 20 ps. During mass production, the quality of each SM needs to be evaluated. In order to meet the requirements, the system clock signal as well as the trigger signal should be distributed precisely and synchronously to all electronics modules within the evaluation readout system. In this paper, a hierarchical clock and trigger distribution method is proposed for the quality evaluation of CBM-TOF SM detectors. In a first stage, the master clock and trigger module (CTM) allocated in a 6U PXI chassis distributes the clock and trigger signals to the slave CTM in the same chassis. In a second stage, the slave CTM transmits the clock and trigger signals to the TDC readout module (TRM) through one optical link. In a third stage, the TRM distributes the clock and trigger signals synchronously to 10 individual TDC boards. Laboratory test results show that the clock jitter at the third stage is less than 4 ps (RMS) and the trigger transmission latency from the master CTM to the TDC is about 272 ns with 11 ps (RMS) jitter. The overall performance complies well with the required specifications.

  18. First level trigger of the DIRAC experiment

    International Nuclear Information System (INIS)

    Afanas'ev, L.G.; Karpukhin, V.V.; Kulikov, A.V.; Gallas, M.

    2001-01-01

    The logic of the first level trigger of the DIRAC experiment at CERN is described. A parallel running of different trigger modes with tagging of events and optional independent prescaling is realized. A CAMAC-based trigger system is completely computer controlled

  19. A Review of Non-Chemical Stressors and Their Importance in Cumulative Risk Assessment

    Science.gov (United States)

    Cumulative exposure/risk assessments need to include non-chemical stressors as well as human activities and chemical data. Multiple stressor research can offer information on the interactions between chemical and non-chemical stressors needed for cumulative risk assessment resea...

  20. The ATLAS Level-1 Central Trigger Processor (CTP)

    CERN Document Server

    Spiwoks, Ralf; Ellis, Nick; Farthouat, P; Gällnö, P; Haller, J; Krasznahorkay, A; Maeno, T; Pauly, T; Pessoa-Lima, H; Resurreccion-Arcas, I; Schuler, G; De Seixas, J M; Torga-Teixeira, R; Wengler, T

    2005-01-01

    The ATLAS Level-1 Central Trigger Processor (CTP) combines information from calorimeter and muon trigger processors and makes the final Level-1 Accept (L1A) decision on the basis of lists of selection criteria (trigger menus). In addition to the event-selection decision, the CTP also provides trigger summary information to the Level-2 trigger and the data acquisition system. It further provides accumulated and bunch-by-bunch scaler data for monitoring of the trigger, detector and beam conditions. The CTP is presented and results are shown from tests with the calorimeter adn muon trigger processors connected to detectors in a particle beam, as well as from stand-alone full-system tests in the laboratory which were used to validate the CTP.

  1. Simultaneous Event-Triggered Fault Detection and Estimation for Stochastic Systems Subject to Deception Attacks.

    Science.gov (United States)

    Li, Yunji; Wu, QingE; Peng, Li

    2018-01-23

    In this paper, a synthesized design of fault-detection filter and fault estimator is considered for a class of discrete-time stochastic systems in the framework of event-triggered transmission scheme subject to unknown disturbances and deception attacks. A random variable obeying the Bernoulli distribution is employed to characterize the phenomena of the randomly occurring deception attacks. To achieve a fault-detection residual is only sensitive to faults while robust to disturbances, a coordinate transformation approach is exploited. This approach can transform the considered system into two subsystems and the unknown disturbances are removed from one of the subsystems. The gain of fault-detection filter is derived by minimizing an upper bound of filter error covariance. Meanwhile, system faults can be reconstructed by the remote fault estimator. An recursive approach is developed to obtain fault estimator gains as well as guarantee the fault estimator performance. Furthermore, the corresponding event-triggered sensor data transmission scheme is also presented for improving working-life of the wireless sensor node when measurement information are aperiodically transmitted. Finally, a scaled version of an industrial system consisting of local PC, remote estimator and wireless sensor node is used to experimentally evaluate the proposed theoretical results. In particular, a novel fault-alarming strategy is proposed so that the real-time capacity of fault-detection is guaranteed when the event condition is triggered.

  2. CMS Triggers for the LHC Startup

    CERN Document Server

    Nhan Nguyen, Chi

    2009-01-01

    The LHC will collide proton beams at a bunch-crossing rate of 40 MHz. At the design luminosity of $10^{34}$ cm$^{-2}$s$^{-1}$ each crossing results in an average of about 20 inelastic pp events. The CMS trigger system is designed to reduce the input rate to about 100 Hz. This task is carried out in two steps, namely the Level-1 (L1) and the High-Level trigger (HLT). The L1 trigger is built of customized fast electronics and is designed to reduce the rate to 100 kHz. The HLT is implemented in a filter farm running on hundreds of CPUs and is designed to reduce the rate by another factor of ~1000. It combines the traditional L2 and L3 trigger components in a novel way and allows the coherent tuning of the HLT algorithms to accommodate multiple physics channels. We will discuss the strategies for optimizing triggers covering the experiment`s early physics program.

  3. The ATLAS Level-1 Calorimeter Trigger

    International Nuclear Information System (INIS)

    Achenbach, R; Andrei, V; Adragna, P; Apostologlou, P; Barnett, B M; Brawn, I P; Davis, A O; Edwards, J P; Asman, B; Bohm, C; Ay, C; Bauss, B; Bendel, M; Dahlhoff, A; Eckweiler, S; Booth, J R A; Thomas, P Bright; Charlton, D G; Collins, N J; Curtis, C J

    2008-01-01

    The ATLAS Level-1 Calorimeter Trigger uses reduced-granularity information from all the ATLAS calorimeters to search for high transverse-energy electrons, photons, τ leptons and jets, as well as high missing and total transverse energy. The calorimeter trigger electronics has a fixed latency of about 1 μs, using programmable custom-built digital electronics. This paper describes the Calorimeter Trigger hardware, as installed in the ATLAS electronics cavern

  4. The ATLAS Level-1 Calorimeter Trigger

    Energy Technology Data Exchange (ETDEWEB)

    Achenbach, R; Andrei, V [Kirchhoff-Institut fuer Physik, University of Heidelberg, D-69120 Heidelberg (Germany); Adragna, P [Physics Department, Queen Mary, University of London, London E1 4NS (United Kingdom); Apostologlou, P; Barnett, B M; Brawn, I P; Davis, A O; Edwards, J P [STFC Rutherford Appleton Laboratory, Harwell Science and Innovation Campus, Didcot, Oxon OX11 0QX (United Kingdom); Asman, B; Bohm, C [Fysikum, Stockholm University, SE-106 91 Stockholm (Sweden); Ay, C; Bauss, B; Bendel, M; Dahlhoff, A; Eckweiler, S [Institut fuer Physik, University of Mainz, D-55099 Mainz (Germany); Booth, J R A; Thomas, P Bright; Charlton, D G; Collins, N J; Curtis, C J [School of Physics and Astronomy, University of Birmingham, Birmingham B15 2TT (United Kingdom)], E-mail: e.eisenhandler@qmul.ac.uk (and others)

    2008-03-15

    The ATLAS Level-1 Calorimeter Trigger uses reduced-granularity information from all the ATLAS calorimeters to search for high transverse-energy electrons, photons, {tau} leptons and jets, as well as high missing and total transverse energy. The calorimeter trigger electronics has a fixed latency of about 1 {mu}s, using programmable custom-built digital electronics. This paper describes the Calorimeter Trigger hardware, as installed in the ATLAS electronics cavern.

  5. Software trigger for the TOPAZ detector at TRISTAN

    International Nuclear Information System (INIS)

    Tsukamoto, T.; Yamauchi, M.; Enomoto, R.

    1990-01-01

    A new software trigger system was developed and installed at the TOPAZ detector to the trigger system for the TRISTAN e + e - collider to take data efficiently in the scheduled high luminosity experiment. This software trigger requires two or more charged tracks originated at the interaction point by examining the timing of signals from the time projection chamber. To execute the vertex finding very quickly, four microprocessors are used in parallel. By this new trigger the rate of the track trigger was reduced down to 30-40% with very small inefficiency. The additional dead time by this trigger is negligible. (orig.)

  6. The Trigger for Early Running

    CERN Document Server

    The ATLAS Collaboration

    2009-01-01

    The ATLAS trigger and data acquisition system is based on three levels of event selection designed to capture the physics of interest with high efficiency from an initial bunch crossing rate of 40 MHz. The selections in the three trigger levels must provide sufficient rejection to reduce the rate to 200 Hz, compatible with offline computing power and storage capacity. The LHC is expected to begin its operation with a peak luminosity of 10^31 with a relatively small number of bunches, but quickly ramp up to higher luminosities by increasing the number of bunches, and thus the overall interaction rate. Decisions must be taken every 25 ns during normal LHC operations at the design luminosity of 10^34, where the average bunch crossing will contain more than 20 interactions. Hence, trigger selections must be deployed that can adapt to the changing beam conditions while preserving the interesting physics and satisfying varying detector requirements. In this paper, we provide a menu of trigger selections that can be...

  7. BTeV detached vertex trigger

    International Nuclear Information System (INIS)

    Gottschalk, E.E.

    2001-01-01

    BTeV is a collider experiment that has been approved to run in the Tevatron at Fermilab. The experiment will conduct precision studies of CP violation using a forward-geometry detector. The detector will be optimized for high-rate detection of beauty and charm particles produced in collisions between protons and anti-protons. BTeV will trigger on beauty and charm events by taking advantage of the main difference between these heavy quark events and more typical hadronic events - the presence of detached beauty and charm decay vertices. The first stage of the BTeV trigger will receive data from a pixel vertex detector at a rate of 100 gb s -1 , reconstruct tracks and vertices for every beam crossing, reject 99% of beam crossings that do not produce beauty or charm particles, and trigger on beauty events with high efficiency. An overview of the trigger design and its influence on the design of the pixel vertex detector is presented

  8. Rate Predictions and Trigger/DAQ Resource Monitoring in ATLAS

    CERN Document Server

    Schaefer, D M; The ATLAS collaboration

    2012-01-01

    Since starting in 2010, the Large Hadron Collider (LHC) has pro- duced collisions at an ever increasing rate. The ATLAS experiment successfully records the collision data with high eciency and excel- lent data quality. Events are selected using a three-level trigger system, where each level makes a more re ned selection. The level-1 trigger (L1) consists of a custom-designed hardware trigger which seeds two higher software based trigger levels. Over 300 triggers compose a trig- ger menu which selects physics signatures such as electrons, muons, particle jets, etc. Each trigger consumes computing resources of the ATLAS trigger system and oine storage. The LHC instantaneous luminosity conditions, desired physics goals of the collaboration, and the limits of the trigger infrastructure determine the composition of the ATLAS trigger menu. We describe a trigger monitoring frame- work for computing the costs of individual trigger algorithms such as data request rates and CPU consumption. This framework has been used...

  9. The EPA's human exposure research program for assessing cumulative risk in communities.

    Science.gov (United States)

    Zartarian, Valerie G; Schultz, Bradley D

    2010-06-01

    Communities are faced with challenges in identifying and prioritizing environmental issues, taking actions to reduce their exposures, and determining their effectiveness for reducing human health risks. Additional challenges include determining what scientific tools are available and most relevant, and understanding how to use those tools; given these barriers, community groups tend to rely more on risk perception than science. The U.S. Environmental Protection Agency's Office of Research and Development, National Exposure Research Laboratory (NERL) and collaborators are developing and applying tools (models, data, methods) for enhancing cumulative risk assessments. The NERL's "Cumulative Communities Research Program" focuses on key science questions: (1) How to systematically identify and prioritize key chemical stressors within a given community?; (2) How to develop estimates of exposure to multiple stressors for individuals in epidemiologic studies?; and (3) What tools can be used to assess community-level distributions of exposures for the development and evaluation of the effectiveness of risk reduction strategies? This paper provides community partners and scientific researchers with an understanding of the NERL research program and other efforts to address cumulative community risks; and key research needs and opportunities. Some initial findings include the following: (1) Many useful tools exist for components of risk assessment, but need to be developed collaboratively with end users and made more comprehensive and user-friendly for practical application; (2) Tools for quantifying cumulative risks and impact of community risk reduction activities are also needed; (3) More data are needed to assess community- and individual-level exposures, and to link exposure-related information with health effects; and (4) Additional research is needed to incorporate risk-modifying factors ("non-chemical stressors") into cumulative risk assessments. The products of this

  10. Susceptibility and triggering scenarios at a regional scale for shallow landslides

    Science.gov (United States)

    Gullà, G.; Antronico, L.; Iaquinta, P.; Terranova, O.

    2008-07-01

    The work aims at identifying susceptible areas and pluviometric triggering scenarios at a regional scale in Calabria (Italy), with reference to shallow landsliding events. The proposed methodology follows a statistical approach and uses a database linked to a GIS that has been created to support the various steps of spatial data management and manipulation. The shallow landslide predisposing factors taken into account are derived from (i) the 40-m digital terrain model of the region, an ˜ 15,075 km 2 extension; (ii) outcropping lithology; (iii) soils; and (iv) land use. More precisely, a map of the slopes has been drawn from the digital terrain model. Two kinds of covers [prevalently coarse-grained (CG cover) or fine-grained (FG cover)] were identified, referring to the geotechnical characteristics of geomaterial covers and to the lithology map; soilscapes were drawn from soil maps; and finally, the land use map was employed without any prior processing. Subsequently, the inventory maps of some shallow landsliding events, totaling more than 30,000 instabilities of the past and detected by field surveys and photo aerial restitution, were employed to calibrate the relative importance of these predisposing factors. The use of single factors (first level analysis) therefore provides three different susceptibility maps. Second level analysis, however, enables better location of areas susceptible to shallow landsliding events by crossing the single susceptibility maps. On the basis of the susceptibility map obtained by the second level analysis, five different classes of susceptibility to shallow landsliding events have been outlined over the regional territory: 8.9% of the regional territory shows very high susceptibility, 14.3% high susceptibility, 15% moderate susceptibility, 3.6% low susceptibility, and finally, about 58% very low susceptibility. Finally, the maps of two significant shallow landsliding events of the past and their related rainfalls have been

  11. Convergent and Divergent Signaling in PAMP-Triggered Immunity and Effector-Triggered Immunity.

    Science.gov (United States)

    Peng, Yujun; van Wersch, Rowan; Zhang, Yuelin

    2018-04-01

    Plants use diverse immune receptors to sense pathogen attacks. Recognition of pathogen-associated molecular patterns (PAMPs) by pattern recognition receptors localized on the plasma membrane leads to PAMP-triggered immunity (PTI). Detection of pathogen effectors by intracellular or plasma membrane-localized immune receptors results in effector-triggered immunity (ETI). Despite the large variations in the magnitude and duration of immune responses triggered by different PAMPs or pathogen effectors during PTI and ETI, plasma membrane-localized immune receptors activate similar downstream molecular events such as mitogen-activated protein kinase activation, oxidative burst, ion influx, and increased biosynthesis of plant defense hormones, indicating that defense signals initiated at the plasma membrane converge at later points. On the other hand, activation of ETI by immune receptors localized to the nucleus appears to be more directly associated with transcriptional regulation of defense gene expression. Here, we review recent progress in signal transductions downstream of different groups of plant immune receptors, highlighting the converging and diverging molecular events.

  12. The Trigger System of the CMS Experiment

    OpenAIRE

    Felcini, Marta

    2008-01-01

    We give an overview of the main features of the CMS trigger and data acquisition (DAQ) system. Then, we illustrate the strategies and trigger configurations (trigger tables) developed for the detector calibration and physics program of the CMS experiment, at start-up of LHC operations, as well as their possible evolution with increasing luminosity. Finally, we discuss the expected CPU time performance of the trigger algorithms and the CPU requirements for the event filter farm at start-up.

  13. Smart trigger logic for focal plane arrays

    Science.gov (United States)

    Levy, James E; Campbell, David V; Holmes, Michael L; Lovejoy, Robert; Wojciechowski, Kenneth; Kay, Randolph R; Cavanaugh, William S; Gurrieri, Thomas M

    2014-03-25

    An electronic device includes a memory configured to receive data representing light intensity values from pixels in a focal plane array and a processor that analyzes the received data to determine which light values correspond to triggered pixels, where the triggered pixels are those pixels that meet a predefined set of criteria, and determines, for each triggered pixel, a set of neighbor pixels for which light intensity values are to be stored. The electronic device also includes a buffer that temporarily stores light intensity values for at least one previously processed row of pixels, so that when a triggered pixel is identified in a current row, light intensity values for the neighbor pixels in the previously processed row and for the triggered pixel are persistently stored, as well as a data transmitter that transmits the persistently stored light intensity values for the triggered and neighbor pixels to a data receiver.

  14. Cumulative Author Index for Soviet Laser Bibliographies Nos. 67-93, September 1983-February 1989

    Science.gov (United States)

    1990-02-01

    C) 0 00 I: Cumulative Author Index for Soviet Laser Bibliographies September 1983 - February 1989 A Defense S&T Intelligence Special Purpose Document...90 CUMULATIVE AUTHOR INDEX FOR SOVIET LASER BIBLIOGRAPHIES Nos. 67-93 SEPTEMBER 1983 - FEBRUARY 1989 Date of Report March 31, 19 Vice Director for...RECIPIENT’S CATALOG NUMBER DST-2700Z-001-90 4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED CUMULATIVE AUTHOR INDEX FOR SOVIET LASER

  15. The Role of Cumulative Trauma, Betrayal, and Appraisals in Understanding Trauma Symptomatology.

    Science.gov (United States)

    Martin, Christina Gamache; Cromer, Lisa Demarni; Deprince, Anne P; Freyd, Jennifer J

    2013-03-01

    Poor psychological outcomes are common among trauma survivors, yet not all survivors experience adverse sequelae. The current study examined links between cumulative trauma exposure as a function of the level of betrayal (measured by the relational closeness of the survivor and the perpetrator), trauma appraisals, gender, and trauma symptoms. Participants were 273 college students who reported experiencing at least one traumatic event on a trauma checklist. Three cumulative indices were constructed to assess the number of different types of traumas experienced that were low (LBTs), moderate (MBTs), or high in betrayal (HBTs). Greater trauma exposure was related to more symptoms of depression, dissociation, and PTSD, with exposure to HBTs contributing the most. Women were more likely to experience HBTs than men, but there were no gender differences in trauma-related symptoms. Appraisals of trauma were predictive of trauma-related symptoms over and above the effects explained by cumulative trauma at each level of betrayal. The survivor's relationship with the perpetrator, the effect of cumulative trauma, and their combined impact on trauma symptomatology are discussed.

  16. UA1 upgrade first-level calorimeter trigger processor

    International Nuclear Information System (INIS)

    Bains, N.; Charlton, D.; Ellis, N.; Garvey, J.; Gregory, J.; Jimack, M.P.; Jovanovic, P.; Kenyon, I.R.; Baird, S.A.; Campbell, D.; Cawthraw, M.; Coughlan, J.; Flynn, P.; Galagedera, S.; Grayer, G.; Halsall, R.; Shah, T.P.; Stephens, R.; Eisenhandler, E.; Fensome, I.; Landon, M.

    1989-01-01

    A new first-level trigger processor has been built for the UA1 experiment on the Cern SppS Collider. The processor exploits the fine granularity of the new UA1 uranium-TMP calorimeter to improve the selectivity of the trigger. The new electron trigger has improved hadron jet rejection, achieved by requiring low energy deposition around the electromagnetic cluster. A missing transverse energy trigger and a total energy trigger have also been implemented. (orig.)

  17. The ATLAS Trigger: Recent Experience and Future Plans

    CERN Document Server

    The ATLAS collaboration

    2009-01-01

    This paper will give an overview of the ATLAS trigger design and its innovative features. It will describe the valuable experience gained in running the trigger reconstruction and event selection in the fastchanging environment of the detector commissioning during 2008. It will also include a description of the trigger selection menu and its 2009 deployment plan from first collisions to the nominal luminosity. ATLAS is one of the two general-purpose detectors at the Large Hadron Collider (LHC). The trigger system needs to efficiently reject a large rate of background events and still select potentially interesting ones with high efficiency. After a first level trigger implemented in custom electronics, the trigger event selection is made by the High Level Trigger (HLT) system, implemented in software. To reduce the processing time to manageable levels, the HLT uses seeded, step-wise and fast selection algorithms, aiming at the earliest possible rejection of background events. The ATLAS trigger event selection...

  18. 14 CFR Section 18 - Objective Classification-Cumulative Effect of Changes in Accounting Principles

    Science.gov (United States)

    2010-01-01

    ... of Changes in Accounting Principles Section 18 Section 18 Aeronautics and Space OFFICE OF THE... Objective Classification—Cumulative Effect of Changes in Accounting Principles 98Cumulative Effect of Changes in Accounting Principles. Record here the difference between the amount of retained earnings at...

  19. System design and verification process for LHC programmable trigger electronics

    CERN Document Server

    Crosetto, D

    1999-01-01

    The rapid evolution of electronics has made it essential to design systems in a technology-independent form that will permit their realization in any future technology. This article describes two practical projects that have been developed for fast, programmable, scalable, modular electronics for the first-level trigger of Large Hadron Collider (LHC) experiments at CERN, Geneva. In both projects, one for the front-end electronics and the second for executing first- level trigger algorithms, the whole system requirements were constrained to two types of replicated components. The overall problem is described, the 3D-Flow design is introduced as a novel solution, and current solutions to the problem are described and compared with the 3D-Flow solution. The design/verification methodology proposed allows the user's real-time system algorithm to be verified down to the gate-level simulation on a technology- independent platform, thus yielding the design for a system that can be implemented with any technology at ...

  20. Pesticide Cumulative Risk Assessment: Framework for Screening Analysis

    Science.gov (United States)

    This document provides guidance on how to screen groups of pesticides for cumulative evaluation using a two-step approach: begin with evaluation of available toxicological information and, if necessary, follow up with a risk-based screening approach.

  1. Cumulative-Phase-Alteration of Galactic-Light Passing Through the Cosmic-Microwave-Background: A New Mechanism for Some Observed Spectral-Shifts

    Directory of Open Access Journals (Sweden)

    Tank H. K.

    2012-07-01

    Full Text Available Currently, whole of the measured “cosmological-red-shift ” is interpreted as due to the “metric-expansion-of-space”; so for the required “closer -density” of the universe, we need twenty times more mass-energy than the visible baryonic-matter contained in the universe. This paper proposes a new mechanism, which can account for good per- centage of the red-shift in the extra-galactic-light, greatly reducing the requirement of dark matter-energy. Also, this mechanism can cause a new kin d of blue-shift reported here, and their observational evidences. These spectral-s hifts are proposed to result due to cumulative phase-alteration of extra-galactic-light b ecause of vector-addition of: (i electric-field of extra-galactic-light and (ii that of the cosmic-microwave-background (CMB. Since the center-frequency of CMB is much lower than extra-galactic-light, the cumulative-phase-alteration results in red -shift, observed as an additional contribu- tor to the measured “cosmological red-shift”; and since the center-frequency of CMB is higher than the radio-frequency-signals used to measure velocity of space-probes like: Pioneer-10, Pioneer-11, Galileo and Ulysses, the cum ulative-phase-alteration re- sulted in blue-shift, leading to the interpretation of deceleration of these space-probes. While the galactic-light experiences the red-shift, and th e ranging-signals of the space- probes experience blue -shift, they are comparable in magnitude, providing a supportive- evidence for the new mechanism proposed here. More confirmative-experiments for this new mechanism are also proposed.

  2. Adjacent Vehicle Number-Triggered Adaptive Transmission for V2V Communications.

    Science.gov (United States)

    Wei, Yiqiao; Chen, Jingjun; Hwang, Seung-Hoon

    2018-03-02

    For vehicle-to-vehicle (V2V) communication, such issues as continuity and reliability still have to be solved. Specifically, it is necessary to consider a more scalable physical layer due to the high-speed mobility of vehicles and the complex channel environment. Adaptive transmission has been adapted in channel-dependent scheduling. However, it has been neglected with regards to the physical topology changes in the vehicle network. In this paper, we propose a physical topology-triggered adaptive transmission scheme which adjusts the data rate between vehicles according to the number of connectable vehicles nearby. Also, we investigate the performance of the proposed method using computer simulations and compare it with the conventional methods. The numerical results show that the proposed method can provide more continuous and reliable data transmission for V2V communications.

  3. Chimpanzees (Pan troglodytes) and the question of cumulative culture: an experimental approach.

    Science.gov (United States)

    Marshall-Pescini, Sarah; Whiten, Andrew

    2008-07-01

    There is increasing evidence for cultural variations in behaviour among non-human species, but human societies additionally display elaborate cumulative cultural evolution, with successive generations building on earlier achievements. Evidence for cumulative culture in non-human species remains minimal and controversial. Relevant experiments are also lacking. Here we present a first experiment designed to examine chimpanzees' capacity for cumulative social learning. Eleven young chimpanzees were presented with a foraging device, which afforded both a relatively simple and a more complex tool-use technique for extracting honey. The more complex 'probing' technique incorporated the core actions of the simpler 'dipping' one and was also much more productive. In a baseline, exploration condition only two subjects discovered the dipping technique and a solitary instance of probing occurred. Demonstrations of dipping by a familiar human were followed by acquisition of this technique by the five subjects aged three years or above, whilst younger subjects showed a significant increase only in the elements of the dipping technique. By contrast, subsequent demonstrations of the probing task were not followed by acquisition of this more productive technique. Subjects stuck to their habitual dipping method despite an escalating series of demonstrations eventually exceeding 200. Supplementary tests showed this technique is within the capability of chimpanzees of this age. We therefore tentatively conclude that young chimpanzees exhibit a tendency to become 'stuck' on a technique they initially learn, inhibiting cumulative social learning and possibly constraining the species' capacity for cumulative cultural evolution.

  4. Cumulative release to the accessible environment

    International Nuclear Information System (INIS)

    Kanehiro, B.

    1985-01-01

    The Containment and Isolation Working Group considered issues related to the postclosure behavior of repositories in crystalline rock. This working group was further divided into subgroups to consider the progress since the 1978 GAIN Symposium and identify research needs in the individual areas of regional ground-water flow, ground-water travel time, fractional release, and cumulative release. The analysis and findings of the Fractional Release Subgroup are presented

  5. Anti-irritants II: Efficacy against cumulative irritation

    DEFF Research Database (Denmark)

    Andersen, Flemming; Hedegaard, Kathryn; Petersen, Thomas Kongstad

    2006-01-01

    window of opportunity in which to demonstrate efficacy. Therefore, the effect of AI was studied in a cumulative irritation model by inducing irritant dermatitis with 10 min daily exposures for 5+4 days (no irritation on weekend) to 1% sodium lauryl sulfate on the right and 20% nonanoic acid on the left...

  6. The ATLAS Tau Trigger

    CERN Document Server

    Dam, M; The ATLAS collaboration

    2009-01-01

    The ATLAS experiment at CERN’s LHC has implemented a dedicated tau trigger system to select hadronically decaying tau leptons from the enormous background of QCD jets. This promises a significant increase in the discovery potential to the Higgs boson and in searches for physics beyond the Standard Model. The three level trigger system has been optimised for effciency and good background rejection. The first level uses information from the calorimeters only, while the two higher levels include also information from the tracking detectors. Shower shape variables and the track multiplicity are important variables to distinguish taus from QCD jets. At the initial lumonosity of 10^31 cm^−2 s^−1, single tau triggers with a transverse energy threshold of 50 GeV or higher can be run standalone. Below this level, the tau signatures will be combined with other event signature

  7. The ATLAS Tau Trigger

    CERN Document Server

    Rados, PK; The ATLAS collaboration

    2014-01-01

    Physics processes involving tau leptons play a crucial role in understanding particle physics at the high energy frontier. The ability to efficiently trigger on events containing hadronic tau decays is therefore of particular importance to the ATLAS experiment. During the 2012 run, the Large Hadronic Collder (LHC) reached instantaneous luminosities of nearly $10^{34} cm^{-2}s^{-1}$ with bunch crossings occurring every $50 ns$. This resulted in a huge event rate and a high probability of overlapping interactions per bunch crossing (pile-up). With this in mind it was necessary to design an ATLAS tau trigger system that could reduce the event rate to a manageable level, while efficiently extracting the most interesting physics events in a pile-up robust manner. In this poster the ATLAS tau trigger is described, its performance during 2012 is presented, and the outlook for the LHC Run II is briefly summarized.

  8. The Aurora accelerator's triggered oil switch

    International Nuclear Information System (INIS)

    Weidenheimer, D.M.; Pereira, N.R.; Judy, D.C.; Stricklett, K.L.

    1993-01-01

    Achieving a radiation pulse with 15 ns risetime using all four of the Aurora accelerator's Blumlein pulse-forming lines demands synchronization of the Blumleins to within 10 ns (in addition to a 15 ns risetime for a single line). Timing of each Blumlein is controlled by a triggered 12 MV oil switch. A smaller-than-customary trigger electrode makes the switching time more reproducible. Time-resolved photography of the oil arcs suggests that triggering occurs simultaneously around the sharp edge of the trigger electrode, perhaps with small deviations that grow into the most prominent arcs characteristically seen in open-shutter photographs. However, many smaller arcs that are usually overlooked in open-shutter pictures may contribute to current conduction in a closed switch

  9. The ATLAS Trigger System Commissioning and Performance

    CERN Document Server

    Hamilton, A

    2010-01-01

    The ATLAS trigger has been used very successfully to collect collision data during 2009 and 2010 LHC running at centre of mass energies of 900 GeV, 2.36 TeV, and 7 TeV. This paper presents the ongoing work to commission the ATLAS trigger with proton collisions, including an overview of the performance of the trigger based on extensive online running. We describe how the trigger has evolved with increasing LHC luminosity and give a brief overview of plans for forthcoming LHC running.

  10. The ATLAS Level-1 Trigger Timing Setup

    CERN Document Server

    Spiwoks, R; Ellis, Nick; Farthouat, P; Gällnö, P; Haller, J; Krasznahorkay, A; Maeno, T; Pauly, T; Pessoa-Lima, H; Resurreccion-Arcas, I; Schuler, G; De Seixas, J M; Torga-Teixeira, R; Wengler, T

    2005-01-01

    The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions at a bunch-crossing rate of 40 MHz. In order to reduce the data rate, a three-level trigger system selects potentially interesting physics. The first trigger level is implemented in electronics and firmware. It aims at reducing the output rate to less than 100 kHz. The Central Trigger Processor combines information from the calorimeter and muon trigger processors and makes the final Level-1-Accept decision. It is a central element in the timing setup of the experiment. Three aspects are considered in this article: the timing setup with respect to the Level-1 trigger, with respect to the expriment, and with respect to the world.

  11. Sparse Method for Direction of Arrival Estimation Using Denoised Fourth-Order Cumulants Vector.

    Science.gov (United States)

    Fan, Yangyu; Wang, Jianshu; Du, Rui; Lv, Guoyun

    2018-06-04

    Fourth-order cumulants (FOCs) vector-based direction of arrival (DOA) estimation methods of non-Gaussian sources may suffer from poor performance for limited snapshots or difficulty in setting parameters. In this paper, a novel FOCs vector-based sparse DOA estimation method is proposed. Firstly, by utilizing the concept of a fourth-order difference co-array (FODCA), an advanced FOCs vector denoising or dimension reduction procedure is presented for arbitrary array geometries. Then, a novel single measurement vector (SMV) model is established by the denoised FOCs vector, and efficiently solved by an off-grid sparse Bayesian inference (OGSBI) method. The estimation errors of FOCs are integrated in the SMV model, and are approximately estimated in a simple way. A necessary condition regarding the number of identifiable sources of our method is presented that, in order to uniquely identify all sources, the number of sources K must fulfill K ≤ ( M 4 - 2 M 3 + 7 M 2 - 6 M ) / 8 . The proposed method suits any geometry, does not need prior knowledge of the number of sources, is insensitive to associated parameters, and has maximum identifiability O ( M 4 ) , where M is the number of sensors in the array. Numerical simulations illustrate the superior performance of the proposed method.

  12. Upgrade trigger & reconstruction strategy: 2017 milestone

    CERN Document Server

    Albrecht, Johannes; Campora Perez, Daniel Hugo; Cattaneo, Marco; Marco, Clemencic; Couturier, Ben; Dziurda, Agnieszka; Fitzpatrick, Conor; Fontana, Marianna; Grillo, Lucia; Hasse, Christoph; Hill, Donal; Jones, Christopher Rob; Lemaitre, Florian; Lupton, Olli; Matev, Rosen; Pearce, Alex; Polci, Francesco; Promberger, Laura; Ponce, Sebastien; Quagliani, Renato; Raven, Gerhard; Sciascia, Barbara; Schiller, Manuel Tobias; Stahl, Sascha; Szymanski, Maciej Pawel; Chefdeville, Maximilien

    2018-01-01

    The LHCb collaboration is currently preparing an update of the experiment to take data in Run 3 of the LHC. The dominant feature of this upgrade is a trigger-less readout of the full detector followed by a full software trigger. To make optimal use of the collected data, the events are reconstructed at the inelastic collision rate of 30 MHz. This document presents the baseline trigger and reconstruction strategy as of the end of 2017.

  13. The ATLAS Trigger System : Ready for Run-2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00211007; The ATLAS collaboration

    2016-01-01

    The ATLAS trigger has been successfully collecting collision data during the first run of the LHC between 2009-2013 at a centre-of-mass energy between 900 GeV and 8 TeV. The trigger system consists of a hardware based Level-1 (L1) and a software based high-level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. During the course of the ongoing Run-2 data-taking campaign at 13 TeV centre-of-mass energy the trigger rates will be approximately 5 times higher compared to Run-1. In these proceedings we briefly review the ATLAS trigger system upgrades that were implemented during the shutdown, allowing us to cope with the increased trigger rates while maintaining or even improving our efficiency to select relevant physics processes. This includes changes to the L1 calorimeter and muon trigger system, the introduction of a new L1 topological trigger subsystem and the merging of the previously two-level HLT system into a single ev...

  14. ATLAS LAr Calorimeter Trigger Electronics Phase-1 Upgrade

    CERN Document Server

    Aad, Georges; The ATLAS collaboration

    2017-01-01

    The upgrade of the Large Hadron Collider (LHC) scheduled for a shut-down period of 2019-2020, referred to as the Phase-I upgrade, will increase the instantaneous luminosity to about three times the design value. Since the current ATLAS trigger system does not allow sufficient increase of the trigger rate, an improvement of the trigger system is required. The Liquid Argon (LAr) Calorimeter read-out will therefore be modified to use digital trigger signals with a higher spatial granularity in order to improve the identification efficiencies of electrons, photons, tau, jets and missing energy, at high background rejection rates at the Level-1 trigger. The new trigger signals will be arranged in 34000 so-called Super Cells which achieves 5-10 times better granularity than the trigger towers currently used and allows an improved background rejection. The readout of the trigger signals will process the signal of the Super Cells at every LHC bunch-crossing at 12-bit precision and a frequency of 40 MHz. The data will...

  15. DZERO Level 3 DAQ/Trigger Closeout

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Tevatron Collider, located at the Fermi National Accelerator Laboratory, delivered its last 1.96 TeV proton-antiproton collisions on September 30th, 2011. The DZERO experiment continues to take cosmic data for final alignment for several more months . Since Run 2 started, in March 2001, all DZERO data has been collected by the DZERO Level 3 Trigger/DAQ System. The system is a modern, networked, commodity hardware trigger and data acquisition system based around a large central switch with about 60 front ends and 200 trigger computers. DZERO front end crates are VME based. Single Board Computer interfaces between detector data on VME and the network transport for the DAQ system. Event flow is controlled by the Routing Master which can steer events to clusters of farm nodes based on the low level trigger bits that fired. The farm nodes are multi-core commodity computer boxes, without special hardware, that run isolated software to make the final Level 3 trigger decision. Passed events are transferred to th...

  16. The D-Zero Run II Trigger

    International Nuclear Information System (INIS)

    Blazey, G. C.

    1997-01-01

    The general purpose D0 collider detector, located at Fermi National Accelerator Laboratory, requires significantly enhanced data acquisition and triggering to operate in the high luminosity (L = 2 x 10 32 cm -2 s -1 ), high rate environment (7 MHz or 132 ns beam crossings) of the upgraded TeVatron proton anti-proton accelerator. This article describes the three major levels and frameworks of the new trigger. Information from the first trigger stage (L1) which includes scintillating, tracking and calorimeter detectors will provide a deadtimeless, 4.2 (micro)s trigger decision with an accept rate of 10 kHz. The second stage (L2), comprised of hardware engines associated with specific detectors and a single global processor will test for correlations between L1 triggers. L2 will have an accept rate of 1 kHz at a maximum deadtime of 5% and require a 100 (micro)s decision time. The third and final stage (L3) will reconstruct events in a farm of processors for a final instantaneous accept rate of 50 Hz

  17. Graphical processors for HEP trigger systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R. [INFN Sezione di Roma Tor Vergata, Via della Ricerca Scientifica, 1, 00133 Roma (Italy); Biagioni, A. [INFN Sezione di Roma, P.le Aldo Moro, 2, 00185 Roma (Italy); Chiozzi, S.; Cotta Ramusino, A. [INFN Sezione di Ferrara, Via Saragat, 1, 44122 Ferrara (Italy); Di Lorenzo, S. [INFN Sezione di Pisa, L. Bruno Pontecorvo, 3, 56127 Pisa (Italy); Università di Pisa, Lungarno Pacinotti 43, 56126 Pisa (Italy); Fantechi, R. [INFN Sezione di Pisa, L. Bruno Pontecorvo, 3, 56127 Pisa (Italy); Fiorini, M. [INFN Sezione di Ferrara, Via Saragat, 1, 44122 Ferrara (Italy); Università di Ferrara, Via Ludovico Ariosto 35, 44121 Ferrara (Italy); Frezza, O. [INFN Sezione di Roma, P.le Aldo Moro, 2, 00185 Roma (Italy); Lamanna, G. [INFN, Laboratori Nazionali di Frascati (Italy); Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P.S.; Pastorelli, E. [INFN Sezione di Roma, P.le Aldo Moro, 2, 00185 Roma (Italy); Piandani, R. [INFN Sezione di Pisa, L. Bruno Pontecorvo, 3, 56127 Pisa (Italy); Pontisso, L., E-mail: luca.pontisso@cern.ch [INFN Sezione di Pisa, L. Bruno Pontecorvo, 3, 56127 Pisa (Italy); Rossetti, D. [NVIDIA Corp., Santa Clara, CA (United States); Simula, F. [INFN Sezione di Roma, P.le Aldo Moro, 2, 00185 Roma (Italy); Sozzi, M. [INFN Sezione di Pisa, L. Bruno Pontecorvo, 3, 56127 Pisa (Italy); Università di Pisa, Lungarno Pacinotti 43, 56126 Pisa (Italy); and others

    2017-02-11

    General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to employ GPUs as accelerators in offline computations. With the steady decrease of GPU latencies and the increase in link and memory throughputs, time is ripe for real-time applications using GPUs in high-energy physics data acquisition and trigger systems. We will discuss the use of online parallel computing on GPUs for synchronous low level trigger systems, focusing on tests performed on the trigger of the CERN NA62 experiment. Latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Moreover, we discuss how specific trigger algorithms can be parallelised and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen LHC luminosity upgrade where highly selective algorithms will be crucial to maintain sustainable trigger rates with very high pileup.

  18. Upgrades of the ATLAS trigger system

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00221618; The ATLAS collaboration

    2018-01-01

    In coming years the LHC is expected to undergo upgrades to increase both the energy of proton-proton collisions and the instantaneous luminosity. In order to cope with these more challenging LHC conditions, upgrades of the ATLAS trigger system will be required. This talk will focus on some of the key aspects of these upgrades. Firstly, the upgrade period between 2019-2021 will see an increase in instantaneous luminosity to $3\\times10^{34} \\rm{cm^{-2}s^{-1}}$. Upgrades to the Level 1 trigger system during this time will include improvements for both the muon and calorimeter triggers. These include the upgrade of the first-level Endcap Muon trigger, the calorimeter trigger electronics and the addition of new calorimeter feature extractor hardware, such as the Global Feature Extractor (gFEX). An overview will be given on the design and development status the aforementioned systems, along with the latest testing and validation results. \\\\ By 2026, the High Luminosity LHC will be able to deliver 14 TeV collisions ...

  19. Graphical processors for HEP trigger systems

    International Nuclear Information System (INIS)

    Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P.S.; Pastorelli, E.; Piandani, R.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.

    2017-01-01

    General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to employ GPUs as accelerators in offline computations. With the steady decrease of GPU latencies and the increase in link and memory throughputs, time is ripe for real-time applications using GPUs in high-energy physics data acquisition and trigger systems. We will discuss the use of online parallel computing on GPUs for synchronous low level trigger systems, focusing on tests performed on the trigger of the CERN NA62 experiment. Latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Moreover, we discuss how specific trigger algorithms can be parallelised and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen LHC luminosity upgrade where highly selective algorithms will be crucial to maintain sustainable trigger rates with very high pileup.

  20. Online software trigger at PANDA/FAIR

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Donghee; Kliemt, Ralf; Nerling, Frank [Helmholtz-Institut Mainz (Germany); Denig, Achim [Institut fuer Kernphysik, Universitaet Mainz (Germany); Goetzen, Klaus; Peters, Klaus [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH (Germany); Collaboration: PANDA-Collaboration

    2014-07-01

    The PANDA experiment at FAIR will employ a novel trigger-less read-out system. Since a conventional hardware trigger concept is not suitable for PANDA, a high level online event filter will be applied to perform fast event selection based on physics properties of the reconstructed events. A trigger-less data stream implies an event selection with track reconstruction and pattern recognition to be performed online, and thus analysing data under real time conditions at event rates of up to 40 MHz.The projected data rate reduction of about three orders of magnitude requires an effective background rejection, while retaining interesting signal events. Real time event selection in the environment of hadronic reactions is rather challenging and relies on sophisticated algorithms for the software trigger. The implementation and the performance of physics trigger algorithms presently studied with realistic Monte Carlo simulations is discussed. The impact of parameters such as momentum or mass resolution, PID probability, vertex reconstruction and a multivariate analysis using the TMVA package for event filtering is presented.

  1. The CMS trigger in Run 2

    CERN Document Server

    Tosi, Mia

    2018-01-01

    During its second period of operation (Run 2) which started in 2015, the LHC will reach a peak instantaneous luminosity of approximately 2$\\times 10^{34}$~cm$^{-2}s^{-1}$ with an average pile-up of about 55, far larger than the design value. Under these conditions, the online event selection is a very challenging task. In CMS, it is realised by a two-level trigger system: the Level-1 (L1) Trigger, implemented in custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the offline reconstruction software running on a computer farm.\\\\ In order to face this challenge, the L1 trigger has undergone a major upgrade compared to Run 1, whereby all electronic boards of the system have been replaced, allowing more sophisticated algorithms to be run online. Its last stage, the global trigger, is now able to perform complex selections and to compute high-level quantities, like invariant masses. Likewise, the algorithms that run in the HLT went through big improvements; in particular, new ap...

  2. The ZEUS calorimeter first level trigger

    Science.gov (United States)

    Silverstein, S.; Ali, I.; Behrens, B.; Foudas, C.; Fordham, C.; Goussiou, A.; Jaworski, M.; Lackey, J.; Reeder, D.; Robl, P.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Dawson, J.; Krakauer, D.; Talaga, R.; Schlereth, J.; Zhang, H.

    1995-02-01

    An overview of the ZEUS calorimeter first level trigger is presented. The CFLT uses a pipelined architecture to accept and analyze calorimeter data for every 96 ns beam crossing interval. PMT signals are combined by analog electronics into electromagnetic and hadronic sums for 896 trigger towers. The analog sums are then digitized and analyzed. The CFLT determines the total, transverse, and missing transverse energy, identifies isolated electrons and muons, and sums energies in programmable subregions. Calculations are performed in 96 ns steps, and new data are accepted for every beam crossing. Trigger data are forwarded to the global first level trigger (GFLT) after 2 μs, allowing a GFLT accept to be issued 5 μs after the beam crossing which produced the event. Important features of the CFLT include a 12-bit effective dynamic range, extensive use of memory lookup tables for trigger calculations, fast pattern searches for isolated leptons, and low electronics noise. During the 1993 HERA run, the CFLT reduced a 50 kHz background rate to around 100 Hz.

  3. ELM mitigation with pellet ELM triggering and implications for PFCs and plasma performance in ITER

    NARCIS (Netherlands)

    Baylor, L.R.; Lang, P.T.; Allen, S.L.; Combs, S.K.; Commaux, N.; Evans, T.E.; Fenstermacher, M.E.; Huijsmans, G.T.A.; Jernigan, T.C.; Lasnier, C.J.; Leonard, A.W.; Loarte, A.; Maingi, R.; Maruyama, S.; Meitner, S.J.; Moyer, R.A.; Osborne, T.H.

    2015-01-01

    PLASMA-SURFACE INTERACTIONS 21 — Proceedings of the 21st International Conference on Plasma-Surface Interactions in Controlled Fusion Devices Kanazawa, Japan May 26-30, 2014 The triggering of rapid small edge localized modes (ELMs) by high frequency pellet injection has been proposed as a method to

  4. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    Science.gov (United States)

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  5. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    Directory of Open Access Journals (Sweden)

    Ali Albattat

    2016-08-01

    Full Text Available The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems. These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  6. Triggering Artefacts

    DEFF Research Database (Denmark)

    Mogensen, Preben Holst; Robinson, Mike

    1995-01-01

    and adapting them to specific situations need not be ad hoc.Triggering artefacts are a way of systematically challenging both designers' preunderstandings and the conservatism of work practice. Experiences from the Great Belt tunnel and bridge project are used to illustrate howtriggering artefacts change...

  7. Accounting for the social triggers of sexual compulsivity.

    Science.gov (United States)

    Parsons, Jeffrey T; Kelly, Brian C; Bimbi, David S; Muench, Frederick; Morgenstern, Jon

    2007-01-01

    To examine the social triggers of sexual compulsivity amongst a diverse sample of gay and bisexual men. Qualitative interviews were conducted with 180 gay and bisexual men in the United States who self-identified that their sex lives were spinning out of control. The data were analyzed using a grounded theory approach to explore the range of social triggers that were driving sexual compulsions. An open-ended interview and a structured clinical interview were conducted with each participant. The interviews examined their experiences with sexual compulsivity over time and the impact of their problematic sexual behaviors on their lives. Two types of social triggers emerged from the data: event-centered triggers and contextual triggers. Event-centered triggers arise from sudden, unforeseen events. Two major event-centered triggers were identified: relationship turmoil and catastrophes. Contextual triggers, on the other hand, have a certain element of predictability, and included such things as location, people, the use of drugs, and pornography. This framework of triggers has clinical implications for the prevention and treatment of sexual compulsivity. Clinicians can utilize the framework of social triggers in the therapeutic process to provide insight into ways to effectively work through symptoms of sexual compulsivity. Awareness of the contextual aspects of sexual compulsivity may be critical to understanding the behaviors of sexually compulsive clients. Thus, therapeutic assessments should focus upon the social context in addition to the psychological components of the disorder.

  8. The D0 run II trigger system

    International Nuclear Information System (INIS)

    Schwienhorst, Reinhard; Michigan State U.

    2004-01-01

    The D0 detector at the Fermilab Tevatron was upgraded for Run II. This upgrade included improvements to the trigger system in order to be able to handle the increased Tevatron luminosity and higher bunch crossing rates compared to Run I. The D0 Run II trigger is a highly exible system to select events to be written to tape from an initial interaction rate of about 2.5 MHz. This is done in a three-tier pipelined, buffered system. The first tier (level 1) processes fast detector pick-off signals in a hardware/firmware based system to reduce the event rate to about 1. 5kHz. The second tier (level 2) uses information from level 1 and forms simple Physics objects to reduce the rate to about 850 Hz. The third tier (level 3) uses full detector readout and event reconstruction on a filter farm to reduce the rate to 20-30 Hz. The D0 trigger menu contains a wide variety of triggers. While the emphasis is on triggering on generic lepton and jet final states, there are also trigger terms for specific final state signatures. In this document we describe the D0 trigger system as it was implemented and is currently operating in Run II

  9. Stochastic evaluation of the dynamic response and the cumulative damage of nuclear power plant piping

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hanaoka, Masaaki

    1981-01-01

    This report deals with a fundamental study concerning an evaluation of uncertainties of the nuclear piping response and cumulative damage under excess-earthquake loadings. The main purposes of this study cover following several problems. (1) Experimental estimation analysis of the uncertainties concerning the dynamic response and the cumulative failure by using piping test model. (2) Numerical simulation analysis by Monte Carlo method under the assumption that relation between restoring force and deformation is characterized by perfectly elasto-plastic one. (Checking the mathematical model.) (3) Development of the conventional uncertainty estimating method by introducing a perturbation technique based on an appropriate equivalently linearized approach. (Checking the estimation technique.) (4) An application of this method to more realistical cases. Through above mentioned procedures some important results are obtained as follows; First, fundamental statistical properties of the natural frequencies and the number of cycle to failure crack initiation are evaluated. Second, the effect of the frequency fluctuation and the yielding fluctuation are estimated and examined through Monte Carlo simulation technique. It has become clear that the yielding fluctuation gives significant effect on the piping power response up to its failure initiation. Finally some results through proposed perturbation technique are discussed. Statistical properties estimated coincide fairly well with those through numerical simulation. (author)

  10. Exploring the Role of Soil Moisture Conditions for Rainfall Triggered Landslides on Catchment Scale: the case of the Ialomita Sub Carpathians, Romania

    Science.gov (United States)

    Chitu, Zenaida; Bogaard, Thom; Adler, Mary-Jeanne; Steele-Dunne, Susan; Hrachowitz, Markus; Busuioc, Aristita; Sandric, Ionut; Istrate, Alexandru

    2014-05-01

    Like in many parts of the world, landslides represent in Romania recurrent phenomena that produce numerous damages to the infrastructure every few years. The high frequency of landslide events over the world has resulted to the development of many early warning systems that are based on the definition of rainfall thresholds triggering landslides. In Romania in particular, recent studies exploring the temporal occurrence of landslides have revealed that rainfall represents the most important triggering factor for landslides. The presence of low permeability soils and gentle slope degrees in the Ialomita Subcarpathians of Romania makes that cumulated precipitation over variable time interval and the hydraulic response of the soil plays a key role in landslides triggering. In order to identify the slope responses to rainfall events in this particular area we investigate the variability of soil moisture and its relationship to landslide events in three Subcarpathians catchments (Cricovul Dulce, Bizididel and Vulcana) by combining in situ measurements, satellite-based radiometry and hydrological modelling. For the current study, hourly soil moisture measurements from six soil moisture monitoring stations that are fitted with volumetric soil moisture sensors, temperature soil sensors and rain gauges sensors are used. Pedotransfer functions will be applied in order to infer hydraulic soil properties from soil texture sampled from 50 soil profiles. The information about spatial and temporal variability of soil moisture content will be completed with the Level 2 soil moisture products from the Soil Moisture and Ocean Salinity (SMOS) mission. A time series analysis of soil moisture is planned to be integrated to landslide and rainfall time series in order to determine a preliminary rainfall threshold triggering landslides in Ialomita Subcarpathians.

  11. A P-N Sequence Generator Using LFSR with Dual Edge Trigger Technique

    Directory of Open Access Journals (Sweden)

    Naghwal Nitin Kumar

    2016-01-01

    Full Text Available This paper represents the design and implementation of a low power 4-bit LFSR using Dual edge triggered flip flop. A linear feedback shift register (LFSR is assembled by N number of flip flops connected in series and a combinational logic generally xor gate. An LFSR can generate random number sequence which acts as cipher in cryptography. A known text encrypted over long PN sequence, in order to improve security sequence made longer ie 128 bit; require long chain of flip flop leads to more power consumption. In this paper a novel circuit of random sequence generator using dual edge triggered flip flop has been proposed. Data has been generated on every edge of flip flop instead of single edge. A DETFF-LFSR can generate random number require with less number of clock cycle, it minimizes the number of flip flop result in power saving. In this paper we concentrates on the designing of power competent Test Pattern Generator (TPG using four dual edge triggered flip-flops as the basic building block, overall there is reduction of power around 25% by using these techniques.

  12. Cumulative Effects of Micro-Hydro Development on the Fisheries of the Swan River Drainage, Montana, First Annual Progress Report (Covering Field Season July-November 1982).

    Energy Technology Data Exchange (ETDEWEB)

    Leathe, Stephen A.; Graham, Patrick J.

    1984-03-01

    This fisheries study is to determine the potential cumulative biological and economic effects of 20 small or micro-hydro-electric facilities (less than 5 megawatts) proposed to be constructed on tributaries to the Swan River, a 1738 square kilometer (671 square mile) drainage located in northwestern Montana. The study addresses portions of measure 1204 (b) (2) of the Norwthwest Power Planning Council's Columbia River Basin Fish and Wildlife Program. Aerial pre-surveys conducted during 1982 identified 102 stream reaches that may support fish populations in the Swan drainage between Swan and Lindbergh lakes. These reaches were located in 49 tributary streams and constituted 416 kilometers (258 miles) of potential fish habitat. Construction of all proposed small hydro projects would divert water from 54 kilometers (34 miles) or about 13 percent of the tributary system. Only two of the 20 proposed hydro sites did not support trout populations and most were populated by migratory bull trout and westslope cutthroat trout. Potential cumulative habitat losses that could result from dewatering of all proposed project areas were predicted using a stream reach classification scheme involving stream gradient, drainage ara, and fish population data. Preliminary results of this worst case analysis indicate that 23, 19 and 6 percent of the high quality rearing habitat for cutthroat, bull, and brook trout respectively would be lost.

  13. Upgrade of the Level-1 muon trigger of the ATLAS detector in the barrel-endcap transition region with RPC chambers

    CERN Document Server

    Massa, L; The ATLAS collaboration

    2014-01-01

    This report presents a project for the upgrade of the Level-1 muon trigger in the barrel-endcap transition region (1.01) caused by charged particles originating from secondary interactions downstream of the interaction point. After the LHC phase-1 upgrade, forseen for 2018, the Level-1 muon trigger rate would saturate the allocated bandwidth unless new measures are adopted to improve the rejection of fake triggers. ATLAS is going to improve the trigger selectivity in the region |$\\eta$|>1.3 with the addition of the New Small Wheel detector as an inner trigger plane. To obtain a similar trigger selectivity in the barrel-endcap transition region 1.0<|$\\eta$|<1.3, it is proposed to add new RPC chambers at the edge of the inner layer of the barrel muon spectrometer. These chambers will be based on a three layer structure with thinner gas gaps and electrodes with respect to the ATLAS standard and a new low-profile light-weight mechanical structure that will allow the installation in the limited available spa...

  14. DESIGN AND ANALYSIS OF STATIC RANDOM ACCESS MEMORY BY SCHMITT TRIGGER TOPOLOGY FOR LOW VOLTAGE APPLICATIONS

    Directory of Open Access Journals (Sweden)

    RUKKUMANI V.

    2016-12-01

    Full Text Available Aggressive scaling of transistor dimensions with each technology generation has resulted an increased integration density and improved device performance at the expense of increased leakage current. The Supply voltage scaling is an effective way of reducing dynamic as well as leakage power consumption. However the sensitivity of the circuit parameters increases with reduction of the supply voltage. SRAM bit- cells utilizing minimum sized transistors are susceptible to various random process variations. The Schmitt Trigger based operation gives better readconstancy as well as superior write-ability compared to the standard bitcell configurations. The proposed Schmitt Trigger based bitcells integrate a built-in feedback mechanism make the process with high tolerance. In this paper an obsolete design of a differential sensing Static Random Access Memory (SRAM bit cells for ultralow-power and ultralow-area Schmitt trigger operation is introduced. The ST bit cells incorporate a built-in feedback mechanism, provided by separate control signal if the feedback is given by the internal nodes, achieving process variation tolerance that must be used for future nano-scaled technology nodes. In this we proposed 32nm technology for designing 10T SRAM cell using Microwind.Total power about 30% is reduced due to 32 nm technology as compared to 65 nm technlology.

  15. Design studies for the Double Chooz trigger

    International Nuclear Information System (INIS)

    Cucoanes, Andi Sebastian

    2009-01-01

    The main characteristic of the neutrino mixing effect is assumed to be the coupling between the flavor and the mass eigenstates. Three mixing angles (θ 12 , θ 23 , θ 13 ) are describing the magnitude of this effect. Still unknown, θ 13 is considered very small, based on the measurement done by the CHOOZ experiment. A leading experiment will be Double Chooz, placed in the Ardennes region, on the same site as used by CHOOZ. The Double Chooz goal is the exploration of ∝80% from the currently allowed θ 13 region, by searching the disappearance of reactor antineutrinos. Double Chooz will use two similar detectors, located at different distances from the reactor cores: a near one at ∝150 m where no oscillations are expected and a far one at 1.05 km distance, close to the first minimum of the survival probability function. The measurement foresees a precise comparison of neutrino rates and spectra between both detectors. The detection mechanism is based on the inverse β-decay. The Double Chooz detectors have been designed to minimize the rate of random background. In a simplified view, two optically separated regions are considered. The target, filled with Gd-doped liquid scintillator, is the main antineutrino interaction volume. Surrounding the target, the inner veto region aims to tag the cosmogenic muon background which hits the detector. Both regions are viewed by photomultipliers. The Double Chooz trigger system has to be highly efficient for antineutrino events as well as for several types of background. The trigger analyzes discriminated signals from the central region and the inner veto photomultipliers. The trigger logic is fully programmable and can combine the input signals. The trigger conditions are based on the total energy released in event and on the PMT groups multiplicity. For redundancy, two independent trigger boards will be used for the central region, each of them receiving signals from half of the photomultipliers. A third trigger board

  16. SSC physics signatures and trigger requirements

    International Nuclear Information System (INIS)

    1985-01-01

    Strategies are considered for triggering on new physics processes on the environment of the SSC, where interaction rates will be very high and most new physics processes quite rare. The quantities available for use in the trigger at various levels are related to the signatures of possible new physics. Two examples were investigated in some detail using the ISAJET Monte Carlo program: Higgs decays to W pairs and a missing energy trigger applied to gluino pair production. In both of the examples studied in detail, it was found that workable strategies for reducing the trigger rate were obtainable which also produced acceptable efficiency for the processes of interest. In future work, it will be necessary to carry out such a program for the full spectrum of suggested new physics

  17. Cumulative effects assessment in Canada: an agenda for action and research

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, E.B.; Chan, Y.-H.; Peterson, N.M.; Constable, G.A.; Caton, R.B.; Davis, C.S.; Wallace, R.R.; Yarranton, G.A.

    1987-01-01

    This review of cumulative environmental effects assessment in Canada identified 13 sets of issues that are considered to be of particular significance to Canadians over the next decade or two. They are: long-range transport of air pollutants; urban air quality and airshed saturation, mobilization of persistent or bioaccumulated substances, climatic modification, land occupation by man-made features, habitat alienation and fragmentation, soil losses, effects of agricultural chemicals, groundwater supply reduction and contamination, increased sediment, chemical and thermal loading of freshwater and marine habitats, accelerating rates of renewable resource harvesting, and long-term containment and disposal of toxic wastes. There is a diverse set of examples in which cumulative effects have been recognized and brought under control and management, and the scientific and institutional factors that promoted a successful approach are summarized. It was confirmed that there are well-defined limitations in the degree to which project referrals and project-specific environmental impact assessments can be adapted to manage cumulative effects successfully. In general, this review confirmed the hypothesis that current approaches for both scientific analyses and institutional arrangements to manage cumulative effects remain inadequately developed in Canada. To address this weakness, action is required on improving links between ecosystems, research, and management. Recommendations are made and a research agenda is presented. 171 refs., 5 figs., 2 tabs.

  18. Simulation of the ATLAS New Small Wheel trigger

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00399900; The ATLAS collaboration

    2018-01-01

    The instantaneous luminosity of the LHC will increase up to a factor of seven with respect to the original design value to explore physics at higher energy scale. The inner station of the ATLAS muon end-cap system (Small Wheel) will be replaced by the New Small Wheel (NSW) to benefit from the high luminosity. The NSW will provide precise track-segment information to the Level-1 trigger system in order to suppress the trigger rate from fake muon tracks. This article summarizes the NSW trigger decision system and track-segment finding algorithm implemented in the trigger processor, and discusses results of performance studies on the trigger system. The results demonstrate that the NSW trigger system is capable of working with good performance satisfying the requirements.

  19. The first-level muon trigger system advances

    CERN Multimedia

    Ellis, N.

    2006-01-01

    Important advances have been made in the last few months in the first-level muon trigger, both for the barrel system and for the endcap system, in a close collaboration between the detector and trigger-electronics groups for the RPCs (Resistive-Plate Chambers) and TGCs (Thin-Gap Chambers). These trigger systems are crucial for the success of the muon-related physics programme of the experiment; events that are not triggered will be lost forever, and the trigger chambers also provide the second coordinate for the reconstruction of muons that are only measured in the bending plane by the MDT detectors. Integration and installation of the barrel muon trigger electronics on the RPC detectors is in full swing. The on-detector electronics consists of more than 800 units each of "Splitter" and "Pad" boxes which have been tested and integrated by a team of physicists, engineers and technicians from Italy and Romania. This work will continue for a further few months until the complete system has been installed and so...

  20. Progress on the Level-1 Calorimeter Trigger

    CERN Multimedia

    Eric Eisenhandler

    The Level-1 Calorimeter Trigger (L1Calo) has recently passed a number of major hurdles. The various electronic modules that make up the trigger are either in full production or are about to be, and preparations in the ATLAS pit are well advanced. L1Calo has three main subsystems. The PreProcessor converts analogue calorimeter signals to digital, associates the rather broad trigger pulses with the correct proton-proton bunch crossing, and does a final calibration in transverse energy before sending digital data streams to the two algorithmic trigger processors. The Cluster Processor identifies and counts electrons, photons and taus, and the Jet/Energy-sum Processor looks for jets and also sums missing and total transverse energy. Readout drivers allow the performance of the trigger to be monitored online and offline, and also send region-of-interest information to the Level-2 Trigger. The PreProcessor (Heidelberg) is the L1Calo subsystem with the largest number of electronic modules (124), and most of its fu...