WorldWideScience

Sample records for cumulative trigger proposal

  1. SQL Triggers Reacting on Time Events: An Extension Proposal

    Science.gov (United States)

    Behrend, Andreas; Dorau, Christian; Manthey, Rainer

    Being able to activate triggers at timepoints reached or after time intervals elapsed has been acknowledged by many authors as a valuable functionality of a DBMS. Recently, the interest in time-based triggers has been renewed in the context of data stream monitoring. However, up till now SQL triggers react to data changes only, even though research proposals and prototypes have been supporting several other event types, in particular time-based ones, since long. We therefore propose a seamless extension of the SQL trigger concept by time-based triggers, focussing on semantic issues arising from such an extension.

  2. Proposal for a level 0 calorimeter trigger system for LHCb

    CERN Document Server

    Bertin, A; Capponi, M; D'Antone, I; De Castro, S; Donà, R; Galli, D; Giacobbe, B; Marconi, U; Massa, I; Piccinini, M; Poli, M; Semprini-Cesari, N; Spighi, R; Vecchi, S; Villa, M; Vitale, A; Zoccoli, A; Zoccoli, Antonio

    1999-01-01

    In this note we present a complete system for the Level-0 LHCb calorimeter triggers. The system is derived from the electromagnetic calorimeter pre-trigger developed for the HERA-B experiment. The proposed system follows closely the Level-0 trigger algorithms presented in the LHCb Technical Proposal based on an electromagnetic and hadronic showers analysis performed on 3x3 calorimeter matrix. The general architecture presented is completely synchronous and quite flexible to allow adaptation to further improvements on the Level-0 trigger algorithms.

  3. TRIGGER

    CERN Multimedia

    W. Smith

    2012-01-01

      Level-1 Trigger The Level-1 Trigger group is ready to deploy improvements to the L1 Trigger algorithms for 2012. These include new high-PT patterns for the RPC endcap, an improved CSC PT assignment, a new PT-matching algorithm for the Global Muon Trigger, and new calibrations for ECAL, HCAL, and the Regional Calorimeter Trigger. These should improve the efficiency, rate, and stability of the L1 Trigger. The L1 Trigger group also is migrating the online systems to SLC5. To make the data transfer from the Global Calorimeter Trigger to the Global Trigger more reliable and also to allow checking the data integrity online, a new optical link system has been developed by the GCT and GT groups and successfully tested at the CMS electronics integration facility in building 904. This new system is now undergoing further tests at Point 5 before being deployed for data-taking this year. New L1 trigger menus have recently been studied and proposed by Emmanuelle Perez and the L1 Detector Performance Group...

  4. A method proposal for cumulative environmental impact assessment based on the landscape vulnerability evaluation

    International Nuclear Information System (INIS)

    Pavlickova, Katarina; Vyskupova, Monika

    2015-01-01

    Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impact significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process

  5. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The hardware of the trigger components has been mostly finished. The ECAL Endcap Trigger Concentrator Cards (TCC) are in production while Barrel TCC firmware has been upgraded, and the Trigger Primitives can now be stored by the Data Concentrator Card for readout by the DAQ. The Regional Calorimeter Trigger (RCT) system is complete, and the timing is being finalized. All 502 HCAL trigger links to RCT run without error. The HCAL muon trigger timing has been equalized with DT, RPC, CSC and ECAL. The hardware and firmware for the Global Calorimeter Trigger (GCT) jet triggers are being commissioned and data from these triggers is available for readout. The GCT energy sums from rings of trigger towers around the beam pipe beam have been changed to include two rings from both sides. The firmware for Drift Tube Track Finder, Barrel Sorter and Wedge Sorter has been upgraded, and the synchronization of the DT trigger is satisfactory. The CSC local trigger has operated flawlessly u...

  6. TRIGGER

    CERN Multimedia

    Roberta Arcidiacono

    2013-01-01

    Trigger Studies Group (TSG) The Trigger Studies Group has just concluded its third 2013 workshop, where all POGs presented the improvements to the physics object reconstruction, and all PAGs have shown their plans for Trigger development aimed at the 2015 High Level Trigger (HLT) menu. The Strategy for Trigger Evolution And Monitoring (STEAM) group is responsible for Trigger menu development, path timing, Trigger performance studies coordination, HLT offline DQM as well as HLT release, menu and conditions validation – this last task in collaboration with PdmV (Physics Data and Monte Carlo Validation group). In the last months the group has delivered several HLT rate estimates and comparisons, using the available data and Monte Carlo samples. The studies were presented at the Trigger workshops in September and December, and STEAM has contacted POGs and PAGs to understand the origin of the discrepancies observed between 8 TeV data and Monte Carlo simulations. The most recent results show what the...

  7. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The trigger synchronization procedures for running with cosmic muons and operating with the LHC were reviewed during the May electronics week. Firmware maintenance issues were also reviewed. Link tests between the new ECAL endcap trigger concentrator cards (TCC48) and the Regional Calorimeter Trigger have been performed. Firmware for the energy sum triggers and an upgraded tau trigger of the Global Calorimeter Triggers has been developed and is under test. The optical fiber receiver boards for the Track-Finder trigger theta links of the DT chambers are now all installed. The RPC trigger is being made more robust by additional chamber and cable shielding and also by firmware upgrades. For the CSC’s the front-end and trigger motherboard firmware have been updated. New RPC patterns and DT/CSC lookup tables taking into account phi asymmetries in the magnetic field configuration are under study. The motherboard for the new pipeline synchronizer of the Global Trigg...

  8. TRIGGER

    CERN Multimedia

    W. Smith

    At the March meeting, the CMS trigger group reported on progress in production, tests in the Electronics Integration Center (EIC) in Prevessin 904, progress on trigger installation in the underground counting room at point 5, USC55, the program of trigger pattern tests and vertical slice tests and planning for the Global Runs starting this summer. The trigger group is engaged in the final stages of production testing, systems integration, and software and firmware development. Most systems are delivering final tested electronics to CERN. The installation in USC55 is underway and integration testing is in full swing. A program of orderly connection and checkout with subsystems and central systems has been developed. This program includes a series of vertical subsystem slice tests providing validation of a portion of each subsystem from front-end electronics through the trigger and DAQ to data captured and stored. After full checkout, trigger subsystems will be then operated in the CMS Global Runs. Continuous...

  9. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The production of the trigger hardware is now basically finished, and in time for the turn-on of the LHC. The last boards produced are the Trigger Concentrator Cards for the ECAL Endcaps (TCC-EE). After the recent installation of the four EE Dees, the TCC-EE prototypes were used for their commissioning. Production boards are arriving and are being tested continuously, with the last ones expected in November. The Regional Calorimeter Trigger hardware is fully integrated after installation of the last EE cables. Pattern tests from the HCAL up to the GCT have been performed successfully. The HCAL triggers are fully operational, including the connection of the HCAL-outer and forward-HCAL (HO/HF) technical triggers to the Global Trigger. The HCAL Trigger and Readout (HTR) board firmware has been updated to permit recording of the tower “feature bit” in the data. The Global Calorimeter Trigger hardware is installed, but some firmware developments are still n...

  10. TRIGGER

    CERN Multimedia

    by Wesley Smith

    2010-01-01

    Level-1 Trigger Hardware and Software The overall status of the L1 trigger has been excellent and the running efficiency has been high during physics fills. The timing is good to about 1%. The fine-tuning of the time synchronization of muon triggers is ongoing and will be completed after more than 10 nb-1 of data have been recorded. The CSC trigger primitive and RPC trigger timing have been refined. A new configuration for the CSC Track Finder featured modified beam halo cuts and improved ghost cancellation logic. More direct control was provided for the DT opto-receivers. New RPC Cosmic Trigger (RBC/TTU) trigger algorithms were enabled for collision runs. There is further work planned during the next technical stop to investigate a few of the links from the ECAL to the Regional Calorimeter Trigger (RCT). New firmware and a new configuration to handle trigger rate spikes in the ECAL barrel are also being tested. A board newly developed by the tracker group (ReTRI) has been installed and activated to block re...

  11. TRIGGER

    CERN Multimedia

    W. Smith

    2010-01-01

    Level-1 Trigger Hardware and Software The Level-1 Trigger hardware has performed well during both the recent proton-proton and heavy ion running. Efforts were made to improve the visibility and handling of alarms and warnings. The tracker ReTRI boards that prevent fixed frequencies of Level-1 Triggers are now configured through the Trigger Supervisor. The Global Calorimeter Trigger (GCT) team has introduced a buffer cleanup procedure at stops and a reset of the QPLL during configuring to ensure recalibration in case of a switch from the LHC clock to the local clock. A device to test the cables between the Regional Calorimeter Trigger and the GCT has been manufactured. A wrong charge bit was fixed in the CSC Trigger. The ECAL group is improving crystal masking and spike suppression in the trigger primitives. New firmware for the Drift Tube Track Finder (DTTF) sorters was developed to improve fake track tagging and sorting. Zero suppression was implemented in the DT Sector Collector readout. The track finder b...

  12. TRIGGER

    CERN Multimedia

    Wesley Smith

    Trigger Hardware The status of the trigger components was presented during the September CMS Week and Annual Review and at the monthly trigger meetings in October and November. Procedures for cold and warm starts (e.g. refreshing of trigger parameters stored in registers) of the trigger subsystems have been studied. Reviews of parts of the Global Calorimeter Trigger (GCT) and the Global Trigger (GT) have taken place in October and November. The CERN group summarized the status of the Trigger Timing and Control (TTC) system. All TTC crates and boards are installed in the underground counting room, USC55. The central clock system will be upgraded in December (after the Global Run at the end of November GREN) to the new RF2TTC LHC machine interface timing module. Migration of subsystem's TTC PCs to SLC4/ XDAQ 3.12 is being prepared. Work is on going to unify the access to Local Timing Control (LTC) and TTC CMS interface module (TTCci) via SOAP (Simple Object Access Protocol, a lightweight XML-based messaging ...

  13. TRIGGER

    CERN Multimedia

    W. Smith from contributions of C. Leonidopoulos

    2010-01-01

    Level-1 Trigger Hardware and Software Since nearly all of the Level-1 (L1) Trigger hardware at Point 5 has been commissioned, activities during the past months focused on the fine-tuning of synchronization, particularly for the ECAL and the CSC systems, on firmware upgrades and on improving trigger operation and monitoring. Periodic resynchronizations or hard resets and a shortened luminosity section interval of 23 seconds were implemented. For the DT sector collectors, an automatic power-off was installed in case of high temperatures, and the monitoring capabilities of the opto-receivers and the mini-crates were enhanced. The DTTF and the CSCTF now have improved memory lookup tables. The HCAL trigger primitive logic implemented a new algorithm providing better stability of the energy measurement in the presence of any phase misalignment. For the Global Calorimeter Trigger, additional Source Cards have been manufactured and tested. Testing of the new tau, missing ET and missing HT algorithms is underw...

  14. TRIGGER

    CERN Multimedia

    Wesley Smith

    Level-1 Trigger Hardware and Software The final parts of the Level-1 trigger hardware are now being put in place. For the ECAL endcaps, more than half of the Trigger Concentrator Cards for the ECAL Endcap (TCC-EE) are now available at CERN, such that one complete endcap can be covered. The Global Trigger now correctly handles ECAL calibration sequences, without being influenced by backpressure. The Regional Calorimeter Trigger (RCT) hardware is complete and working in USC55. Intra-crate tests of all 18 RCT crates and the Global Calorimeter Trigger (GCT) are regularly taking place. Pattern tests have successfully captured data from HCAL through RCT to the GCT Source Cards. HB/HE trigger data are being compared with emulator results to track down the very few remaining hardware problems. The treatment of hot and dead cells, including their recording in the database, has been defined. For the GCT, excellent agreement between the emulator and data has been achieved for jets and HF ET sums. There is still som...

  15. TRIGGER

    CERN Multimedia

    W. Smith

    Level-1 Trigger Hardware and Software The trigger system has been constantly in use in cosmic and commissioning data taking periods. During CRAFT running it delivered 300 million muon and calorimeter triggers to CMS. It has performed stably and reliably. During the abort gaps it has also provided laser and other calibration triggers. Timing issues, namely synchronization and latency issues, have been solved. About half of the Trigger Concentrator Cards for the ECAL Endcap (TCC-EE) are installed, and the firmware is being worked on. The production of the other half has started. The HCAL Trigger and Readout (HTR) card firmware has been updated, and new features such as fast parallel zero-suppression have been included. Repairs of drift tube (DT) trigger mini-crates, optical links and receivers of sector collectors are under way and have been completed on YB0. New firmware for the optical receivers of the theta links to the drift tube track finder is being installed. In parallel, tests with new eta track finde...

  16. TRIGGER

    CERN Multimedia

    R. Carlin with contributions from D. Acosta

    2012-01-01

    Level-1 Trigger Data-taking continues at cruising speed, with high availability of all components of the Level-1 trigger. We have operated the trigger up to a luminosity of 7.6E33, where we approached 100 kHz using the 7E33 prescale column.  Recently, the pause without triggers in case of an automatic "RESYNC" signal (the "settle" and "recover" time) was reduced in order to minimise the overall dead-time. This may become very important when the LHC comes back with higher energy and luminosity after LS1. We are also preparing for data-taking in the proton-lead run in early 2013. The CASTOR detector will make its comeback into CMS and triggering capabilities are being prepared for this. Steps to be taken include improved cooperation with the TOTEM trigger system and using the LHC clock during the injection and ramp phases of LHC. Studies are being finalised that will have a bearing on the Trigger Technical Design Report (TDR), which is to be rea...

  17. TRIGGER

    CERN Multimedia

    W. Smith

    At the December meeting, the CMS trigger group reported on progress in production, tests in the Electronics Integration Center (EIC) in Prevessin 904, progress on trigger installation in the underground counting room at point 5, USC55, and results from the Magnet Test and Cosmic Challenge (MTCC) phase II. The trigger group is engaged in the final stages of production testing, systems integration, and software and firmware development. Most systems are delivering final tested electronics to CERN. The installation in USC55 is underway and moving towards integration testing. A program of orderly connection and checkout with subsystems and central systems has been developed. This program includes a series of vertical subsystem slice tests providing validation of a portion of each subsystem from front-end electronics through the trigger and DAQ to data captured and stored. This is combined with operations and testing without beam that will continue until startup. The plans for start-up, pilot and early running tri...

  18. TRIGGER

    CERN Multimedia

    Wesley Smith

    2011-01-01

    Level-1 Trigger Hardware and Software New Forward Scintillating Counters (FSC) for rapidity gap measurements have been installed and integrated into the Trigger recently. For the Global Muon Trigger, tuning of quality criteria has led to improvements in muon trigger efficiencies. Several subsystems have started campaigns to increase spares by recovering boards or producing new ones. The barrel muon sector collector test system has been reactivated, new η track finder boards are in production, and φ track finder boards are under revision. In the CSC track finder, an η asymmetry problem has been corrected. New pT look-up tables have also improved efficiency. RPC patterns were changed from four out of six coincident layers to three out of six in the barrel, which led to a significant increase in efficiency. A new PAC firmware to trigger on heavy stable charged particles allows looking for chamber hit coincidences in two consecutive bunch-crossings. The redesign of the L1 Trigger Emulator...

  19. TRIGGER

    CERN Multimedia

    W. Smith, from contributions of D. Acosta

    2012-01-01

      The L1 Trigger group deployed several major improvements this year. Compared to 2011, the single-muon trigger rate has been reduced by a factor of 2 and the η coverage has been restored to 2.4, with high efficiency. During the current technical stop, a higher jet seed threshold will be applied in the Global Calorimeter Trigger in order to significantly reduce the strong pile-up dependence of the HT and multi-jet triggers. The currently deployed L1 menu, with the “6E33” prescales, has a total rate of less than 100 kHz and operates with detector readout dead time of less than 3% for luminosities up to 6.5 × 1033 cm–2s–1. Further prescale sets have been created for 7 and 8 × 1033 cm–2s–1 luminosities. The L1 DPG is evaluating the performance of the Trigger for upcoming conferences and publication. Progress on the Trigger upgrade was reviewed during the May Upgrade Week. We are investigating scenarios for stagin...

  20. TRIGGER

    CERN Multimedia

    W. Smith from contributions of C. Leonidopoulos, I. Mikulec, J. Varela and C. Wulz.

    Level-1 Trigger Hardware and Software Over the past few months, the Level-1 trigger has successfully recorded data with cosmic rays over long continuous stretches as well as LHC splash events, beam halo, and collision events. The L1 trigger hardware, firmware, synchronization, performance and readiness for beam operation were reviewed in October. All L1 trigger hardware is now installed at Point 5, and most of it is completely commissioned. While the barrel ECAL Trigger Concentrator Cards are fully operational, the recently delivered endcap ECAL TCC system is still being commissioned. For most systems there is a sufficient number of spares available, but for a few systems additional reserve modules are needed. It was decided to increase the overall L1 latency by three bunch crossings to increase the safety margin for trigger timing adjustments. In order for CMS to continue data taking during LHC frequency ramps, the clock distribution tree needs to be reset. The procedures for this have been tested. A repl...

  1. TRIGGER

    CERN Multimedia

    R. Arcidiacono

    2013-01-01

      In 2013 the Trigger Studies Group (TSG) has been restructured in three sub-groups: STEAM, for the development of new HLT menus and monitoring their performance; STORM, for the development of HLT tools, code and actual configurations; and FOG, responsible for the online operations of the High Level Trigger. The Strategy for Trigger Evolution And Monitoring (STEAM) group is responsible for Trigger Menu development, path timing, trigger performance studies coordination, HLT offline DQM as well as HLT release, menu and conditions validation – in collaboration and with the technical support of the PdmV group. Since the end of proton-proton data taking, the group has started preparing for 2015 data taking, with collisions at 13 TeV and 25 ns bunch spacing. The reliability of the extrapolation to higher energy is being evaluated comparing the trigger rates on 7 and 8 TeV Monte Carlo samples with the data taken in the past two years. The effect of 25 ns bunch spacing is being studied on the d...

  2. TRIGGER

    CERN Multimedia

    W. Smith

    Level-1 Trigger Hardware and Software The road map for the final commissioning of the level-1 trigger system has been set. The software for the trigger subsystems is being upgraded to run under CERN Scientific Linux 4 (SLC4). There is also a new release for the Trigger Supervisor (TS 1.4), which implies upgrade work by the subsystems. As reported by the CERN group, a campaign to tidy the Trigger Timing and Control (TTC) racks has begun. The machine interface was upgraded by installing the new RF2TTC module, which receives RF signals from LHC Point 4. Two Beam Synchronous Timing (BST) signals, one for each beam, can now be received in CMS. The machine group will define the exact format of the information content shortly. The margin on the locking range of the CMS QPLL is planned for study for different subsystems in the next Global Runs, using a function generator. The TTC software has been successfully tested on SLC4. Some TTC subsystems have already been upgraded to SLC4. The TTCci Trigger Supervisor ...

  3. TRIGGER

    CERN Multimedia

    by Wesley Smith

    2011-01-01

    Level-1 Trigger Hardware and Software After the winter shutdown minor hardware problems in several subsystems appeared and were corrected. A reassessment of the overall latency has been made. In the TTC system shorter cables between TTCci and TTCex have been installed, which saved one bunch crossing, but which may have required an adjustment of the RPC timing. In order to tackle Pixel out-of-syncs without influencing other subsystems, a special hardware/firmware re-sync protocol has been introduced in the Global Trigger. The link between the Global Calorimeter Trigger and the Global Trigger with the new optical Global Trigger Interface and optical receiver daughterboards has been successfully tested in the Electronics Integration Centre in building 904. New firmware in the GCT now allows a setting to remove the HF towers from energy sums. The HF sleeves have been replaced, which should lead to reduced rates of anomalous signals, which may allow their inclusion after this is validated. For ECAL, improvements i...

  4. TRIGGER

    CERN Multimedia

    W. Smith

    2011-01-01

    Level-1 Trigger Hardware and Software Overall the L1 trigger hardware has been running very smoothly during the last months of proton running. Modifications for the heavy-ion run have been made where necessary. The maximal design rate of 100 kHz can be sustained without problems. All L1 latencies have been rechecked. The recently installed Forward Scintillating Counters (FSC) are being used in the heavy ion run. The ZDC scintillators have been dismantled, but the calorimeter itself remains. We now send the L1 accept signal and other control signals to TOTEM. Trigger cables from TOTEM to CMS will be installed during the Christmas shutdown, so that the TOTEM data can be fully integrated within the CMS readout. New beam gas triggers have been developed, since the BSC-based trigger is no longer usable at high luminosities. In particular, a special BPTX signal is used after a quiet period with no collisions. There is an ongoing campaign to provide enough spare modules for the different subsystems. For example...

  5. TRIGGER

    CERN Multimedia

    J. Alimena

    2013-01-01

    Trigger Strategy Group The Strategy for Trigger Evolution And Monitoring (STEAM) group is responsible for the development of future High-Level Trigger menus, as well as of its DQM and validation, in collaboration and with the technical support of the PdmV group. Taking into account the beam energy and luminosity expected in 2015, a rough estimate of the trigger rates indicates a factor four increase with respect to 2012 conditions. Assuming that a factor two can be tolerated thanks to the increase in offline storage and processing capabilities, a toy menu has been developed using the new OpenHLT workflow to estimate the transverse energy/momentum thresholds that would halve the current trigger rates. The CPU time needed to run the HLT has been compared between data taken with 25 ns and 50 ns bunch spacing, for equivalent pile-up: no significant difference was observed on the global time per event distribution at the only available data point, corresponding to a pile-up of about 10 interactions. Using th...

  6. TRIGGER

    CERN Multimedia

    W. Smith

    Level-1 Trigger Hardware The CERN group is working on the TTC system. Seven out of nine sub-detector TTC VME crates with all fibers cabled are installed in USC55. 17 Local Trigger Controller (LTC) boards have been received from production and are in the process of being tested. The RF2TTC module replacing the TTCmi machine interface has been delivered and will replace the TTCci module used to mimic the LHC clock. 11 out of 12 crates housing the barrel ECAL off-detector electronics have been installed in USC55 after commissioning at the Electronics Integration Centre in building 904. The cabling to the Regional Calorimeter Trigger (RCT) is terminated. The Lisbon group has completed the Synchronization and Link mezzanine board (SLB) production. The Palaiseau group has fully tested and installed 33 out of 40 Trigger Concentrator Cards (TCC). The seven remaining boards are being remade. The barrel TCC boards have been tested at the H4 test beam, and good agreement with emulator predictions were found. The cons...

  7. Proposal of a trigger tool to assess adverse events in dental care.

    Science.gov (United States)

    Corrêa, Claudia Dolores Trierweiler Sampaio de Oliveira; Mendes, Walter

    2017-11-21

    The aim of this study was to propose a trigger tool for research of adverse events in outpatient dentistry in Brazil. The tool was elaborated in two stages: (i) to build a preliminary set of triggers, a literature review was conducted to identify the composition of trigger tools used in other areas of health and the principal adverse events found in dentistry; (ii) to validate the preliminarily constructed triggers a panel of experts was organized using the modified Delphi method. Fourteen triggers were elaborated in a tool with explicit criteria to identify potential adverse events in dental care, essential for retrospective patient chart reviews. Studies on patient safety in dental care are still incipient when compared to other areas of health care. This study intended to contribute to the research in this field. The contribution by the literature and guidance from the expert panel allowed elaborating a set of triggers to detect adverse events in dental care, but additional studies are needed to test the instrument's validity.

  8. Proposed FPGA based tracking for a Level-1 track trigger at CMS for the HL-LHC

    CERN Document Server

    Pozzobon, Nicola

    2014-01-01

    The High Luminosity LHC (HL-LHC) is expected to deliver a luminosity in excess of $5\\times10^{34}$ cm$^{-2}$/s. The high event rate places stringent requirements on the trigger system. A key component of the CMS upgrade for the HL-LHC is a track trigger system which will identify tracks with transverse momenta above 2 GeV already at the first-level trigger within 5 $\\mu$s. This presentation will discuss a proposed track finding and fitting based on the tracklet based approach implemented on FPGAs. Tracklets are formed from pairs of hits in nearby layers in the detector and used in a road search. Summary Fast pattern recognition in Silicon trackers for triggering has often made use of Associative Memories for the pattern recognition step. We propose an alternative approach to solving the pattern recognition and track fitting problem for the upgraded CMS tracker for the HL-LHC operation. We make use of the trigger primitives,stubs, from the tracker. The stubs are formed from pairs of hits in sensors separated r...

  9. Proposal of upgrade of the ATLAS muon trigger in the barrel-endcap transition region with RPCs

    CERN Document Server

    Massa, L; The ATLAS collaboration

    2014-01-01

    This report presents a project for the upgrade of the Level-1 muon trigger in the barrel-endcap transition region (1.01) caused by charged particles originating from secondary interactions downstream of the interaction point. After the LHC upgrade forseen for 2018, the Level-1 muon trigger rate would saturate the allocated bandwidth unless new measures are adopted to improve the rejection of fake triggers. ATLAS is going to improve the trigger selectivity in the region |$\\eta$|>1.3 with the New Small Wheel detector upgrade. To obtain a similar trigger selectivity in the barrel-endcap transition region, it is proposed to add new RPC chambers at the edge of the inner layer of the barrel muon spectrometer. These chambers will be based on a three layer structure with thinner gas gaps and electrodes with respect to the ATLAS standard and a new low-profile light-weight mechanical structure that will allow the installation in the limited available space. New front-end electronics, integrating fast TDC capabilities w...

  10. A proposed DT-seeded Muon Track Trigger for the HL-LHC

    CERN Document Server

    CMS Collaboration

    2015-01-01

    The LHC program after the observation of the candidate SM Higgs boson will continue with collisions at 13 and 14 TeV, which will help clarify future subjects of study and shape the tools needed to carry them on. Any upgrade of the LHC experiments for unprecedented luminosities, such as the HL-LHC ones, must then maintain the acceptance on electroweak processes that can lead to a detailed study of the properties of the candidate Higgs boson. The acceptance of the key leptonic, photonic and hadronic trigger objects should be kept such that the overall physics acceptance, in particular for low-mass scale processes, can be the same as the one the experiments featured in 2012. In such a scenario, a new approach to early trigger implementation is needed. One of the major steps to be taken is the exploitation of high-granularity tracking sub-detectors, such as the CMS Silicon Tracker, in taking the early trigger decision. Their inclusion into the trigger chain can be crucial in several tasks, including the confirmat...

  11. Proposal for a model to assess the effect of seismic activity on the triggering of debris flows

    Science.gov (United States)

    Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Luna, Byron Quan; Nadim, Farrokh

    2013-04-01

    Landslide triggered by earthquakes is a serious threat for many communities around the world, and in some cases is known to have caused 25-50% of the earthquake fatalities. Seismic shaking can contribute to the triggering of debris flows either during the seismic event or indirectly by increasing the susceptibility of the slope to debris flow during intense rainfall in a period after the seismic event. The paper proposes a model to quantify both these effects. The model is based on an infinite slope formulation where precipitation and earthquakes influence the slope stability as follows: (1) During the shaking, the factor of safety is reduced due to cyclic pore pressure build-up where the cyclic pore pressure is modelled as a function of earthquake duration and intensity (measured as number of equivalent shear stress cycles and cyclic shear stress magnitude) and in-situ soil conditions (measured as average normalised shear stress). The model is calibrated using cyclic triaxial and direct simple shear (DSS) test data on clay and sand. (2) After the shaking, the factor of safety is modified using a combined empirical and analytical model that links observed earthquake induced changes in rainfall thresholds for triggering of debris flow to an equivalent reduction in soil shear strength. The empirical part uses data from past earthquakes to propose a conceptual model linking a site-specific reduction factor for rainfall intensity threshold (needed to trigger debris flows) to earthquake magnitude, distance from the epicentre and time period after the earthquake. The analytical part is a hydrological model for transient rainfall infiltration into an infinite slope in order to translate the change in rainfall intensity threshold into an equivalent reduction in soil shear strength. This is generalised into a functional form giving a site-specific shear strength reduction factor as function of earthquake history and soil conditions. The model is suitable for hazard and risk

  12. The LHCb trigger

    International Nuclear Information System (INIS)

    Korolko, I.

    1998-01-01

    This paper describes progress in the development of the LHCb trigger system since the letter of intent. The trigger philosophy has significantly changed, resulting in an increase of trigger efficiency for signal B events. It is proposed to implement a level-1 vertex topology trigger in specialised hardware. (orig.)

  13. A proposed Drift Tubes-seeded muon track trigger for the CMS experiment at the High Luminosity-LHC

    CERN Document Server

    AUTHOR|(CDS)2070813; Lazzizzera, Ignazio; Vanini, Sara; Zotto, Pierluigi

    2016-01-01

    The LHC program at 13 and 14 TeV, after the observation of the candidate SM Higgs boson, will help clarify future subjects of study and shape the needed tools. Any upgrade of the LHC experiments for unprecedented luminosities, such as the High Luminosity-LHC ones, must then maintain the acceptance on electroweak processes that can lead to a detailed study of the properties of the candidate Higgs boson. The acceptance of the key lepton, photon and hadron triggers should be kept such that the overall physics acceptance, in particular for low-mass scale processes, can be the same as the one the experiments featured in 2012.In such a scenario, a new approach to early trigger implementation is needed. One of the major steps will be the inclusion of high-granularity tracking sub-detectors, such as the CMS Silicon Tracker, in taking the early trigger decision. This contribution can be crucial in several tasks, including the confirmation of triggers in other subsystems, and the improvement of the on-line momentum mea...

  14. The challenge of cumulative impacts

    Energy Technology Data Exchange (ETDEWEB)

    Masden, Elisabeth

    2011-07-01

    Full text: As governments pledge to combat climate change, wind turbines are becoming a common feature of terrestrial and marine environments. Although wind power is a renewable energy source and a means of reducing carbon emissions, there is a need to ensure that the wind farms themselves do not damage the environment. There is particular concern over the impacts of wind farms on bird populations, and with increasing numbers of wind farm proposals, the concern focuses on cumulative impacts. Individually, a wind farm, or indeed any activity/action, may have minor effects on the environment, but collectively these may be significant, potentially greater than the sum of the individual parts acting alone. Cumulative impact assessment is a legislative requirement of environmental impact assessment but such assessments are rarely adequate restricting the acquisition of basic knowledge about the cumulative impacts of wind farms on bird populations. Reasons for this are numerous but a recurring theme is the lack of clear definitions and guidance on how to perform cumulative assessments. Here we present a conceptual framework and include illustrative examples to demonstrate how the framework can be used to improve the planning and execution of cumulative impact assessments. The core concept is that explicit definitions of impacts, actions and scales of assessment are required to reduce uncertainty in the process of assessment and improve communication between stake holders. Only when it is clear what has been included within a cumulative assessment, is it possible to make comparisons between developments. Our framework requires improved legislative guidance on the actions to include in assessments, and advice on the appropriate baselines against which to assess impacts. Cumulative impacts are currently considered on restricted scales (spatial and temporal) relating to individual development assessments. We propose that benefits would be gained from elevating cumulative

  15. Cumulative Poisson Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  16. Activation of shared identities on the Uruguayan and Brazilian frontier triggered by the construction of Uruguay as a consumption proposal

    Directory of Open Access Journals (Sweden)

    Roberta Brandalise

    2016-12-01

    Full Text Available Estudamos a participação do jornalismo brasileiro na articulação de representações sociais e de identidades culturais na fronteira Brasil-Uruguai, a partir das apropriações e usos que brasileiros e uruguaios fronteiriços fizeram de narrativas noticiosas sobre o Uruguai e que eles consideraram relevantes em seu cotidiano. Realizamos o estudo de caso sob a perspectiva dos Estudos Culturais Britânicos e Latino-Americanos, com entrevistas semiestruturadas junto a uma amostra de 12 fronteiriços. O Uruguai foi caracterizado no discurso jornalístico como uma proposta de consumo, como destino turístico ou em função de seus produtos e recursos naturais. As narrativas noticiosas colaboraram para reforçar a identificação entre uruguaios e brasileiros com respeito ao pertencimento regional e ao estilo de vida.   PALAVRAS-CHAVE: Comunicação; Consumo; Fronteiras; Uruguai; Brasil.     ABSTRACT We studied the participation of Brazilian journalism in the articulation of social representations and of cultural identities on the Brazilian and Uruguayan border based on the news narratives about Uruguay that frontier Brazilians and Uruguayans appropriated, made use of and considered relevant in their everyday lives. We conducted the study case from the perspective of British and Latin American social studies, using semi-structured interviews for questioning 12 frontier people. Uruguay was featured in newspaper discourse as a consumption proposal, a tourist destination, or a stand for its products and natural resources. The narrative news helped to reinforce the mutual identification of Uruguayans and Brazilians with respect to their belonging to the region and their life style.   KEYWORDS: Communication; Consumption; Frontiers; Uruguay; Brazil.     RESUMEN Estudiamos la participación del periodismo brasileño en la articulación de representaciones sociales y de identidades culturales en la frontera Brasil-Uruguay, a partir de las

  17. Trigger finger

    Science.gov (United States)

    ... digit; Trigger finger release; Locked finger; Digital flexor tenosynovitis ... cut or hand Yellow or green drainage from the cut Hand pain or discomfort Fever If your trigger finger returns, call your surgeon. You may need another surgery.

  18. The methodological proposal of photography as a discharger of the trigger of memory: application to Telêmaco Borba s history (1950-1969

    Directory of Open Access Journals (Sweden)

    Juliana de Oliveira Teixeira

    2014-07-01

    Full Text Available This work has the objectives of testing and systematizing the methodological proposal of photography as a discharger of the trigger of memory, a technique that combines photographic images with oral history. The method, developed by the group Communication and History of Universidade Estadual de Londrina, was formalized in the dissertation of Maria Luisa Hoffmann (2010 and, since then, it has been applied to cities with recent histories. To do a relevant test in this dissertation, Telêmaco Borba(PR was chosen as field of study, and the precepts of empiricism in communication were respected, following the ideas of Maria Immacolata Vassallo Lopes (2010 and Luiz Claudio Martino (2010. The application of the method was also based theoretically, using as references the works of Jacques Le Goff (2003, Ecléa Bosi (2009 and Boris Kossoy (2009. During the empirical process, nine pioneers of Telêmaco Borba were submitted to the methodological proposal, using 17 old photographs of the city. Roughly, the results of the test show that the technique, when applied with the epistemological criteria of science, becomes an efficient empirical tool, capable of bringing new data and information to the studies of memory and to the history of the studied cities.

  19. Divergent Cumulative Cultural Evolution

    OpenAIRE

    Marriott, Chris; Chebib, Jobran

    2016-01-01

    Divergent cumulative cultural evolution occurs when the cultural evolutionary trajectory diverges from the biological evolutionary trajectory. We consider the conditions under which divergent cumulative cultural evolution can occur. We hypothesize that two conditions are necessary. First that genetic and cultural information are stored separately in the agent. Second cultural information must be transferred horizontally between agents of different generations. We implement a model with these ...

  20. About the cumulants of periodic signals

    Science.gov (United States)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  1. Cumulative cultural learning: Development and diversity

    Science.gov (United States)

    2017-01-01

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children’s learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission—the cornerstone of human cultural diversity. PMID:28739945

  2. Cumulative cultural learning: Development and diversity.

    Science.gov (United States)

    Legare, Cristine H

    2017-07-24

    The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children's learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission-the cornerstone of human cultural diversity.

  3. Lessons from (triggered) tremor

    Science.gov (United States)

    Gomberg, Joan

    2010-01-01

    I test a “clock-advance” model that implies triggered tremor is ambient tremor that occurs at a sped-up rate as a result of loading from passing seismic waves. This proposed model predicts that triggering probability is proportional to the product of the ambient tremor rate and a function describing the efficacy of the triggering wave to initiate a tremor event. Using data mostly from Cascadia, I have compared qualitatively a suite of teleseismic waves that did and did not trigger tremor with ambient tremor rates. Many of the observations are consistent with the model if the efficacy of the triggering wave depends on wave amplitude. One triggered tremor observation clearly violates the clock-advance model. The model prediction that larger triggering waves result in larger triggered tremor signals also appears inconsistent with the measurements. I conclude that the tremor source process is a more complex system than that described by the clock-advance model predictions tested. Results of this and previous studies also demonstrate that (1) conditions suitable for tremor generation exist in many tectonic environments, but, within each, only occur at particular spots whose locations change with time; (2) any fluid flow must be restricted to less than a meter; (3) the degree to which delayed failure and secondary triggering occurs is likely insignificant; and 4) both shear and dilatational deformations may trigger tremor. Triggered and ambient tremor rates correlate more strongly with stress than stressing rate, suggesting tremor sources result from time-dependent weakening processes rather than simple Coulomb failure.

  4. Trigger Finger

    Science.gov (United States)

    ... in a bent position. People whose work or hobbies require repetitive gripping actions are at higher risk ... developing trigger finger include: Repeated gripping. Occupations and hobbies that involve repetitive hand use and prolonged gripping ...

  5. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  6. Cumulation of light nuclei

    International Nuclear Information System (INIS)

    Baldin, A.M.; Bondarev, V.K.; Golovanov, L.B.

    1977-01-01

    Limit fragmentation of light nuclei (deuterium, helium) bombarded with 8,6 GeV/c protons was investigated. Fragments (pions, protons and deuterons) were detected within the emission angle 50-150 deg with regard to primary protons and within the pulse range 150-180 MeV/c. By the kinematics of collision of a primary proton with a target at rest the fragments observed correspond to a target mass upto 3 GeV. Thus, the data obtained correspond to teh cumulation upto the third order

  7. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  8. Multiparty correlation measure based on the cumulant

    International Nuclear Information System (INIS)

    Zhou, D. L.; Zeng, B.; Xu, Z.; You, L.

    2006-01-01

    We propose a genuine multiparty correlation measure for a multiparty quantum system as the trace norm of the cumulant of the state. The legitimacy of our multiparty correlation measure is explicitly demonstrated by proving it satisfies the five basic conditions required for a correlation measure. As an application we construct an efficient algorithm for the calculation of our measures for all stabilizer states

  9. Triggering Artefacts

    DEFF Research Database (Denmark)

    Mogensen, Preben Holst; Robinson, Mike

    1995-01-01

    and adapting them to specific situations need not be ad hoc.Triggering artefacts are a way of systematically challenging both designers' preunderstandings and the conservatism of work practice. Experiences from the Great Belt tunnel and bridge project are used to illustrate howtriggering artefacts change...

  10. Cumulative environmental effects. Summary

    International Nuclear Information System (INIS)

    2012-01-01

    This report presents a compilation of knowledge about the state of the environment and human activity in the Norwegian part of the North Sea and Skagerrak. The report gives an overview of pressures and impacts on the environment from normal activity and in the event of accidents. This is used to assess the cumulative environmental effects, which factors have most impact and where the impacts are greatest, and to indicate which problems are expected to be most serious in the future. The report is intended to provide relevant information that can be used in the management of the marine area in the future. It also provides input for the identification of environmental targets and management measures for the North Sea and Skagerrak.(Author)

  11. Cumulative environmental effects. Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    This report presents a compilation of knowledge about the state of the environment and human activity in the Norwegian part of the North Sea and Skagerrak. The report gives an overview of pressures and impacts on the environment from normal activity and in the event of accidents. This is used to assess the cumulative environmental effects, which factors have most impact and where the impacts are greatest, and to indicate which problems are expected to be most serious in the future. The report is intended to provide relevant information that can be used in the management of the marine area in the future. It also provides input for the identification of environmental targets and management measures for the North Sea and Skagerrak.(Author)

  12. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  13. A Framework for Treating Cumulative Trauma with Art Therapy

    Science.gov (United States)

    Naff, Kristina

    2014-01-01

    Cumulative trauma is relatively undocumented in art therapy practice, although there is growing evidence that art therapy provides distinct benefits for resolving various traumas. This qualitative study proposes an art therapy treatment framework for cumulative trauma derived from semi-structured interviews with three art therapists and artistic…

  14. Online Scheduling in Manufacturing A Cumulative Delay Approach

    CERN Document Server

    Suwa, Haruhiko

    2013-01-01

    Online scheduling is recognized as the crucial decision-making process of production control at a phase of “being in production" according to the released shop floor schedule. Online scheduling can be also considered as one of key enablers to realize prompt capable-to-promise as well as available-to-promise to customers along with reducing production lead times under recent globalized competitive markets. Online Scheduling in Manufacturing introduces new approaches to online scheduling based on a concept of cumulative delay. The cumulative delay is regarded as consolidated information of uncertainties under a dynamic environment in manufacturing and can be collected constantly without much effort at any points in time during a schedule execution. In this approach, the cumulative delay of the schedule has the important role of a criterion for making a decision whether or not a schedule revision is carried out. The cumulative delay approach to trigger schedule revisions has the following capabilities for the ...

  15. Cumulative radiation effect

    International Nuclear Information System (INIS)

    Kirk, J.; Gray, W.M.; Watson, E.R.

    1977-01-01

    In five previous papers, the concept of Cumulative Radiation Effect (CRE) has been presented as a scale of accumulative sub-tolerance radiation damage, with a unique value of the CRE describing a specific level of radiation effect. Simple nomographic and tabular methods for the solution of practical problems in radiotherapy are now described. An essential feature of solving a CRE problem is firstly to present it in a concise and readily appreciated form, and, to do this, nomenclature has been introduced to describe schedules and regimes as compactly as possible. Simple algebraic equations have been derived to describe the CRE achieved by multi-schedule regimes. In these equations, the equivalence conditions existing at the junctions between schedules are not explicit and the equations are based on the CREs of the constituent schedules assessed individually without reference to their context in the regime as a whole. This independent evaluation of CREs for each schedule has resulted in a considerable simplification in the calculation of complex problems. The calculations are further simplified by the use of suitable tables and nomograms, so that the mathematics involved is reduced to simple arithmetical operations which require at the most the use of a slide rule but can be done by hand. The order of procedure in the presentation and calculation of CRE problems can be summarised in an evaluation procedure sheet. The resulting simple methods for solving practical problems of any complexity on the CRE-system are demonstrated by a number of examples. (author)

  16. Cumulative radiation effect

    International Nuclear Information System (INIS)

    Kirk, J.; Cain, O.; Gray, W.M.

    1977-01-01

    Cumulative Radiation Effect (CRE) represents a scale of accumulative sub-tolerance radiation damage, with a unique value of the CRE describing a specific level of radiation effect. Computer calculations have been used to simplify the evaluation of problems associated with the applications of the CRE-system in radiotherapy. In a general appraisal of the applications of computers to the CRE-system, the various problems encountered in clinical radiotherapy have been categorised into those involving the evaluation of a CRE at a point in tissue and those involving the calculation of CRE distributions. As a general guide, the computer techniques adopted at the Glasgow Institute of Radiotherapeutics for the solution of CRE problems are presented, and consist basically of a package of three interactive programs for point CRE calculations and a Fortran program which calculates CRE distributions for iso-effect treatment planning. Many examples are given to demonstrate the applications of these programs, and special emphasis has been laid on the problem of treating a point in tissue with different doses per fraction on alternate treatment days. The wide range of possible clinical applications of the CRE-system has been outlined and described under the categories of routine clinical applications, retrospective and prospective surveys of patient treatment, and experimental and theoretical research. Some of these applications such as the results of surveys and studies of time optimisation of treatment schedules could have far-reaching consequences and lead to significant improvements in treatment and cure rates with the minimum damage to normal tissue. (author)

  17. Secant cumulants and toric geometry

    NARCIS (Netherlands)

    Michalek, M.; Oeding, L.; Zwiernik, P.W.

    2012-01-01

    We study the secant line variety of the Segre product of projective spaces using special cumulant coordinates adapted for secant varieties. We show that the secant variety is covered by open normal toric varieties. We prove that in cumulant coordinates its ideal is generated by binomial quadrics. We

  18. Trigger circuit

    International Nuclear Information System (INIS)

    Verity, P.R.; Chaplain, M.D.; Turner, G.D.J.

    1984-01-01

    A monostable trigger circuit comprises transistors TR2 and TR3 arranged with their collectors and bases interconnected. The collector of the transistor TR2 is connected to the base of transistor TR3 via a capacitor C2 the main current path of a grounded base transistor TR1 and resistive means R2,R3. The collector of transistor TR3 is connected to the base of transistor TR2 via resistive means R6, R7. In the stable state all the transistors are OFF, the capacitor C2 is charged, and the output is LOW. A positive pulse input to the base of TR2 switches it ON, which in turn lowers the voltage at points A and B and so switches TR1 ON so that C2 can discharge via R2, R3, which in turn switches TR3 ON making the output high. Thus all three transistors are latched ON. When C2 has discharged sufficiently TR1 switches OFF, followed by TR3 (making the output low again) and TR2. The components C1, C3 and R4 serve to reduce noise, and the diode D1 is optional. (author)

  19. Multi-objective optimization of MOSFETs channel widths and supply voltage in the proposed dual edge-triggered static D flip-flop with minimum average power and delay by using fuzzy non-dominated sorting genetic algorithm-II.

    Science.gov (United States)

    Keivanian, Farshid; Mehrshad, Nasser; Bijari, Abolfazl

    2016-01-01

    D Flip-Flop as a digital circuit can be used as a timing element in many sophisticated circuits. Therefore the optimum performance with the lowest power consumption and acceptable delay time will be critical issue in electronics circuits. The newly proposed Dual-Edge Triggered Static D Flip-Flop circuit layout is defined as a multi-objective optimization problem. For this, an optimum fuzzy inference system with fuzzy rules is proposed to enhance the performance and convergence of non-dominated sorting Genetic Algorithm-II by adaptive control of the exploration and exploitation parameters. By using proposed Fuzzy NSGA-II algorithm, the more optimum values for MOSFET channel widths and power supply are discovered in search space than ordinary NSGA types. What is more, the design parameters involving NMOS and PMOS channel widths and power supply voltage and the performance parameters including average power consumption and propagation delay time are linked. To do this, the required mathematical backgrounds are presented in this study. The optimum values for the design parameters of MOSFETs channel widths and power supply are discovered. Based on them the power delay product quantity (PDP) is 6.32 PJ at 125 MHz Clock Frequency, L = 0.18 µm, and T = 27 °C.

  20. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  1. Evolutionary neural network modeling for software cumulative failure time prediction

    International Nuclear Information System (INIS)

    Tian Liang; Noore, Afzel

    2005-01-01

    An evolutionary neural network modeling approach for software cumulative failure time prediction based on multiple-delayed-input single-output architecture is proposed. Genetic algorithm is used to globally optimize the number of the delayed input neurons and the number of neurons in the hidden layer of the neural network architecture. Modification of Levenberg-Marquardt algorithm with Bayesian regularization is used to improve the ability to predict software cumulative failure time. The performance of our proposed approach has been compared using real-time control and flight dynamic application data sets. Numerical results show that both the goodness-of-fit and the next-step-predictability of our proposed approach have greater accuracy in predicting software cumulative failure time compared to existing approaches

  2. The STAR Level-3 trigger system

    International Nuclear Information System (INIS)

    Adler, C.; Berger, J.; Demello, M.; Dietel, T.; Flierl, D.; Landgraf, J.; Lange, J.S.; LeVine, M.J.; Ljubicic, A.; Nelson, J.; Roehrich, D.; Stock, R.; Struck, C.; Yepes, P.

    2003-01-01

    The STAR Level-3 trigger issues a trigger decision upon a complete online reconstruction of Au+Au collisions at relativistic heavy ion collider energies. Central interactions are processed up to a rate of 50 s -1 including a simple analysis of physics observables. The setup of the processor farm and the event reconstruction as well as experiences and the proposed trigger algorithms are described

  3. The Algebra of the Cumulative Percent Operation.

    Science.gov (United States)

    Berry, Andrew J.

    2002-01-01

    Discusses how to help students avoid some pervasive reasoning errors in solving cumulative percent problems. Discusses the meaning of ."%+b%." the additive inverse of ."%." and other useful applications. Emphasizes the operational aspect of the cumulative percent concept. (KHR)

  4. Adaptive strategies for cumulative cultural learning.

    Science.gov (United States)

    Ehn, Micael; Laland, Kevin

    2012-05-21

    The demographic and ecological success of our species is frequently attributed to our capacity for cumulative culture. However, it is not yet known how humans combine social and asocial learning to generate effective strategies for learning in a cumulative cultural context. Here we explore how cumulative culture influences the relative merits of various pure and conditional learning strategies, including pure asocial and social learning, critical social learning, conditional social learning and individual refiner strategies. We replicate the Rogers' paradox in the cumulative setting. However, our analysis suggests that strategies that resolved Rogers' paradox in a non-cumulative setting may not necessarily evolve in a cumulative setting, thus different strategies will optimize cumulative and non-cumulative cultural learning. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Cumulative Environmental Management Association : Wood Buffalo Region

    International Nuclear Information System (INIS)

    Friesen, B.

    2001-01-01

    The recently announced oil sands development of the Wood Buffalo Region in Alberta was the focus of this power point presentation. Both mining and in situ development is expected to total $26 billion and 2.6 million barrels per day of bitumen production. This paper described the economic, social and environmental challenges facing the resource development of this region. In addition to the proposed oil sands projects, this region will accommodate the needs of conventional oil and gas production, forestry, building of pipelines and power lines, municipal development, recreation, tourism, mining exploration and open cast mining. The Cumulative Environmental Management Association (CEMA) was inaugurated as a non-profit association in April 2000, and includes 41 members from all sectors. Its major role is to ensure a sustainable ecosystem and to avoid any cumulative impacts on wildlife. Other work underway includes the study of soil and plant species diversity, and the effects of air emissions on human health, wildlife and vegetation. The bioaccumulation of heavy metals and their impacts on surface water and fish is also under consideration to ensure the quality and quantity of surface water and ground water. 3 figs

  6. The Central Trigger Processor (CTP)

    CERN Multimedia

    Franchini, Matteo

    2016-01-01

    The Central Trigger Processor (CTP) receives trigger information from the calorimeter and muon trigger processors, as well as from other sources of trigger. It makes the Level-1 decision (L1A) based on a trigger menu.

  7. 32 CFR 651.16 - Cumulative impacts.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Cumulative impacts. 651.16 Section 651.16... § 651.16 Cumulative impacts. (a) NEPA analyses must assess cumulative effects, which are the impact on the environment resulting from the incremental impact of the action when added to other past, present...

  8. Probabilistic clustering of rainfall condition for landslide triggering

    Science.gov (United States)

    Rossi, Mauro; Luciani, Silvia; Cesare Mondini, Alessandro; Kirschbaum, Dalia; Valigi, Daniela; Guzzetti, Fausto

    2013-04-01

    Landslides are widespread natural and man made phenomena. They are triggered by earthquakes, rapid snow melting, human activities, but mostly by typhoons and intense or prolonged rainfall precipitations. In Italy mostly they are triggered by intense precipitation. The prediction of landslide triggered by rainfall precipitations over large areas is commonly based on the exploitation of empirical models. Empirical landslide rainfall thresholds are used to identify rainfall conditions for the possible landslide initiation. It's common practice to define rainfall thresholds by assuming a power law lower boundary in the rainfall intensity-duration or cumulative rainfall-duration space above which landslide can occur. The boundary is defined considering rainfall conditions associated to landslide phenomena using heuristic approaches, and doesn't consider rainfall events not causing landslides. Here we present a new fully automatic method to identify the probability of landslide occurrence associated to rainfall conditions characterized by measures of intensity or cumulative rainfall and rainfall duration. The method splits the rainfall events of the past in two groups: a group of events causing landslides and its complementary, then estimate their probabilistic distributions. Next, the probabilistic membership of the new event to one of the two clusters is estimated. The method doesn't assume a priori any threshold model, but simple exploits the real empirical distribution of rainfall events. The approach was applied in the Umbria region, Central Italy, where a catalogue of landslide timing, were obtained through the search of chronicles, blogs and other source of information in the period 2002-2012. The approach was tested using rain gauge measures and satellite rainfall estimates (NASA TRMM-v6), allowing in both cases the identification of the rainfall condition triggering landslides in the region. Compared to the other existing threshold definition methods, the prosed

  9. Event-triggered attitude control of spacecraft

    Science.gov (United States)

    Wu, Baolin; Shen, Qiang; Cao, Xibin

    2018-02-01

    The problem of spacecraft attitude stabilization control system with limited communication and external disturbances is investigated based on an event-triggered control scheme. In the proposed scheme, information of attitude and control torque only need to be transmitted at some discrete triggered times when a defined measurement error exceeds a state-dependent threshold. The proposed control scheme not only guarantees that spacecraft attitude control errors converge toward a small invariant set containing the origin, but also ensures that there is no accumulation of triggering instants. The performance of the proposed control scheme is demonstrated through numerical simulation.

  10. A paradox of cumulative culture.

    Science.gov (United States)

    Kobayashi, Yutaka; Wakano, Joe Yuichiro; Ohtsuki, Hisashi

    2015-08-21

    Culture can grow cumulatively if socially learnt behaviors are improved by individual learning before being passed on to the next generation. Previous authors showed that this kind of learning strategy is unlikely to be evolutionarily stable in the presence of a trade-off between learning and reproduction. This is because culture is a public good that is freely exploited by any member of the population in their model (cultural social dilemma). In this paper, we investigate the effect of vertical transmission (transmission from parents to offspring), which decreases the publicness of culture, on the evolution of cumulative culture in both infinite and finite population models. In the infinite population model, we confirm that culture accumulates largely as long as transmission is purely vertical. It turns out, however, that introduction of even slight oblique transmission drastically reduces the equilibrium level of culture. Even more surprisingly, if the population size is finite, culture hardly accumulates even under purely vertical transmission. This occurs because stochastic extinction due to random genetic drift prevents a learning strategy from accumulating enough culture. Overall, our theoretical results suggest that introducing vertical transmission alone does not really help solve the cultural social dilemma problem. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. On the mechanism of hadron cumulative production on nucleus

    International Nuclear Information System (INIS)

    Efremov, A.V.

    1976-01-01

    A mechanism of cumulative production of hadrons on nucleus is proposed which is similar to that of high perpendicular hadron production. The cross section obtained describes the main qualitative features of such prosesses, e.g., initial energy dependence atomic number behaviour, dependence on the rest mass of the produced particle and its production angle

  12. Analysis of sensory ratings data with cumulative link models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Brockhoff, Per B.

    2013-01-01

    Examples of categorical rating scales include discrete preference, liking and hedonic rating scales. Data obtained on these scales are often analyzed with normal linear regression methods or with omnibus Pearson chi2 tests. In this paper we propose to use cumulative link models that allow for reg...

  13. Review Document: Full Software Trigger

    CERN Document Server

    Albrecht, J; Raven, G

    2014-01-01

    This document presents a trigger system for the upgraded LHCb detector, scheduled to begin operation in 2020. This document serves as input for the internal review towards the "DAQ, online and trigger TDR". The proposed trigger system is implemented entirely in software. In this document we show that track reconstruction of a similar quality to that available in the offline algorithms can be performed on the full inelastic $pp$-collision rate, without prior event selections implemented in custom hardware and without relying upon a partial event reconstruction. A track nding eciency of 98.8 % relative to oine can be achieved for tracks with $p_T >$ 500 MeV/$c$. The CPU time required for this reconstruction is about 40 % of the available budget. Proof-of-principle selections are presented which demonstrate that excellent performance is achievable using an inclusive beauty trigger, in addition to exclusive beauty and charm triggers. Finally, it is shown that exclusive beauty and charm selections that do not intr...

  14. BAT Triggering Performance

    Science.gov (United States)

    McLean, Kassandra M.; Fenimore, E. E.; Palmer, D. M.; BAT Team

    2006-09-01

    The Burst Alert Telescope (BAT) onboard Swift has detected and located about 160 gamma-ray bursts (GRBs) in its first twenty months of operation. BAT employs two triggering systems to find GRBs: image triggering, which looks for a new point source in the field of view, and rate triggering, which looks for a significant increase in the observed counts. The image triggering system looks at 1 minute, 5 minute, and full pointing accumulations of counts in the detector plane in the energy range of 15-50 keV, with about 50 evaluations per pointing (about 40 minutes). The rate triggering system looks through 13 different time scales (from 4ms to 32s), 4 overlapping energy bins (covering 15-350 keV), 9 regions of the detector plane (from the full plane to individual quarters), and two background sampling models to search for GRBs. It evaluates 27000 trigger criteria in a second, for close to 1000 criteria. The image triggering system looks at 1, 5, and 40 minute accumulations of counts in the detector plane in the energy range of 15-50 keV. Both triggering systems are working very well with the settings from before launch and after we turned on BAT. However, we now have more than a year and a half of data to evaluate these triggering systems and tweak them for optimal performance, as well as lessons learned from these triggering systems.

  15. Turning stumbling blocks into stepping stones in the analysis of cumulative impacts

    Science.gov (United States)

    Leslie M. Reid

    2004-01-01

    Federal and state legislation, such as the National Environmental Policy Act and the California Environmental Quality Act, require that responsible agency staff consider the cumulative impacts of proposed activities before permits are issued for certain kinds of public or private projects. The Council on Environmental Quality (CEQ 1997) defined a cumulative impact as...

  16. Stay away from asthma triggers

    Science.gov (United States)

    Asthma triggers - stay away from; Asthma triggers - avoiding; Reactive airway disease - triggers; Bronchial asthma - triggers ... clothes. They should leave the coat outside or away from your child. Ask people who work at ...

  17. Cumulative trauma disorders: A review.

    Science.gov (United States)

    Iqbal, Zaheen A; Alghadir, Ahmad H

    2017-08-03

    Cumulative trauma disorder (CTD) is a term for various injuries of the musculoskeletal and nervous systems that are caused by repetitive tasks, forceful exertions, vibrations, mechanical compression or sustained postures. Although there are many studies citing incidence of CTDs, there are fewer articles about its etiology, pathology and management. The aim of our study was to discuss the etiology, pathogenesis, prevention and management of CTDs. A literature search was performed using various electronic databases. The search was limited to articles in English language pertaining to randomized clinical trials, cohort studies and systematic reviews of CTDs. A total of 180 papers were identified to be relevant published since 1959. Out of these, 125 papers reported about its incidence and 50 about its conservative treatment. Workplace environment, same task repeatability and little variability, decreased time for rest, increase in expectations are major factors for developing CTDs. Prevention of its etiology and early diagnosis can be the best to decrease its incidence and severity. For effective management of CTDs, its treatment should be divided into Primordial, Primary, Secondary and Tertiary prevention.

  18. Complete cumulative index (1963-1983)

    International Nuclear Information System (INIS)

    1983-01-01

    This complete cumulative index covers all regular and special issues and supplements published by Atomic Energy Review (AER) during its lifetime (1963-1983). The complete cumulative index consists of six Indexes: the Index of Abstracts, the Subject Index, the Title Index, the Author Index, the Country Index and the Table of Elements Index. The complete cumulative index supersedes the Cumulative Indexes for Volumes 1-7: 1963-1969 (1970), and for Volumes 1-10: 1963-1972 (1972); this Index also finalizes Atomic Energy Review, the publication of which has recently been terminated by the IAEA

  19. Cumulative carbon as a policy framework for achieving climate stabilization

    Science.gov (United States)

    Matthews, H. Damon; Solomon, Susan; Pierrehumbert, Raymond

    2012-01-01

    The primary objective of the United Nations Framework Convention on Climate Change is to stabilize greenhouse gas concentrations at a level that will avoid dangerous climate impacts. However, greenhouse gas concentration stabilization is an awkward framework within which to assess dangerous climate change on account of the significant lag between a given concentration level and the eventual equilibrium temperature change. By contrast, recent research has shown that global temperature change can be well described by a given cumulative carbon emissions budget. Here, we propose that cumulative carbon emissions represent an alternative framework that is applicable both as a tool for climate mitigation as well as for the assessment of potential climate impacts. We show first that both atmospheric CO2 concentration at a given year and the associated temperature change are generally associated with a unique cumulative carbon emissions budget that is largely independent of the emissions scenario. The rate of global temperature change can therefore be related to first order to the rate of increase of cumulative carbon emissions. However, transient warming over the next century will also be strongly affected by emissions of shorter lived forcing agents such as aerosols and methane. Non-CO2 emissions therefore contribute to uncertainty in the cumulative carbon budget associated with near-term temperature targets, and may suggest the need for a mitigation approach that considers separately short- and long-lived gas emissions. By contrast, long-term temperature change remains primarily associated with total cumulative carbon emissions owing to the much longer atmospheric residence time of CO2 relative to other major climate forcing agents. PMID:22869803

  20. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  1. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  2. Cumulative human impacts on marine predators

    DEFF Research Database (Denmark)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact...

  3. Cumulative Student Loan Debt in Minnesota, 2015

    Science.gov (United States)

    Williams-Wyche, Shaun

    2016-01-01

    To better understand student debt in Minnesota, the Minnesota Office of Higher Education (the Office) gathers information on cumulative student loan debt from Minnesota degree-granting institutions. These data detail the number of students with loans by institution, the cumulative student loan debt incurred at that institution, and the percentage…

  4. Triggering trigeminal neuralgia

    DEFF Research Database (Denmark)

    Di Stefano, Giulia; Maarbjerg, Stine; Nurmikko, Turo

    2018-01-01

    Introduction Although it is widely accepted that facial pain paroxysms triggered by innocuous stimuli constitute a hallmark sign of trigeminal neuralgia, very few studies to date have systematically investigated the role of the triggers involved. In the recently published diagnostic classification...

  5. Triggering the GRANDE array

    International Nuclear Information System (INIS)

    Wilson, C.L.; Bratton, C.B.; Gurr, J.; Kropp, W.; Nelson, M.; Sobel, H.; Svoboda, R.; Yodh, G.; Burnett, T.; Chaloupka, V.; Wilkes, R.J.; Cherry, M.; Ellison, S.B.; Guzik, T.G.; Wefel, J.; Gaidos, J.; Loeffler, F.; Sembroski, G.; Goodman, J.; Haines, T.J.; Kielczewska, D.; Lane, C.; Steinberg, R.; Lieber, M.; Nagle, D.; Potter, M.; Tripp, R.

    1990-01-01

    A brief description of the Gamma Ray And Neutrino Detector Experiment (GRANDE) is presented. The detector elements and electronics are described. The trigger logic for the array is then examined. The triggers for the Gamma Ray and the Neutrino portions of the array are treated separately. (orig.)

  6. Trigger Menu in 2017

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    This document summarises the trigger menu deployed by the ATLAS experiment during 2017 data taking at proton-proton collision centre-of-mass energies of $\\sqrt{s}=13$ TeV and $\\sqrt{s}=5$ TeV at the LHC and describes the improvements with respect to the trigger system and menu used in 2016 data taking.

  7. Causality and headache triggers

    Science.gov (United States)

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  8. The LHCb trigger

    CERN Document Server

    Hernando Morata, Jose Angel

    2006-01-01

    The LHCb experiment relies on an efficient trigger to select a rate up to 2 kHz of events useful for physics analysis from an initial rate of 10 MHz of visible collisions. In this contribution, we describe the different LHCb trigger algorithms and present their expected performance.

  9. The NA27 trigger

    International Nuclear Information System (INIS)

    Bizzarri, R.; Di Capua, E.; Falciano, S.; Iori, M.; Marel, G.; Piredda, G.; Zanello, L.; Haupt, L.; Hellman, S.; Holmgren, S.O.; Johansson, K.E.

    1985-05-01

    We have designed and implemented a minimum bias trigger together with a fiducial volume trigger for the experiment NA27, performed at the CERN SPS. A total of more than 3 million bubble chamber pictures have been taken with a triggered cross section smaller than 75% of the total inelastic cross section. Events containing charm particles were triggered with an efficiency of 98 +2 sub(-3)%. With the fiducial volume trigger, the probability for a picture to contain an interaction in the visible hydrogen increased from 47.3% to 59.5%, reducing film cost and processing effort with about 20%. The improvement in data taking rate is shown to be negligible. (author)

  10. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  11. Polarization in high Psub(trans) and cumulative hadron production

    International Nuclear Information System (INIS)

    Efremov, A.V.

    1978-01-01

    The final hadron polarization in the high Psub(trans) processes is analyzed in the parton hard scattering picture. Scaling assumption allows a correct qualitative description to be given for the Psub(trans)-behaviour of polarization or escape angle behaviour in cumulative production. The energy scaling and weak dependence on the beam and target type is predicted. A method is proposed for measuring the polarization of hadron jets

  12. The Trigger Processor and Trigger Processor Algorithms for the ATLAS New Small Wheel Upgrade

    CERN Document Server

    Lazovich, Tomo; The ATLAS collaboration

    2015-01-01

    The ATLAS New Small Wheel (NSW) is an upgrade to the ATLAS muon endcap detectors that will be installed during the next long shutdown of the LHC. Comprising both MicroMegas (MMs) and small-strip Thin Gap Chambers (sTGCs), this system will drastically improve the performance of the muon system in a high cavern background environment. The NSW trigger, in particular, will significantly reduce the rate of fake triggers coming from track segments in the endcap not originating from the interaction point. We will present an overview of the trigger, the proposed sTGC and MM trigger algorithms, and the hardware implementation of the trigger. In particular, we will discuss both the heart of the trigger, an ATCA system with FPGA-based trigger processors (using the same hardware platform for both MM and sTGC triggers), as well as the full trigger electronics chain, including dedicated cards for transmission of data via GBT optical links. Finally, we will detail the challenges of ensuring that the trigger electronics can ...

  13. Triggering soft bombs at the LHC

    Science.gov (United States)

    Knapen, Simon; Griso, Simone Pagan; Papucci, Michele; Robinson, Dean J.

    2017-08-01

    Very high multiplicity, spherically-symmetric distributions of soft particles, with p T ˜ few×100 MeV, may be a signature of strongly-coupled hidden valleys that exhibit long, efficient showering windows. With traditional triggers, such `soft bomb' events closely resemble pile-up and are therefore only recorded with minimum bias triggers at a very low efficiency. We demonstrate a proof-of-concept for a high-level triggering strategy that efficiently separates soft bombs from pile-up by searching for a `belt of fire': a high density band of hits on the innermost layer of the tracker. Seeding our proposed high-level trigger with existing jet, missing transverse energy or lepton hardware-level triggers, we show that net trigger efficiencies of order 10% are possible for bombs of mass several × 100 GeV. We also consider the special case that soft bombs are the result of an exotic decay of the 125 GeV Higgs. The fiducial rate for `Higgs bombs' triggered in this manner is marginally higher than the rate achievable by triggering directly on a hard muon from associated Higgs production.

  14. Trigger tracking for the LHCb upgrade

    CERN Multimedia

    Dungs, K

    2014-01-01

    This poster presents a trigger system for the upgraded LHCb detector, scheduled to begin operation in 2020. The proposed trigger system is implemented entirely in software. We show that track reconstruction of a similar quality to that available in the offline algorithms can be performed on the full inelastic pp-collision rate. A track finding efficiency of 98.8% relative to offline can be achieved for good trigger tracks. The CPU time required for this reconstruction is less than 60% of the available budget.

  15. LHCb Topological Trigger Reoptimization

    International Nuclear Information System (INIS)

    Likhomanenko, Tatiana; Khairullin, Egor; Rogozhnikov, Alex; Ustyuzhanin, Andrey; Ilten, Philip; Williams, Michael

    2015-01-01

    The main b-physics trigger algorithm used by the LHCb experiment is the so- called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger, which utilized a custom boosted decision tree algorithm, selected a nearly 100% pure sample of b-hadrons with a typical efficiency of 60-70%; its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and neural networks. The topological trigger algorithm is designed to select all ’interesting” decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. Methods studied include cascading, ensembling and blending techniques. Furthermore, novel boosting techniques have been implemented that will help reduce systematic uncertainties in Run 2 measurements. We demonstrate that the reoptimized topological trigger is expected to significantly improve on the Run 1 performance for a wide range of b-hadron decays. (paper)

  16. NOMAD Trigger Studies

    International Nuclear Information System (INIS)

    Varvell, K.

    1995-01-01

    The author reports on the status of an offline study of the NOMAD triggers, which has several motivations. Of primary importance is to demonstrate, using offline information recorded by the individual subdetectors comprising NOMAD, that the online trigger system is functioning as expected. Such an investigation serves to complement the extensive monitoring which is already carried out online. More specific to the needs of the offline software and analysis, the reconstruction of tracks and vertices in the detector requires some knowledge of the time at which the trigger has occurred, in order to locate relevant hits in the drift chambers and muon chambers in particular. The fact that the different triggers allowed by the MIOTRINO board take varying times to form complicates this task. An offline trigger algorithm may serve as a tool to shed light on situations where the online trigger status bits have not been recorded correctly, as happens in a small number of cases, or as an aid to studies with the aim of further refinement of the online triggers themselves

  17. Calo trigger acquisition system

    CERN Multimedia

    Franchini, Matteo

    2016-01-01

    Calo trigger acquisition system - Evolution of the acquisition system from a multiple boards system (upper, orange cables) to a single board one (below, light blue cables) where all the channels are collected in a single board.

  18. Calorimetry triggering in ATLAS

    CERN Document Server

    Igonkina, O; Adragna, P; Aharrouche, M; Alexandre, G; Andrei, V; Anduaga, X; Aracena, I; Backlund, S; Baines, J; Barnett, B M; Bauss, B; Bee, C; Behera, P; Bell, P; Bendel, M; Benslama, K; Berry, T; Bogaerts, A; Bohm, C; Bold, T; Booth, J R A; Bosman, M; Boyd, J; Bracinik, J; Brawn, I, P; Brelier, B; Brooks, W; Brunet, S; Bucci, F; Casadei, D; Casado, P; Cerri, A; Charlton, D G; Childers, J T; Collins, N J; Conde Muino, P; Coura Torres, R; Cranmer, K; Curtis, C J; Czyczula, Z; Dam, M; Damazio, D; Davis, A O; De Santo, A; Degenhardt, J; Delsart, P A; Demers, S; Demirkoz, B; Di Mattia, A; Diaz, M; Djilkibaev, R; Dobson, E; Dova, M, T; Dufour, M A; Eckweiler, S; Ehrenfeld, W; Eifert, T; Eisenhandler, E; Ellis, N; Emeliyanov, D; Enoque Ferreira de Lima, D; Faulkner, P J W; Ferland, J; Flacher, H; Fleckner, J E; Flowerdew, M; Fonseca-Martin, T; Fratina, S; Fhlisch, F; Gadomski, S; Gallacher, M P; Garitaonandia Elejabarrieta, H; Gee, C N P; George, S; Gillman, A R; Goncalo, R; Grabowska-Bold, I; Groll, M; Gringer, C; Hadley, D R; Haller, J; Hamilton, A; Hanke, P; Hauser, R; Hellman, S; Hidvgi, A; Hillier, S J; Hryn'ova, T; Idarraga, J; Johansen, M; Johns, K; Kalinowski, A; Khoriauli, G; Kirk, J; Klous, S; Kluge, E-E; Koeneke, K; Konoplich, R; Konstantinidis, N; Kwee, R; Landon, M; LeCompte, T; Ledroit, F; Lei, X; Lendermann, V; Lilley, J N; Losada, M; Maettig, S; Mahboubi, K; Mahout, G; Maltrana, D; Marino, C; Masik, J; Meier, K; Middleton, R P; Mincer, A; Moa, T; Monticelli, F; Moreno, D; Morris, J D; Mller, F; Navarro, G A; Negri, A; Nemethy, P; Neusiedl, A; Oltmann, B; Olvito, D; Osuna, C; Padilla, C; Panes, B; Parodi, F; Perera, V J O; Perez, E; Perez Reale, V; Petersen, B; Pinzon, G; Potter, C; Prieur, D P F; Prokishin, F; Qian, W; Quinonez, F; Rajagopalan, S; Reinsch, A; Rieke, S; Riu, I; Robertson, S; Rodriguez, D; Rogriquez, Y; Rhr, F; Saavedra, A; Sankey, D P C; Santamarina, C; Santamarina Rios, C; Scannicchio, D; Schiavi, C; Schmitt, K; Schultz-Coulon, H C; Schfer, U; Segura, E; Silverstein, D; Silverstein, S; Sivoklokov, S; Sjlin, J; Staley, R J; Stamen, R; Stelzer, J; Stockton, M C; Straessner, A; Strom, D; Sushkov, S; Sutton, M; Tamsett, M; Tan, C L A; Tapprogge, S; Thomas, J P; Thompson, P D; Torrence, E; Tripiana, M; Urquijo, P; Urrejola, P; Vachon, B; Vercesi, V; Vorwerk, V; Wang, M; Watkins, P M; Watson, A; Weber, P; Weidberg, T; Werner, P; Wessels, M; Wheeler-Ellis, S; Whiteson, D; Wiedenmann, W; Wielers, M; Wildt, M; Winklmeier, F; Wu, X; Xella, S; Zhao, L; Zobernig, H; de Seixas, J M; dos Anjos, A; Asman, B; Özcan, E

    2009-01-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 105 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  19. BTeV Trigger

    International Nuclear Information System (INIS)

    Gottschalk, Erik E.

    2006-01-01

    BTeV was designed to conduct precision studies of CP violation in BB-bar events using a forward-geometry detector in a hadron collider. The detector was optimized for high-rate detection of beauty and charm particles produced in collisions between protons and antiprotons. The trigger was designed to take advantage of the main difference between events with beauty and charm particles and more typical hadronic events-the presence of detached beauty and charm decay vertices. The first stage of the BTeV trigger was to receive data from a pixel vertex detector, reconstruct tracks and vertices for every beam crossing, reject at least 98% of beam crossings in which neither beauty nor charm particles were produced, and trigger on beauty events with high efficiency. An overview of the trigger design and its evolution to include commodity networking and computing components is presented

  20. Calorimetry triggering in ATLAS

    International Nuclear Information System (INIS)

    Igonkina, O; Achenbach, R; Andrei, V; Adragna, P; Aharrouche, M; Bauss, B; Bendel, M; Alexandre, G; Anduaga, X; Aracena, I; Backlund, S; Bogaerts, A; Baines, J; Barnett, B M; Bee, C; P, Behera; Bell, P; Benslama, K; Berry, T; Bohm, C

    2009-01-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 | 10 5 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  1. Calorimetry Triggering in ATLAS

    International Nuclear Information System (INIS)

    Igonkina, O.; Achenbach, R.; Adragna, P.; Aharrouche, M.; Alexandre, G.; Andrei, V.; Anduaga, X.; Aracena, I.; Backlund, S.; Baines, J.; Barnett, B.M.; Bauss, B.; Bee, C.; Behera, P.; Bell, P.; Bendel, M.; Benslama, K.; Berry, T.; Bogaerts, A.; Bohm, C.; Bold, T.; Booth, J.R.A.; Bosman, M.; Boyd, J.; Bracinik, J.; Brawn, I.P.; Brelier, B.; Brooks, W.; Brunet, S.; Bucci, F.; Casadei, D.; Casado, P.; Cerri, A.; Charlton, D.G.; Childers, J.T.; Collins, N.J.; Conde Muino, P.; Coura Torres, R.; Cranmer, K.; Curtis, C.J.; Czyczula, Z.; Dam, M.; Damazio, D.; Davis, A.O.; De Santo, A.; Degenhardt, J.

    2011-01-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2/10 5 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  2. Calorimetry triggering in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Igonkina, O [Nikhef National Institute for Subatomic Physics, Amsterdam (Netherlands); Achenbach, R; Andrei, V [Kirchhoff Institut fuer Physik, Universitaet Heidelberg, Heidelberg (Germany); Adragna, P [Physics Department, Queen Mary, University of London, London (United Kingdom); Aharrouche, M; Bauss, B; Bendel, M [Institut fr Physik, Universitt Mainz, Mainz (Germany); Alexandre, G [Section de Physique, Universite de Geneve, Geneva (Switzerland); Anduaga, X [Universidad Nacional de La Plata, La Plata (Argentina); Aracena, I [Stanford Linear Accelerator Center (SLAC), Stanford (United States); Backlund, S; Bogaerts, A [European Laboratory for Particle Physics (CERN), Geneva (Switzerland); Baines, J; Barnett, B M [STFC Rutherford Appleton Laboratory, Harwell Science and Innovation Campus, Didcot, Oxon (United Kingdom); Bee, C [Centre de Physique des Particules de Marseille, IN2P3-CNRS, Marseille (France); P, Behera [Iowa State University, Ames, Iowa (United States); Bell, P [School of Physics and Astronomy, University of Manchester, Manchester (United Kingdom); Benslama, K [University of Regina, Regina (Canada); Berry, T [Department of Physics, Royal Holloway and Bedford New College, Egham (United Kingdom); Bohm, C [Fysikum, Stockholm University, Stockholm (Sweden)

    2009-04-01

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 | 10{sup 5} to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  3. An Integrated Cumulative Transformation and Feature Fusion Approach for Bearing Degradation Prognostics

    Directory of Open Access Journals (Sweden)

    Lixiang Duan

    2018-01-01

    Full Text Available Aimed at degradation prognostics of a rolling bearing, this paper proposed a novel cumulative transformation algorithm for data processing and a feature fusion technique for bearing degradation assessment. First, a cumulative transformation is presented to map the original features extracted from a vibration signal to their respective cumulative forms. The technique not only makes the extracted features show a monotonic trend but also reduces the fluctuation; such properties are more propitious to reflect the bearing degradation trend. Then, a new degradation index system is constructed, which fuses multidimensional cumulative features by kernel principal component analysis (KPCA. Finally, an extreme learning machine model based on phase space reconstruction is proposed to predict the degradation trend. The model performance is experimentally validated with a whole-life experiment of a rolling bearing. The results prove that the proposed method reflects the bearing degradation process clearly and achieves a good balance between model accuracy and complexity.

  4. LHC-B trigger and data acquisition progress report

    CERN Document Server

    Dijkstra, H; Harris, Frank

    1997-01-01

    97-05 This report describes the progress since the Letter of Intent [1] in the development of the trigger and data acquisition system for LHC-B. The basic philosophy has changed significantly, with the proposal to implement tracking and vertex topology triggers in specialised hardware. This will be at an additional trigger level, giving 4 levels in total. We present details of the new proposal, together with preliminary requirements estimates, and some simulation results.

  5. LHCb Topological Trigger Reoptimization

    CERN Document Server

    INSPIRE-00400931; Ilten, Philip; Khairullin, Egor; Rogozhnikov, Alex; Ustyuzhanin, Andrey; Williams, Michael

    2015-12-23

    The main b-physics trigger algorithm used by the LHCb experiment is the so-called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger, which utilized a custom boosted decision tree algorithm, selected a nearly 100% pure sample of b-hadrons with a typical efficiency of 60-70%; its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and neural networks. The topological trigger algorithm is designed to select all "interesting" decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. ...

  6. Topological Trigger Developments

    CERN Multimedia

    Likhomanenko, Tatiana

    2015-01-01

    The main b-physics trigger algorithm used by the LHCb experiment is the so-called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger utilized a custom boosted decision tree algorithm, selected an almost 100% pure sample of b-hadrons with a typical efficiency of 60-70%, and its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and uBoost. The topological trigger algorithm is designed to select all "interesting" decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. These inclu...

  7. The Relationship between Gender, Cumulative Adversities and ...

    African Journals Online (AJOL)

    The Relationship between Gender, Cumulative Adversities and Mental Health of Employees in ... CAs were measured in three forms (family adversities (CAFam), personal adversities ... Age of employees ranged between 18-65 years.

  8. Complexity and demographic explanations of cumulative culture

    NARCIS (Netherlands)

    Querbes, A.; Vaesen, K.; Houkes, W.N.

    2014-01-01

    Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological

  9. Cumulative human impacts on marine predators.

    Science.gov (United States)

    Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J; Halpern, Benjamin S; Breed, Greg A; Nickel, Barry; Teutschel, Nicole M; Crowder, Larry B; Benson, Scott; Dutton, Peter H; Bailey, Helen; Kappes, Michelle A; Kuhn, Carey E; Weise, Michael J; Mate, Bruce; Shaffer, Scott A; Hassrick, Jason L; Henry, Robert W; Irvine, Ladd; McDonald, Birgitte I; Robinson, Patrick W; Block, Barbara A; Costa, Daniel P

    2013-01-01

    Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact (CUI) on marine predators by combining electronic tracking data of eight protected predator species (n=685 individuals) in the California Current Ecosystem with data on 24 anthropogenic stressors. We show significant variation in CUI with some of the highest impacts within US National Marine Sanctuaries. High variation in underlying species and cumulative impact distributions means that neither alone is sufficient for effective spatial management. Instead, comprehensive management approaches accounting for both cumulative human impacts and trade-offs among multiple stressors must be applied in planning the use of marine resources.

  10. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  11. Muon Trigger for Mobile Phones

    Science.gov (United States)

    Borisyak, M.; Usvyatsov, M.; Mulhearn, M.; Shimmin, C.; Ustyuzhanin, A.

    2017-10-01

    The CRAYFIS experiment proposes to use privately owned mobile phones as a ground detector array for Ultra High Energy Cosmic Rays. Upon interacting with Earth’s atmosphere, these events produce extensive particle showers which can be detected by cameras on mobile phones. A typical shower contains minimally-ionizing particles such as muons. As these particles interact with CMOS image sensors, they may leave tracks of faintly-activated pixels that are sometimes hard to distinguish from random detector noise. Triggers that rely on the presence of very bright pixels within an image frame are not efficient in this case. We present a trigger algorithm based on Convolutional Neural Networks which selects images containing such tracks and are evaluated in a lazy manner: the response of each successive layer is computed only if activation of the current layer satisfies a continuation criterion. Usage of neural networks increases the sensitivity considerably comparable with image thresholding, while the lazy evaluation allows for execution of the trigger under the limited computational power of mobile phones.

  12. Cumulative effects assessment: Does scale matter?

    International Nuclear Information System (INIS)

    Therivel, Riki; Ross, Bill

    2007-01-01

    Cumulative effects assessment (CEA) is (or should be) an integral part of environmental assessment at both the project and the more strategic level. CEA helps to link the different scales of environmental assessment in that it focuses on how a given receptor is affected by the totality of plans, projects and activities, rather than on the effects of a particular plan or project. This article reviews how CEAs consider, and could consider, scale issues: spatial extent, level of detail, and temporal issues. It is based on an analysis of Canadian project-level CEAs and UK strategic-level CEAs. Based on a review of literature and, especially, case studies with which the authors are familiar, it concludes that scale issues are poorly considered at both levels, with particular problems being unclear or non-existing cumulative effects scoping methodologies; poor consideration of past or likely future human activities beyond the plan or project in question; attempts to apportion 'blame' for cumulative effects; and, at the plan level, limited management of cumulative effects caused particularly by the absence of consent regimes. Scale issues are important in most of these problems. However both strategic-level and project-level CEA have much potential for managing cumulative effects through better siting and phasing of development, demand reduction and other behavioural changes, and particularly through setting development consent rules for projects. The lack of strategic resource-based thresholds constrains the robust management of strategic-level cumulative effects

  13. Ecosystem assessment methods for cumulative effects at the regional scale

    International Nuclear Information System (INIS)

    Hunsaker, C.T.

    1989-01-01

    Environmental issues such as nonpoint-source pollution, acid rain, reduced biodiversity, land use change, and climate change have widespread ecological impacts and require an integrated assessment approach. Since 1978, the implementing regulations for the National Environmental Policy Act (NEPA) have required assessment of potential cumulative environmental impacts. Current environmental issues have encouraged ecologists to improve their understanding of ecosystem process and function at several spatial scales. However, management activities usually occur at the local scale, and there is little consideration of the potential impacts to the environmental quality of a region. This paper proposes that regional ecological risk assessment provides a useful approach for assisting scientists in accomplishing the task of assessing cumulative impacts. Critical issues such as spatial heterogeneity, boundary definition, and data aggregation are discussed. Examples from an assessment of acidic deposition effects on fish in Adirondack lakes illustrate the importance of integrated data bases, associated modeling efforts, and boundary definition at the regional scale

  14. Feasibility studies of a Level-1 Tracking Trigger for ATLAS

    CERN Document Server

    Warren, M; Brenner, R; Konstantinidis, N; Sutton, M

    2009-01-01

    The existing ATLAS Level-1 trigger system is seriously challenged at the SLHC's higher luminosity. A hardware tracking trigger might be needed, but requires a detailed understanding of the detector. Simulation of high pile-up events, with various data-reduction techniques applied will be described. Two scenarios are envisaged: (a) regional readout - calorimeter and muon triggers are used to identify portions of the tracker; and (b) track-stub finding using special trigger layers. A proposed hardware system, including data reduction on the front-end ASICs, readout within a super-module and integrating regional triggering into all levels of the readout system, will be discussed.

  15. CMS Trigger Performance

    CERN Document Server

    Donato, Silvio

    2017-01-01

    During its second run of operation (Run 2) which started in 2015, the LHC will deliver a peak instantaneous luminosity that may reach $2 \\cdot 10^{34}$ cm$^{-2}$s$^{-1}$ with an average pile-up of about 55, far larger than the design value. Under these conditions, the online event selection is a very challenging task. In CMS, it is realized by a two-level trigger system the Level-1 (L1) Trigger, implemented in custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the offline reconstruction software running on a computer farm. In order to face this challenge, the L1 trigger has been through a major upgrade compared to Run 1, whereby all electronic boards of the system have been replaced, allowing more sophisticated algorithms to be run online. Its last stage, the global trigger, is now able to perform complex selections and to compute high-level quantities, like invariant masses. Likewise, the algorithms that run in the HLT go through big improvements; in particular, new appr...

  16. The ATLAS Tau Trigger

    CERN Document Server

    Dam, M; The ATLAS collaboration

    2009-01-01

    The ATLAS experiment at CERN’s LHC has implemented a dedicated tau trigger system to select hadronically decaying tau leptons from the enormous background of QCD jets. This promises a significant increase in the discovery potential to the Higgs boson and in searches for physics beyond the Standard Model. The three level trigger system has been optimised for effciency and good background rejection. The first level uses information from the calorimeters only, while the two higher levels include also information from the tracking detectors. Shower shape variables and the track multiplicity are important variables to distinguish taus from QCD jets. At the initial lumonosity of 10^31 cm^−2 s^−1, single tau triggers with a transverse energy threshold of 50 GeV or higher can be run standalone. Below this level, the tau signatures will be combined with other event signature

  17. The ATLAS Tau Trigger

    CERN Document Server

    Rados, PK; The ATLAS collaboration

    2014-01-01

    Physics processes involving tau leptons play a crucial role in understanding particle physics at the high energy frontier. The ability to efficiently trigger on events containing hadronic tau decays is therefore of particular importance to the ATLAS experiment. During the 2012 run, the Large Hadronic Collder (LHC) reached instantaneous luminosities of nearly $10^{34} cm^{-2}s^{-1}$ with bunch crossings occurring every $50 ns$. This resulted in a huge event rate and a high probability of overlapping interactions per bunch crossing (pile-up). With this in mind it was necessary to design an ATLAS tau trigger system that could reduce the event rate to a manageable level, while efficiently extracting the most interesting physics events in a pile-up robust manner. In this poster the ATLAS tau trigger is described, its performance during 2012 is presented, and the outlook for the LHC Run II is briefly summarized.

  18. Structure functions and particle production in the cumulative region: two different exponentials

    International Nuclear Information System (INIS)

    Braun, M.; Vechernin, V.

    1997-01-01

    In the framework of the recently proposed (QCD-based parton model for the cumulative phenomena in the interactions with nuclei two mechanisms for particle production, direct and spectator ones, are analyzed. It is shown that due to final-state interactions the leading terms of the direct mechanism contribution are cancelled and the spectator mechanism is the dominant one. It leads to a smaller slope of the cumulative particle production rates compared to the slope of the nuclear structure function in the cumulative region x ≥ 1, in agreement with the recent experimental data

  19. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  20. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  1. Managing cumulative impacts: A key to sustainability?

    Energy Technology Data Exchange (ETDEWEB)

    Hunsaker, C.T.

    1994-12-31

    This paper addresses how science can be more effectively used in creating policy to manage cumulative effects on ecosystems. The paper focuses on the scientific techniques that we have to identify and to assess cumulative impacts on ecosystems. The term ``sustainable development`` was brought into common use by the World Commission on Environment and Development (The Brundtland Commission) in 1987. The Brundtland Commission report highlighted the need to simultaneously address developmental and environmental imperatives simultaneously by calling for development that ``meets the needs of the present generation without compromising the needs of future generations.`` We cannot claim to be working toward sustainable development until we can quantitatively assess cumulative impacts on the environment: The two concepts are inextricibally linked in that the elusiveness of cumulative effects likely has the greatest potential of keeping us from achieving sustainability. In this paper, assessment and management frameworks relevant to cumulative impacts are discussed along with recent literature on how to improve such assessments. When possible, examples are given for marine ecosystems.

  2. ALICE High Level Trigger

    CERN Multimedia

    Alt, T

    2013-01-01

    The ALICE High Level Trigger (HLT) is a computing farm designed and build for the real-time, online processing of the raw data produced by the ALICE detectors. Events are fully reconstructed from the raw data, analyzed and compressed. The analysis summary together with the compressed data and a trigger decision is sent to the DAQ. In addition the reconstruction of the events allows for on-line monitoring of physical observables and this information is provided to the Data Quality Monitor (DQM). The HLT can process event rates of up to 2 kHz for proton-proton and 200 Hz for Pb-Pb central collisions.

  3. Trigger and decision processors

    International Nuclear Information System (INIS)

    Franke, G.

    1980-11-01

    In recent years there have been many attempts in high energy physics to make trigger and decision processes faster and more sophisticated. This became necessary due to a permanent increase of the number of sensitive detector elements in wire chambers and calorimeters, and in fact it was possible because of the fast developments in integrated circuits technique. In this paper the present situation will be reviewed. The discussion will be mainly focussed upon event filtering by pure software methods and - rather hardware related - microprogrammable processors as well as random access memory triggers. (orig.)

  4. Perspectives on cumulative risks and impacts.

    Science.gov (United States)

    Faust, John B

    2010-01-01

    Cumulative risks and impacts have taken on different meanings in different regulatory and programmatic contexts at federal and state government levels. Traditional risk assessment methodologies, with considerable limitations, can provide a framework for the evaluation of cumulative risks from chemicals. Under an environmental justice program in California, cumulative impacts are defined to include exposures, public health effects, or environmental effects in a geographic area from the emission or discharge of environmental pollution from all sources, through all media. Furthermore, the evaluation of these effects should take into account sensitive populations and socioeconomic factors where possible and to the extent data are available. Key aspects to this potential approach include the consideration of exposures (versus risk), socioeconomic factors, the geographic or community-level assessment scale, and the inclusion of not only health effects but also environmental effects as contributors to impact. Assessments of this type extend the boundaries of the types of information that toxicologists generally provide for risk management decisions.

  5. Cumulative processes and quark distribution in nuclei

    International Nuclear Information System (INIS)

    Kondratyuk, L.; Shmatikov, M.

    1984-01-01

    Assuming existence of multiquark (mainly 12q) bags in nuclei the spectra of cumulative nucleons and mesons produced in high-energy particle-nucleus collisions are discussed. The exponential form of quark momentum distribution in 12q-bag (agreeing well with the experimental data on lepton-nucleus interactions at large q 2 ) is shown to result in quasi-exponential distribution of cumulative particles over the light-cone variable αsub(B). The dependence of f(αsub(B); psub(perpendicular)) (where psub(perpendicular) is the transverse momentum of the bag) upon psub(perpendicular) is considered. The yields of cumulative resonances as well as effects related to the u- and d-quark distributions in N > Z nuclei being different are dicscussed

  6. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Preference, resistance to change, and the cumulative decision model.

    Science.gov (United States)

    Grace, Randolph C

    2018-01-01

    According to behavioral momentum theory (Nevin & Grace, 2000a), preference in concurrent chains and resistance to change in multiple schedules are independent measures of a common construct representing reinforcement history. Here I review the original studies on preference and resistance to change in which reinforcement variables were manipulated parametrically, conducted by Nevin, Grace and colleagues between 1997 and 2002, as well as more recent research. The cumulative decision model proposed by Grace and colleagues for concurrent chains is shown to provide a good account of both preference and resistance to change, and is able to predict the increased sensitivity to reinforcer rate and magnitude observed with constant-duration components. Residuals from fits of the cumulative decision model to preference and resistance to change data were positively correlated, supporting the prediction of behavioral momentum theory. Although some questions remain, the learning process assumed by the cumulative decision model, in which outcomes are compared against a criterion that represents the average outcome value in the current context, may provide a plausible model for the acquisition of differential resistance to change. © 2018 Society for the Experimental Analysis of Behavior.

  8. Cumulative Culture and Future Thinking: Is Mental Time Travel a Prerequisite to Cumulative Cultural Evolution?

    Science.gov (United States)

    Vale, G. L.; Flynn, E. G.; Kendal, R. L.

    2012-01-01

    Cumulative culture denotes the, arguably, human capacity to build on the cultural behaviors of one's predecessors, allowing increases in cultural complexity to occur such that many of our cultural artifacts, products and technologies have progressed beyond what a single individual could invent alone. This process of cumulative cultural evolution…

  9. The STAR trigger

    International Nuclear Information System (INIS)

    Bieser, F.S.; Crawford, H.J.; Engelage, J.; Eppley, G.; Greiner, L.C.; Judd, E.G.; Klein, S.R.; Meissner, F.; Minor, R.; Milosevich, Z.; Mutchler, G.; Nelson, J.M.; Schambach, J.; VanderMolen, A.S.; Ward, H.; Yepes, P.

    2003-01-01

    We describe the trigger system that we designed and implemented for the STAR detector at RHIC. This is a 10 MHz pipelined system based on fast detector output that controls the event selection for the much slower tracking detectors. Results from the first run are presented and new detectors for the 2001 run are discussed

  10. Asthma Triggers: Gain Control

    Science.gov (United States)

    ... harm people too. Try to use pest management methods that pose less of a risk. Keep counters, sinks, tables and floors clean and ... with pest challenges in your home and other environments. [EPA ... pests while reducing pesticide risks; roaches are often asthma triggers and shouldn’t ...

  11. Physics issues on triggering

    Indian Academy of Sciences (India)

    The detectors at the ILC are planned to run without hardware trigger. The ... as not coming from the interaction point and not matching to the silicon detectors ... electrons so that additional dE/dx cuts can help, making also here a factor 10 or.

  12. AIDS radio triggers.

    Science.gov (United States)

    Elias, A M

    1991-07-01

    In April 1991, the Ethnic Communities' Council of NSW was granted funding under the Community AIDS Prevention and Education Program through the Department of Community Services and Health, to produce a series of 6x50 second AIDS radio triggers with a 10-second tag line for further information. The triggers are designed to disseminate culturally-sensitive information about HIV/AIDS in English, Italian, Greek, Spanish, Khmer, Turkish, Macedonian, Serbo-Croatian, Arabic, Cantonese, and Vietnamese, with the goal of increasing awareness and decreasing the degree of misinformation about HIV/AIDS among people of non-English-speaking backgrounds through radio and sound. The 6 triggers cover the denial that AIDS exists in the community, beliefs that words and feelings do not protect one from catching HIV, encouraging friends to be compassionate, compassion within the family, AIDS information for a young audience, and the provision of accurate and honest information on HIV/AIDS. The triggers are slated to be completed by the end of July 1991 and will be broadcast on all possible community, ethnic, and commercial radio networks across Australia. They will be available upon request in composite form with an information kit for use by health care professionals and community workers.

  13. Dealing with Asthma Triggers

    Science.gov (United States)

    ... one trigger that you shouldn't avoid because exercise is important for your health. Your doctor will want you to be active, so talk with him or her about what to do before playing ... or 15 minutes before you exercise or play sports. And, of course, you'll ...

  14. Trigger Finger (Stenosing Tenosynovitis)

    Science.gov (United States)

    ... All Topics A-Z Videos Infographics Symptom Picker Anatomy Bones Joints Muscles Nerves Vessels Tendons About Hand Surgery What is a Hand Surgeon? What is a Hand Therapist? Media Find a Hand Surgeon Home Anatomy Trigger Finger Email to a friend * required fields ...

  15. EXAFS cumulants of CdSe

    International Nuclear Information System (INIS)

    Diop, D.

    1997-04-01

    EXAFS functions had been extracted from measurements on the K edge of Se at different temperatures between 20 and 300 K. The analysis of the EXAFS of the filtered first two shells has been done in the wavevector range laying between 2 and 15.5 A -1 in terms of the cumulants of the effective distribution of distances. The cumulants C 3 and C 4 obtained from the phase difference and the amplitude ratio methods have shown the anharmonicity in the vibrations of atoms around their equilibrium position. (author). 13 refs, 3 figs

  16. Finite-volume cumulant expansion in QCD-colorless plasma

    Energy Technology Data Exchange (ETDEWEB)

    Ladrem, M. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); Physics Department, Algiers (Algeria); ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Ahmed, M.A.A. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Taiz University in Turba, Physics Department, Taiz (Yemen); Alfull, Z.Z. [Taibah University, Physics Department, Faculty of Science, Al-Madinah, Al-Munawwarah (Saudi Arabia); Cherif, S. [ENS-Vieux Kouba (Bachir El-Ibrahimi), Laboratoire de Physique et de Mathematiques Appliquees (LPMA), Algiers (Algeria); Ghardaia University, Sciences and Technologies Department, Ghardaia (Algeria)

    2015-09-15

    Due to the finite-size effects, the localization of the phase transition in finite systems and the determination of its order, become an extremely difficult task, even in the simplest known cases. In order to identify and locate the finite-volume transition point T{sub 0}(V) of the QCD deconfinement phase transition to a colorless QGP, we have developed a new approach using the finite-size cumulant expansion of the order parameter and the L{sub mn}-method. The first six cumulants C{sub 1,2,3,4,5,6} with the corresponding under-normalized ratios (skewness Σ, kurtosis κ, pentosis Π{sub ±}, and hexosis H{sub 1,2,3}) and three unnormalized combinations of them, (O = σ{sup 2}κΣ{sup -1},U = σ{sup -2}Σ{sup -1},N = σ{sup 2}κ) are calculated and studied as functions of (T, V). A new approach, unifying in a clear and consistent way the definitions of cumulant ratios, is proposed.Anumerical FSS analysis of the obtained results has allowed us to locate accurately the finite-volume transition point. The extracted transition temperature value T{sub 0}(V) agrees with that expected T{sub 0}{sup N}(V) from the order parameter and the thermal susceptibility χ{sub T} (T, V), according to the standard procedure of localization to within about 2%. In addition to this, a very good correlation factor is obtained proving the validity of our cumulants method. The agreement of our results with those obtained by means of other models is remarkable. (orig.)

  17. Cumulative effects of wind turbines. A guide to assessing the cumulative effects of wind energy development

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This guidance provides advice on how to assess the cumulative effects of wind energy developments in an area and is aimed at developers, planners, and stakeholders interested in the development of wind energy in the UK. The principles of cumulative assessment, wind energy development in the UK, cumulative assessment of wind energy development, and best practice conclusions are discussed. The identification and assessment of the cumulative effects is examined in terms of global environmental sustainability, local environmental quality and socio-economic activity. Supplementary guidance for assessing the principle cumulative effects on the landscape, on birds, and on the visual effect is provided. The consensus building approach behind the preparation of this guidance is outlined in the annexes of the report.

  18. Self-triggered coordination with ternary controllers

    NARCIS (Netherlands)

    De Persis, Claudio; Frasca, Paolo

    2012-01-01

    This paper regards coordination of networked systems with ternary controllers. We develop a hybrid coordination system which implements a self-triggered communication policy, based on polling the neighbors upon need. We prove that the proposed scheme ensures finite-time convergence to a neighborhood

  19. Near-Field Source Localization Using a Special Cumulant Matrix

    Science.gov (United States)

    Cui, Han; Wei, Gang

    A new near-field source localization algorithm based on a uniform linear array was proposed. The proposed algorithm estimates each parameter separately but does not need pairing parameters. It can be divided into two important steps. The first step is bearing-related electric angle estimation based on the ESPRIT algorithm by constructing a special cumulant matrix. The second step is the other electric angle estimation based on the 1-D MUSIC spectrum. It offers much lower computational complexity than the traditional near-field 2-D MUSIC algorithm and has better performance than the high-order ESPRIT algorithm. Simulation results demonstrate that the performance of the proposed algorithm is close to the Cramer-Rao Bound (CRB).

  20. Determining rainfall thresholds that trigger landslides in Colombia

    International Nuclear Information System (INIS)

    Mayorga Marquez, Ruth

    2003-01-01

    Considering that rainfall is the natural event that more often triggers landslides, it is important to study the relationship between this phenomenon and the occurrence of earth mass movements, by determining rainfall thresholds that trigger landslides in different zones of Colombia. The research presents a methodology that allows proposing rainfall thresholds that trigger landslides in Colombia, by means of a relationship between the accumulated rain in the soil (antecedent rainfall) and the rain that falls the day of the landslide occurrence (event rainfall)

  1. Decision analysis with cumulative prospect theory.

    Science.gov (United States)

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  2. Cumulative watershed effects: a research perspective

    Science.gov (United States)

    Leslie M. Reid; Robert R. Ziemer

    1989-01-01

    A cumulative watershed effect (CWE) is any response to multiple land-use activities that is caused by, or results in, altered watershed function. The CWE issue is politically defined, as is the significance of particular impacts. But the processes generating CWEs are the traditional focus of geomorphology and ecology, and have thus been studied for decades. The CWE...

  3. Elaboration of a concept for the cumulative environmental exposure assessment of biocides

    Energy Technology Data Exchange (ETDEWEB)

    Gross, Rita; Bunke, Dirk; Moch, Katja [Oeko-Institut e.V. - Institut fuer Angewandte Oekologie e.V., Freiburg im Breisgau (Germany); Gartiser, Stefan [Hydrotox GmbH, Freiburg im Breisgau (Germany)

    2011-12-15

    Article 10(1) of the EU Biocidal Products Directive 98/8/EC (BPD) requires that for the inclusion of an active substance in Annex I, Annex IA or IB, cumulation effects from the use of biocidal products containing the same active substance shall be taken into account, where relevant. The study proves the feasibility of a technical realisation of Article 10(1) of the BPD and elaborates a first concept for the cumulative environmental exposure assessment of biocides. Existing requirements concerning cumulative assessments in other regulatory frameworks have been evaluated and their applicability for biocides has been examined. Technical terms and definitions used in this context were documented with the aim to harmonise terminology with other frameworks and to set up a precise definition within the BPD. Furthermore, application conditions of biocidal products have been analysed to find out for which cumulative exposure assessments may be relevant. Different parameters were identified which might serve as indicators for the relevance of cumulative exposure assessments. These indicators were then integrated in a flow chart by means of which the relevance of cumulative exposure assessments can be checked. Finally, proposals for the technical performance of cumulative exposure assessments within the Review Programme have been elaborated with the aim to bring the results of the project into the upcoming development and harmonization processes on EU level. (orig.)

  4. An evaluation paradigm for cumulative impact analysis

    Science.gov (United States)

    Stakhiv, Eugene Z.

    1988-09-01

    Cumulative impact analysis is examined from a conceptual decision-making perspective, focusing on its implicit and explicit purposes as suggested within the policy and procedures for environmental impact analysis of the National Environmental Policy Act of 1969 (NEPA) and its implementing regulations. In this article it is also linked to different evaluation and decision-making conventions, contrasting a regulatory context with a comprehensive planning framework. The specific problems that make the application of cumulative impact analysis a virtually intractable evaluation requirement are discussed in connection with the federal regulation of wetlands uses. The relatively familiar US Army Corps of Engineers' (the Corps) permit program, in conjunction with the Environmental Protection Agency's (EPA) responsibilities in managing its share of the Section 404 regulatory program requirements, is used throughout as the realistic context for highlighting certain pragmatic evaluation aspects of cumulative impact assessment. To understand the purposes of cumulative impact analysis (CIA), a key distinction must be made between the implied comprehensive and multiobjective evaluation purposes of CIA, promoted through the principles and policies contained in NEPA, and the more commonly conducted and limited assessment of cumulative effects (ACE), which focuses largely on the ecological effects of human actions. Based on current evaluation practices within the Corps' and EPA's permit programs, it is shown that the commonly used screening approach to regulating wetlands uses is not compatible with the purposes of CIA, nor is the environmental impact statement (EIS) an appropriate vehicle for evaluating the variety of objectives and trade-offs needed as part of CIA. A heuristic model that incorporates the basic elements of CIA is developed, including the idea of trade-offs among social, economic, and environmental protection goals carried out within the context of environmental

  5. The ATLAS Tau Trigger

    International Nuclear Information System (INIS)

    Rados, Petar Kevin

    2013-06-01

    The tau lepton plays a crucial role in understanding particle physics at the Tera scale. One of the most promising probes of the Higgs boson coupling to fermions is with detector signatures involving taus. In addition, many theories beyond the Standard Model, such as supersymmetry and exotic particles (W' and Z'), predict new physics with large couplings to taus. The ability to trigger on hadronic tau decays is therefore critical to achieving the physics goals of the ATLAS experiment. The higher instantaneous luminosities of proton-proton collisions achieved by the Large Hadron Collider (LHC) in 2012 resulted in a larger probability of overlap (pile-up) between bunch crossings, and so it was critical for ATLAS to have an effective tau trigger strategy. The details of this strategy are summarized in this paper, and the results of the latest performance measurements are presented. (authors)

  6. The LPS trigger system

    International Nuclear Information System (INIS)

    Benotto, F.; Costa, M.; Staiano, A.; Zampieri, A.; Bollito, M.; Isoardi, P.; Pernigotti, E.; Sacchi, R.; Trapani, P.P.; Larsen, H.; Massam, T.; Nemoz, C.

    1996-03-01

    The Leading Proton Spectrometer (LPS) has been equipped with microstrip silicon detectors specially designed to trigger events with high values of x L vertical stroke anti p' p vertical stroke / vertical stroke anti p p vertical stroke ≥0.95 where vertical stroke anti p' p vertical stroke and vertical stroke anti p p vertical stroke are respectively the momenta of outgoing and incoming protons. The LPS First Level Trigger can provide a clear tag for very high momentum protons in a kinematical region never explored before. In the following we discuss the physics motivation in tagging very forward protons and present a detailed description of the detector design, the front end electronics, the readout electronics, the Monte Carlo simulation and some preliminary results from 1995 data taking. (orig.)

  7. Minimum risk trigger indices

    International Nuclear Information System (INIS)

    Tingey, F.H.

    1979-01-01

    A viable safeguards system includes among other things the development and use of indices which trigger various courses of action. The usual limit of error calculation provides such an index. The classical approach is one of constructing tests which, under certain assumptions, make the likelihood of a false alarm small. Of concern also is the test's failure to indicate a loss (diversion) when in fact one has occurred. Since false alarms are usually costly and losses both costly and of extreme strategic sinificance, there remains the task of balancing the probability of false alarm and its consequences against the probability of undetected loss and its consequences. The application of other than classical hypothesis testing procedures are considered in this paper. Using various consequence models, trigger indices are derived which have certain optimum properties. Application of the techniques would enhance the material control function

  8. Neural networks for triggering

    International Nuclear Information System (INIS)

    Denby, B.; Campbell, M.; Bedeschi, F.; Chriss, N.; Bowers, C.; Nesti, F.

    1990-01-01

    Two types of neural network beauty trigger architectures, based on identification of electrons in jets and recognition of secondary vertices, have been simulated in the environment of the Fermilab CDF experiment. The efficiencies for B's and rejection of background obtained are encouraging. If hardware tests are successful, the electron identification architecture will be tested in the 1991 run of CDF. 10 refs., 5 figs., 1 tab

  9. The ARGUS vertex trigger

    International Nuclear Information System (INIS)

    Koch, N.; Kolander, M.; Kolanoski, H.; Siegmund, T.; Bergter, J.; Eckstein, P.; Schubert, K.R.; Waldi, R.; Imhof, M.; Ressing, D.; Weiss, U.; Weseler, S.

    1995-09-01

    A fast second level trigger has been developed for the ARGUS experiment which recognizes tracks originating from the interaction region. The processor compares the hits in the ARGUS Micro Vertex Drift Chamber to 245760 masks stored in random access memories. The masks which are fully defined in three dimensions are able to reject tracks originating in the wall of the narrow beampipe of 10.5 mm radius. (orig.)

  10. Study of a Level-3 Tau Trigger with the Pixel Detector

    CERN Document Server

    Kotlinski, Danek; Nikitenko, Alexander

    2001-01-01

    We present a Monte Carlo study of the performance of a Level-3 Tau trigger based on the Pixel Detector data. The trigger is designed to select of the Higgs bosons decaying into two tau leptons with tau jet(s) in the final state. The proposed trigger is particularly useful as it operates at an early stage of the CMS High Level Trigger system. The performance of the trigger is studied for the most difficult case of high luminosity LHC scenario.

  11. An experimental investigation of triggered film boiling destabilisation

    International Nuclear Information System (INIS)

    Naylor, P.

    1985-03-01

    Film boiling was established on a polished brass rod in water, collapse being initiated by either a pressure pulse or a transient bulk water flow. This work is relevant to the triggering stage of a molten fuel-coolant interaction, and a criterion is proposed for triggered film boiling collapse with pressure pulse. (U.K.)

  12. Robust self-triggered MPC for constrained linear systems

    NARCIS (Netherlands)

    Brunner, F.D.; Heemels, W.P.M.H.; Allgöwer, F.

    2014-01-01

    In this paper we propose a robust self-triggered model predictive control algorithm for linear systems with additive bounded disturbances and hard constraints on the inputs and state. In self-triggered control, at every sampling instant the time until the next sampling instant is computed online

  13. The BTeV trigger system

    International Nuclear Information System (INIS)

    Kaplan, D.M.

    2000-01-01

    BTeV is a dedicated beauty and charm experiment proposed for the Fermilab Tevatron. The broad physics program envisaged for BTeV requires a trigger that is efficient for a wide variety of heavy-quark decays, including those to all-hadronic final states. To achieve this, we plan to trigger on evidence of detached vertices at the very first trigger level, taking advantage of fast-readout pixel detectors to facilitate fast pattern recognition. Simulations show that 100-to-1 rejection of light-quark background events can be achieved at Level 1 using specialized trackfinding hardware, and that an additional factor of 10-100 in data reduction can be achieved using general purpose processor farms at Levels 2 and 3. This is adequate to allow data taking at luminosities in excess of 2x10 32 cm -2 s -1

  14. Retrospective respiratory triggering renal perfusion MRI

    Energy Technology Data Exchange (ETDEWEB)

    Attenberger, Ulrike I.; Michaely, Henrik J.; Schoenberg, Stefan O. (Dept. of Clinical Radiology and Nuclear Medicine, Univ. Hospital Mannheim, Univ. of Heidelberg, Mannheim (Germany)), e-mail: ulrike.attenberger@medma.uni-heidelberg.de; Sourbron, Steven P. (Div. of Medical Physics, Univ. of Leeds, Leeds (United Kingdom)); Reiser, Maximilian F. (Dept. of Clinical Radiology, Univ. Hospitals Munich, Grosshadern, Ludwig-Maximilians-Univ., Munich (Germany))

    2010-12-15

    Background: Artifacts of respiratory motion are one of the well-known limitations of dynamic contrast-enhanced MRI (DCE-MRI) of the kidney. Purpose: To propose and evaluate a retrospective triggering approach to minimize the effect of respiratory motion in DCE-MRI of the kidney. Material and Methods: Nine consecutive patients underwent renal perfusion measurements. Data were acquired with a 2D saturation-recovery TurboFLASH sequence. In order to test the dependence of the results on size and location of the manually drawn triggering regions of interest (ROIs), three widely differing triggering regions were defined by one observer. Mean value, standard deviation, and variability of the renal function parameters plasma flow (FP), plasma volume (VP), plasma transit time (TP), tubular flow (FT), tubular volume (VT), and tubular transit time (TT) were calculated on a per-patient basis. Results: The results show that triggered data have adequate temporal resolution to measure blood flow. The overall average values of the function parameters were: 152.77 (FP), 15.18 (VP), 6,73 (TP), 18.50 (FT), 35.36 (VT), and 117.67 (TT). The variability (calculated in % SD from the mean value) for three different respiratory triggering regions defined on a per-patient basis was between 0.81% and 9.87% for FP, 1.45% and 8.19% for VP, 0% and 9.63% for TP, 2.15% and 12.23% for TF, 0.8% and 17.28% for VT, and 1.97% and 12.87% for TT. Conclusion: Triggering reduces the oscillations in the signal curves and produces sharper parametric maps. In contrast to numerically challenging approaches like registration and segmentation it can be applied in clinical routine, but a (semi)-automatic approach to select the triggering ROI is desirable to reduce user dependence.

  15. Aftershocks and triggering processes in rock fracture

    Science.gov (United States)

    Davidsen, J.; Kwiatek, G.; Goebel, T.; Stanchits, S. A.; Dresen, G.

    2017-12-01

    One of the hallmarks of our understanding of seismicity in nature is the importance of triggering processes, which makes the forecasting of seismic activity feasible. These triggering processes by which one earthquake induces (dynamic or static) stress changes leading to potentially multiple other earthquakes are at the core relaxation processes. A specic example of triggering are aftershocks following a large earthquake, which have been observed to follow certain empirical relationships such as the Omori-Utsu relation. Such an empirical relation should arise from the underlying microscopic dynamics of the involved physical processes but the exact connection remains to be established. Simple explanations have been proposed but their general applicability is unclear. Many explanations involve the picture of an earthquake as a purely frictional sliding event. Here, we present experimental evidence that these empirical relationships are not limited to frictional processes but also arise in fracture zone formation and are mostly related to compaction-type events. Our analysis is based on tri-axial compression experiments under constant displacement rate on sandstone and granite samples using spatially located acoustic emission events and their focal mechanisms. More importantly, we show that event-event triggering plays an important role in the presence of large-scale or macrocopic imperfections while such triggering is basically absent if no signicant imperfections are present. We also show that spatial localization and an increase in activity rates close to failure do not necessarily imply triggering behavior associated with aftershocks. Only if a macroscopic crack is formed and its propagation remains subcritical do we observe significant triggering.

  16. Sharing a quota on cumulative carbon emissions

    International Nuclear Information System (INIS)

    Raupach, Michael R.; Davis, Steven J.; Peters, Glen P.; Andrew, Robbie M.; Canadell, Josep G.; Ciais, Philippe

    2014-01-01

    Any limit on future global warming is associated with a quota on cumulative global CO 2 emissions. We translate this global carbon quota to regional and national scales, on a spectrum of sharing principles that extends from continuation of the present distribution of emissions to an equal per-capita distribution of cumulative emissions. A blend of these endpoints emerges as the most viable option. For a carbon quota consistent with a 2 C warming limit (relative to pre-industrial levels), the necessary long-term mitigation rates are very challenging (typically over 5% per year), both because of strong limits on future emissions from the global carbon quota and also the likely short-term persistence in emissions growth in many regions. (authors)

  17. Complexity and demographic explanations of cumulative culture.

    Science.gov (United States)

    Querbes, Adrien; Vaesen, Krist; Houkes, Wybo

    2014-01-01

    Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.

  18. Complexity and demographic explanations of cumulative culture.

    Directory of Open Access Journals (Sweden)

    Adrien Querbes

    Full Text Available Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.

  19. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  20. Childhood Cumulative Risk and Later Allostatic Load

    DEFF Research Database (Denmark)

    Doan, Stacey N; Dich, Nadya; Evans, Gary W

    2014-01-01

    State, followed for 8 years (between the ages 9 and 17). Poverty- related stress was computed using the cumulative risk approach, assessing stressors across 9 domains, including environmental, psychosocial, and demographic factors. Allostatic load captured a range of physiological responses, including......Objective: The present study investigated the long-term impact of exposure to poverty-related stressors during childhood on allostatic load, an index of physiological dysregulation, and the potential mediating role of substance use. Method: Participants (n = 162) were rural children from New York...... cardiovascular, hypothalamic pituitary adrenal axis, sympathetic adrenal medullary system, and metabolic activity. Smoking and alcohol/drug use were tested as mediators of the hypothesized childhood risk-adolescent allostatic load relationship. Results: Cumulative risk exposure at age 9 predicted increases...

  1. Fuzzy set theory for cumulative trauma prediction

    OpenAIRE

    Fonseca, Daniel J.; Merritt, Thomas W.; Moynihan, Gary P.

    2001-01-01

    A widely used fuzzy reasoning algorithm was modified and implemented via an expert system to assess the potential risk of employee repetitive strain injury in the workplace. This fuzzy relational model, known as the Priority First Cover Algorithm (PFC), was adapted to describe the relationship between 12 cumulative trauma disorders (CTDs) of the upper extremity, and 29 identified risk factors. The algorithm, which finds a suboptimal subset from a group of variables based on the criterion of...

  2. Sikap Kerja Duduk Terhadap Cumulative Trauma Disorder

    OpenAIRE

    Rahmawati, Yulita; Sugiharto, -

    2011-01-01

    Permasalahan yang diteliti adalah adakah hubungan antara sikap kerja duduk dengan kejadian Cumulative Trauma Disorder (CTD) pada pekerja bagian pengamplasan di PT. Geromar Jepara. Tujuan yang ingin dicapai adalah untuk mengetahui hubungan antara sikap kerja duduk dengan kejadian CTD pada pekerja bagian pengamplasan. Metode penelitian ini bersifat explanatory dengan menggunakan pendekatan belah lintang. Populasi dalam penelitian ini adalah pekerja bagian pengamplasan sebanyak 30 orang. Teknik ...

  3. Power Reactor Docket Information. Annual cumulation (citations)

    International Nuclear Information System (INIS)

    1977-12-01

    An annual cumulation of the citations to the documentation associated with civilian nuclear power plants is presented. This material is that which is submitted to the U.S. Nuclear Regulatory Commission in support of applications for construction and operating licenses. Citations are listed by Docket number in accession number sequence. The Table of Contents is arranged both by Docket number and by nuclear power plant name

  4. Cumulative Effect of Depression on Dementia Risk

    OpenAIRE

    Olazarán, J.; Trincado, R.; Bermejo-Pareja, F.

    2013-01-01

    Objective. To analyze a potential cumulative effect of life-time depression on dementia and Alzheimer's disease (AD), with control of vascular factors (VFs). Methods. This study was a subanalysis of the Neurological Disorders in Central Spain (NEDICES) study. Past and present depression, VFs, dementia status, and dementia due to AD were documented at study inception. Dementia status was also documented after three years. Four groups were created according to baseline data: never depression (n...

  5. Cumulative release to the accessible environment

    International Nuclear Information System (INIS)

    Kanehiro, B.

    1985-01-01

    The Containment and Isolation Working Group considered issues related to the postclosure behavior of repositories in crystalline rock. This working group was further divided into subgroups to consider the progress since the 1978 GAIN Symposium and identify research needs in the individual areas of regional ground-water flow, ground-water travel time, fractional release, and cumulative release. The analysis and findings of the Fractional Release Subgroup are presented

  6. Zinc triggers microglial activation.

    Science.gov (United States)

    Kauppinen, Tiina M; Higashi, Youichirou; Suh, Sang Won; Escartin, Carole; Nagasawa, Kazuki; Swanson, Raymond A

    2008-05-28

    Microglia are resident immune cells of the CNS. When stimulated by infection, tissue injury, or other signals, microglia assume an activated, "ameboid" morphology and release matrix metalloproteinases, reactive oxygen species, and other proinflammatory factors. This innate immune response augments host defenses, but it can also contribute to neuronal death. Zinc is released by neurons under several conditions in which microglial activation occurs, and zinc chelators can reduce neuronal death in animal models of cerebral ischemia and neurodegenerative disorders. Here, we show that zinc directly triggers microglial activation. Microglia transfected with a nuclear factor-kappaB (NF-kappaB) reporter gene showed a severalfold increase in NF-kappaB activity in response to 30 microm zinc. Cultured mouse microglia exposed to 15-30 microm zinc increased nitric oxide production, increased F4/80 expression, altered cytokine expression, and assumed the activated morphology. Zinc-induced microglial activation was blocked by inhibiting NADPH oxidase, poly(ADP-ribose) polymerase-1 (PARP-1), or NF-kappaB activation. Zinc injected directly into mouse brain induced microglial activation in wild-type mice, but not in mice genetically lacking PARP-1 or NADPH oxidase activity. Endogenous zinc release, induced by cerebral ischemia-reperfusion, likewise induced a robust microglial reaction, and this reaction was suppressed by the zinc chelator CaEDTA. Together, these results suggest that extracellular zinc triggers microglial activation through the sequential activation of NADPH oxidase, PARP-1, and NF-kappaB. These findings identify a novel trigger for microglial activation and a previously unrecognized mechanism by which zinc may contribute to neurological disorders.

  7. EPA Workshop on Epigenetics and Cumulative Risk ...

    Science.gov (United States)

    Agenda Download the Workshop Agenda (PDF) The workshop included presentations and discussions by scientific experts pertaining to three topics (i.e., epigenetic changes associated with diverse stressors, key science considerations in understanding epigenetic changes, and practical application of epigenetic tools to address cumulative risks from environmental stressors), to address several questions under each topic, and included an opportunity for attendees to participate in break-out groups, provide comments and ask questions. Workshop Goals The workshop seeks to examine the opportunity for use of aggregate epigenetic change as an indicator in cumulative risk assessment for populations exposed to multiple stressors that affect epigenetic status. Epigenetic changes are specific molecular changes around DNA that alter expression of genes. Epigenetic changes include DNA methylation, formation of histone adducts, and changes in micro RNAs. Research today indicates that epigenetic changes are involved in many chronic diseases (cancer, cardiovascular disease, obesity, diabetes, mental health disorders, and asthma). Research has also linked a wide range of stressors including pollution and social factors with occurrence of epigenetic alterations. Epigenetic changes have the potential to reflect impacts of risk factors across multiple stages of life. Only recently receiving attention is the nexus between the factors of cumulative exposure to environmental

  8. Higher order cumulants in colorless partonic plasma

    Energy Technology Data Exchange (ETDEWEB)

    Cherif, S. [Sciences and Technologies Department, University of Ghardaia, Ghardaia, Algiers (Algeria); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria); Ahmed, M. A. A. [Department of Physics, College of Science, Taibah University Al-Madinah Al-Mounawwarah KSA (Saudi Arabia); Department of Physics, Taiz University in Turba, Taiz (Yemen); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria); Ladrem, M., E-mail: mladrem@yahoo.fr [Department of Physics, College of Science, Taibah University Al-Madinah Al-Mounawwarah KSA (Saudi Arabia); Laboratoire de Physique et de Mathématiques Appliquées (LPMA), ENS-Kouba (Bachir El-Ibrahimi), Algiers (Algeria)

    2016-06-10

    Any physical system considered to study the QCD deconfinement phase transition certainly has a finite volume, so the finite size effects are inevitably present. This renders the location of the phase transition and the determination of its order as an extremely difficult task, even in the simplest known cases. In order to identify and locate the colorless QCD deconfinement transition point in finite volume T{sub 0}(V), a new approach based on the finite-size cumulant expansion of the order parameter and the ℒ{sub m,n}-Method is used. We have shown that both cumulants of higher order and their ratios, associated to the thermodynamical fluctuations of the order parameter, in QCD deconfinement phase transition behave in a particular enough way revealing pronounced oscillations in the transition region. The sign structure and the oscillatory behavior of these in the vicinity of the deconfinement phase transition point might be a sensitive probe and may allow one to elucidate their relation to the QCD phase transition point. In the context of our model, we have shown that the finite volume transition point is always associated to the appearance of a particular point in whole higher order cumulants under consideration.

  9. Cumulative irritation potential of topical retinoid formulations.

    Science.gov (United States)

    Leyden, James J; Grossman, Rachel; Nighland, Marge

    2008-08-01

    Localized irritation can limit treatment success with topical retinoids such as tretinoin and adapalene. The factors that influence irritant reactions have been shown to include individual skin sensitivity, the particular retinoid and concentration used, and the vehicle formulation. To compare the cutaneous tolerability of tretinoin 0.04% microsphere gel (TMG) with that of adapalene 0.3% gel and a standard tretinoin 0.025% cream. The results of 2 randomized, investigator-blinded studies of 2 to 3 weeks' duration, which utilized a split-face method to compare cumulative irritation scores induced by topical retinoids in subjects with healthy skin, were combined. Study 1 compared TMG 0.04% with adapalene 0.3% gel over 2 weeks, while study 2 compared TMG 0.04% with tretinoin 0.025% cream over 3 weeks. In study 1, TMG 0.04% was associated with significantly lower cumulative scores for erythema, dryness, and burning/stinging than adapalene 0.3% gel. However, in study 2, there were no significant differences in cumulative irritation scores between TMG 0.04% and tretinoin 0.025% cream. Measurements of erythema by a chromameter showed no significant differences between the test formulations in either study. Cutaneous tolerance of TMG 0.04% on the face was superior to that of adapalene 0.3% gel and similar to that of a standard tretinoin cream containing a lower concentration of the drug (0.025%).

  10. A self seeded first level track trigger for ATLAS

    International Nuclear Information System (INIS)

    Schöning, A

    2012-01-01

    For the planned high luminosity upgrade of the Large Hadron Collider, aiming to increase the instantaneous luminosity to 5 × 10 34 cm −2 s −1 , the implementation of a first level track trigger has been proposed. This trigger could be installed in the year ∼ 2021 along with the complete renewal of the ATLAS inner detector. The fast readout of the hit information from the Inner Detector is considered as the main challenge of such a track trigger. Different concepts for the implementation of a first level trigger are currently studied within the ATLAS collaboration. The so called 'Self Seeded' track trigger concept exploits fast frontend filtering algorithms based on cluster size reconstruction and fast vector tracking to select hits associated to high momentum tracks. Simulation studies have been performed and results on efficiencies, purities and trigger rates are presented for different layouts.

  11. ATLAS Tau Trigger

    CERN Document Server

    Belanger-Champagne, C; Bosman, M; Brenner, R; Casado, MP; Czyczula, Z; Dam, M; Demers, S; Farrington, S; Igonkina, O; Kalinowski, A; Kanaya, N; Osuna, C; Pérez, E; Ptacek, E; Reinsch, A; Saavedra, A; Sopczak, A; Strom, D; Torrence, E; Tsuno, S; Vorwerk, V; Watson, A; Xella, S

    2008-01-01

    Moving to the high energy scale of the LHC, the identification of tau leptons will become a necessary and very powerful tool, allowing a discovery of physics beyond Standard Model. Many models, among them light SM Higgs and various SUSY models, predict an abundant production of taus with respect to other leptons. The reconstruction of hadronic tau decays, although a very challenging task in hadronic enviroments, allows to increase a signal efficiency by at least of factor 2, and provides an independent control sample to disantangle lepton tau decays from prompt electrons and muons. Thanks to the advanced calorimetry and tracking, the ATLAS experiment has developed tools to efficiently identify hadronic taus at the trigger level. In this presentation we will review the characteristics of taus and the methods to suppress low-multiplicity, low-energy jets contributions as well as we will address the tau trigger chain which provide a rejection rate of 10^5. We will further present plans for commissioning the ATLA...

  12. The D0 calorimeter trigger

    International Nuclear Information System (INIS)

    Guida, J.

    1992-12-01

    The D0 calorimeter trigger system consists of many levels to make physics motivated trigger decisions. The Level-1 trigger uses hardware techniques to reduce the trigger rate from ∼ 100kHz to 200Hz. It forms sums of electromagnetic and hadronic energy, globally and in towers, along with finding the missing transverse energy. A minimum energy is set on these energy sums to pass the event. The Level-2 trigger is a set of software filters, operating in a parallel-processing microvax farm which further reduces the trigger rate to a few Hertz. These filters will reject events which lack electron candidates, jet candidates, or missing transverse energy in the event. The performance of these triggers during the early running of the D0 detector will also be discussed

  13. New tests of cumulative prospect theory and the priority heuristic

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-04-01

    Full Text Available Previous tests of cumulative prospect theory (CPT and of the priority heuristic (PH found evidence contradicting these two models of risky decision making. However, those tests were criticized because they had characteristics that might ``trigger'' use of other heuristics. This paper presents new tests that avoid those characteristics. Expected values of the gambles are nearly equal in each choice. In addition, if a person followed expected value (EV, expected utility (EU, CPT, or PH in these tests, she would shift her preferences in the same direction as shifts in EV or EU. In contrast, the transfer of attention exchange model (TAX and a similarity model predict that people will reverse preferences in the opposite direction. Results contradict the PH, even when PH is modified to include a preliminary similarity evaluation using the PH parameters. New tests of probability-consequence interaction were also conducted. Strong interactions were observed, contrary to PH. These results add to the growing bodies of evidence showing that neither CPT nor PH is an accurate description of risky decision making.

  14. Surgery for trigger finger.

    Science.gov (United States)

    Fiorini, Haroldo Junior; Tamaoki, Marcel Jun; Lenza, Mário; Gomes Dos Santos, Joao Baptista; Faloppa, Flávio; Belloti, Joao Carlos

    2018-02-20

    Trigger finger is a common clinical disorder, characterised by pain and catching as the patient flexes and extends digits because of disproportion between the diameter of flexor tendons and the A1 pulley. The treatment approach may include non-surgical or surgical treatments. Currently there is no consensus about the best surgical treatment approach (open, percutaneous or endoscopic approaches). To evaluate the effectiveness and safety of different methods of surgical treatment for trigger finger (open, percutaneous or endoscopic approaches) in adults at any stage of the disease. We searched CENTRAL, MEDLINE, Embase and LILACS up to August 2017. We included randomised or quasi-randomised controlled trials that assessed adults with trigger finger and compared any type of surgical treatment with each other or with any other non-surgical intervention. The major outcomes were the resolution of trigger finger, pain, hand function, participant-reported treatment success or satisfaction, recurrence of triggering, adverse events and neurovascular injury. Two review authors independently selected the trial reports, extracted the data and assessed the risk of bias. Measures of treatment effect for dichotomous outcomes calculated risk ratios (RRs), and mean differences (MDs) or standardised mean differences (SMD) for continuous outcomes, with 95% confidence intervals (CIs). When possible, the data were pooled into meta-analysis using the random-effects model. GRADE was used to assess the quality of evidence for each outcome. Fourteen trials were included, totalling 1260 participants, with 1361 trigger fingers. The age of participants included in the studies ranged from 16 to 88 years; and the majority of participants were women (approximately 70%). The average duration of symptoms ranged from three to 15 months, and the follow-up after the procedure ranged from eight weeks to 23 months.The studies reported nine types of comparisons: open surgery versus steroid injections (two

  15. A bivariate optimal replacement policy with cumulative repair cost ...

    Indian Academy of Sciences (India)

    Min-Tsai Lai

    Shock model; cumulative damage model; cumulative repair cost limit; preventive maintenance model. 1. Introduction ... with two types of shocks: one type is failure shock, and the other type is damage ...... Theory, methods and applications.

  16. Triggering at high luminosity: fake triggers from pile-up

    International Nuclear Information System (INIS)

    Johnson, R.

    1983-01-01

    Triggers based on a cut in transverse momentum (p/sub t/) have proved to be useful in high energy physics both because they indicte that a hard constituent scattering has occurred and because they can be made quickly enough to gate electronics. These triggers will continue to be useful at high luminosities if overlapping events do not cause an excessive number of fake triggers. In this paper, I determine if this is indeed a problem at high luminosity machines

  17. Nostalgia: content, triggers, functions.

    Science.gov (United States)

    Wildschut, Tim; Sedikides, Constantine; Arndt, Jamie; Routledge, Clay

    2006-11-01

    Seven methodologically diverse studies addressed 3 fundamental questions about nostalgia. Studies 1 and 2 examined the content of nostalgic experiences. Descriptions of nostalgic experiences typically featured the self as a protagonist in interactions with close others (e.g., friends) or in momentous events (e.g., weddings). Also, the descriptions contained more expressions of positive than negative affect and often depicted the redemption of negative life scenes by subsequent triumphs. Studies 3 and 4 examined triggers of nostalgia and revealed that nostalgia occurs in response to negative mood and the discrete affective state of loneliness. Studies 5, 6, and 7 investigated the functional utility of nostalgia and established that nostalgia bolsters social bonds, increases positive self-regard, and generates positive affect. These findings demarcate key landmarks in the hitherto uncharted research domain of nostalgia.

  18. On interference of cumulative proton production mechanisms

    International Nuclear Information System (INIS)

    Braun, M.A.; Vechernin, V.V.

    1993-01-01

    The dynamical picture of the cumulative proton production in hA-collisions by means of diagram analysis with NN interaction described by a non-relativistic NN potential is considered. The contributions of the various mechanisms (spectator, direct and rescattering) for backward hemisphere proton production within the framework of this common approach is calculated. The emphasis is on the comparison of the relative contributions of these mechanisms for various angles, taking into account the interference of these contributions. Comparison with experimental data is also presented. (author)

  19. Preserved cumulative semantic interference despite amnesia

    Directory of Open Access Journals (Sweden)

    Gary Michael Oppenheim

    2015-05-01

    As predicted by Oppenheim et al’s (2010 implicit incremental learning account, WRP’s BCN RTs demonstrated strong (and significant repetition priming and semantic blocking effects (Figure 1. Similar to typical results from neurally intact undergraduates, WRP took longer to name pictures presented in semantically homogeneous blocks than in heterogeneous blocks, an effect that increased with each cycle. This result challenges accounts that ascribe cumulative semantic interference in this task to explicit memory mechanisms, instead suggesting that the effect has the sort of implicit learning bases that are typically spared in hippocampal amnesia.

  20. Is cumulated pyrethroid exposure associated with prediabetes?

    DEFF Research Database (Denmark)

    Hansen, Martin Rune; Jørs, Erik; Lander, Flemming

    2014-01-01

    was to investigate an association between exposure to pyrethroids and abnormal glucose regulation (prediabetes or diabetes). A cross-sectional study was performed among 116 pesticide sprayers from public vector control programs in Bolivia and 92 nonexposed controls. Pesticide exposure (duration, intensity...... pyrethroids, a significant positive trend was observed between cumulative pesticide exposure (total number of hours sprayed) and adjusted OR of abnormal glucose regulation, with OR 14.7 [0.9-235] in the third exposure quintile. The study found a severely increased prevalence of prediabetes among Bolivian...

  1. A solar tornado triggered by flares?

    OpenAIRE

    Panesar, N. K.; Innes, D. E.; Tiwari, S. K.; Low, B. C.

    2013-01-01

    Context. Solar tornados are dynamical, conspicuously helical magnetic structures that are mainly observed as a prominence activity. Aims. We investigate and propose a triggering mechanism for the solar tornado observed in a prominence cavity by SDO/AIA on September 25, 2011. Methods. High-cadence EUV images from the SDO/AIA and the Ahead spacecraft of STEREO/EUVI are used to correlate three flares in the neighbouring active-region (NOAA 11303) and their EUV waves with the dynamical de...

  2. Storytelling as a trigger for sharing conversations

    OpenAIRE

    Emma Louise Parfitt

    2014-01-01

    This article explores whether traditional oral storytelling can be used to provide insights into the way in which young people of 12-14 years identify and understand the language of emotion and behaviour. Following the preliminary analysis, I propose that storytelling may trigger sharing conversations. My research attempts to extend the social and historical perspectives of Jack Zipes, on fairy tales, into a sociological analysis of young people’s lives today. I seek to investigate the extent...

  3. Chapter 19. Cumulative watershed effects and watershed analysis

    Science.gov (United States)

    Leslie M. Reid

    1998-01-01

    Cumulative watershed effects are environmental changes that are affected by more than.one land-use activity and that are influenced by.processes involving the generation or transport.of water. Almost all environmental changes are.cumulative effects, and almost all land-use.activities contribute to cumulative effects

  4. Original and cumulative prospect theory: a discussion of empirical differences

    NARCIS (Netherlands)

    Wakker, P.P.; Fennema, H.

    1997-01-01

    This note discusses differences between prospect theory and cumulative prospect theory. It shows that cumulative prospect theory is not merely a formal correction of some theoretical problems in prospect theory, but it also gives different predictions. Experiments are described that favor cumulative

  5. The ATLAS hadronic tau trigger

    CERN Document Server

    Black, C; The ATLAS collaboration

    2012-01-01

    With the high luminosities of proton-proton collisions achieved at the LHC, the strategies for triggering have become more important than ever for physics analysis. The naive inclusive single tau lepton triggers now suffer from severe rate limitations. To allow for a large program of physics analyses with taus, the development of topological triggers that combine tau signatures with other measured quantities in the event is required. These combined triggers open many opportunities to study new physics beyond the Standard Model and to search for the Standard Model Higgs. We present the status and performance of the hadronic tau trigger in ATLAS. We demonstrate that the ATLAS tau trigger ran remarkably well over 2011, and how the lessons learned from 2011 led to numerous improvements in the preparation of the 2012 run. These improvements include the introduction of tau selection criteria that are robust against varying pileup scenarios, and the implementation of multivariate selection techniques in the tau trig...

  6. The ATLAS hadronic tau trigger

    CERN Document Server

    Black, C; The ATLAS collaboration

    2012-01-01

    With the high luminosities of proton-proton collisions achieved at the LHC, the strategies for triggering have become more important than ever for physics analysis. The naïve inclusive single tau lepton triggers now suffer from severe rate limitations. To allow for a large program of physics analyses with taus, the development of topological triggers that combine tau signatures with other measured quantities in the event is required. These combined triggers open many opportunities to study new physics beyond the Standard Model and to search for the Standard Model Higgs. We present the status and performance of the hadronic tau trigger in ATLAS. We demonstrate that the ATLAS tau trigger ran remarkably well over 2011, and how the lessons learned from 2011 led to numerous improvements in the preparation of the 2012 run. These improvements include the introduction of tau selection criteria that are robust against varying pileup scenarios, and the implementation of multivariate selection techniques in the tau tri...

  7. The ATLAS hadronic tau trigger

    International Nuclear Information System (INIS)

    Shamim, Mansoora

    2012-01-01

    The extensive tau physics programs of the ATLAS experiment relies heavily on trigger to select hadronic decays of tau lepton. Such a trigger is implemented in ATLAS to efficiently collect signal events, while keeping the rate of multi-jet background within the allowed bandwidth. This contribution summarizes the performance of the ATLAS hadronic tau trigger system during 2011 data taking period and improvements implemented for the 2012 data collection.

  8. Flexible trigger menu implementation on the Global Trigger for the CMS Level-1 trigger upgrade

    Science.gov (United States)

    MATSUSHITA, Takashi; CMS Collaboration

    2017-10-01

    The CMS experiment at the Large Hadron Collider (LHC) has continued to explore physics at the high-energy frontier in 2016. The integrated luminosity delivered by the LHC in 2016 was 41 fb-1 with a peak luminosity of 1.5 × 1034 cm-2s-1 and peak mean pile-up of about 50, all exceeding the initial estimations for 2016. The CMS experiment has upgraded its hardware-based Level-1 trigger system to maintain its performance for new physics searches and precision measurements at high luminosities. The Global Trigger is the final step of the CMS Level-1 trigger and implements a trigger menu, a set of selection requirements applied to the final list of objects from calorimeter and muon triggers, for reducing the 40 MHz collision rate to 100 kHz. The Global Trigger has been upgraded with state-of-the-art FPGA processors on Advanced Mezzanine Cards with optical links running at 10 GHz in a MicroTCA crate. The powerful processing resources of the upgraded system enable implementation of more algorithms at a time than previously possible, allowing CMS to be more flexible in how it handles the available trigger bandwidth. Algorithms for a trigger menu, including topological requirements on multi-objects, can be realised in the Global Trigger using the newly developed trigger menu specification grammar. Analysis-like trigger algorithms can be represented in an intuitive manner and the algorithms are translated to corresponding VHDL code blocks to build a firmware. The grammar can be extended in future as the needs arise. The experience of implementing trigger menus on the upgraded Global Trigger system will be presented.

  9. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...

  10. BTeV trigger/DAQ innovations

    International Nuclear Information System (INIS)

    Votava, Margaret

    2005-01-01

    The BTeV experiment was a collider based high energy physics (HEP) B-physics experiment proposed at Fermilab. It included a large-scale, high speed trigger/data acquisition (DAQ) system, reading data off the detector at 500 Gbytes/sec and writing to mass storage at 200 Mbytes/sec. The online design was considered to be highly credible in terms of technical feasibility, schedule and cost. This paper will give an overview of the overall trigger/DAQ architecture, highlight some of the challenges, and describe the BTeV approach to solving some of the technical challenges. At the time of termination in early 2005, the experiment had just passed its baseline review. Although not fully implemented, many of the architecture choices, design, and prototype work for the online system (both trigger and DAQ) were well on their way to completion. Other large, high-speed online systems may have interest in the some of the design choices and directions of BTeV, including (a) a commodity-based tracking trigger running asynchronously at full rate, (b) the hierarchical control and fault tolerance in a large real time environment, (c) a partitioning model that supports offline processing on the online farms during idle periods with plans for dynamic load balancing, and (d) an independent parallel highway architecture

  11. Trigger and data acquisition

    CERN Multimedia

    CERN. Geneva; Gaspar, C

    2001-01-01

    Past LEP experiments generate data at 0.5 MByte/s from particle detectors with over a quarter of a million readout channels. The process of reading out the electronic channels, treating them, and storing the date produced by each collision for further analysis by the physicists is called "Data Acquisition". Not all beam crossings produce interesting physics "events", picking the interesting ones is the task of the "Trigger" system. In order to make sure that the data is collected in good conditions the experiment's operation has to be constantly verified. In all, at LEP experiments over 100 000 parameters were monitored, controlled, and synchronized by the "Monotoring and control" system. In the future, LHC experiments will produce as much data in a single day as a LEP detector did in a full year's running with a raw data rate of 10 - 100 MBytes/s and will have to cope with some 800 million proton-proton collisions a second of these collisions only one in 100 million million is interesting for new particle se...

  12. Performance of the ALICE PHOS trigger and improvements for RUN 2

    International Nuclear Information System (INIS)

    Zhao, C; Røed, K; Skaali, T B; Liu, L; Rohrich, D; Kharlov, Y; Bratrud, L; Alme, J

    2013-01-01

    This paper will discuss the performance of the PHOS level-0 trigger and planned improvements for RUN 2. Due to hardware constraints the Trigger Region Unit boards are limited to an operating frequency of 20 MHz. This has led to some ambiguity and biases of the trigger inputs. The trigger input generation scheme was therefore optimized to improve the performance. The PHOS level-0 trigger system has been working with an acceptable efficiency and purity. Proposed actions to further improve the performance and possibly eliminate the impact of the biased trigger inputs will also be presented

  13. Symptom-triggered benzodiazepine therapy for alcohol withdrawal syndrome in the emergency department: a comparison with the standard fixed dose benzodiazepine regimen.

    LENUS (Irish Health Repository)

    Cassidy, Eugene M

    2012-10-01

    The aim of the study was to compare symptom-triggered and standard benzodiazepine regimens for the treatment of alcohol withdrawal syndrome in an emergency department clinical decision unit. The authors found that the symptom-triggered approach reduced cumulative benzodiazepine dose and length of stay.

  14. Cumulative Clearness Index Frequency Distributions on the Territory of the Russian Federation

    Science.gov (United States)

    Frid, S. E.; Lisitskaya, N. V.; Popel, O. S.

    2018-02-01

    Cumulative distributions of clearness index values are constructed for the territory of Russia based on ground observation results and NASA POWER data. The obtained distributions lie close to each other, which means that the NASA POWER data can be used in solar power installations simulation at temperate and high latitudes. Approximation of the obtained distributions is carried out. The values of equation coefficients for the cumulative clearness index distributions constructed for a wide range of climatic conditions are determined. Equations proposed for a tropical climate are used in the calculations, so they can be regarded as universal ones.

  15. Evolution model with a cumulative feedback coupling

    Science.gov (United States)

    Trimper, Steffen; Zabrocki, Knud; Schulz, Michael

    2002-05-01

    The paper is concerned with a toy model that generalizes the standard Lotka-Volterra equation for a certain population by introducing a competition between instantaneous and accumulative, history-dependent nonlinear feedback the origin of which could be a contribution from any kind of mismanagement in the past. The results depend on the sign of that additional cumulative loss or gain term of strength λ. In case of a positive coupling the system offers a maximum gain achieved after a finite time but the population will die out in the long time limit. In this case the instantaneous loss term of strength u is irrelevant and the model exhibits an exact solution. In the opposite case λ<0 the time evolution of the system is terminated in a crash after ts provided u=0. This singularity after a finite time can be avoided if u≠0. The approach may well be of relevance for the qualitative understanding of more realistic descriptions.

  16. Psychometric properties of the Cumulated Ambulation Score

    DEFF Research Database (Denmark)

    Ferriero, Giorgio; Kristensen, Morten T; Invernizzi, Marco

    2018-01-01

    INTRODUCTION: In the geriatric population, independent mobility is a key factor in determining readiness for discharge following acute hospitalization. The Cumulated Ambulation Score (CAS) is a potentially valuable score that allows day-to-day measurements of basic mobility. The CAS was developed...... and validated in older patients with hip fracture as an early postoperative predictor of short-term outcome, but it is also used to assess geriatric in-patients with acute medical illness. Despite the fast- accumulating literature on the CAS, to date no systematic review synthesizing its psychometric properties....... Of 49 studies identified, 17 examined the psychometric properties of the CAS. EVIDENCE SYNTHESIS: Most papers dealt with patients after hip fracture surgery, and only 4 studies assessed the CAS psychometric characteristics also in geriatric in-patients with acute medical illness. Two versions of CAS...

  17. The TOTEM modular trigger system

    Energy Technology Data Exchange (ETDEWEB)

    Bagliesi, M.G., E-mail: mg.bagliesi@pi.infn.i [University of Siena and INFN Pisa (Italy); Berretti, M.; Cecchi, R.; Greco, V.; Lami, S.; Latino, G.; Oliveri, E.; Pedreschi, E.; Scribano, A.; Spinella, F.; Turini, N. [University of Siena and INFN Pisa (Italy)

    2010-05-21

    The TOTEM experiment will measure the total cross-section with the luminosity independent method and study elastic and diffractive scattering at the LHC. We are developing a modular trigger system, based on programmable logic, that will select meaningful events within 2.5{mu}s. The trigger algorithm is based on a tree structure in order to obtain information compression. The trigger primitive is generated directly on the readout chip, VFAT, that has a specific fast output that gives low resolution hits information. In two of the TOTEM detectors, Roman Pots and T2, a coincidence chip will perform track recognition directly on the detector readout boards, while for T1 the hits are transferred from the VFATs to the trigger hardware. Starting from more than 2000 bits delivered by the detector electronics, we extract, in a first step, six trigger patterns of 32 LVDS signals each; we build, then, on a dedicated board, a 1-bit (L1) trigger signal for the TOTEM experiment and 16 trigger bits to the CMS experiment global trigger system for future common data taking.

  18. The TOTEM modular trigger system

    International Nuclear Information System (INIS)

    Bagliesi, M.G.; Berretti, M.; Cecchi, R.; Greco, V.; Lami, S.; Latino, G.; Oliveri, E.; Pedreschi, E.; Scribano, A.; Spinella, F.; Turini, N.

    2010-01-01

    The TOTEM experiment will measure the total cross-section with the luminosity independent method and study elastic and diffractive scattering at the LHC. We are developing a modular trigger system, based on programmable logic, that will select meaningful events within 2.5μs. The trigger algorithm is based on a tree structure in order to obtain information compression. The trigger primitive is generated directly on the readout chip, VFAT, that has a specific fast output that gives low resolution hits information. In two of the TOTEM detectors, Roman Pots and T2, a coincidence chip will perform track recognition directly on the detector readout boards, while for T1 the hits are transferred from the VFATs to the trigger hardware. Starting from more than 2000 bits delivered by the detector electronics, we extract, in a first step, six trigger patterns of 32 LVDS signals each; we build, then, on a dedicated board, a 1-bit (L1) trigger signal for the TOTEM experiment and 16 trigger bits to the CMS experiment global trigger system for future common data taking.

  19. Upgrade trigger: Biannual performance update

    CERN Document Server

    Aaij, Roel; Couturier, Ben; Esen, Sevda; De Cian, Michel; De Vries, Jacco Andreas; Dziurda, Agnieszka; Fitzpatrick, Conor; Fontana, Marianna; Grillo, Lucia; Hasse, Christoph; Jones, Christopher Rob; Le Gac, Renaud; Matev, Rosen; Neufeld, Niko; Nikodem, Thomas; Polci, Francesco; Del Buono, Luigi; Quagliani, Renato; Schwemmer, Rainer; Seyfert, Paul; Stahl, Sascha; Szumlak, Tomasz; Vesterinen, Mika Anton; Wanczyk, Joanna; Williams, Mark Richard James; Yin, Hang; Zacharjasz, Emilia Anna

    2017-01-01

    This document presents the performance of the LHCb Upgrade trigger reconstruction sequence, incorporating changes to the underlying reconstruction algorithms and detector description since the Trigger and Online Upgrade TDR. An updated extrapolation is presented using the most recent example of an Event Filter Farm node.

  20. Minimum Bias Trigger in ATLAS

    International Nuclear Information System (INIS)

    Kwee, Regina

    2010-01-01

    Since the restart of the LHC in November 2009, ATLAS has collected inelastic pp collisions to perform first measurements on charged particle densities. These measurements will help to constrain various models describing phenomenologically soft parton interactions. Understanding the trigger efficiencies for different event types are therefore crucial to minimize any possible bias in the event selection. ATLAS uses two main minimum bias triggers, featuring complementary detector components and trigger levels. While a hardware based first trigger level situated in the forward regions with 2.2 < |η| < 3.8 has been proven to select pp-collisions very efficiently, the Inner Detector based minimum bias trigger uses a random seed on filled bunches and central tracking detectors for the event selection. Both triggers were essential for the analysis of kinematic spectra of charged particles. Their performance and trigger efficiency measurements as well as studies on possible bias sources will be presented. We also highlight the advantage of these triggers for particle correlation analyses. (author)

  1. Cumulative impact assessments and bird/wind farm interactions: Developing a conceptual framework

    International Nuclear Information System (INIS)

    Masden, Elizabeth A.; Fox, Anthony D.; Furness, Robert W.; Bullman, Rhys; Haydon, Daniel T.

    2010-01-01

    The wind power industry has grown rapidly in the UK to meet EU targets of sourcing 20% of energy from renewable sources by 2020. Although wind power is a renewable energy source, there are environmental concerns over increasing numbers of wind farm proposals and associated cumulative impacts. Individually, a wind farm, or indeed any action, may have minor effects on the environment, but collectively these may be significant, potentially greater than the sum of the individual parts acting alone. EU and UK legislation requires a cumulative impact assessment (CIA) as part of Environmental Impact Assessments (EIA). However, in the absence of detailed guidance and definitions, such assessments within EIA are rarely adequate, restricting the acquisition of basic knowledge about the cumulative impacts of wind farms on bird populations. Here we propose a conceptual framework to promote transparency in CIA through the explicit definition of impacts, actions and scales within an assessment. Our framework requires improved legislative guidance on the actions to include in assessments, and advice on the appropriate baselines against which to assess impacts. Cumulative impacts are currently considered on restricted scales (spatial and temporal) relating to individual development EIAs. We propose that benefits would be gained from elevating CIA to a strategic level, as a component of spatially explicit planning.

  2. Study on the cumulative impact of reclamation activities on ecosystem health in coastal waters.

    Science.gov (United States)

    Shen, Chengcheng; Shi, Honghua; Zheng, Wei; Li, Fen; Peng, Shitao; Ding, Dewen

    2016-02-15

    The purpose of this study is to develop feasible tools to investigate the cumulative impact of reclamations on coastal ecosystem health, so that the strategies of ecosystem-based management can be applied in the coastal zone. An indicator system and model were proposed to assess the cumulative impact synthetically. Two coastal water bodies, namely Laizhou Bay (LZB) and Tianjin coastal waters (TCW), in the Bohai Sea of China were studied and compared, each in a different phase of reclamations. Case studies showed that the indicator scores of coastal ecosystem health in LZB and TCW were 0.75 and 0.68 out of 1.0, respectively. It can be concluded that coastal reclamations have a historically cumulative effect on benthic environment, whose degree is larger than that on aquatic environment. The ecosystem-based management of coastal reclamations should emphasize the spatially and industrially intensive layout. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Downstream cumulative effects of land use on freshwater communities

    Science.gov (United States)

    Kuglerová, L.; Kielstra, B. W.; Moore, D.; Richardson, J. S.

    2015-12-01

    Many streams and rivers are subject to disturbance from intense land use such as urbanization and agriculture, and this is especially obvious for small headwaters. Streams are spatially organized into networks where headwaters represent the tributaries and provide water, nutrients, and organic material to the main stems. Therefore perturbations within the headwaters might be cumulatively carried on downstream. Although we know that the disturbance of headwaters in urban and agricultural landscapes poses threats to downstream river reaches, the magnitude and severity of these changes for ecological communities is less known. We studied stream networks along a gradient of disturbance connected to land use intensity, from urbanized watersheds to watersheds placed in agricultural settings in the Greater Toronto Area. Further, we compared the patterns and processes found in the modified watershed to a control watershed, situated in a forested, less impacted landscape. Preliminary results suggest that hydrological modifications (flash floods), habitat loss (drainage and sewer systems), and water quality issues of small streams in urbanized and agricultural watersheds represent major disturbances and threats for aquatic and riparian biota on local as well as larger spatial scales. For example, communities of riparian plants are dominated by species typical of the land use on adjacent uplands as well as the dominant land use on the upstream contributing area, instead of riparian obligates commonly found in forested watersheds. Further, riparian communities in disturbed environments are dominated by invasive species. The changes in riparian communities are vital for various functions of riparian vegetation. Bank erosion control is suppressed, leading to severe channel transformations and sediment loadings in urbanized watersheds. Food sources for instream biota and thermal regimes are also changed, which further triggers alterations of in-stream biological communities

  4. The ALICE Central Trigger Processor (CTP) upgrade

    International Nuclear Information System (INIS)

    Krivda, M.; Alexandre, D.; Barnby, L.S.; Evans, D.; Jones, P.G.; Jusko, A.; Lietava, R.; Baillie, O. Villalobos; Pospíšil, J.

    2016-01-01

    The ALICE Central Trigger Processor (CTP) at the CERN LHC has been upgraded for LHC Run 2, to improve the Transition Radiation Detector (TRD) data-taking efficiency and to improve the physics performance of ALICE. There is a new additional CTP interaction record sent using a new second Detector Data Link (DDL), a 2 GB DDR3 memory and an extension of functionality for classes. The CTP switch has been incorporated directly onto the new LM0 board. A design proposal for an ALICE CTP upgrade for LHC Run 3 is also presented. Part of the development is a low latency high bandwidth interface whose purpose is to minimize an overall trigger latency

  5. A programmable systolic trigger processor for FERA bus data

    International Nuclear Information System (INIS)

    Appelquist, G.; Hovander, B.; Sellden, B.; Bohm, C.

    1992-09-01

    A generic CAMAC based trigger processor module for fast processing of large amounts of ADC data, has been designed. This module has been realised using complex programmable gate arrays (LCAs from XILINX). The gate arrays have been connected to memories and multipliers in such a way that different gate array configurations can cover a wide range of module applications. Using this module, it is possible to construct complex trigger processors. The module uses both the fast ECL FERA bus and the CAMAC bus for inputs and outputs. The latter, however, is primarily used for set-up and control but may also be used for data output. Large numbers of ADCs can be served by a hierarchical arrangement of trigger processor modules, processing ADC data with pipe-line arithmetics producing the final result at the apex of the pyramid. The trigger decision will be transmitted to the data acquisition system via a logic signal while numeric results may be extracted by the CAMAC controller. The trigger processor was originally developed for the proposed neutral particle search experiment at CERN, NUMASS. There it was designed to serve as a second level trigger processor. It was required to correct all ADC raw data for efficiency and pedestal, calculate the total calorimeter energy, obtain the optimal time of flight data and calculate the particle mass. A suitable mass cut would then deliver the trigger decision. More complex triggers were also considered. (au)

  6. DUMAND data acquisition with triggering

    International Nuclear Information System (INIS)

    Brenner, A.E.; Theriot, D.; March, R.H.

    1980-01-01

    A data acquisition scheme for the standard DUMAND array that includes a simple triggering scheme as a fundamental part of the system is presented. Although there are a number of not yet fully understood parameters, it is assumed that thresholds can be set in such a manner as to give rise to a triggered signal that is not so dominated by randoms that it gives a substantial decrease in the data acquisition rate over that which would be required by a nontriggered system. It is also assumed that the triggering logic is relatively simple and does not need major computational capabilities for a trigger logic decision. With these assumptions, it is possible to generate the trigger at the array and restrict the data transfer to shore. However, with a not unreasonable delay of 200 microseconds, it is even possible to transmit the information for the trigger to shore and perform all that logic on the shore. The critical point is to send the minimum amount of information necessary to construct the trigger such that one need not send all the possible information in all detectors of the array continuously to shore. 1 figure

  7. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    1999-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments....

  8. Schedulability-Driven Communication Synthesis for Time Triggered Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    We present an approach to static priority preemptive process scheduling for the synthesis of hard real-time distributed embedded systems where communication plays an important role. The communication model is based on a time-triggered protocol. We have developed an analysis for the communication...... delays proposing four different message scheduling policies over a time-triggered communication channel. Optimization strategies for the synthesis of communication are developed, and the four approaches to message scheduling are compared using extensive experiments...

  9. Rockfall triggering by cyclic thermal stressing of exfoliation fractures

    Science.gov (United States)

    Collins, Brian D.; Stock, Greg M.

    2016-01-01

    Exfoliation of rock deteriorates cliffs through the formation and subsequent opening of fractures, which in turn can lead to potentially hazardous rockfalls. Although a number of mechanisms are known to trigger rockfalls, many rockfalls occur during periods when likely triggers such as precipitation, seismic activity and freezing conditions are absent. It has been suggested that these enigmatic rockfalls may occur due to solar heating of rock surfaces, which can cause outward expansion. Here we use data from 3.5 years of field monitoring of an exfoliating granite cliff in Yosemite National Park in California, USA, to assess the magnitude and temporal pattern of thermally induced rock deformation. From a thermodynamic analysis, we find that daily, seasonal and annual temperature variations are sufficient to drive cyclic and cumulative opening of fractures. Application of fracture theory suggests that these changes can lead to further fracture propagation and the consequent detachment of rock. Our data indicate that the warmest times of the day and year are particularly conducive to triggering rockfalls, and that cyclic thermal forcing may enhance the efficacy of other, more typical rockfall triggers.

  10. Triggered Release from Polymer Capsules

    Energy Technology Data Exchange (ETDEWEB)

    Esser-Kahn, Aaron P. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry; Odom, Susan A. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry; Sottos, Nancy R. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Materials Science and Engineering; White, Scott R. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Aerospace Engineering; Moore, Jeffrey S. [Univ. of Illinois, Urbana, IL (United States). Beckman Inst. for Advanced Science and Technology and Dept. of Chemistry

    2011-07-06

    Stimuli-responsive capsules are of interest in drug delivery, fragrance release, food preservation, and self-healing materials. Many methods are used to trigger the release of encapsulated contents. Here we highlight mechanisms for the controlled release of encapsulated cargo that utilize chemical reactions occurring in solid polymeric shell walls. Triggering mechanisms responsible for covalent bond cleavage that result in the release of capsule contents include chemical, biological, light, thermal, magnetic, and electrical stimuli. We present methods for encapsulation and release, triggering methods, and mechanisms and conclude with our opinions on interesting obstacles for chemically induced activation with relevance for controlled release.

  11. Cumulative radiation dose of multiple trauma patients during their hospitalization

    International Nuclear Information System (INIS)

    Wang Zhikang; Sun Jianzhong; Zhao Zudan

    2012-01-01

    Objective: To study the cumulative radiation dose of multiple trauma patients during their hospitalization and to analyze the dose influence factors. Methods: The DLP for CT and DR were retrospectively collected from the patients during June, 2009 and April, 2011 at a university affiliated hospital. The cumulative radiation doses were calculated by summing typical effective doses of the anatomic regions scanned. Results: The cumulative radiation doses of 113 patients were collected. The maximum,minimum and the mean values of cumulative effective doses were 153.3, 16.48 mSv and (52.3 ± 26.6) mSv. Conclusions: Multiple trauma patients have high cumulative radiation exposure. Therefore, the management of cumulative radiation doses should be enhanced. To establish the individualized radiation exposure archives will be helpful for the clinicians and technicians to make decision whether to image again and how to select the imaging parameters. (authors)

  12. 7 CFR 42.132 - Determining cumulative sum values.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Determining cumulative sum values. 42.132 Section 42... Determining cumulative sum values. (a) The parameters for the on-line cumulative sum sampling plans for AQL's... 3 1 2.5 3 1 2 1 (b) At the beginning of the basic inspection period, the CuSum value is set equal to...

  13. Improving cumulative effects assessment in Alberta: Regional strategic assessment

    International Nuclear Information System (INIS)

    Johnson, Dallas; Lalonde, Kim; McEachern, Menzie; Kenney, John; Mendoza, Gustavo; Buffin, Andrew; Rich, Kate

    2011-01-01

    The Government of Alberta, Canada is developing a regulatory framework to better manage cumulative environmental effects from development in the province. A key component of this effort is regional planning, which will lay the primary foundation for cumulative effects management into the future. Alberta Environment has considered the information needs of regional planning and has concluded that Regional Strategic Assessment may offer significant advantages if integrated into the planning process, including the overall improvement of cumulative environmental effects assessment in the province.

  14. Children neglected: Where cumulative risk theory fails.

    Science.gov (United States)

    O'Hara, Mandy; Legano, Lori; Homel, Peter; Walker-Descartes, Ingrid; Rojas, Mary; Laraque, Danielle

    2015-07-01

    Neglected children, by far the majority of children maltreated, experience an environment most deficient in cognitive stimulation and language exchange. When physical abuse co-occurs with neglect, there is more stimulation through negative parent-child interaction, which may lead to better cognitive outcomes, contrary to Cumulative Risk Theory. The purpose of the current study was to assess whether children only neglected perform worse on cognitive tasks than children neglected and physically abused. Utilizing LONGSCAN archived data, 271 children only neglected and 101 children neglected and physically abused in the first four years of life were compared. The two groups were assessed at age 6 on the WPPSI-R vocabulary and block design subtests, correlates of cognitive intelligence. Regression analyses were performed, controlling for additional predictors of poor cognitive outcome, including socioeconomic variables and caregiver depression. Children only neglected scored significantly worse than children neglected and abused on the WPPSI-R vocabulary subtest (p=0.03). The groups did not differ on the block design subtest (p=0.4). This study shows that for neglected children, additional abuse may not additively accumulate risk when considering intelligence outcomes. Children experiencing only neglect may need to be referred for services that address cognitive development, with emphasis on the linguistic environment, in order to best support the developmental challenges of neglected children. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Standardization of the cumulative absolute velocity

    International Nuclear Information System (INIS)

    O'Hara, T.F.; Jacobson, J.P.

    1991-12-01

    EPRI NP-5930, ''A Criterion for Determining Exceedance of the Operating Basis Earthquake,'' was published in July 1988. As defined in that report, the Operating Basis Earthquake (OBE) is exceeded when both a response spectrum parameter and a second damage parameter, referred to as the Cumulative Absolute Velocity (CAV), are exceeded. In the review process of the above report, it was noted that the calculation of CAV could be confounded by time history records of long duration containing low (nondamaging) acceleration. Therefore, it is necessary to standardize the method of calculating CAV to account for record length. This standardized methodology allows consistent comparisons between future CAV calculations and the adjusted CAV threshold value based upon applying the standardized methodology to the data set presented in EPRI NP-5930. The recommended method to standardize the CAV calculation is to window its calculation on a second-by-second basis for a given time history. If the absolute acceleration exceeds 0.025g at any time during each one second interval, the earthquake records used in EPRI NP-5930 have been reanalyzed and the adjusted threshold of damage for CAV was found to be 0.16g-set

  16. Analysis of Memory Codes and Cumulative Rehearsal in Observational Learning

    Science.gov (United States)

    Bandura, Albert; And Others

    1974-01-01

    The present study examined the influence of memory codes varying in meaningfulness and retrievability and cumulative rehearsal on retention of observationally learned responses over increasing temporal intervals. (Editor)

  17. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  18. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    Science.gov (United States)

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.

  19. The CDF level-3 trigger

    International Nuclear Information System (INIS)

    Devlin, T.

    1993-01-01

    The Collider Detector at Fermilab (CDF) has been operating at the Tevatron and collecting data on proton-antiproton interactions with collision rates above 250,000 Hz. Three levels of filtering select events for data logging at a rate of about 4 Hz. The Level 3 trigger provides most of the capabilities of the offline production programs for event reconstruction and physics analysis. The type of physics triggers, application of cuts, and combinations of logical requirements for event selection are controlled at run time by a trigger table using a syntax fully integrated with the Level 1 and Level 2 hardware triggers. The level 3 software operates in 48 RISC/UNIX processors (over 1000 mips) served by four 20-MByte/sec data buses for input, output and control. The system architecture, debugging, code validation, error reporting, analysis capabilities and performance will be described

  20. Cumulative Effect of Depression on Dementia Risk

    Directory of Open Access Journals (Sweden)

    J. Olazarán

    2013-01-01

    Full Text Available Objective. To analyze a potential cumulative effect of life-time depression on dementia and Alzheimer’s disease (AD, with control of vascular factors (VFs. Methods. This study was a subanalysis of the Neurological Disorders in Central Spain (NEDICES study. Past and present depression, VFs, dementia status, and dementia due to AD were documented at study inception. Dementia status was also documented after three years. Four groups were created according to baseline data: never depression (nD, past depression (pD, present depression (prD, and present and past depression (prpD. Logistic regression was used. Results. Data of 1,807 subjects were investigated at baseline (mean age 74.3, 59.3% women, and 1,376 (81.6% subjects were evaluated after three years. The prevalence of dementia at baseline was 6.7%, and dementia incidence was 6.3%. An effect of depression was observed on dementia prevalence (OR [CI 95%] 1.84 [1.01–3.35] for prD and 2.73 [1.08–6.87] for prpD, and on dementia due to AD (OR 1.98 [0.98–3.99] for prD and OR 3.98 [1.48–10.71] for prpD (fully adjusted models, nD as reference. Depression did not influence dementia incidence. Conclusions. Present depression and, particularly, present and past depression are associated with dementia at old age. Multiple mechanisms, including toxic effect of depression on hippocampal neurons, plausibly explain these associations.

  1. Quantitative cumulative biodistribution of antibodies in mice

    Science.gov (United States)

    Yip, Victor; Palma, Enzo; Tesar, Devin B; Mundo, Eduardo E; Bumbaca, Daniela; Torres, Elizabeth K; Reyes, Noe A; Shen, Ben Q; Fielder, Paul J; Prabhu, Saileta; Khawli, Leslie A; Boswell, C Andrew

    2014-01-01

    The neonatal Fc receptor (FcRn) plays an important and well-known role in antibody recycling in endothelial and hematopoietic cells and thus it influences the systemic pharmacokinetics (PK) of immunoglobulin G (IgG). However, considerably less is known about FcRn’s role in the metabolism of IgG within individual tissues after intravenous administration. To elucidate the organ distribution and gain insight into the metabolism of humanized IgG1 antibodies with different binding affinities FcRn, comparative biodistribution studies in normal CD-1 mice were conducted. Here, we generated variants of herpes simplex virus glycoprotein D-specific antibody (humanized anti-gD) with increased and decreased FcRn binding affinity by genetic engineering without affecting antigen specificity. These antibodies were expressed in Chinese hamster ovary cell lines, purified and paired radiolabeled with iodine-125 and indium-111. Equal amounts of I-125-labeled and In-111-labeled antibodies were mixed and intravenously administered into mice at 5 mg/kg. This approach allowed us to measure both the real-time IgG uptake (I-125) and cumulative uptake of IgG and catabolites (In-111) in individual tissues up to 1 week post-injection. The PK and distribution of the wild-type IgG and the variant with enhanced binding for FcRn were largely similar to each other, but vastly different for the rapidly cleared low-FcRn-binding variant. Uptake in individual tissues varied across time, FcRn binding affinity, and radiolabeling method. The liver and spleen emerged as the most concentrated sites of IgG catabolism in the absence of FcRn protection. These data provide an increased understanding of FcRn’s role in antibody PK and catabolism at the tissue level. PMID:24572100

  2. Flexible trigger menu implementation on the Global Trigger for the CMS Level-1 trigger upgrade

    CERN Document Server

    Matsushita, Takashi

    2017-01-01

    The CMS experiment at the Large Hadron Collider (LHC) has continued to explore physics at the high-energy frontier in 2016. The integrated luminosity delivered by the LHC in 2016 was 41~fb$^{-1}$ with a peak luminosity of 1.5 $\\times$ 10$^{34}$ cm$^{-2}$s$^{-1}$ and peak mean pile-up of about 50, all exceeding the initial estimations for 2016. The CMS experiment has upgraded its hardware-based Level-1 trigger system to maintain its performance for new physics searches and precision measurements at high luminosities. The Global Trigger is the final step of the CMS \\mbox{Level-1} trigger and implements a trigger menu, a set of selection requirements applied to the final list of objects from calorimeter and muon triggers, for reducing the 40 MHz collision rate to 100 kHz. The Global Trigger has been upgraded with state-of-the-art FPGA processors on Advanced Mezzanine Cards with optical links running at 10 GHz in a MicroTCA crate. The powerful processing resources of the upgraded system enable implemen...

  3. Cumulative effects of forest management activities: how might they occur?

    Science.gov (United States)

    R. M. Rice; R. B. Thomas

    1985-01-01

    Concerns are often voiced about possible environmental damage as the result of the cumulative sedimentation effects of logging and forest road construction. In response to these concerns, National Forests are developing procedures to reduce the possibility that their activities may lead to unacceptable cumulative effects

  4. Cumulative effect in multiple production processes on nuclei

    International Nuclear Information System (INIS)

    Golubyatnikova, E.S.; Shmonin, V.L.; Kalinkin, B.N.

    1989-01-01

    It is shown that the cumulative effect is a natural result of the process of hadron multiple production in nuclear reactions. Interpretation is made of the universality of slopes of inclusive spectra and other characteristics of cumulative hadrons. The character of information from such reactions is discussed, which could be helpful in studying the mechanism of multiparticle production. 27 refs.; 4 figs

  5. Cumulative particle production in the quark recombination model

    International Nuclear Information System (INIS)

    Gavrilov, V.B.; Leksin, G.A.

    1987-01-01

    Production of cumulative particles in hadron-nuclear inteactions at high energies is considered within the framework of recombination quark model. Predictions for inclusive cross sections of production of cumulative particles and different resonances containing quarks in s state are made

  6. L1 track finding for a time multiplexed trigger

    Energy Technology Data Exchange (ETDEWEB)

    Cieri, D., E-mail: davide.cieri@bristol.ac.uk [University of Bristol, Bristol (United Kingdom); Rutherford Appleton Laboratory, Didcot (United Kingdom); Brooke, J.; Grimes, M. [University of Bristol, Bristol (United Kingdom); Newbold, D. [University of Bristol, Bristol (United Kingdom); Rutherford Appleton Laboratory, Didcot (United Kingdom); Harder, K.; Shepherd-Themistocleous, C.; Tomalin, I. [Rutherford Appleton Laboratory, Didcot (United Kingdom); Vichoudis, P. [CERN, Geneva (Switzerland); Reid, I. [Brunel University, London (United Kingdom); Iles, G.; Hall, G.; James, T.; Pesaresi, M.; Rose, A.; Tapper, A.; Uchida, K. [Imperial College, London (United Kingdom)

    2016-07-11

    At the HL-LHC, proton bunches will cross each other every 25 ns, producing an average of 140 pp-collisions per bunch crossing. To operate in such an environment, the CMS experiment will need a L1 hardware trigger able to identify interesting events within a latency of 12.5 μs. The future L1 trigger will make use also of data coming from the silicon tracker to control the trigger rate. The architecture that will be used in future to process tracker data is still under discussion. One interesting proposal makes use of the Time Multiplexed Trigger concept, already implemented in the CMS calorimeter trigger for the Phase I trigger upgrade. The proposed track finding algorithm is based on the Hough Transform method. The algorithm has been tested using simulated pp-collision data. Results show a very good tracking efficiency. The algorithm will be demonstrated in hardware in the coming months using the MP7, which is a μTCA board with a powerful FPGA capable of handling data rates approaching 1 Tb/s.

  7. L1 Track Finding for a Time Multiplexed Trigger

    CERN Document Server

    AUTHOR|(CDS)2090481; Grimes, M.; Newbold, D.; Harder, K.; Shepherd-Themistocleous, C.; Tomalin, I.; Vichoudis, P.; Reid, I.; Iles, G.; Hall, G.; James, T.; Pesaresi, M.; Rose, A.; Tapper, A.; Uchida, K.

    2016-01-01

    At the HL-LHC, proton bunches will cross each other every 25 ns, producing an average of 140 p p-collisions per bunch crossing. To operate in such an environment, the CMS experiment will need a L1 hardware trigger able to identify interesting events within a latency of 12.5 us. The future L1 trigger will make use also of data coming from the silicon tracker to control the trigger rate. The architecture that will be used in future to process tracker data is still under discussion. One interesting proposal makes use of the Time Multiplexed Trigger concept, already implemented in the CMS calorimeter trigger for the Phase I trigger upgrade. The proposed track finding algorithm is based on the Hough Transform method. The algorithm has been tested using simulated pp-collision data. Results show a very good tracking efficiency. The algorithm will be demonstrated in hardware in the coming months using the MP7, which is a uTCA board with a powerful FPGA capable of handling data rates approaching 1 Tb/s.

  8. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  9. Trigger processing using reconfigurable logic in the CMS calorimeter trigger

    Energy Technology Data Exchange (ETDEWEB)

    Brooke, J J; Cussans, D G; Heath, G P; Maddox, A J; Newbold, D M; Rabbetts, P D

    2001-04-01

    We present the design of the Global Calorimeter Trigger processor for the CMS detector at LHC. This is a fully pipelined processor system which collects data from all the CMS calorimeters and produces summary information used in forming the Level-1 trigger decision for each event. The design in based on the use of state-of-the-art reconfigurable logic devices (FPGAs) and fast data links. We present the results of device testing using a low-latency pipelined sort algorithm, which demonstrate that an FPGA can be used to perform processing previously foreseen to require custom ASICs. Our design approach results in a powerful, flexible and compact processor system.

  10. Constraints and triggers: situational mechanics of gender in negotiation.

    Science.gov (United States)

    Bowles, Hannah Riley; Babcock, Linda; McGinn, Kathleen L

    2005-12-01

    The authors propose 2 categories of situational moderators of gender in negotiation: situational ambiguity and gender triggers. Reducing the degree of situational ambiguity constrains the influence of gender on negotiation. Gender triggers prompt divergent behavioral responses as a function of gender. Field and lab studies (1 and 2) demonstrated that decreased ambiguity in the economic structure of a negotiation (structural ambiguity) reduces gender effects on negotiation performance. Study 3 showed that representation role (negotiating for self or other) functions as a gender trigger by producing a greater effect on female than male negotiation performance. Study 4 showed that decreased structural ambiguity constrains gender effects of representation role, suggesting that situational ambiguity and gender triggers work in interaction to moderate gender effects on negotiation performance. Copyright 2006 APA, all rights reserved.

  11. A Fast Hardware Tracker for the ATLAS Trigger System

    CERN Document Server

    Neubauer, Mark S

    2011-01-01

    In hadron collider experiments, triggering the detector to store interesting events for offline analysis is a challenge due to the high rates and multiplicities of particles produced. Maintaining high trigger efficiency for the physics we are most interested in while at the same time suppressing high rate physics from inclusive QCD processes is a difficult but important problem. It is essential that the trigger system be flexible and robust, with sufficient redundancy and operating margin. Providing high quality track reconstruction over the full ATLAS detector by the start of processing at LVL2 is an important element to achieve these needs. As the instantaneous luminosity increases, the computational load on the LVL2 system will significantly increase due to the need for more sophisticated algorithms to suppress backgrounds. The Fast Tracker (FTK) is a proposed upgrade to the ATLAS trigger system. It is designed to enable early rejection of background events and thus leave more LVL2 execution time by moving...

  12. Cumulative emission budgets and their implications: the case for SAFE carbon

    Science.gov (United States)

    Allen, Myles; Bowerman, Niel; Frame, David; Mason, Charles

    2010-05-01

    The risk of dangerous long-term climate change due to anthropogenic carbon dioxide emissions is predominantly determined by cumulative emissions over all time, not the rate of emission in any given year or commitment period. This has profound implications for climate mitigation policy: emission targets for specific years such as 2020 or 2050 provide no guarantee of meeting any overall cumulative emission budget. By focusing attention on short-term measures to reduce the flow of emissions, they may even exacerbate the overall long-term stock. Here we consider how climate policies might be designed explicitly to limit cumulative emissions to, for example, one trillion tonnes of carbon, a figure that has been estimated to give a most likely warming of two degrees above pre-industrial, with a likely range of 1.6-2.6 degrees. Three approaches are considered: tradable emission permits with the possibility of indefinite emission banking, carbon taxes explicitly linked to cumulative emissions and mandatory carbon sequestration. Framing mitigation policy around cumulative targets alleviates the apparent tension between climate protection and short-term consumption that bedevils any attempt to forge global agreement. We argue that the simplest and hence potentially the most effective approach might be a mandatory requirement on the fossil fuel industry to ensure that a steadily increasing fraction of fossil carbon extracted from the ground is artificially removed from the active carbon cycle through some form of sequestration. We define Sequestered Adequate Fraction of Extracted (SAFE) carbon as a source in which this sequestered fraction is anchored to cumulative emissions, increasing smoothly to reach 100% before we release the trillionth tonne. While adopting the use of SAFE carbon would increase the cost of fossil energy much as a system of emission permits or carbon taxes would, it could do so with much less explicit government intervention. We contrast this proposal

  13. The DOe Silicon Track Trigger

    International Nuclear Information System (INIS)

    Steinbrueck, Georg

    2003-01-01

    We describe a trigger preprocessor to be used by the DOe experiment for selecting events with tracks from the decay of long-lived particles. This Level 2 impact parameter trigger utilizes information from the Silicon Microstrip Tracker to reconstruct tracks with improved spatial and momentum resolutions compared to those obtained by the Level 1 tracking trigger. It is constructed of VME boards with much of the logic existing in programmable processors. A common motherboard provides the I/O infrastructure and three different daughter boards perform the tasks of identifying the roads from the tracking trigger data, finding the clusters in the roads in the silicon detector, and fitting tracks to the clusters. This approach provides flexibility for the design, testing and maintenance phases of the project. The track parameters are provided to the trigger framework in 25 μs. The effective impact parameter resolution for high-momentum tracks is 35 μm, dominated by the size of the Tevatron beam

  14. Towards Greenland Glaciation: cumulative or abrupt transition?

    Science.gov (United States)

    Ramstein, Gilles; Tan, Ning; Ladant, Jean-baptiste; Dumas, Christophe; Contoux, Camille

    2017-04-01

    During the mid-Pliocene warming period (3-3.3 Ma BP), the global annual mean temperatures inferred by data and model studies were 2-3° warmer than pre-industrial values. Accordingly, Greenland ice sheet volume is supposed to reach at the most, only half of that of present-day [Haywood et al. 2010]. Around 2.7-2.6 Ma BP, just ˜ 500 kyr after the warming peak of mid-Pliocene, the Greenland ice sheet has reached its full size [Lunt et al. 2008]. A crucial question concerns the evolution of the Greenland ice sheet from half to full size during the 3 - 2.5 Ma period. Data show a decreasing trend of atmospheric CO2 concentration from 3 Ma to 2.5 Ma [Seki et al.2010; Bartoli et al. 2011; Martinez et al. 2015]. However, a recent study [Contoux et al. 2015] suggests that a lowering of CO2 is not sufficient to initiate a perennial glaciation on Greenland and must be combined with low summer insolation to preserve the ice sheet during insolation maxima. This suggests rather a cumulative process than an abrupt event. In order to diagnose the evolution of the ice sheet build-up, we carry on, for the first time, a transient simulation of climate and ice sheet evolutions from 3 Ma to 2.5 Ma. This strategy enables us to investigate the waxing and waning of the ice sheet during several orbital cycles. We use a tri-dimensional interpolation method designed by Ladant et al. (2014), which allows the evolution of CO2 concentration and of orbital parameters, and the evolution of the Greenland ice sheet size to be taken into account. By interpolating climatic snapshot simulations ran with various possible combinations of CO2, orbits and ice sheet sizes, we can build a continuous climatic forcing that is then used to provide 500 kyrs-long ice sheet simulations. With such a tool, we may offer a physically based answer to different CO2 reconstructions scenarios and analyse which one is the most consistent with Greenland ice sheet buildup.

  15. A branch-and-cut-and-price algorithm for the cumulative capacitated vehicle routing problem

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Lysgaard, Jens

    2014-01-01

    The paper considers the Cumulative Capacitated Vehicle Routing Problem (CCVRP), which is a variation of the well-known Capacitated Vehicle Routing Problem (CVRP). In this problem, the traditional objective of minimizing total distance or time traveled by the vehicles is replaced by minimizing...... the sum of arrival times at the customers. A branch-and-cut-and-price algorithm for obtaining optimal solutions to the problem is proposed. Computational results based on a set of standard CVRP benchmarks are presented....

  16. An Analysis of Cumulative Risks Indicated by Biomonitoring Data of Six Phthalates Using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...

  17. An analysis of cumulative risks based on biomonitoring data for six phthalates using the Maximum Cumulative Ratio

    Science.gov (United States)

    The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single chemical drives the cumulative risk of an individual exposed to multiple chemicals. Phthalates are a class of chemicals with ubiquitous exposures in the general population that have the potential to cause ...

  18. Correlated stopping, proton clusters and higher order proton cumulants

    Energy Technology Data Exchange (ETDEWEB)

    Bzdak, Adam [AGH University of Science and Technology, Faculty of Physics and Applied Computer Science, Krakow (Poland); Koch, Volker [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Skokov, Vladimir [RIKEN/BNL, Brookhaven National Laboratory, Upton, NY (United States)

    2017-05-15

    We investigate possible effects of correlations between stopped nucleons on higher order proton cumulants at low energy heavy-ion collisions. We find that fluctuations of the number of wounded nucleons N{sub part} lead to rather nontrivial dependence of the correlations on the centrality; however, this effect is too small to explain the large and positive four-proton correlations found in the preliminary data collected by the STAR collaboration at √(s) = 7.7 GeV. We further demonstrate that, by taking into account additional proton clustering, we are able to qualitatively reproduce the preliminary experimental data. We speculate that this clustering may originate either from collective/multi-collision stopping which is expected to be effective at lower energies or from a possible first-order phase transition, or from (attractive) final state interactions. To test these ideas we propose to measure a mixed multi-particle correlation between stopped protons and a produced particle (e.g. pion, antiproton). (orig.)

  19. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  20. A new method to cluster genomes based on cumulative Fourier power spectrum.

    Science.gov (United States)

    Dong, Rui; Zhu, Ziyue; Yin, Changchuan; He, Rong L; Yau, Stephen S-T

    2018-06-20

    Analyzing phylogenetic relationships using mathematical methods has always been of importance in bioinformatics. Quantitative research may interpret the raw biological data in a precise way. Multiple Sequence Alignment (MSA) is used frequently to analyze biological evolutions, but is very time-consuming. When the scale of data is large, alignment methods cannot finish calculation in reasonable time. Therefore, we present a new method using moments of cumulative Fourier power spectrum in clustering the DNA sequences. Each sequence is translated into a vector in Euclidean space. Distances between the vectors can reflect the relationships between sequences. The mapping between the spectra and moment vector is one-to-one, which means that no information is lost in the power spectra during the calculation. We cluster and classify several datasets including Influenza A, primates, and human rhinovirus (HRV) datasets to build up the phylogenetic trees. Results show that the new proposed cumulative Fourier power spectrum is much faster and more accurately than MSA and another alignment-free method known as k-mer. The research provides us new insights in the study of phylogeny, evolution, and efficient DNA comparison algorithms for large genomes. The computer programs of the cumulative Fourier power spectrum are available at GitHub (https://github.com/YaulabTsinghua/cumulative-Fourier-power-spectrum). Copyright © 2018. Published by Elsevier B.V.

  1. ATLAS FTK Fast Track Trigger

    CERN Document Server

    Iizawa, T; The ATLAS collaboration

    2014-01-01

    The Fast TracKer (FTK) will perform global track reconstruction after each Level-1 trigger accept signal to enable the software-based higher level trigger to have early access to tracking information. FTK is a dedicated processor based on a mixture of advanced technologies. Modern, powerful Field Programmable Gate Arrays (FPGAs) form an important part of the system architecture, and the large level of computing power required for pattern recognition is provided by incorporating standard-cell ASICs named Associative Memory (AM). Motivation and the architecture of the FTK system will be presented, and the status of hardware and simulation will be following.

  2. RPC Trigger Robustness: Status Report

    CERN Document Server

    Di Mattia, A; Nisati, A; Pastore, F; Vari, R; Veneziano, Stefano; Aielli, G; Camarri, P; Cardarelli, R; Di Ciaccio, A; Di Simone, A; Liberti, B; Santonico, R

    2002-01-01

    The present paper describes the Level-1 Barrel Muon Trigger performance as expected with the current configuration of the RPC detectors designed for the Barrel Muon Spectrometer of ATLAS. Results of a beam test performed at the X5-GIF facility at CERN are presented in order to show the trigger efficiency with different conditions of RPC detection efficiency and several background rates. Small (50$\\times$50 cm$^2$) RPC chambers with final Front-end electronics and splitter boards are used in the test, while the coincidence logic is applied off-line using a detailed simulation of the coincidence matrix.

  3. Fast processor for dilepton triggers

    International Nuclear Information System (INIS)

    Katsanevas, S.; Kostarakis, P.; Baltrusaitis, R.

    1983-01-01

    We describe a fast trigger processor, developed for and used in Fermilab experiment E-537, for selecting high-mass dimuon events produced by negative pions and anti-protons. The processor finds candidate tracks by matching hit information received from drift chambers and scintillation counters, and determines their momenta. Invariant masses are calculated for all possible pairs of tracks and an event is accepted if any invariant mass is greater than some preselectable minimum mass. The whole process, accomplished within 5 to 10 microseconds, achieves up to a ten-fold reduction in trigger rate

  4. DT Local Trigger performance in 2015

    CERN Document Server

    CMS Collaboration

    2015-01-01

    The Local Trigger system of the CMS Drift Tube chambers (DT) was checked applying similar methods as in the LHC Run 1 (2012). The main variables shown in this note are the trigger efficiency, the trigger quality and the fraction of trigger ghosts. The performance was found to be comparable or better than in Run 1.

  5. Cumulative stress and autonomic dysregulation in a community sample.

    Science.gov (United States)

    Lampert, Rachel; Tuit, Keri; Hong, Kwang-Ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita

    2016-05-01

    Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers (N = 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview (CAI), a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24-h ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors and cumulative adverse life events (CALE) were all inversely associated with ultra low-frequency (ULF), very low-frequency (VLF) and low-frequency (LF) power and SDNN (all p accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance alone but were not longer significant after adjusting for race and health behaviors. In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial.

  6. Describing phytotoxic effects on cumulative germination

    OpenAIRE

    Dias, L.S.

    2001-01-01

    Phytotoxic studies strongly depend on evaluation of germination responses, which implies the need for adequate procedures to account for distinct aspects of the germinative process. For this, indices, comparisons among treatments at various times, and model fitting have been proposed. The objective of this work is to compare the three approaches and select the one providing greater insight and precision. Speed of germination, speed of accumulated germination, the coefficient of the rate of ge...

  7. Cumulants in perturbation expansions for non-equilibrium field theory

    International Nuclear Information System (INIS)

    Fauser, R.

    1995-11-01

    The formulation of perturbation expansions for a quantum field theory of strongly interacting systems in a general non-equilibrium state is discussed. Non-vanishing initial correlations are included in the formulation of the perturbation expansion in terms of cumulants. The cumulants are shown to be the suitable candidate for summing up the perturbation expansion. Also a linked-cluster theorem for the perturbation series with cumulants is presented. Finally a generating functional of the perturbation series with initial correlations is studied. We apply the methods to a simple model of a fermion-boson system. (orig.)

  8. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date....... It is common practice to apply the Kaplan-Meier or Aalen-Johansen estimator to the total sample and report either the estimated cumulative incidence curve or just a single point on the curve as a description of the disease risk. METHODS: We argue that, whenever the disease or disorder of interest is influenced...

  9. The Trigger for Early Running

    CERN Document Server

    The ATLAS Collaboration

    2009-01-01

    The ATLAS trigger and data acquisition system is based on three levels of event selection designed to capture the physics of interest with high efficiency from an initial bunch crossing rate of 40 MHz. The selections in the three trigger levels must provide sufficient rejection to reduce the rate to 200 Hz, compatible with offline computing power and storage capacity. The LHC is expected to begin its operation with a peak luminosity of 10^31 with a relatively small number of bunches, but quickly ramp up to higher luminosities by increasing the number of bunches, and thus the overall interaction rate. Decisions must be taken every 25 ns during normal LHC operations at the design luminosity of 10^34, where the average bunch crossing will contain more than 20 interactions. Hence, trigger selections must be deployed that can adapt to the changing beam conditions while preserving the interesting physics and satisfying varying detector requirements. In this paper, we provide a menu of trigger selections that can be...

  10. The CDF Silicon Vertex Trigger

    International Nuclear Information System (INIS)

    Dell'Orso, Mauro

    2006-01-01

    Motivations, design, performance and ongoing upgrade of the CDF Silicon Vertex Trigger are presented. The system provides CDF with a powerful tool for online tracking with offline quality in order to enhance the reach on B-physics and large P t -physics coupled to b quarks

  11. Cumulative t-link threshold models for the genetic analysis of calving ease scores

    Directory of Open Access Journals (Sweden)

    Tempelman Robert J

    2003-09-01

    Full Text Available Abstract In this study, a hierarchical threshold mixed model based on a cumulative t-link specification for the analysis of ordinal data or more, specifically, calving ease scores, was developed. The validation of this model and the Markov chain Monte Carlo (MCMC algorithm was carried out on simulated data from normally and t4 (i.e. a t-distribution with four degrees of freedom distributed populations using the deviance information criterion (DIC and a pseudo Bayes factor (PBF measure to validate recently proposed model choice criteria. The simulation study indicated that although inference on the degrees of freedom parameter is possible, MCMC mixing was problematic. Nevertheless, the DIC and PBF were validated to be satisfactory measures of model fit to data. A sire and maternal grandsire cumulative t-link model was applied to a calving ease dataset from 8847 Italian Piemontese first parity dams. The cumulative t-link model was shown to lead to posterior means of direct and maternal heritabilities (0.40 ± 0.06, 0.11 ± 0.04 and a direct maternal genetic correlation (-0.58 ± 0.15 that were not different from the corresponding posterior means of the heritabilities (0.42 ± 0.07, 0.14 ± 0.04 and the genetic correlation (-0.55 ± 0.14 inferred under the conventional cumulative probit link threshold model. Furthermore, the correlation (> 0.99 between posterior means of sire progeny merit from the two models suggested no meaningful rerankings. Nevertheless, the cumulative t-link model was decisively chosen as the better fitting model for this calving ease data using DIC and PBF.

  12. Cumulative Environmental Impacts: Science and Policy to Protect Communities.

    Science.gov (United States)

    Solomon, Gina M; Morello-Frosch, Rachel; Zeise, Lauren; Faust, John B

    2016-01-01

    Many communities are located near multiple sources of pollution, including current and former industrial sites, major roadways, and agricultural operations. Populations in such locations are predominantly low-income, with a large percentage of minorities and non-English speakers. These communities face challenges that can affect the health of their residents, including limited access to health care, a shortage of grocery stores, poor housing quality, and a lack of parks and open spaces. Environmental exposures may interact with social stressors, thereby worsening health outcomes. Age, genetic characteristics, and preexisting health conditions increase the risk of adverse health effects from exposure to pollutants. There are existing approaches for characterizing cumulative exposures, cumulative risks, and cumulative health impacts. Although such approaches have merit, they also have significant constraints. New developments in exposure monitoring, mapping, toxicology, and epidemiology, especially when informed by community participation, have the potential to advance the science on cumulative impacts and to improve decision making.

  13. Pesticide Cumulative Risk Assessment: Framework for Screening Analysis

    Science.gov (United States)

    This document provides guidance on how to screen groups of pesticides for cumulative evaluation using a two-step approach: begin with evaluation of available toxicological information and, if necessary, follow up with a risk-based screening approach.

  14. Considering Environmental and Occupational Stressors in Cumulative Risk Assessments

    Science.gov (United States)

    While definitions vary across the global scientific community, cumulative risk assessments (CRAs) typically are described as exhibiting a population focus and analyzing the combined risks posed by multiple stressors. CRAs also may consider risk management alternatives as an anal...

  15. Peer tutors as learning and teaching partners: a cumulative ...

    African Journals Online (AJOL)

    ... paper explores the kinds of development in tutors' thinking and action that are possible when training and development is theoretically informed, coherent, and oriented towards improving practice. Keywords: academic development, academic literacies, cumulative learning, higher education, peer tutoring, writing centres.

  16. CTD Information Guide. Preventing Cumulative Trauma Disorders in the Workplace

    National Research Council Canada - National Science Library

    1992-01-01

    The purpose of this report is to provide Army occupational safety and health (OSH) professionals with a primer that explains the basic principles of ergonomic-hazard recognition for common cumulative trauma disorders...

  17. Cumulative radiation exposure in children with cystic fibrosis.

    LENUS (Irish Health Repository)

    O'Reilly, R

    2010-02-01

    This retrospective study calculated the cumulative radiation dose for children with cystic fibrosis (CF) attending a tertiary CF centre. Information on 77 children with a mean age of 9.5 years, a follow up time of 658 person years and 1757 studies including 1485 chest radiographs, 215 abdominal radiographs and 57 computed tomography (CT) scans, of which 51 were thoracic CT scans, were analysed. The average cumulative radiation dose was 6.2 (0.04-25) mSv per CF patient. Cumulative radiation dose increased with increasing age and number of CT scans and was greater in children who presented with meconium ileus. No correlation was identified between cumulative radiation dose and either lung function or patient microbiology cultures. Radiation carries a risk of malignancy and children are particularly susceptible. Every effort must be made to avoid unnecessary radiation exposure in these patients whose life expectancy is increasing.

  18. New high-energy phenomena in aircraft triggered lightning

    NARCIS (Netherlands)

    van Deursen, A.P.J.; Kochkin, P.; de Boer, A.; Bardet, M.; Boissin, J.F.

    2016-01-01

    High-energy phenomena associated with lighting have been proposed in the twenties, observed for the first time in the sixties, and further investigated more recently by e.g. rocket triggered lightning. Similarly, x-rays have been detected in meter-long discharges in air at standard atmospheric

  19. Design and Test Space Exploration of Transport-Triggered Architectures

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper describes a new approach in the high level design and test of transport-triggered architectures (TTA), a special type of application specific instruction processors (ASIP). The proposed method introduces the test as an additional constraint, besides throughput and circuit area. The

  20. Numerical evaluation of a robust self-triggered MPC algorithm

    NARCIS (Netherlands)

    Brunner, F.D.; Heemels, W.P.M.H.; Allgöwer, F.

    2016-01-01

    We present numerical examples demonstrating the efficacy of a recently proposed self-triggered model predictive control scheme for disturbed linear discrete-time systems with hard constraints on the input and state. In order to reduce the amount of communication between the controller and the

  1. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    DEFF Research Database (Denmark)

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    Recently, Fine and Gray (J Am Stat Assoc 94:496–509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter...... estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums...

  2. Natural experimentation is a challenging method for identifying headache triggers.

    Science.gov (United States)

    Houle, Timothy T; Turner, Dana P

    2013-04-01

    In this study, we set out to determine whether individual headache sufferers can learn about the potency of their headache triggers (causes) using only natural experimentation. Headache patients naturally use the covariation of the presence-absence of triggers with headache attacks to assess the potency of triggers. The validity of this natural experimentation has never been investigated. A companion study has proposed 3 assumptions that are important for assigning causal status to triggers. This manuscript examines one of these assumptions, constancy in trigger presentation, using real-world conditions. The similarity of day-to-day weather conditions over 4 years, as well as the similarity of ovarian hormones and perceived stress over a median of 89 days in 9 regularly cycling headache sufferers, was examined using several available time series. An arbitrary threshold of 90% similarity using Gower's index identified similar days for comparison. The day-to-day variability in just these 3 headache triggers is substantial enough that finding 2 naturally similar days for which to contrast the effect of a fourth trigger (eg, drinking wine vs not drinking wine) will only infrequently occur. Fluctuations in weather patterns resulted in a median of 2.3 days each year that were similar (range 0-27.4). Considering fluctuations in stress patterns and ovarian hormones, only 1.5 days/month (95% confidence interval 1.2-2.9) and 2.0 days/month (95% confidence interval 1.9-2.2), respectively, met our threshold for similarity. Although assessing the personal causes of headache is an age-old endeavor, the great many candidate triggers exhibit variability that may prevent sound conclusions without assistance from formal experimentation or statistical balancing. © 2013 American Headache Society.

  3. Cumulative query method for influenza surveillance using search engine data.

    Science.gov (United States)

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  4. Steps and pips in the history of the cumulative recorder.

    OpenAIRE

    Lattal, Kennon A

    2004-01-01

    From its inception in the 1930s until very recent times, the cumulative recorder was the most widely used measurement instrument in the experimental analysis of behavior. It was an essential instrument in the discovery and analysis of schedules of reinforcement, providing the first real-time analysis of operant response rates and patterns. This review traces the evolution of the cumulative recorder from Skinner's early modified kymographs through various models developed by Skinner and his co...

  5. Mapping Cumulative Impacts of Human Activities on Marine Ecosystems

    OpenAIRE

    , Seaplan

    2018-01-01

    Given the diversity of human uses and natural resources that converge in coastal waters, the potential independent and cumulative impacts of those uses on marine ecosystems are important to consider during ocean planning. This study was designed to support the development and implementation of the 2009 Massachusetts Ocean Management Plan. Its goal was to estimate and visualize the cumulative impacts of human activities on coastal and marine ecosystems in the state and federal waters off of Ma...

  6. Cluster observations and simulations of He+ EMIC triggered emissions

    Science.gov (United States)

    Grison, B.; Shoji, M.; Santolik, O.; Omura, Y.

    2012-12-01

    EMIC triggered emissions have been reported in the inner magnetosphere at the edge of the plasmapause nightside [Pickett et al., 2010]. The generation mechanism proposed by Omura et al. [2010] is very similar to the one of the whistler chorus emissions and simulation results agree with observations and theory [Shoji et Omura, 2011]. The main characteristics of these emissions generated in the magnetic equatorial plane region are a frequency with time dispersion and a high level of coherence. The start frequency of previously mentioned observations is above half of the proton gyrofrequency. It means that the emissions are generated on the proton branch. On the He+ branch, generation of triggered emissions, in the same region, requests more energetic protons and the triggering process starts below the He+ gyrofrequency. It makes their identification in Cluster data rather difficult. Recent simulation results confirm the possibility of EMIC triggered emission on the He+ branch. In the present contribution we propose to compare a Cluster event to simulation results in order to investigate the possibility to identify observations to a He+ triggered emission. The impact of the observed waves on particle precipitation is also investigated.

  7. Study of cumulative fatigue damage detection for used parts with nonlinear output frequency response functions based on NARMAX modelling

    Science.gov (United States)

    Huang, Honglan; Mao, Hanying; Mao, Hanling; Zheng, Weixue; Huang, Zhenfeng; Li, Xinxin; Wang, Xianghong

    2017-12-01

    Cumulative fatigue damage detection for used parts plays a key role in the process of remanufacturing engineering and is related to the service safety of the remanufactured parts. In light of the nonlinear properties of used parts caused by cumulative fatigue damage, the based nonlinear output frequency response functions detection approach offers a breakthrough to solve this key problem. First, a modified PSO-adaptive lasso algorithm is introduced to improve the accuracy of the NARMAX model under impulse hammer excitation, and then, an effective new algorithm is derived to estimate the nonlinear output frequency response functions under rectangular pulse excitation, and a based nonlinear output frequency response functions index is introduced to detect the cumulative fatigue damage in used parts. Then, a novel damage detection approach that integrates the NARMAX model and the rectangular pulse is proposed for nonlinear output frequency response functions identification and cumulative fatigue damage detection of used parts. Finally, experimental studies of fatigued plate specimens and used connecting rod parts are conducted to verify the validity of the novel approach. The obtained results reveal that the new approach can detect cumulative fatigue damages of used parts effectively and efficiently and that the various values of the based nonlinear output frequency response functions index can be used to detect the different fatigue damages or working time. Since the proposed new approach can extract nonlinear properties of systems by only a single excitation of the inspected system, it shows great promise for use in remanufacturing engineering applications.

  8. Cumulative organic anion transporter-mediated drug-drug interaction potential of multiple components in salvia miltiorrhiza (danshen) preparations.

    Science.gov (United States)

    Wang, Li; Venitz, Jürgen; Sweet, Douglas H

    2014-12-01

    To evaluate organic anion transporter-mediated drug-drug interaction (DDI) potential for individual active components of Danshen (Salvia miltiorrhiza) vs. combinations using in vitro and in silico approaches. Inhibition profiles for single Danshen components and combinations were generated in stably-expressing human (h)OAT1 and hOAT3 cells. Plasma concentration-time profiles for compounds were estimated from in vivo human data using an i.v. two-compartment model (with first-order elimination). The cumulative DDI index was proposed as an indicator of DDI potential for combination products. This index was used to evaluate the DDI potential for Danshen injectables from 16 different manufacturers and 14 different lots from a single manufacturer. The cumulative DDI index predicted in vivo inhibition potentials, 82% (hOAT1) and 74% (hOAT3), comparable with those observed in vitro, 72 ± 7% (hOAT1) and 81 ± 10% (hOAT3), for Danshen component combinations. Using simulated unbound Cmax values, a wide range in cumulative DDI index between manufacturers, and between lots, was predicted. Many products exhibited a cumulative DDI index > 1 (50% inhibition). Danshen injectables will likely exhibit strong potential to inhibit hOAT1 and hOAT3 function in vivo. The proposed cumulative DDI index might improve prediction of DDI potential of herbal medicines or pharmaceutical preparations containing multiple components.

  9. Genetic algorithm-based improved DOA estimation using fourth-order cumulants

    Science.gov (United States)

    Ahmed, Ammar; Tufail, Muhammad

    2017-05-01

    Genetic algorithm (GA)-based direction of arrival (DOA) estimation is proposed using fourth-order cumulants (FOC) and ESPRIT principle which results in Multiple Invariance Cumulant ESPRIT algorithm. In the existing FOC ESPRIT formulations, only one invariance is utilised to estimate DOAs. The unused multiple invariances (MIs) must be exploited simultaneously in order to improve the estimation accuracy. In this paper, a fitness function based on a carefully designed cumulant matrix is developed which incorporates MIs present in the sensor array. Better DOA estimation can be achieved by minimising this fitness function. Moreover, the effectiveness of Newton's method as well as GA for this optimisation problem has been illustrated. Simulation results show that the proposed algorithm provides improved estimation accuracy compared to existing algorithms, especially in the case of low SNR, less number of snapshots, closely spaced sources and high signal and noise correlation. Moreover, it is observed that the optimisation using Newton's method is more likely to converge to false local optima resulting in erroneous results. However, GA-based optimisation has been found attractive due to its global optimisation capability.

  10. Humanoid infers Archimedes' principle: understanding physical relations and object affordances through cumulative learning experiences

    Science.gov (United States)

    2016-01-01

    Emerging studies indicate that several species such as corvids, apes and children solve ‘The Crow and the Pitcher’ task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause–effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended ‘learning–prediction–abstraction’ loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. PMID:27466440

  11. Humanoid infers Archimedes' principle: understanding physical relations and object affordances through cumulative learning experiences.

    Science.gov (United States)

    Bhat, Ajaz Ahmad; Mohan, Vishwanathan; Sandini, Giulio; Morasso, Pietro

    2016-07-01

    Emerging studies indicate that several species such as corvids, apes and children solve 'The Crow and the Pitcher' task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause-effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended 'learning-prediction-abstraction' loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. © 2016 The Author(s).

  12. Analysis of cumulative exergy losses in the chains of technological processes

    International Nuclear Information System (INIS)

    Szargut, J.

    1989-01-01

    This paper reports on cumulative exergy consumption (CExC) which characterizes the chain of technological processes leading from natural resources to the final product under consideration. The difference of CExC and exergy of material or energy carrier expresses the cumulative exergy loss (CExL) in the mentioned technological chain. Two apportionment methods of CExL have been proposed. Partial exergy losses appear in particular links of the technological chain and characterize the influence of irreversibility of these links. Constituent exergy losses express the influence of thermodynamic imperfection of constituent technological chains leading to the final link of the total technological chain. Analysis of the partial and constituent exergy losses informs about the possibilities of improvement of the technological chains

  13. Triggers in UA2 and UA1

    International Nuclear Information System (INIS)

    Dorenbosch, J.

    1985-01-01

    The UA2 and UA1 trigger systems are described as they will be used after the upgrade of the CERN SPPS. The luminosity of the collider will increase to 3x10 30 . The bunch spacing is 4 microseconds, comparable to the time available for a second level trigger at the SSC. The first level triggers are very powerful and deliver trigger rates of about 100 Hz. The UA1 second level trigger operates on the final digitizings with a combination of special and general purpose processors. At the highest trigger levels a small farm of processors performs the final reduction. (orig.)

  14. Measuring a fair and ambitious climate agreement using cumulative emissions

    International Nuclear Information System (INIS)

    Peters, Glen P; Andrew, Robbie M; Solomon, Susan; Friedlingstein, Pierre

    2015-01-01

    Policy makers have called for a ‘fair and ambitious’ global climate agreement. Scientific constraints, such as the allowable carbon emissions to avoid exceeding a 2 °C global warming limit with 66% probability, can help define ambitious approaches to climate targets. However, fairly sharing the mitigation challenge to meet a global target involves human values rather than just scientific facts. We develop a framework based on cumulative emissions of carbon dioxide to compare the consistency of countries’ current emission pledges to the ambition of keeping global temperatures below 2 °C, and, further, compare two alternative methods of sharing the remaining emission allowance. We focus on the recent pledges and other official statements of the EU, USA, and China. The EU and US pledges are close to a 2 °C level of ambition only if the remaining emission allowance is distributed based on current emission shares, which is unlikely to be viewed as ‘fair and ambitious’ by others who presently emit less. China’s stated emissions target also differs from measures of global fairness, owing to emissions that continue to grow into the 2020s. We find that, combined, the EU, US, and Chinese pledges leave little room for other countries to emit CO 2 if a 2 °C limit is the objective, essentially requiring all other countries to move towards per capita emissions 7 to 14 times lower than the EU, USA, or China by 2030. We argue that a fair and ambitious agreement for a 2 °C limit that would be globally inclusive and effective in the long term will require stronger mitigation than the goals currently proposed. Given such necessary and unprecedented mitigation and the current lack of availability of some key technologies, we suggest a new diplomatic effort directed at ensuring that the necessary technologies become available in the near future. (letter)

  15. Stimulus conflict triggers behavioral avoidance.

    Science.gov (United States)

    Dignath, David; Eder, Andreas B

    2015-12-01

    According to a recent extension of the conflict-monitoring theory, conflict between two competing response tendencies is registered as an aversive event and triggers a motivation to avoid the source of conflict. In the present study, we tested this assumption. Over five experiments, we examined whether conflict is associated with an avoidance motivation and whether stimulus conflict or response conflict triggers an avoidance tendency. Participants first performed a color Stroop task. In a subsequent motivation test, participants responded to Stroop stimuli with approach- and avoidance-related lever movements. These results showed that Stroop-conflict stimuli increased the frequency of avoidance responses in a free-choice motivation test, and also increased the speed of avoidance relative to approach responses in a forced-choice test. High and low proportions of response conflict in the Stroop task had no effect on avoidance in the motivation test. Avoidance of conflict was, however, obtained even with new conflict stimuli that had not been presented before in a Stroop task, and when the Stroop task was replaced with an unrelated filler task. Taken together, these results suggest that stimulus conflict is sufficient to trigger avoidance.

  16. Industrial accidents triggered by lightning.

    Science.gov (United States)

    Renni, Elisabetta; Krausmann, Elisabeth; Cozzani, Valerio

    2010-12-15

    Natural disasters can cause major accidents in chemical facilities where they can lead to the release of hazardous materials which in turn can result in fires, explosions or toxic dispersion. Lightning strikes are the most frequent cause of major accidents triggered by natural events. In order to contribute towards the development of a quantitative approach for assessing lightning risk at industrial facilities, lightning-triggered accident case histories were retrieved from the major industrial accident databases and analysed to extract information on types of vulnerable equipment, failure dynamics and damage states, as well as on the final consequences of the event. The most vulnerable category of equipment is storage tanks. Lightning damage is incurred by immediate ignition, electrical and electronic systems failure or structural damage with subsequent release. Toxic releases and tank fires tend to be the most common scenarios associated with lightning strikes. Oil, diesel and gasoline are the substances most frequently released during lightning-triggered Natech accidents. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. The UA1 trigger processor

    International Nuclear Information System (INIS)

    Grayer, G.H.

    1981-01-01

    Experiment UA1 is a large multi-purpose spectrometer at the CERN proton-antiproton collider, scheduled for late 1981. The principal trigger is formed on the basis of the energy deposition in calorimeters. A trigger decision taken in under 2.4 microseconds can avoid dead time losses due to the bunched nature of the beam. To achieve this we have built fast 8-bit charge to digital converters followed by two identical digital processors tailored to the experiment. The outputs of groups of the 2440 photomultipliers in the calorimeters are summed to form a total of 288 input channels to the ADCs. A look-up table in RAM is used to convert the digitised photomultiplier signals to energy in one processor, combinations of input channels, and also counts the number of clusters with electromagnetic or hadronic energy above pre-determined levels. Up to twelve combinations of these conditions, together with external information, may be combined in coincidence or in veto to form the final trigger. Provision has been made for testing using simulated data in an off-line mode, and sampling real data when on-line. (orig.)

  18. ATLAS Level-1 Topological Trigger

    CERN Document Server

    Zheng, Daniel; The ATLAS collaboration

    2018-01-01

    The ATLAS experiment has introduced and recently commissioned a completely new hardware sub-system of its first-level trigger: the topological processor (L1Topo). L1Topo consist of two AdvancedTCA blades mounting state-of-the-art FPGA processors, providing high input bandwidth (up to 4 Gb/s) and low latency data processing (200 ns). L1Topo is able to select collision events by applying kinematic and topological requirements on candidate objects (energy clusters, jets, and muons) measured by calorimeters and muon sub-detectors. Results from data recorded using the L1Topo trigger will be presented. These results demonstrate a significantly improved background event rejection, thus allowing for a rate reduction without efficiency loss. This improvement has been shown for several physics processes leading to low-pT leptons, including H->tau tau and J/Psi->mu mu. In addition to describing the L1Topo trigger system, we will discuss the use of an accurate L1Topo simulation as a powerful tool to validate and optimize...

  19. ATLAS FTK: Fast Track Trigger

    CERN Document Server

    Volpi, Guido; The ATLAS collaboration

    2015-01-01

    An overview of the ATLAS Fast Tracker processor is presented, reporting the design of the system, its expected performance, and the integration status. The next LHC runs, with a significant increase in instantaneous luminosity, will provide a big challenge to the trigger and data acquisition systems of all the experiments. An intensive use of the tracking information at the trigger level will be important to keep high efficiency in interesting events, despite the increase in multiple p-p collisions per bunch crossing (pile-up). In order to increase the use of tracks within the High Level Trigger (HLT), the ATLAS experiment planned the installation of an hardware processor dedicated to tracking: the Fast TracKer (FTK) processor. The FTK is designed to perform full scan track reconstruction at every Level-1 accept. To achieve this goal, the FTK uses a fully parallel architecture, with algorithms designed to exploit the computing power of custom VLSI chips, the Associative Memory, as well as modern FPGAs. The FT...

  20. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  1. Event-triggered output feedback control for distributed networked systems.

    Science.gov (United States)

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2016-01-01

    This paper addresses the problem of output-feedback communication and control with event-triggered framework in the context of distributed networked control systems. The design problem of the event-triggered output-feedback control is proposed as a linear matrix inequality (LMI) feasibility problem. The scheme is developed for the distributed system where only partial states are available. In this scheme, a subsystem uses local observers and share its information to its neighbors only when the subsystem's local error exceeds a specified threshold. The developed method is illustrated by using a coupled cart example from the literature. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Evaluating Cumulative Ecosystem Response to Restoration Projects in the Columbia River Estuary, Annual Report 2004

    Energy Technology Data Exchange (ETDEWEB)

    Diefenderfer, Heida L.; Roegner, Curtis; Thom, Ronald M.; Dawley, Earl M.; Whiting, Allan H.; Johnson, Gary E.; Sobocinski, Kathryn L.; Anderson, Michael G.; Ebberts, Blaine

    2005-12-15

    The restoration of wetland salmon habitat in the tidal portion of the Columbia River is occurring at an accelerating pace and is anticipated to improve habitat quality and effect hydrological reconnection between existing and restored habitats. Currently multiple groups are applying a variety of restoration strategies in an attempt to emulate historic estuarine processes. However, the region lacks both a standardized means of evaluating the effectiveness of individual projects as well as methods for determining the cumulative effects of all restoration projects on a regional scale. This project is working to establish a framework to evaluate individual and cumulative ecosystem responses to restoration activities in order to validate the effectiveness of habitat restoration activities designed to benefit salmon through improvements to habitat quality and habitat opportunity (i.e. access) in the Columbia River from Bonneville Dam to the ocean. The review and synthesis of approaches to measure the cumulative effects of multiple restoration projects focused on defining methods and metrics of relevance to the CRE, and, in particular, juvenile salmon use of this system. An extensive literature review found no previous study assessing the cumulative effects of multiple restoration projects on the fundamental processes and functions of a large estuarine system, although studies are underway in other large land-margin ecosystems including the Florida Everglades and the Louisiana coastal wetlands. Literature from a variety of scientific disciplines was consulted to identify the ways that effects can accumulate (e.g., delayed effects, cross-boundary effects, compounding effects, indirect effects, triggers and thresholds) as well as standard and innovative tools and methods utilized in cumulative effects analyses: conceptual models, matrices, checklists, modeling, trends analysis, geographic information systems, carrying capacity analysis, and ecosystem analysis. Potential

  3. Headache triggers in the US military.

    Science.gov (United States)

    Theeler, Brett J; Kenney, Kimbra; Prokhorenko, Olga A; Fideli, Ulgen S; Campbell, William; Erickson, Jay C

    2010-05-01

    Headaches can be triggered by a variety of factors. Military service members have a high prevalence of headache but the factors triggering headaches in military troops have not been identified. The objective of this study is to determine headache triggers in soldiers and military beneficiaries seeking specialty care for headaches. A total of 172 consecutive US Army soldiers and military dependents (civilians) evaluated at the headache clinics of 2 US Army Medical Centers completed a standardized questionnaire about their headache triggers. A total of 150 (87%) patients were active-duty military members and 22 (13%) patients were civilians. In total, 77% of subjects had migraine; 89% of patients reported at least one headache trigger with a mean of 8.3 triggers per patient. A wide variety of headache triggers was seen with the most common categories being environmental factors (74%), stress (67%), consumption-related factors (60%), and fatigue-related factors (57%). The types of headache triggers identified in active-duty service members were similar to those seen in civilians. Stress-related triggers were significantly more common in soldiers. There were no significant differences in trigger types between soldiers with and without a history of head trauma. Headaches in military service members are triggered mostly by the same factors as in civilians with stress being the most common trigger. Knowledge of headache triggers may be useful for developing strategies that reduce headache occurrence in the military.

  4. The sign problem in real-time path integral simulations: Using the cumulant action to implement multilevel blocking

    International Nuclear Information System (INIS)

    Mak, C. H.

    2009-01-01

    A practical method to tackle the sign problem in real-time path integral simulations is proposed based on the multilevel blocking idea. The formulation is made possible by using a cumulant expansion of the action, which in addition to addressing the sign problem, provides an unbiased estimator for the action from a statistically noisy sample of real-time paths. The cumulant formulation also allows the analytical gradients of the action to be computed with little extra computational effort, and it can easily be implemented in a massively parallel environment.

  5. Proposed Hall D Detector Electronics

    International Nuclear Information System (INIS)

    Paul Smith

    1998-01-01

    With nearly 10**5 channels, the signal processing and data acquisition electronics system will present a significant challenge. We envisage much of the electronics being physically located on or near the detectors to avoid the long and expensive low-level signal cables otherwise required. CERN detectors such as COMPASS and ATLAS provide a good model, and we should build on their experience as much as possible. Radiation hardness and minimal power dissipation are additional constraints. The high beam rate will necessitate good time resolution, integrated low level triggering capability and sufficient pipelining of the data to accommodate the trigger decision time. A proposed architecture is shown in the figure. Detector channels are either ''pixels'', e.g. PWCs, drift chambers, and ring cerenkovs, or charge detectors, e.g. CSI or lead glass. Pixel detectors are discriminated, while charge detectors are digitized by Flash ADCs (FADC). The digitized information is pipelined in shift registers which provide a time window for the first level of triggering to consider. After passing through the shift registers, the data are further pipelined in RAM to provide time for the level 1 trigger decision. In the event of a level 1 trigger, the RAM contents are transferred to a level 2 processor farm where more detailed trigger decisions take place

  6. The Jefferson Lab Trigger Supervisor System

    International Nuclear Information System (INIS)

    Ed Jastrzembsi; David Abbott; Graham Heyes; R.W. MacLeod; Carl Timmer; Elliott Wolin

    2000-01-01

    We discuss the design and performance of a Trigger Supervisor System for use in nuclear physics experiments at Jefferson Lab. We also discuss the enhanced features of a new Trigger Supervisor Module now under construction

  7. The Jefferson Lab Trigger Supervisor System

    International Nuclear Information System (INIS)

    Jastrzembski, E.; Abbott, D.J.; Heyes, W.G.; MacLeod, R.W.; Timmer, C.; Wolin, E.

    1999-01-01

    The authors discuss the design and performance of a Trigger Supervisor System for use in nuclear physics experiments at Jefferson Lab. They also discuss the enhanced features of a new Trigger Supervisor Module now under construction

  8. The Trigger System of the CMS Experiment

    OpenAIRE

    Felcini, Marta

    2008-01-01

    We give an overview of the main features of the CMS trigger and data acquisition (DAQ) system. Then, we illustrate the strategies and trigger configurations (trigger tables) developed for the detector calibration and physics program of the CMS experiment, at start-up of LHC operations, as well as their possible evolution with increasing luminosity. Finally, we discuss the expected CPU time performance of the trigger algorithms and the CPU requirements for the event filter farm at start-up.

  9. Triggers for a high sensitivity charm experiment

    International Nuclear Information System (INIS)

    Christian, D.C.

    1994-07-01

    Any future charm experiment clearly should implement an E T trigger and a μ trigger. In order to reach the 10 8 reconstructed charm level for hadronic final states, a high quality vertex trigger will almost certainly also be necessary. The best hope for the development of an offline quality vertex trigger lies in further development of the ideas of data-driven processing pioneered by the Nevis/U. Mass. group

  10. Maintenance hemodialysis patients have high cumulative radiation exposure.

    LENUS (Irish Health Repository)

    Kinsella, Sinead M

    2010-10-01

    Hemodialysis is associated with an increased risk of neoplasms which may result, at least in part, from exposure to ionizing radiation associated with frequent radiographic procedures. In order to estimate the average radiation exposure of those on hemodialysis, we conducted a retrospective study of 100 patients in a university-based dialysis unit followed for a median of 3.4 years. The number and type of radiological procedures were obtained from a central radiology database, and the cumulative effective radiation dose was calculated using standardized, procedure-specific radiation levels. The median annual radiation dose was 6.9 millisieverts (mSv) per patient-year. However, 14 patients had an annual cumulative effective radiation dose over 20 mSv, the upper averaged annual limit for occupational exposure. The median total cumulative effective radiation dose per patient over the study period was 21.7 mSv, in which 13 patients had a total cumulative effective radiation dose over 75 mSv, a value reported to be associated with a 7% increased risk of cancer-related mortality. Two-thirds of the total cumulative effective radiation dose was due to CT scanning. The average radiation exposure was significantly associated with the cause of end-stage renal disease, history of ischemic heart disease, transplant waitlist status, number of in-patient hospital days over follow-up, and death during the study period. These results highlight the substantial exposure to ionizing radiation in hemodialysis patients.

  11. Event-triggered cooperative target tracking in wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Lu Kelin

    2016-10-01

    Full Text Available Since the issues of low communication bandwidth supply and limited battery capacity are very crucial for wireless sensor networks, this paper focuses on the problem of event-triggered cooperative target tracking based on set-membership information filtering. We study some fundamental properties of the set-membership information filter with multiple sensor measurements. First, a sufficient condition is derived for the set-membership information filter, under which the boundedness of the outer ellipsoidal approximation set of the estimation means is guaranteed. Second, the equivalence property between the parallel and sequential versions of the set-membership information filter is presented. Finally, the results are applied to a 1D event-triggered target tracking scenario in which the negative information is exploited in the sense that the measurements that do not satisfy the triggering conditions are modelled as set-membership measurements. The tracking performance of the proposed method is validated with extensive Monte Carlo simulations.

  12. Evaluation of potential meteorological triggers of large landslides in sensitive glaciomarine clay, eastern Canada

    Directory of Open Access Journals (Sweden)

    D. Gauthier

    2012-11-01

    Full Text Available Heavy rains spread over some interval preceding large landslides in sensitive glaciomarine clay in eastern Canada are often noted as a triggering or causative factor in case studies or research reports for individual landslides, although the quantity or duration of the triggering rain event has never been characterized adequately. We selected five large landslide events that occurred in the glaciomarine clay in eastern Canada, and calculated cumulative antecedent precipitation for intervals ranging between one and 365 days preceding each event. We also calculated the antecedent precipitation values for every other day in the record, and computed the relative rank of the landslide day within the complete record. Our results show that several intervals for each landslide event are highly ranked – including those preceding a presumably earthquake-triggered landslide – but overall the rankings were highly variable, ranging between 99% and 6%. The set of highest-ranking intervals are unique for each event, including both short and long-term cumulative precipitation. All of the landslides occurred in the spring months, and the release of sequestered surface and ground water during the spring ground thaw may be related to the timing of the large landslides, so that the evolution of ground frost in the early winter may be of interest for landslide prediction. We found no simple precipitation threshold for triggering large landslides in sensitive glaciomarine clay in eastern Canada, suggesting that some complex temporal and spatial combination of pre-conditions, external energy (e.g. earthquakes, precipitation triggers and other factors such as ground frost formation and thaw are required to trigger a landslide.

  13. Observing earthquakes triggered in the near field by dynamic deformations

    Science.gov (United States)

    Gomberg, J.; Bodin, P.; Reasenberg, P.A.

    2003-01-01

    We examine the hypothesis that dynamic deformations associated with seismic waves trigger earthquakes in many tectonic environments. Our analysis focuses on seismicity at close range (within the aftershock zone), complementing published studies of long-range triggering. Our results suggest that dynamic triggering is not confined to remote distances or to geothermal and volcanic regions. Long unilaterally propagating ruptures may focus radiated dynamic deformations in the propagation direction. Therefore, we expect seismicity triggered dynamically by a directive rupture to occur asymmetrically, with a majority of triggered earthquakes in the direction of rupture propagation. Bilaterally propagating ruptures also may be directive, and we propose simple criteria for assessing their directivity. We compare the inferred rupture direction and observed seismicity rate change following 15 earthquakes (M 5.7 to M 8.1) that occured in California and Idaho in the United States, the Gulf of Aqaba, Syria, Guatemala, China, New Guinea, Turkey, Japan, Mexico, and Antarctica. Nine of these mainshocks had clearly directive, unilateral ruptures. Of these nine, seven apparently induced an asymmetric increase in seismicity rate that correlates with the rupture direction. The two exceptions include an earthquake preceded by a comparable-magnitude event on a conjugate fault and another for which data limitations prohibited conclusive results. Similar (but weaker) correlations were found for the bilaterally rupturing earthquakes we studied. Although the static stress change also may trigger seismicity, it and the seismicity it triggers are expected to be similarly asymmetric only if the final slip is skewed toward the rupture terminus. For several of the directive earthquakes, we suggest that the seismicity rate change correlates better with the dynamic stress field than the static stress change.

  14. Study On Aftershock Triggering In Consideration Of Tectonic Stress Field

    Science.gov (United States)

    Hu, C.; Cai, Y.

    2007-12-01

    : The occurrence of earthquake is related to the strength of rock and tectonic stress field. The seismic risk factor (SRF),D=\\left|{τn }\\right|/(μσn ) is proposed to describe the dangerous status of aftershock triggering in this paper. Dearthquakes, velocity field from GPS as well as geological survey. As one order of approximation, the magnitudes of the regional tectonic stress field can be estimated by the Coulomb failure criterion. Finite element method (FEM) and the concept of the factor D are used to study the aftershock triggering of the 1976 Tangshan Ms=7.8 earthquake. The results show that: (1) Most of the aftershocks triggered by the Tangshan earthquake occurred in the two-leaf-shaped regions of D≥ 1 near the north-east end of the main-shock fault. The largest leaf is about 100km long and 40km wide. (2) The areas of aftershock triggering predicted by the seismic risk factorD and Δ CFS (the changes in the Coulomb failure stress) are almost the same near the fault. The difference between them is that the aftershock area predicted by Δ CFS≥ 0 is too large and the area predicted by the factor D≥ 1 is limited. The areas of aftershock triggering predicted by Δ CFS≥ 0.04 MPa are nearly the same as those of D≥ 1 obtained by the study. (3) Sometimes Δ CFS =0.01MPa is taken as a low threshold of aftershock triggering. However, Δ CFS≥ 0 only means the probability increase of the earthquake triggering, not means the earthquake will occur. The earthquake occurrence is not only related to Δ CFS, but also to the tectonic stress field before the main-shock.

  15. A new functional and structural generation of JK edge-triggered flip-flops

    International Nuclear Information System (INIS)

    Stefanescu, I.

    1977-01-01

    A new type of logical structure for a JK edge-triggered flip-flop is proposed by the author. The structure facilitates flip-flop realizations, named ''jk-JK edge-triggered flip-flops'', satisfying more functional requirements, and offering an increased flexibility in logical design, with respect to the conventional JK edge-triggered flip-flops. The function of new flip-flops covers the function of JK edge-triggered flip-flops, known as integrated circuits. (author)

  16. First level trigger of the DIRAC experiment

    International Nuclear Information System (INIS)

    Afanas'ev, L.G.; Karpukhin, V.V.; Kulikov, A.V.; Gallas, M.

    2001-01-01

    The logic of the first level trigger of the DIRAC experiment at CERN is described. A parallel running of different trigger modes with tagging of events and optional independent prescaling is realized. A CAMAC-based trigger system is completely computer controlled

  17. Cumulative Trauma Among Mayas Living in Southeast Florida.

    Science.gov (United States)

    Millender, Eugenia I; Lowe, John

    2017-06-01

    Mayas, having experienced genocide, exile, and severe poverty, are at high risk for the consequences of cumulative trauma that continually resurfaces through current fear of an uncertain future. Little is known about the mental health and alcohol use status of this population. This correlational study explored t/he relationship of cumulative trauma as it relates to social determinants of health (years in the United States, education, health insurance status, marital status, and employment), psychological health (depression symptoms), and health behaviors (alcohol use) of 102 Guatemalan Mayas living in Southeast Florida. The results of this study indicated that, as specific social determinants of health and cumulative trauma increased, depression symptoms (particularly among women) and the risk for harmful alcohol use (particularly among men) increased. Identifying risk factors at an early stage before serious disease or problems are manifest provides room for early screening leading to early identification, early treatment, and better outcomes.

  18. Session: What do we know about cumulative or population impacts

    Energy Technology Data Exchange (ETDEWEB)

    Kerlinger, Paul; Manville, Al; Kendall, Bill

    2004-09-01

    This session at the Wind Energy and Birds/Bats workshop consisted of a panel discussion followed by a discussion/question and answer period. The panelists were Paul Kerlinger, Curry and Kerlinger, LLC, Al Manville, U.S. Fish and Wildlife Service, and Bill Kendall, US Geological Service. The panel addressed the potential cumulative impacts of wind turbines on bird and bat populations over time. Panel members gave brief presentations that touched on what is currently known, what laws apply, and the usefulness of population modeling. Topics addressed included which sources of modeling should be included in cumulative impacts, comparison of impacts from different modes of energy generation, as well as what research is still needed regarding cumulative impacts of wind energy development on bird and bat populations.

  19. Estimating a population cumulative incidence under calendar time trends

    DEFF Research Database (Denmark)

    Hansen, Stefan N; Overgaard, Morten; Andersen, Per K

    2017-01-01

    BACKGROUND: The risk of a disease or psychiatric disorder is frequently measured by the age-specific cumulative incidence. Cumulative incidence estimates are often derived in cohort studies with individuals recruited over calendar time and with the end of follow-up governed by a specific date...... by calendar time trends, the total sample Kaplan-Meier and Aalen-Johansen estimators do not provide useful estimates of the general risk in the target population. We present some alternatives to this type of analysis. RESULTS: We show how a proportional hazards model may be used to extrapolate disease risk...... estimates if proportionality is a reasonable assumption. If not reasonable, we instead advocate that a more useful description of the disease risk lies in the age-specific cumulative incidence curves across strata given by time of entry or perhaps just the end of follow-up estimates across all strata...

  20. Baltic Sea biodiversity status vs. cumulative human pressures

    DEFF Research Database (Denmark)

    Andersen, Jesper H.; Halpern, Benjamin S.; Korpinen, Samuli

    2015-01-01

    Abstract Many studies have tried to explain spatial and temporal variations in biodiversity status of marine areas from a single-issue perspective, such as fishing pressure or coastal pollution, yet most continental seas experience a wide range of human pressures. Cumulative impact assessments have...... been developed to capture the consequences of multiple stressors for biodiversity, but the ability of these assessments to accurately predict biodiversity status has never been tested or ground-truthed. This relationship has similarly been assumed for the Baltic Sea, especially in areas with impaired...... status, but has also never been documented. Here we provide a first tentative indication that cumulative human impacts relate to ecosystem condition, i.e. biodiversity status, in the Baltic Sea. Thus, cumulative impact assessments offer a promising tool for informed marine spatial planning, designation...

  1. Detecting spatial patterns with the cumulant function – Part 1: The theory

    Directory of Open Access Journals (Sweden)

    P. Naveau

    2008-02-01

    Full Text Available In climate studies, detecting spatial patterns that largely deviate from the sample mean still remains a statistical challenge. Although a Principal Component Analysis (PCA, or equivalently a Empirical Orthogonal Functions (EOF decomposition, is often applied for this purpose, it provides meaningful results only if the underlying multivariate distribution is Gaussian. Indeed, PCA is based on optimizing second order moments, and the covariance matrix captures the full dependence structure of multivariate Gaussian vectors. Whenever the application at hand can not satisfy this normality hypothesis (e.g. precipitation data, alternatives and/or improvements to PCA have to be developed and studied. To go beyond this second order statistics constraint, that limits the applicability of the PCA, we take advantage of the cumulant function that can produce higher order moments information. The cumulant function, well-known in the statistical literature, allows us to propose a new, simple and fast procedure to identify spatial patterns for non-Gaussian data. Our algorithm consists in maximizing the cumulant function. Three families of multivariate random vectors, for which explicit computations are obtained, are implemented to illustrate our approach. In addition, we show that our algorithm corresponds to selecting the directions along which projected data display the largest spread over the marginal probability density tails.

  2. CEAMF study, volume 2 : cumulative effects indicators, thresholds, and case studies : final

    International Nuclear Information System (INIS)

    2003-03-01

    The four types of cumulative effects on the environment are: alteration, loss, and fragmentation of habitat; disturbance; barriers to movement; and direct and indirect mortality. Defining where and how human activities can be continued without irreversible net harm to the environment is part of cumulative effects management. Various land-use and habitat indicators were tested in the Blueberry and Sukunka study areas of British Columbia, to address the environmental effects associated with oil and gas development. As recommended, a tiered threshold approach was used to allow for flexibility in different land management regimes and ecological settings. Success will depend on defining acceptable change, threshold values, standard public database, standard processes to calculate indicator values using the database, and project-specific and cooperative management actions. A pilot study was suggested to test the candidate thresholds and implementation process. The two areas proposed for consideration were the Jedney Enhanced Resource Development Resource Management Zone in the Fort St. John Forest District, and the Etsho Enhanced Resource Development Resource Management Zone in the Fort Nelson Forest District. Both are of interest to the petroleum and forest sectors, and support the woodland caribou, a species which is extremely sensitive to cumulative effects of habitat fragmentation and disturbance. 117 refs., 11 tabs., 39 figs.

  3. Cumulative effects in strategic environmental assessment: The influence of plan boundaries

    Energy Technology Data Exchange (ETDEWEB)

    Bidstrup, Morten, E-mail: bidstrup@plan.aau.dk [Aalborg University (Denmark); Kørnøv, Lone, E-mail: lonek@plan.aau.dk [Aalborg University (Denmark); Partidário, Maria Rosário, E-mail: mariapartidario@tecnico.ulisboa.pt [CEG-IST, Instituto Superior Técnico, Universidade de Lisboa (Portugal)

    2016-02-15

    Cumulative effects (CE) assessment is lacking quality in impact assessment (IA) worldwide. It has been argued that the strategic environmental assessment (SEA) provides a suitable IA framework for addressing CE because it is applied to developments with broad boundaries, but few have tested this claim. Through a case study on the Danish mining sector, this article explores how plan boundaries influence the analytical boundaries applied for assessing CE in SEA. The case was studied through document analysis in combination with semi-structured group interviews of the responsible planners, who also serve as SEA practitioners. It was found that CE are to some extent assessed and managed implicitly throughout the planning process. However, this is through a focus on lowering the cumulative stress of mining rather than the cumulative stress on and capacity of the receiving environment. Plan boundaries do influence CE assessment, though all boundaries are not equally influential. The geographical and time boundaries of the Danish mining plans are broad or flexible enough to accommodate a meaningful assessment of CE, but the topical boundary is restrictive. The study indicates that collaboration among planning authorities and legally appointed CE leadership may facilitate better practice on CE assessment in sector-specific SEA contexts. However, most pressing is the need for relating assessment to the receiving environment as opposed to solely the stress of a proposed plan.

  4. Cumulative effects in strategic environmental assessment: The influence of plan boundaries

    International Nuclear Information System (INIS)

    Bidstrup, Morten; Kørnøv, Lone; Partidário, Maria Rosário

    2016-01-01

    Cumulative effects (CE) assessment is lacking quality in impact assessment (IA) worldwide. It has been argued that the strategic environmental assessment (SEA) provides a suitable IA framework for addressing CE because it is applied to developments with broad boundaries, but few have tested this claim. Through a case study on the Danish mining sector, this article explores how plan boundaries influence the analytical boundaries applied for assessing CE in SEA. The case was studied through document analysis in combination with semi-structured group interviews of the responsible planners, who also serve as SEA practitioners. It was found that CE are to some extent assessed and managed implicitly throughout the planning process. However, this is through a focus on lowering the cumulative stress of mining rather than the cumulative stress on and capacity of the receiving environment. Plan boundaries do influence CE assessment, though all boundaries are not equally influential. The geographical and time boundaries of the Danish mining plans are broad or flexible enough to accommodate a meaningful assessment of CE, but the topical boundary is restrictive. The study indicates that collaboration among planning authorities and legally appointed CE leadership may facilitate better practice on CE assessment in sector-specific SEA contexts. However, most pressing is the need for relating assessment to the receiving environment as opposed to solely the stress of a proposed plan.

  5. The D OE software trigger

    International Nuclear Information System (INIS)

    Linnemann, J.T.; Michigan State Univ., East Lansing, MI

    1992-10-01

    In the D OE experiment, the software filter operates in a processor farm with each node processing a single event. Processing is data-driven: the filter does local processing to verify the candidates from the hardware trigger. The filter code consists of independent pieces called ''tools''; processing for a given hardware bit is a ''script'' invoking one or more ''tools'' sequentially. An offline simulator drives the same code with the same configuration files, running on real or simulated data. Online tests use farm nodes parasiting on the data stream. We discuss the performance of the system and how we attempt to verify its correctness

  6. Calorimeter triggers for hard collisions

    International Nuclear Information System (INIS)

    Landshoff, P.V.; Polkinghorne, J.C.

    1978-01-01

    We discuss the use of a forward calorimeter to trigger on hard hadron-hadron collisions. We give a derivation in the covariant parton model of the Ochs-Stodolsky scaling law for single-hard-scattering processes, and investigate the conditions when instead a multiple- scattering mechanism might dominate. With a proton beam, this mechanism results in six transverse jets, with a total average multiplicity about twice that seen in ordinary events. We estimate that its cross section is likely to be experimentally accessible at avalues of the beam energy in the region of 100 GeV/c

  7. Study on the plasma generation characteristics of an induction-triggered coaxial pulsed plasma thruster

    Science.gov (United States)

    Weisheng, CUI; Wenzheng, LIU; Jia, TIAN; Xiuyang, CHEN

    2018-02-01

    At present, spark plugs are used to trigger discharge in pulsed plasma thrusters (PPT), which are known to be life-limiting components due to plasma corrosion and carbon deposition. A strong electric field could be formed in a cathode triple junction (CTJ) to achieve a trigger function under vacuum conditions. We propose an induction-triggered electrode structure on the basis of the CTJ trigger principle. The induction-triggered electrode structure could increase the electric field strength of the CTJ without changing the voltage between electrodes, contributing to a reduction in the electrode breakdown voltage. Additionally, it can maintain the plasma generation effect when the breakdown voltage is reduced in the discharge experiments. The induction-triggered electrode structure could ensure an effective trigger when the ablation distance of Teflon increases, and the magnetic field produced by the discharge current could further improve the plasma density and propagation velocity. The induction-triggered coaxial PPT we propose has a simplified trigger structure, and it is an effective attempt to optimize the micro-satellite thruster.

  8. A study of the relationship between peak skin dose and cumulative air kerma in interventional neuroradiology and cardiology

    International Nuclear Information System (INIS)

    Neil, S; Padgham, C; Martin, C J

    2010-01-01

    A study of peak skin doses (PSDs) during neuroradiology and cardiology interventional procedures has been carried out using Gafchromic XR-RV2 film. Use of mosaics made from squares held in cling film has allowed doses to the head to be mapped successfully. The displayed cumulative air kerma (CAK) has been calibrated in terms of cumulative entrance surface dose (CESD) and results indicate that this can provide a reliable indicator of the PSD in neuroradiology. Results linking PSD to CESD for interventional cardiology were variable, but CAK is still considered to provide the best option for use as an indicator of potential radiation-induced effects. A CESD exceeding 3 Gy is considered a suitable action level for triggering follow-up of patients in neuroradiology and cardiology for possible skin effects. Application of dose action levels defined in this way would affect 8% of neurological embolisation procedures and 5% of cardiology ablation and multiple stent procedures at the hospitals where the investigations were carried out. A close relationship was observed between CESD and dose-area product (DAP) for particular types of procedure, and DAPs of 200-300 Gy cm 2 could be used as trigger levels where CAK readings were not available. The DAP value would depend on the mean field size and would need to be determined for each application.

  9. Architecture of a Level 1 Track Trigger for the CMS Experiment

    CERN Document Server

    Heintz, Ulrich

    2010-01-01

    The luminosity goal for the Super-LHC is 1035/cm2/s. At this luminosity the number of proton-proton interactions in each beam crossing will be in the hundreds. This will stress many components of the CMS detector. One system that has to be upgraded is the trigger system. To keep the rate at which the level 1 trigger fires manageable, information from the tracker has to be integrated into the level 1 trigger. Current design proposals foresee tracking detectors that perform on-detector filtering to reject hits from low-momentum particles. In order to build a trigger system, the filtered hit data from different layers and sectors of the tracker will have to be transmitted off the detector and brought together in a logic processor that generates trigger tracks within the time window allowed by the level 1 trigger latency. I will describe a possible architecture for the off-detector logic that accomplishes this goal.

  10. Insight into multiple-triggering effect in DTSCRs for ESD protection

    Science.gov (United States)

    Zhang, Lizhong; Wang, Yuan; Wang, Yize; He, Yandong

    2017-07-01

    The diode-triggered silicon-controlled rectifier (DTSCR) is widely used for electrostatic discharge (ESD) protection in advanced CMOS process owing to its advantages, such as design simplification, adjustable trigger/holding voltage, low parasitic capacitance. However, the multiple-triggering effect in the typical DTSCR device may cause undesirable larger overall trigger voltage, which results in a reduced ESD safe margin. In previous research, the major cause is attributed to the higher current level required in the intrinsic SCR. The related discussions indicate that it seems to result from the current division rule between the intrinsic and parasitic SCR formed in the triggering process. In this letter, inserting a large space into the trigger diodes is proposed to get a deeper insight into this issue. The triggering current is observed to be regularly reduced along with the increased space, which confirms that the current division is determined by the parasitic resistance distributed between the intrinsic and parasitic SCR paths. The theoretical analysis is well confirmed by device simulation and transmission line pulse (TLP) test results. The reduced overall trigger voltage is achieved in the modified DTSCR structures due to the comprehensive result of the parasitic resistance vs triggering current, which indicates a minimized multiple-triggering effect. Project supported by the Beijing Natural Science Foundation, China (No. 4162030).

  11. The role of factorial cumulants in reactor neutron noise theory

    International Nuclear Information System (INIS)

    Colombino, A.; Pacilio, N.; Sena, G.

    1979-01-01

    The physical meaning and the combinatorial implications of the factorial cumulant of a state variable such as the number of neutrons or the number of neutron counts are specified. Features of the presentation are: (1) the fission process is treated in its entirety without the customary binary emission restriction, (b) the introduction of the factorial cumulants helps in reducing the complexity of the mathematical problems, (c) all the solutions can be obtained analytically. Only the ergodic hypothesis for the neutron population evolution is dealt with. (author)

  12. Super-Resolution Algorithm in Cumulative Virtual Blanking

    Science.gov (United States)

    Montillet, J. P.; Meng, X.; Roberts, G. W.; Woolfson, M. S.

    2008-11-01

    The proliferation of mobile devices and the emergence of wireless location-based services have generated consumer demand for precise location. In this paper, the MUSIC super-resolution algorithm is applied to time delay estimation for positioning purposes in cellular networks. The goal is to position a Mobile Station with UMTS technology. The problem of Base-Stations herability is solved using Cumulative Virtual Blanking. A simple simulator is presented using DS-SS signal. The results show that MUSIC algorithm improves the time delay estimation in both the cases whether or not Cumulative Virtual Blanking was carried out.

  13. Triggering for charm, beauty, and truth

    International Nuclear Information System (INIS)

    Appel, J.A.

    1982-02-01

    As the search for more and more rare processes accelerates, the need for more and more effective event triggers also accelerates. In the earliest experiments, a simple coincidence often sufficed not only as the event trigger, but as the complete record of an event of interest. In today's experiments, not only has the fast trigger become more sophisticated, but one or more additional level of trigger processing precedes writing event data to magnetic tape for later analysis. Further search experiments will certainly require further expansion in the number of trigger levels required to filter those rare events of particular interest

  14. The Database Driven ATLAS Trigger Configuration System

    CERN Document Server

    Martyniuk, Alex; The ATLAS collaboration

    2015-01-01

    This contribution describes the trigger selection configuration system of the ATLAS low- and high-level trigger (HLT) and the upgrades it received in preparation for LHC Run 2. The ATLAS trigger configuration system is responsible for applying the physics selection parameters for the online data taking at both trigger levels and the proper connection of the trigger lines across those levels. Here the low-level trigger consists of the already existing central trigger (CT) and the new Level-1 Topological trigger (L1Topo), which has been added for Run 2. In detail the tasks of the configuration system during the online data taking are Application of the selection criteria, e.g. energy cuts, minimum multiplicities, trigger object correlation, at the three trigger components L1Topo, CT, and HLT On-the-fly, e.g. rate-dependent, generation and application of prescale factors to the CT and HLT to adjust the trigger rates to the data taking conditions, such as falling luminosity or rate spikes in the detector readout ...

  15. Hadronic Triggers and trigger-object level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program, and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors to more deeply probing for new physics, such as storage and computing requirements f...

  16. Hadronic triggers and trigger object-level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program at the Large Hadron Collider (LHC), and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous event rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors, such as storage and computing requirements...

  17. A spot-matching method using cumulative frequency matrix in 2D gel images

    Science.gov (United States)

    Han, Chan-Myeong; Park, Joon-Ho; Chang, Chu-Seok; Ryoo, Myung-Chun

    2014-01-01

    A new method for spot matching in two-dimensional gel electrophoresis images using a cumulative frequency matrix is proposed. The method improves on the weak points of the previous method called ‘spot matching by topological patterns of neighbour spots’. It accumulates the frequencies of neighbour spot pairs produced through the entire matching process and determines spot pairs one by one in order of higher frequency. Spot matching by frequencies of neighbour spot pairs shows a fairly better performance. However, it can give researchers a hint for whether the matching results can be trustworthy or not, which can save researchers a lot of effort for verification of the results. PMID:26019609

  18. Cumulative effects of wind turbines. Volume 3: Report on results of consultations on cumulative effects of wind turbines on birds

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This report gives details of the consultations held in developing the consensus approach taken in assessing the cumulative effects of wind turbines. Contributions on bird issues, and views of stakeholders, the Countryside Council for Wales, electric utilities, Scottish Natural Heritage, and the National Wind Power Association are reported. The scoping of key species groups, where cumulative effects might be expected, consideration of other developments, the significance of any adverse effects, mitigation, regional capacity assessments, and predictive models are discussed. Topics considered at two stakeholder workshops are outlined in the appendices.

  19. Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation

    Energy Technology Data Exchange (ETDEWEB)

    Barajas-Solano, David A.; Tartakovsky, Alexandre M.

    2018-01-01

    We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advective dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.

  20. Cumulant-Based Coherent Signal Subspace Method for Bearing and Range Estimation

    Directory of Open Access Journals (Sweden)

    Bourennane Salah

    2007-01-01

    Full Text Available A new method for simultaneous range and bearing estimation for buried objects in the presence of an unknown Gaussian noise is proposed. This method uses the MUSIC algorithm with noise subspace estimated by using the slice fourth-order cumulant matrix of the received data. The higher-order statistics aim at the removal of the additive unknown Gaussian noise. The bilinear focusing operator is used to decorrelate the received signals and to estimate the coherent signal subspace. A new source steering vector is proposed including the acoustic scattering model at each sensor. Range and bearing of the objects at each sensor are expressed as a function of those at the first sensor. This leads to the improvement of object localization anywhere, in the near-field or in the far-field zone of the sensor array. Finally, the performances of the proposed method are validated on data recorded during experiments in a water tank.

  1. A configurable tracking algorithm to detect cosmic muon tracks for the CMS-RPC based technical trigger

    CERN Document Server

    Rajan, R T; Loddo, F; Maggi, M; Ranieri, A; Abbrescia, M; Guida, R; Iaselli, G; Nuzzo, S; Pugliese, G; Roselli, G; Trentadue, R; Tupputi, b, S; Benussi, L; Bertani, M; Bianco, S; Fabbri, F; Cavallo, N; Cimmino, e, A; Lomidze, D; Noli, P; Paolucci, P; Piccolo, D; Polese, G; Sciacca, C; Baesso, g, P; Belli, G; Necchi, M; Ratti, S P; Pagano, D; Vitulo, P; Viviani, C; Dimitrov, A; Litov, L; Pavlov, B; Petkov, P; Genchev, V; Iaydjiev, P; Bunkowski, K; Kierzkowski, K; Konecki, M; Kudla, I; Pietrusinski, M; Pozniak, K

    2009-01-01

    In the CERN CMS experiment at LHC Collider special trigger signals called Technical Triggers will be used for the purpose of test and calibration. The Resistive Plate Chambers (RPC) based Technical Trigger system is a part of the CMS muon trigger system and is designed to detect cosmic muon tracks. It is based on two boards, namely RBC (RPC Balcony Collector) and TTU (Technical Trigger Unit). The proposed tracking algorithm (TA) written in VHDL and implemented in the TTU board detects single or multiple cosmic muon tracks at every bunch crossing along with their track lengths and corresponding chamber coordinates. The TA implementation in VHDL and its preliminary simulation results are presented.

  2. Cumulative impacts: current research and current opinions at PSW

    Science.gov (United States)

    R. M. Rice

    1987-01-01

    Consideration of cumulative watershed effects (CWEs) has both political and physical aspects. Regardless of the practical usefulness of present methods of dealing with CWEs, the legal requirement to address them remains. Management of federal land is regulated by the National Environmental Policy Act (NEPA) and the Federal Water Pollution Control Act of 1972. The...

  3. Cumulative Risks of Foster Care Placement for Danish Children

    DEFF Research Database (Denmark)

    Fallesen, Peter; Emanuel, Natalia; Wildeman, Christopher

    2014-01-01

    children. Our results also show some variations by parental ethnicity and sex, but these differences are small. Indeed, they appear quite muted relative to racial/ethnic differences in these risks in the United States. Last, though cumulative risks are similar between Danish and American children...

  4. Disintegration of a profiled shock wave at the cumulation point

    International Nuclear Information System (INIS)

    Kaliski, S.

    1978-01-01

    The disintegration at the cumulation point is analyzed of a shock wave generated with the aid of a profiled pressure. The quantitative relations are analyzed for the disintegration waves for typical compression parameters in systems of thermonuclear microfusion. The quantitative conclusions are drawn for the application of simplifying approximate calculations in problems of microfusion. (author)

  5. Cumulative Prospect Theory, Option Returns, and the Variance Premium

    NARCIS (Netherlands)

    Baele, Lieven; Driessen, Joost; Ebert, Sebastian; Londono Yarce, J.M.; Spalt, Oliver

    The variance premium and the pricing of out-of-the-money (OTM) equity index options are major challenges to standard asset pricing models. We develop a tractable equilibrium model with Cumulative Prospect Theory (CPT) preferences that can overcome both challenges. The key insight is that the

  6. Steps and Pips in the History of the Cumulative Recorder

    Science.gov (United States)

    Lattal, Kennon A.

    2004-01-01

    From its inception in the 1930s until very recent times, the cumulative recorder was the most widely used measurement instrument in the experimental analysis of behavior. It was an essential instrument in the discovery and analysis of schedules of reinforcement, providing the first real-time analysis of operant response rates and patterns. This…

  7. The effects of cumulative practice on mathematics problem solving.

    Science.gov (United States)

    Mayfield, Kristin H; Chase, Philip N

    2002-01-01

    This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving.

  8. Anti-irritants II: Efficacy against cumulative irritation

    DEFF Research Database (Denmark)

    Andersen, Flemming; Hedegaard, Kathryn; Petersen, Thomas Kongstad

    2006-01-01

    window of opportunity in which to demonstrate efficacy. Therefore, the effect of AI was studied in a cumulative irritation model by inducing irritant dermatitis with 10 min daily exposures for 5+4 days (no irritation on weekend) to 1% sodium lauryl sulfate on the right and 20% nonanoic acid on the left...

  9. Cumulative Beam Breakup with Time-Dependent Parameters

    CERN Document Server

    Delayen, J R

    2004-01-01

    A general analytical formalism developed recently for cumulative beam breakup (BBU) in linear accelerators with arbitrary beam current profile and misalignments [1] is extended to include time-dependent parameters such as energy chirp or rf focusing in order to reduce BBU-induced instabilities and emittance growth. Analytical results are presented and applied to practical accelerator configurations.

  10. Hyperscaling breakdown and Ising spin glasses: The Binder cumulant

    Science.gov (United States)

    Lundow, P. H.; Campbell, I. A.

    2018-02-01

    Among the Renormalization Group Theory scaling rules relating critical exponents, there are hyperscaling rules involving the dimension of the system. It is well known that in Ising models hyperscaling breaks down above the upper critical dimension. It was shown by Schwartz (1991) that the standard Josephson hyperscaling rule can also break down in Ising systems with quenched random interactions. A related Renormalization Group Theory hyperscaling rule links the critical exponents for the normalized Binder cumulant and the correlation length in the thermodynamic limit. An appropriate scaling approach for analyzing measurements from criticality to infinite temperature is first outlined. Numerical data on the scaling of the normalized correlation length and the normalized Binder cumulant are shown for the canonical Ising ferromagnet model in dimension three where hyperscaling holds, for the Ising ferromagnet in dimension five (so above the upper critical dimension) where hyperscaling breaks down, and then for Ising spin glass models in dimension three where the quenched interactions are random. For the Ising spin glasses there is a breakdown of the normalized Binder cumulant hyperscaling relation in the thermodynamic limit regime, with a return to size independent Binder cumulant values in the finite-size scaling regime around the critical region.

  11. How to manage the cumulative flood safety of catchment dams ...

    African Journals Online (AJOL)

    Dam safety is a significant issue being taken seriously worldwide. However, in Australia, although much attention is being devoted to the medium- to large-scale dams, minimal attention is being paid to the serious potential problems associated with smaller dams, particularly the potential cumulative safety threats they pose ...

  12. Cumulative Beam Breakup due to Resistive-Wall Wake

    International Nuclear Information System (INIS)

    Wang, J.-M.

    2004-01-01

    The cumulative beam breakup problem excited by the resistive-wall wake is formulated. An approximate analytic method of finding the asymptotic behavior for the transverse bunch displacement is developed and solved. Comparison between the asymptotic analytical expression and the direct numerical solution is presented. Good agreement is found. The criterion of using the asymptotic analytical expression is discussed

  13. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  14. Implications of applying cumulative risk assessment to the workplace.

    Science.gov (United States)

    Fox, Mary A; Spicer, Kristen; Chosewood, L Casey; Susi, Pam; Johns, Douglas O; Dotson, G Scott

    2018-06-01

    Multiple changes are influencing work, workplaces and workers in the US including shifts in the main types of work and the rise of the 'gig' economy. Work and workplace changes have coincided with a decline in unions and associated advocacy for improved safety and health conditions. Risk assessment has been the primary method to inform occupational and environmental health policy and management for many types of hazards. Although often focused on one hazard at a time, risk assessment frameworks and methods have advanced toward cumulative risk assessment recognizing that exposure to a single chemical or non-chemical stressor rarely occurs in isolation. We explore how applying cumulative risk approaches may change the roles of workers and employers as they pursue improved health and safety and elucidate some of the challenges and opportunities that might arise. Application of cumulative risk assessment should result in better understanding of complex exposures and health risks with the potential to inform more effective controls and improved safety and health risk management overall. Roles and responsibilities of both employers and workers are anticipated to change with potential for a greater burden of responsibility on workers to address risk factors both inside and outside the workplace that affect health at work. A range of policies, guidance and training have helped develop cumulative risk assessment for the environmental health field and similar approaches are available to foster the practice in occupational safety and health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Hierarchical Bayesian parameter estimation for cumulative prospect theory

    NARCIS (Netherlands)

    Nilsson, H.; Rieskamp, J.; Wagenmakers, E.-J.

    2011-01-01

    Cumulative prospect theory (CPT Tversky & Kahneman, 1992) has provided one of the most influential accounts of how people make decisions under risk. CPT is a formal model with parameters that quantify psychological processes such as loss aversion, subjective values of gains and losses, and

  16. An Axiomatization of Cumulative Prospect Theory for Decision under Risk

    NARCIS (Netherlands)

    Wakker, P.P.; Chateauneuf, A.

    1999-01-01

    Cumulative prospect theory was introduced by Tversky and Kahneman so as to combine the empirical realism of their original prospect theory with the theoretical advantages of Quiggin's rank-dependent utility. Preference axiomatizations were provided in several papers. All those axiomatizations,

  17. Cumulative assessment: does it improve students’ knowledge acquisition and retention?

    NARCIS (Netherlands)

    Cecilio Fernandes, Dario; Nagtegaal, Manouk; Noordzij, Gera; Tio, Rene

    2017-01-01

    Introduction Assessment for learning means changing students’ behaviour regarding their learning. Cumulative assessment has been shown to increase students’ self-study time and spread their study time throughout a course. However, there was no difference regarding students’ knowledge at the end of

  18. Wired and Wireless Camera Triggering with Arduino

    Science.gov (United States)

    Kauhanen, H.; Rönnholm, P.

    2017-10-01

    Synchronous triggering is an important task that allows simultaneous data capture from multiple cameras. Accurate synchronization enables 3D measurements of moving objects or from a moving platform. In this paper, we describe one wired and four wireless variations of Arduino-based low-cost remote trigger systems designed to provide a synchronous trigger signal for industrial cameras. Our wireless systems utilize 315 MHz or 434 MHz frequencies with noise filtering capacitors. In order to validate the synchronization accuracy, we developed a prototype of a rotating trigger detection system (named RoTriDeS). This system is suitable to detect the triggering accuracy of global shutter cameras. As a result, the wired system indicated an 8.91 μs mean triggering time difference between two cameras. Corresponding mean values for the four wireless triggering systems varied between 7.92 and 9.42 μs. Presented values include both camera-based and trigger-based desynchronization. Arduino-based triggering systems appeared to be feasible, and they have the potential to be extended to more complicated triggering systems.

  19. The challenges and opportunities in cumulative effects assessment

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Melissa M., E-mail: mfoley@usgs.gov [U.S. Geological Survey, Pacific Coastal and Marine Science Center, 400 Natural Bridges, Dr., Santa Cruz, CA 95060 (United States); Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940 (United States); Mease, Lindley A., E-mail: lamease@stanford.edu [Center for Ocean Solutions, Stanford University, 473 Via Ortega, Stanford, CA 94305 (United States); Martone, Rebecca G., E-mail: rmartone@stanford.edu [Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940 (United States); Prahler, Erin E. [Center for Ocean Solutions, Stanford University, 473 Via Ortega, Stanford, CA 94305 (United States); Morrison, Tiffany H., E-mail: tiffany.morrison@jcu.edu.au [ARC Centre of Excellence for Coral Reef Studies, James Cook University, Townsville, QLD, 4811 (Australia); Murray, Cathryn Clarke, E-mail: cmurray@pices.int [WWF-Canada, 409 Granville Street, Suite 1588, Vancouver, BC V6C 1T2 (Canada); Wojcik, Deborah, E-mail: deb.wojcik@duke.edu [Nicholas School for the Environment, Duke University, 9 Circuit Dr., Durham, NC 27708 (United States)

    2017-01-15

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  20. The challenges and opportunities in cumulative effects assessment

    International Nuclear Information System (INIS)

    Foley, Melissa M.; Mease, Lindley A.; Martone, Rebecca G.; Prahler, Erin E.; Morrison, Tiffany H.; Murray, Cathryn Clarke; Wojcik, Deborah

    2017-01-01

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  1. The challenges and opportunities in cumulative effects assessment

    Science.gov (United States)

    Foley, Melissa M.; Mease, Lindley A; Martone, Rebecca G; Prahler, Erin E; Morrison, Tiffany H; Clarke Murray, Cathryn; Wojcik, Deborah

    2016-01-01

    The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.

  2. Event-Triggered Fault Detection of Nonlinear Networked Systems.

    Science.gov (United States)

    Li, Hongyi; Chen, Ziran; Wu, Ligang; Lam, Hak-Keung; Du, Haiping

    2017-04-01

    This paper investigates the problem of fault detection for nonlinear discrete-time networked systems under an event-triggered scheme. A polynomial fuzzy fault detection filter is designed to generate a residual signal and detect faults in the system. A novel polynomial event-triggered scheme is proposed to determine the transmission of the signal. A fault detection filter is designed to guarantee that the residual system is asymptotically stable and satisfies the desired performance. Polynomial approximated membership functions obtained by Taylor series are employed for filtering analysis. Furthermore, sufficient conditions are represented in terms of sum of squares (SOSs) and can be solved by SOS tools in MATLAB environment. A numerical example is provided to demonstrate the effectiveness of the proposed results.

  3. How variable is the number of triggered aftershocks?

    Science.gov (United States)

    Marsan, D.; Helmstetter, A.

    2017-07-01

    Aftershock activity depends at first order on the main shock magnitude but also shows important fluctuations between shocks of equal magnitude. We here investigate these fluctuations, by quantifying them and by relating them to the main shock stress drop and other variables, for southern California earthquakes. A method is proposed in order to only count directly triggered aftershocks, rather than secondary aftershocks (i.e., triggered by previous aftershocks), and to only quantify fluctuations going beyond the natural Poisson variability. Testing of the method subjected to various model errors allows to quantify its robustness. It is found that these fluctuations follow a distribution that is well fitted by a lognormal distribution, with a coefficient of variation of about 1.0 to 1.1. A simple model is proposed to relate this observed dependence to main shock stress drop variability.

  4. Shallow geological structures triggered during the Mw 6.4 Meinong earthquake, southwestern Taiwan

    Directory of Open Access Journals (Sweden)

    Maryline Le Béon

    2017-01-01

    Full Text Available The Meinong earthquake generated up to ~10 cm surface displacement located 10 - 35 km west of the epicenter and monitored by InSAR and GPS. In addition to coseismic deformation related to the deep earthquake source, InSAR revealed three sharp surface displacement gradients. One of them is extensional and is inconsistent with the westward interseismic shortening of ~45 mm yr-1 in this region. The gradient sharpness suggests slip triggering on shallow structures, some of which were not well documented before. To characterize these shallow structures, we investigated potential surface ruptures in the field. Sets of ~NS tension cracks distributed over 25 - 300 m width, with cumulative extension in the same order as InSAR observations, were found along 5.5 km distance along the extensional gradient and are interpreted as surface rupture. We build two E-W regional balanced cross-sections, based on surface geology, subsurface data, and coseismic and interseismic geodetic data. From the Coastal Plain to the east edge of the coseismic deformation area, we propose a series of three active west-dipping backthrusts: the Houchiali fault, the Napalin-Pitou backthrust, and the Lungchuan backthrust. They all root on the 3.5 - 4.0 km deep Tainan detachment located near the base of the 3-km-thick Gutingkeng mudstone. Further east, the detachment would ramp down to ~7 km depth. Coseismic surface deformation measurements suggest that, in addition to the deeper (15 - 20 km main rupture plane, mostly the ramp, the Lungchuan backthrust, and the Tainan detachment were activated during or right after the earthquake. Local extension is considered as transient deformation at the west edge of the shallow main slip zone.

  5. Spatiotemporal patterns, triggers and anatomies of seismically detected rockfalls

    Directory of Open Access Journals (Sweden)

    M. Dietze

    2017-11-01

    Full Text Available Rockfalls are a ubiquitous geomorphic process and a natural hazard in steep landscapes across the globe. Seismic monitoring can provide precise information on the timing, location and event anatomy of rockfalls, which are parameters that are otherwise hard to constrain. By pairing data from 49 seismically detected rockfalls in the Lauterbrunnen Valley in the Swiss Alps with auxiliary meteorologic and seismic data of potential triggers during autumn 2014 and spring 2015, we are able to (i analyse the evolution of single rockfalls and their common properties, (ii identify spatial changes in activity hotspots (iii and explore temporal activity patterns on different scales ranging from months to minutes to quantify relevant trigger mechanisms. Seismic data allow for the classification of rockfall activity into two distinct phenomenological types. The signals can be used to discern multiple rock mass releases from the same spot, identify rockfalls that trigger further rockfalls and resolve modes of subsequent talus slope activity. In contrast to findings based on discontinuous methods with integration times of several months, rockfall in the monitored limestone cliff is not spatially uniform but shows a systematic downward shift of a rock mass release zone following an exponential law, most likely driven by a continuously lowering water table. Freeze–thaw transitions, approximated at first order from air temperature time series, account for only 5 out of the 49 rockfalls, whereas 19 rockfalls were triggered by rainfall events with a peak lag time of 1 h. Another 17 rockfalls were triggered by diurnal temperature changes and occurred during the coldest hours of the day and during the highest temperature change rates. This study is thus the first to show direct links between proposed rockfall triggers and the spatiotemporal distribution of rockfalls under natural conditions; it extends existing models by providing seismic observations of the

  6. Measurement of four-particle cumulants and symmetric cumulants with subevent methods in small collision systems with the ATLAS detector

    CERN Document Server

    Derendarz, Dominik; The ATLAS collaboration

    2018-01-01

    Measurements of symmetric cumulants SC(n,m)=⟨v2nv2m⟩−⟨v2n⟩⟨v2m⟩ for (n,m)=(2,3) and (2,4) and asymmetric cumulant AC(n) are presented in pp, p+Pb and peripheral Pb+Pb collisions at various collision energies, aiming to probe the long-range collective nature of multi-particle production in small systems. Results are obtained using the standard cumulant method, as well as the two-subevent and three-subevent cumulant methods. Results from the standard method are found to be strongly biased by non-flow correlations as indicated by strong sensitivity to the chosen event class definition. A systematic reduction of non-flow effects is observed when using the two-subevent method and the results become independent of event class definition when the three-subevent method is used. The measured SC(n,m) shows an anti-correlation between v2 and v3, and a positive correlation between v2 and v4. The magnitude of SC(n,m) is constant with Nch in pp collisions, but increases with Nch in p+Pb and Pb+Pb collisions. ...

  7. An analytical model for cumulative infiltration into a dual-permeability media

    Science.gov (United States)

    Peyrard, Xavier; Lassabatere, Laurent; Angulo-Jaramillo, Rafael; Simunek, Jiri

    2010-05-01

    Modeling of water infiltration into the vadose zone is important for better understanding of movement of water-transported contaminants. There is a great need to take into account the soil heterogeneity and, in particular, the presence of macropores or cracks that could generate preferential flow. Several mathematical models have been proposed to describe unsaturated flow through heterogeneous soils. The dual-permeability model assumes that flow is governed by Richards equation in both porous regions (matrix and fractures). Water can be exchanged between the two regions following a first-order rate law. A previous study showed that the influence of the hydraulic conductivity of the matrix/macropore interface had a little influence on cumulative infiltration at the soil surface. As a result, one could consider the surface infiltration for a specific case of no water exchange between the fracture and matrix regions (a case of zero interfacial hydraulic conductivity). In such a case, water infiltration can be considered to be the sum of the cumulative infiltrations into the matrix and the fractures. On the basis of analytical models for each sub domain (matrix and fractures), an analytical model is proposed for the entire dual-porosity system. A sensitivity analysis is performed to characterize the influence of several factors, such as the saturated hydraulic conductivity ratio, the water pressure scale parameter ratio, and the saturated volumetric water content scale ratio, on the total cumulative infiltration. Such an analysis greatly helps in quantifying the impact of macroporosity and fractures on water infiltration, which can be of great interest for hydrological models.

  8. Triggers of oral lichen planus flares and the potential role of trigger avoidance in disease management.

    Science.gov (United States)

    Chen, Hannah X; Blasiak, Rachel; Kim, Edwin; Padilla, Ricardo; Culton, Donna A

    2017-09-01

    Many patients with oral lichen planus (OLP) report triggers of flares, some of which overlap with triggers of other oral diseases, including oral allergy syndrome and oral contact dermatitis. The purpose of this study was to evaluate the prevalence of commonly reported triggers of OLP flares, their overlap with triggers of other oral diseases, and the potential role of trigger avoidance as a management strategy. Questionnaire-based survey of 51 patients with biopsy-proven lichen planus with oral involvement seen in an academic dermatology specialty clinic and/or oral pathology clinic between June 2014 and June 2015. Of the participants, 94% identified at least one trigger of their OLP flares. Approximately half of the participants (51%) reported at least one trigger that overlapped with known triggers of oral allergy syndrome, and 63% identified at least one trigger that overlapped with known triggers of oral contact dermatitis. Emotional stress was the most commonly reported trigger (77%). Regarding avoidance, 79% of the study participants reported avoiding their known triggers in daily life. Of those who actively avoided triggers, 89% reported an improvement in symptoms and 70% reported a decrease in the frequency of flares. Trigger identification and avoidance can play a potentially effective role in the management of OLP. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  10. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    1997-01-01

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  11. Smart trigger logic for focal plane arrays

    Science.gov (United States)

    Levy, James E; Campbell, David V; Holmes, Michael L; Lovejoy, Robert; Wojciechowski, Kenneth; Kay, Randolph R; Cavanaugh, William S; Gurrieri, Thomas M

    2014-03-25

    An electronic device includes a memory configured to receive data representing light intensity values from pixels in a focal plane array and a processor that analyzes the received data to determine which light values correspond to triggered pixels, where the triggered pixels are those pixels that meet a predefined set of criteria, and determines, for each triggered pixel, a set of neighbor pixels for which light intensity values are to be stored. The electronic device also includes a buffer that temporarily stores light intensity values for at least one previously processed row of pixels, so that when a triggered pixel is identified in a current row, light intensity values for the neighbor pixels in the previously processed row and for the triggered pixel are persistently stored, as well as a data transmitter that transmits the persistently stored light intensity values for the triggered and neighbor pixels to a data receiver.

  12. ELM mitigation with pellet ELM triggering and implications for PFCs and plasma performance in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Baylor, Larry R. [ORNL; Lang, P. [EURATOM / UKAEA, Abingdon, UK; Allen, S. L. [Lawrence Livermore National Laboratory (LLNL); Lasnier, C. J. [Lawrence Livermore National Laboratory (LLNL); Meitner, Steven J. [ORNL; Combs, Stephen Kirk [ORNL; Commaux, Nicolas JC [ORNL; Loarte, A. [ITER Organization, Cadarache, France; Jernigan, Thomas C. [ORNL

    2015-08-01

    The triggering of rapid small edge localized modes (ELMs) by high frequency pellet injection has been proposed as a method to prevent large naturally occurring ELMs that can erode the ITER plasma facing components (PFCs). Deuterium pellet injection has been used to successfully demonstrate the on-demand triggering of edge localized modes (ELMs) at much higher rates and with much smaller intensity than natural ELMs. The proposed hypothesis for the triggering mechanism of ELMs by pellets is the local pressure perturbation resulting from reheating of the pellet cloud that can exceed the local high-n ballooning mode threshold where the pellet is injected. Nonlinear MHD simulations of the pellet ELM triggering show destabilization of high-n ballooning modes by such a local pressure perturbation.A review of the recent pellet ELM triggering results from ASDEX Upgrade (AUG), DIII-D, and JET reveals that a number of uncertainties about this ELM mitigation technique still remain. These include the heat flux impact pattern on the divertor and wall from pellet triggered and natural ELMs, the necessary pellet size and injection location to reliably trigger ELMs, and the level of fueling to be expected from ELM triggering pellets and synergy with larger fueling pellets. The implications of these issues for pellet ELM mitigation in ITER and its impact on the PFCs are presented along with the design features of the pellet injection system for ITER.

  13. Upgrade trigger & reconstruction strategy: 2017 milestone

    CERN Document Server

    Albrecht, Johannes; Campora Perez, Daniel Hugo; Cattaneo, Marco; Marco, Clemencic; Couturier, Ben; Dziurda, Agnieszka; Fitzpatrick, Conor; Fontana, Marianna; Grillo, Lucia; Hasse, Christoph; Hill, Donal; Jones, Christopher Rob; Lemaitre, Florian; Lupton, Olli; Matev, Rosen; Pearce, Alex; Polci, Francesco; Promberger, Laura; Ponce, Sebastien; Quagliani, Renato; Raven, Gerhard; Sciascia, Barbara; Schiller, Manuel Tobias; Stahl, Sascha; Szymanski, Maciej Pawel; Chefdeville, Maximilien

    2018-01-01

    The LHCb collaboration is currently preparing an update of the experiment to take data in Run 3 of the LHC. The dominant feature of this upgrade is a trigger-less readout of the full detector followed by a full software trigger. To make optimal use of the collected data, the events are reconstructed at the inelastic collision rate of 30 MHz. This document presents the baseline trigger and reconstruction strategy as of the end of 2017.

  14. A muon trigger for the MACRO apparatus

    International Nuclear Information System (INIS)

    Barbarito, E.; Bellotti, R.; Calicchio, M.; Castellano, M.; DeCataldo, G.; DeMarzo, C.; Erriquez, O.; Favuzzi, C.; Giglietto, N.; Liuzzi, R.; Spinelli, P.

    1991-01-01

    A trigger circuit based on EPROM components, able to manage up to 30 lines from independent counters, is described. The circuit has been designed and used in the MACRO apparatus at the Gran Sasso Laboratory for triggering on fast particles. The circuit works with standard TTL positive logic and is assembled in a double standard CAMAC module. It has a high triggering capacity and a high flexibility. (orig.)

  15. The ATLAS Level-1 Calorimeter Trigger

    International Nuclear Information System (INIS)

    Achenbach, R; Andrei, V; Adragna, P; Apostologlou, P; Barnett, B M; Brawn, I P; Davis, A O; Edwards, J P; Asman, B; Bohm, C; Ay, C; Bauss, B; Bendel, M; Dahlhoff, A; Eckweiler, S; Booth, J R A; Thomas, P Bright; Charlton, D G; Collins, N J; Curtis, C J

    2008-01-01

    The ATLAS Level-1 Calorimeter Trigger uses reduced-granularity information from all the ATLAS calorimeters to search for high transverse-energy electrons, photons, τ leptons and jets, as well as high missing and total transverse energy. The calorimeter trigger electronics has a fixed latency of about 1 μs, using programmable custom-built digital electronics. This paper describes the Calorimeter Trigger hardware, as installed in the ATLAS electronics cavern

  16. The ATLAS Level-1 Calorimeter Trigger

    Energy Technology Data Exchange (ETDEWEB)

    Achenbach, R; Andrei, V [Kirchhoff-Institut fuer Physik, University of Heidelberg, D-69120 Heidelberg (Germany); Adragna, P [Physics Department, Queen Mary, University of London, London E1 4NS (United Kingdom); Apostologlou, P; Barnett, B M; Brawn, I P; Davis, A O; Edwards, J P [STFC Rutherford Appleton Laboratory, Harwell Science and Innovation Campus, Didcot, Oxon OX11 0QX (United Kingdom); Asman, B; Bohm, C [Fysikum, Stockholm University, SE-106 91 Stockholm (Sweden); Ay, C; Bauss, B; Bendel, M; Dahlhoff, A; Eckweiler, S [Institut fuer Physik, University of Mainz, D-55099 Mainz (Germany); Booth, J R A; Thomas, P Bright; Charlton, D G; Collins, N J; Curtis, C J [School of Physics and Astronomy, University of Birmingham, Birmingham B15 2TT (United Kingdom)], E-mail: e.eisenhandler@qmul.ac.uk (and others)

    2008-03-15

    The ATLAS Level-1 Calorimeter Trigger uses reduced-granularity information from all the ATLAS calorimeters to search for high transverse-energy electrons, photons, {tau} leptons and jets, as well as high missing and total transverse energy. The calorimeter trigger electronics has a fixed latency of about 1 {mu}s, using programmable custom-built digital electronics. This paper describes the Calorimeter Trigger hardware, as installed in the ATLAS electronics cavern.

  17. The ATLAS Trigger System Commissioning and Performance

    CERN Document Server

    Hamilton, A

    2010-01-01

    The ATLAS trigger has been used very successfully to collect collision data during 2009 and 2010 LHC running at centre of mass energies of 900 GeV, 2.36 TeV, and 7 TeV. This paper presents the ongoing work to commission the ATLAS trigger with proton collisions, including an overview of the performance of the trigger based on extensive online running. We describe how the trigger has evolved with increasing LHC luminosity and give a brief overview of plans for forthcoming LHC running.

  18. A Novel in situ Trigger Combination Method

    International Nuclear Information System (INIS)

    Buzatu, Adrian; Warburton, Andreas; Krumnack, Nils; Yao, Wei-Ming

    2012-01-01

    Searches for rare physics processes using particle detectors in high-luminosity colliding hadronic beam environments require the use of multi-level trigger systems to reject colossal background rates in real time. In analyses like the search for the Higgs boson, there is a need to maximize the signal acceptance by combining multiple different trigger chains when forming the offline data sample. In such statistically limited searches, datasets are often amassed over periods of several years, during which the trigger characteristics evolve and their performance can vary significantly. Reliable production cross-section measurements and upper limits must take into account a detailed understanding of the effective trigger inefficiency for every selected candidate event. We present as an example the complex situation of three trigger chains, based on missing energy and jet energy, to be combined in the context of the search for the Higgs (H) boson produced in association with a W boson at the Collider Detector at Fermilab (CDF). We briefly review the existing techniques for combining triggers, namely the inclusion, division, and exclusion methods. We introduce and describe a novel fourth in situ method whereby, for each candidate event, only the trigger chain with the highest a priori probability of selecting the event is considered. The in situ combination method has advantages of scalability to large numbers of differing trigger chains and of insensitivity to correlations between triggers. We compare the inclusion and in situ methods for signal event yields in the CDF WH search.

  19. The ATLAS Muon and Tau Trigger

    CERN Document Server

    Dell'Asta, L; The ATLAS collaboration

    2013-01-01

    [Muon] The ATLAS experiment at CERN's Large Hadron Collider (LHC) deploys a three-levels processing scheme for the trigger system. The level-1 muon trigger system gets its input from fast muon trigger detectors. Fast sector logic boards select muon candidates, which are passed via an interface board to the central trigger processor and then to the High Level Trigger (HLT). The muon HLT is purely software based and encompasses a level-2 (L2) trigger followed by an event filter (EF) for a staged trigger approach. It has access to the data of the precision muon detectors and other detector elements to refine the muon hypothesis. Trigger-specific algorithms were developed and are used for the L2 to increase processing speed for instance by making use of look-up tables and simpler algorithms, while the EF muon triggers mostly benefit from offline reconstruction software to obtain most precise determination of the track parameters. There are two algorithms with different approaches, namely inside-out and outside-in...

  20. Data analysis at Level-1 Trigger level

    CERN Document Server

    Wittmann, Johannes; Aradi, Gregor; Bergauer, Herbert; Jeitler, Manfred; Wulz, Claudia; Apanasevich, Leonard; Winer, Brian; Puigh, Darren Michael

    2017-01-01

    With ever increasing luminosity at the LHC, optimum online data selection is getting more and more important. While in the case of some experiments (LHCb and ALICE) this task is being completely transferred to computer farms, the others - ATLAS and CMS - will not be able to do this in the medium-term future for technological, detector-related reasons. Therefore, these experiments pursue the complementary approach of migrating more and more of the offline and High-Level Trigger intelligence into the trigger electronics. This paper illustrates how the Level-1 Trigger of the CMS experiment and in particular its concluding stage, the Global Trigger, take up this challenge.

  1. The Run-2 ATLAS Trigger System

    International Nuclear Information System (INIS)

    Martínez, A Ruiz

    2016-01-01

    The ATLAS trigger successfully collected collision data during the first run of the LHC between 2009-2013 at different centre-of-mass energies between 900 GeV and 8TeV. The trigger system consists of a hardware Level-1 and a software-based high level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV and higher luminosity, resulting in up to five times higher rates of processes of interest. A brief review of the ATLAS trigger system upgrades that were implemented between Run-1 and Run-2, allowing to cope with the increased trigger rates while maintaining or even improving the efficiency to select physics processes of interest, will be given. This includes changes to the Level-1 calorimeter and muon trigger systems, the introduction of a new Level-1 topological trigger module and the merging of the previously two-level HLT system into a single event processing farm. A few examples will be shown, such as the impressive performance improvements in the HLT trigger algorithms used to identify leptons, hadrons and global event quantities like missing transverse energy. Finally, the status of the commissioning of the trigger system and its performance during the 2015 run will be presented. (paper)

  2. Geometrical Acceptance Analysis for RPC PAC Trigger

    CERN Document Server

    Seo, Eunsung

    2010-01-01

    The CMS(Compact Muon Solenoid) is one of the four experiments that will analyze the collision results of the protons accelerated by the Large Hardron Collider(LHC) at CERN(Conseil Europen pour la Recherche Nuclaire). In case of the CMS experiment, the trigger system is divided into two stages : The Level-1 Trigger and High Level Trigger. The RPC(Resistive Plate Chamber) PAC(PAttern Comparator) Trigger system, which is a subject of this thesis, is a part of the Level-1 Muon Trigger System. Main task of the PAC Trigger is to identify muons, measures transverse momenta and select the best muon candidates for each proton bunch collision occurring every 25 ns. To calculate the value of PAC Trigger efficiency for triggerable muon, two terms of different efficiencies are needed ; acceptance efficiency and chamber efficiency. Main goal of the works described in this thesis is obtaining the acceptance efficiency of the PAC Trigger in each logical cone. Acceptance efficiency is a convolution of the chambers geometry an...

  3. A Framework to Assess the Cumulative Hydrological Impacts of Dams on flow Regime

    Science.gov (United States)

    Wang, Y.; Wang, D.

    2016-12-01

    In this study we proposed a framework to assess the cumulative impact of dams on hydrological regime, and the impacts of the Three Gorges Dam on flow regime in Yangtze River were investigated with the framework. We reconstructed the unregulated flow series to compare with the regulated flow series in the same period. Eco-surplus and eco-deficit and the Indicators of Hydrologic Alteration parameters were used to examine the hydrological regime change. Among IHA parameters, Wilcoxon signed-rank test and Principal Components Analysis identified the representative indicators of hydrological alterations. Eco-surplus and eco-deficit showed that the reservoir also changed the seasonal regime of the flows in autumn and winter. Annual extreme flows and October flows changes lead to negative ecological implications downstream from the Three Gorges Dam. Ecological operation for the Three Gorges Dam is necessary to mitigate the negative effects on the river ecosystem in the middle reach of Yangtze River. The framework proposed here could be a robust method to assess the cumulative impacts of reservoir operation.

  4. Landslide triggering by rain infiltration

    Science.gov (United States)

    Iverson, Richard M.

    2000-01-01

    Landsliding in response to rainfall involves physical processes that operate on disparate timescales. Relationships between these timescales guide development of a mathematical model that uses reduced forms of Richards equation to evaluate effects of rainfall infiltration on landslide occurrence, timing, depth, and acceleration in diverse situations. The longest pertinent timescale is A/D0, where D0 is the maximum hydraulic diffusivity of the soil and A is the catchment area that potentially affects groundwater pressures at a prospective landslide slip surface location with areal coordinates x, y and depth H. Times greater than A/D0 are necessary for establishment of steady background water pressures that develop at (x, y, H) in response to rainfall averaged over periods that commonly range from days to many decades. These steady groundwater pressures influence the propensity for landsliding at (x, y, H), but they do not trigger slope failure. Failure results from rainfall over a typically shorter timescale H2/D0 associated with transient pore pressure transmission during and following storms. Commonly, this timescale ranges from minutes to months. The shortest timescale affecting landslide responses to rainfall is √(H/g), where g is the magnitude of gravitational acceleration. Postfailure landslide motion occurs on this timescale, which indicates that the thinnest landslides accelerate most quickly if all other factors are constant. Effects of hydrologic processes on landslide processes across these diverse timescales are encapsulated by a response function, R(t*) = √(t*/π) exp (-1/t*) - erfc (1/√t*), which depends only on normalized time, t*. Use of R(t*) in conjunction with topographic data, rainfall intensity and duration information, an infinite-slope failure criterion, and Newton's second law predicts the timing, depth, and acceleration of rainfall-triggered landslides. Data from contrasting landslides that exhibit rapid, shallow

  5. Development of Overflow-Prevention Valve with Trigger Mechanism.

    Science.gov (United States)

    Ishino, Yuji; Mizuno, Takeshi; Takasaki, Masaya

    2016-09-01

    A new overflow-prevention valve for combustible fluid is developed which uses a trigger mechanism. Loading arms for combustible fluid are used for transferring oil from a tanker to tanks and vice versa. The loading arm has a valve for preventing overflow. Overflow- prevention valves cannot use any electric component to avoid combustion. Therefore, the valve must be constructed only by mechanical parts. The conventional overflow-prevention valve uses fluid and pneumatic forces. It consists of a sensor probe, a cylinder, a main valve for shutting off the fluid and a locking mechanism for holding an open state of the main valve. The proposed overflow-prevention valve uses the pressure due to the height difference between the fluid level of the tank and the sensor probe. However, the force of the cylinder produced by the pressure is too small to release the locking mechanism. Therefore, a trigger mechanism is introduced between the cylinder and the locking mechanism. The trigger mechanism produces sufficient force to release the locking mechanism and close the main valve when the height of fluid exceeds a threshold value. A trigger mechanism is designed and fabricated. The operation necessary for closing the main valve is conformed experimentally.

  6. Distributed modelling of shallow landslides triggered by intense rainfall

    Directory of Open Access Journals (Sweden)

    G. B. Crosta

    2003-01-01

    Full Text Available Hazard assessment of shallow landslides represents an important aspect of land management in mountainous areas. Among all the methods proposed in the literature, physically based methods are the only ones that explicitly includes the dynamic factors that control landslide triggering (rainfall pattern, land-use. For this reason, they allow forecasting both the temporal and the spatial distribution of shallow landslides. Physically based methods for shallow landslides are based on the coupling of the infinite slope stability analysis with hydrological models. Three different grid-based distributed hydrological models are presented in this paper: a steady state model, a transient "piston-flow" wetting front model, and a transient diffusive model. A comparative test of these models was performed to simulate landslide occurred during a rainfall event (27–28 June 1997 that triggered hundreds of shallow landslides within Lecco province (central Southern Alps, Italy. In order to test the potential for a completely distributed model for rainfall-triggered landslides, radar detected rainfall intensity has been used. A new procedure for quantitative evaluation of distributed model performance is presented and used in this paper. The diffusive model results in the best model for the simulation of shallow landslide triggering after a rainfall event like the one that we have analysed. Finally, radar data available for the June 1997 event permitted greatly improving the simulation. In particular, radar data allowed to explain the non-uniform distribution of landslides within the study area.

  7. A Fast Hardware Tracker for the ATLAS Trigger System

    CERN Document Server

    Neubauer, M; The ATLAS collaboration

    2009-01-01

    As the LHC luminosity is ramped up to the design level of 10^{34} cm^{-2} s^{-1} and beyond, the high rates, multiplicities, and energies of particles seen by the detectors will pose a unique challenge. Only a tiny fraction of the produced collisions can be stored on tape and immense real-time data reduction is needed. An effective trigger system must maintain high trigger efficiencies for the physics we are most interested in, and at the same time suppress the enormous QCD backgrounds. This requires massive computing power to minimize the online execution time of complex algorithms. A multi-level trigger is an effective solution for an otherwise impossible problem. The Fast Tracker (FTK) is a proposed upgrade to the ATLAS trigger system that will operate at full Level-1 output rates and provide high quality tracks reconstructed over the entire detector by the start of processing in Level-2. FTK solves the combinatorial challenge inherent to tracking by exploiting the massive parallelism of Associative Memori...

  8. Cumulants of heat transfer across nonlinear quantum systems

    Science.gov (United States)

    Li, Huanan; Agarwalla, Bijay Kumar; Li, Baowen; Wang, Jian-Sheng

    2013-12-01

    We consider thermal conduction across a general nonlinear phononic junction. Based on two-time observation protocol and the nonequilibrium Green's function method, heat transfer in steady-state regimes is studied, and practical formulas for the calculation of the cumulant generating function are obtained. As an application, the general formalism is used to study anharmonic effects on fluctuation of steady-state heat transfer across a single-site junction with a quartic nonlinear on-site pinning potential. An explicit nonlinear modification to the cumulant generating function exact up to the first order is given, in which the Gallavotti-Cohen fluctuation symmetry is found still valid. Numerically a self-consistent procedure is introduced, which works well for strong nonlinearity.

  9. A cumulant functional for static and dynamic correlation

    International Nuclear Information System (INIS)

    Hollett, Joshua W.; Hosseini, Hessam; Menzies, Cameron

    2016-01-01

    A functional for the cumulant energy is introduced. The functional is composed of a pair-correction and static and dynamic correlation energy components. The pair-correction and static correlation energies are functionals of the natural orbitals and the occupancy transferred between near-degenerate orbital pairs, rather than the orbital occupancies themselves. The dynamic correlation energy is a functional of the statically correlated on-top two-electron density. The on-top density functional used in this study is the well-known Colle-Salvetti functional. Using the cc-pVTZ basis set, the functional effectively models the bond dissociation of H 2 , LiH, and N 2 with equilibrium bond lengths and dissociation energies comparable to those provided by multireference second-order perturbation theory. The performance of the cumulant functional is less impressive for HF and F 2 , mainly due to an underestimation of the dynamic correlation energy by the Colle-Salvetti functional.

  10. Fragmentation of tensor polarized deuterons into cumulative pions

    International Nuclear Information System (INIS)

    Afanas'ev, S.; Arkhipov, V.; Bondarev, V.

    1998-01-01

    The tensor analyzing power T 20 of the reaction d polarized + A → π - (0 0 ) + X has been measured in the fragmentation of 9 GeV tensor polarized deuterons into pions with momenta from 3.5 to 5.3 GeV/c on hydrogen, beryllium and carbon targets. This kinematic range corresponds to the region of cumulative hadron production with the cumulative variable x c from 1.08 to 1.76. The values of T 20 have been found to be small and consistent with positive values. This contradicts the predictions based on a direct mechanism assuming NN collision between a high momentum nucleon in the deuteron and a target nucleon (NN → NNπ)

  11. Experience of cumulative effects assessment in the UK

    Directory of Open Access Journals (Sweden)

    Piper Jake

    2004-01-01

    Full Text Available Cumulative effects assessment (CEA is a development of environmental impact assessment which attempts to take into account the wider picture of what impacts may affect the environment as a result of either multiple or linear projects, or development plans. CEA is seen as a further valuable tool in promoting sustainable development. The broader canvas upon which the assessment is made leads to a suite of issues such as complexity in methods and assessment of significance, the desirability of co-operation between developers and other parties, new ways of addressing mitigation and monitoring. After outlining the legislative position and the process of CEA, this paper looks at three cases studies in the UK where cumulative assessment has been carried out - the cases concern wind farms, major infrastructure and off-shore developments.

  12. Landslides triggered by the 1946 Ancash earthquake, Peru

    Science.gov (United States)

    Kampherm, T. S.; Evans, S. G.; Valderrama Murillo, P.

    2009-04-01

    The 1946 M7.3 Ancash Earthquake triggered a large number of landslides in an epicentral area that straddled the Continental Divide of South America in the Andes of Peru. A small number of landslides were described in reconnaissance reports by E. Silgado and Arnold Heim published shortly after the earthquake, but further details of the landslides triggered by the earthquake have not been reported since. Utilising field traverses, aerial photograph interpretation and GIS, our study mapped 45 landslides inferred to have been triggered by the event. 83% were rock avalanches involving Cretaceous limestones interbedded with shales. The five largest rock/debris avalanches occurred at Rio Llama (est. vol. 37 M m3), Suytucocha (est. vol., 13.5 Mm3), Quiches (est. vol. 10.5 Mm3 ), Pelagatos (est. vol. 8 Mm3), and Shundoy (est. vol. 8 Mm3). The Suytucocha, Quiches, and Pelagatos landslides were reported by Silgado and Heim. Rock slope failure was most common on slopes with a southwest aspect, an orientation corresponding to the regional dip direction of major planar structures in the Andean foreland belt (bedding planes and thrust faults). In valleys oriented transverse to the NW-SE structural grain of the epicentral area, south-westerly dipping bedding planes combined with orthogonal joint sets to form numerous wedge failures. Many initial rock slope failures were transformed into rock/debris avalanches by the entrainment of colluvium in their path. At Acobamba, a rock avalanche that transformed into a debris avalanche (est. vol. 4.3 Mm3) overwhelmed a village resulting in the deaths of 217 people. The cumulative volume-frequency plot shows a strong power law relation below a marked rollover, similar in form to that derived for landslides triggered by the 1994 Northridge Earthquake. The total volume of the 45 landslides is approximately 93 Mm3. The data point for the Ancash Earthquake plots near the regression line calculated by Keefer (1994), and modified by Malamud et al

  13. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  14. Seasonal climate change patterns due to cumulative CO2 emissions

    Science.gov (United States)

    Partanen, Antti-Ilari; Leduc, Martin; Damon Matthews, H.

    2017-07-01

    Cumulative CO2 emissions are near linearly related to both global and regional changes in annual-mean surface temperature. These relationships are known as the transient climate response to cumulative CO2 emissions (TCRE) and the regional TCRE (RTCRE), and have been shown to remain approximately constant over a wide range of cumulative emissions. Here, we assessed how well this relationship holds for seasonal patterns of temperature change, as well as for annual-mean and seasonal precipitation patterns. We analyzed an idealized scenario with CO2 concentration growing at an annual rate of 1% using data from 12 Earth system models from the Coupled Model Intercomparison Project Phase 5 (CMIP5). Seasonal RTCRE values for temperature varied considerably, with the highest seasonal variation evident in the Arctic, where RTCRE was about 5.5 °C per Tt C for boreal winter and about 2.0 °C per Tt C for boreal summer. Also the precipitation response in the Arctic during boreal winter was stronger than during other seasons. We found that emission-normalized seasonal patterns of temperature change were relatively robust with respect to time, though they were sub-linear with respect to emissions particularly near the Arctic. Moreover, RTCRE patterns for precipitation could not be quantified robustly due to the large internal variability of precipitation. Our results suggest that cumulative CO2 emissions are a useful metric to predict regional and seasonal changes in precipitation and temperature. This extension of the TCRE framework to seasonal and regional climate change is helpful for communicating the link between emissions and climate change to policy-makers and the general public, and is well-suited for impact studies that could make use of estimated regional-scale climate changes that are consistent with the carbon budgets associated with global temperature targets.

  15. Firm heterogeneity, Rules of Origin and Rules of Cumulation

    OpenAIRE

    Bombarda , Pamela; Gamberoni , Elisa

    2013-01-01

    We analyse the impact of relaxing rules of origin (ROOs) in a simple setting with heterogeneous firms that buy intermediate inputs from domestic and foreign sources. In particular, we consider the impact of switching from bilateral to diagonal cumulation when using preferences (instead of paying the MFN tariff) involving the respect of rules of origin. We find that relaxing the restrictiveness of the ROOs leads the least productive exporters to stop exporting. The empirical part confirms thes...

  16. Cumulant approach to dynamical correlation functions at finite temperatures

    International Nuclear Information System (INIS)

    Tran Minhtien.

    1993-11-01

    A new theoretical approach, based on the introduction of cumulants, to calculate thermodynamic averages and dynamical correlation functions at finite temperatures is developed. The method is formulated in Liouville instead of Hilbert space and can be applied to operators which do not require to satisfy fermion or boson commutation relations. The application of the partitioning and projection methods for the dynamical correlation functions is discussed. The present method can be applied to weakly as well as to strongly correlated systems. (author). 9 refs

  17. Severe occupational hand eczema, job stress and cumulative sickness absence.

    Science.gov (United States)

    Böhm, D; Stock Gissendanner, S; Finkeldey, F; John, S M; Werfel, T; Diepgen, T L; Breuer, K

    2014-10-01

    Stress is known to activate or exacerbate dermatoses, but the relationships between chronic stress, job-related stress and sickness absence among occupational hand eczema (OHE) patients are inadequately understood. To see whether chronic stress or burnout symptoms were associated with cumulative sickness absence in patients with OHE and to determine which factors predicted sickness absence in a model including measures of job-related and chronic stress. We investigated correlations of these factors in employed adult inpatients with a history of sickness absence due to OHE in a retrospective cross-sectional explorative study, which assessed chronic stress (Trier Inventory for the Assessment of Chronic Stress), burnout (Shirom Melamed Burnout Measure), clinical symptom severity (Osnabrück Hand Eczema Severity Index), perceived symptom severity, demographic characteristics and cumulative days of sickness absence. The study group consisted of 122 patients. OHE symptoms were not more severe among patients experiencing greater stress and burnout. Women reported higher levels of chronic stress on some measures. Cumulative days of sickness absence correlated with individual dimensions of job-related stress and, in multiple regression analysis, with an overall measure of chronic stress. Chronic stress is an additional factor predicting cumulative sickness absence among severely affected OHE patients. Other relevant factors for this study sample included the 'cognitive weariness' subscale of the Shirom Melamed Burnout Measure and the physical component summary score of the SF-36, a measure of health-related life quality. Prevention and rehabilitation should take job stress into consideration in multidisciplinary treatment strategies for severely affected OHE patients. © The Author 2014. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Science and Societal Partnerships to Address Cumulative Impacts

    OpenAIRE

    Lundquist, Carolyn J.; Fisher, Karen T.; Le Heron, Richard; Lewis, Nick I.; Ellis, Joanne I.; Hewitt, Judi E.; Greenaway, Alison J.; Cartner, Katie J.; Burgess-Jones, Tracey C.; Schiel, David R.; Thrush, Simon F.

    2016-01-01

    Funding and priorities for ocean research are not separate from the underlying sociological, economic, and political landscapes that determine values attributed to ecological systems. Here we present a variation on science prioritization exercises, focussing on inter-disciplinary research questions with the objective of shifting broad scale management practices to better address cumulative impacts and multiple users. Marine scientists in New Zealand from a broad range of scientific and social...

  19. Cumulative prospect theory and mean variance analysis. A rigorous comparison

    OpenAIRE

    Hens, Thorsten; Mayer, Janos

    2012-01-01

    We compare asset allocations derived for cumulative prospect theory(CPT) based on two different methods: Maximizing CPT along the mean–variance efficient frontier and maximizing it without that restriction. We find that with normally distributed returns the difference is negligible. However, using standard asset allocation data of pension funds the difference is considerable. Moreover, with derivatives like call options the restriction to the mean-variance efficient frontier results in a siza...

  20. Signal anomaly detection using modified CUSUM [cumulative sum] method

    International Nuclear Information System (INIS)

    Morgenstern, V.; Upadhyaya, B.R.; Benedetti, M.

    1988-01-01

    An important aspect of detection of anomalies in signals is the identification of changes in signal behavior caused by noise, jumps, changes in band-width, sudden pulses and signal bias. A methodology is developed to identify, isolate and characterize these anomalies using a modification of the cumulative sum (CUSUM) approach. The new algorithm performs anomaly detection at three levels and is implemented on a general purpose computer. 7 refs., 4 figs

  1. Nonlinear dynamical triggering of slow slip

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Paul A [Los Alamos National Laboratory; Knuth, Matthew W [WISCONSIN; Kaproth, Bryan M [PENN STATE; Carpenter, Brett [PENN STATE; Guyer, Robert A [Los Alamos National Laboratory; Le Bas, Pierre - Yves [Los Alamos National Laboratory; Daub, Eric G [Los Alamos National Laboratory; Marone, Chris [PENN STATE

    2010-12-10

    Among the most fascinating, recent discoveries in seismology have been the phenomena of triggered slip, including triggered earthquakes and triggered-tremor, as well as triggered slow, silent-slip during which no seismic energy is radiated. Because fault nucleation depths cannot be probed directly, the physical regimes in which these phenomena occur are poorly understood. Thus determining physical properties that control diverse types of triggered fault sliding and what frictional constitutive laws govern triggered faulting variability is challenging. We are characterizing the physical controls of triggered faulting with the goal of developing constitutive relations by conducting laboratory and numerical modeling experiments in sheared granular media at varying load conditions. In order to simulate granular fault zone gouge in the laboratory, glass beads are sheared in a double-direct configuration under constant normal stress, while subject to transient perturbation by acoustic waves. We find that triggered, slow, silent-slip occurs at very small confining loads ({approx}1-3 MPa) that are smaller than those where dynamic earthquake triggering takes place (4-7 MPa), and that triggered slow-slip is associated with bursts of LFE-like acoustic emission. Experimental evidence suggests that the nonlinear dynamical response of the gouge material induced by dynamic waves may be responsible for the triggered slip behavior: the slip-duration, stress-drop and along-strike slip displacement are proportional to the triggering wave amplitude. Further, we observe a shear-modulus decrease corresponding to dynamic-wave triggering relative to the shear modulus of stick-slips. Modulus decrease in response to dynamical wave amplitudes of roughly a microstrain and above is a hallmark of elastic nonlinear behavior. We believe that the dynamical waves increase the material non-affine elastic deformation during shearing, simultaneously leading to instability and slow-slip. The inferred

  2. Crane Safety Assessment Method Based on Entropy and Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Aihua Li

    2017-01-01

    Full Text Available Assessing the safety status of cranes is an important problem. To overcome the inaccuracies and misjudgments in such assessments, this work describes a safety assessment method for cranes that combines entropy and cumulative prospect theory. Firstly, the proposed method transforms the set of evaluation indices into an evaluation vector. Secondly, a decision matrix is then constructed from the evaluation vectors and evaluation standards, and an entropy-based technique is applied to calculate the index weights. Thirdly, positive and negative prospect value matrices are established from reference points based on the positive and negative ideal solutions. Thus, this enables the crane safety grade to be determined according to the ranked comprehensive prospect values. Finally, the safety status of four general overhead traveling crane samples is evaluated to verify the rationality and feasibility of the proposed method. The results demonstrate that the method described in this paper can precisely and reasonably reflect the safety status of a crane.

  3. Nonuniform Sparse Data Clustering Cascade Algorithm Based on Dynamic Cumulative Entropy

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available A small amount of prior knowledge and randomly chosen initial cluster centers have a direct impact on the accuracy of the performance of iterative clustering algorithm. In this paper we propose a new algorithm to compute initial cluster centers for k-means clustering and the best number of the clusters with little prior knowledge and optimize clustering result. It constructs the Euclidean distance control factor based on aggregation density sparse degree to select the initial cluster center of nonuniform sparse data and obtains initial data clusters by multidimensional diffusion density distribution. Multiobjective clustering approach based on dynamic cumulative entropy is adopted to optimize the initial data clusters and the best number of the clusters. The experimental results show that the newly proposed algorithm has good performance to obtain the initial cluster centers for the k-means algorithm and it effectively improves the clustering accuracy of nonuniform sparse data by about 5%.

  4. Problems of describing the cumulative effect in relativistic nuclear physics

    International Nuclear Information System (INIS)

    Baldin, A.M.

    1979-01-01

    The problem of describing the cumulative effect i.e., the particle production on nuclei in the range kinematically forbidden for one-nucleon collisions, is studied. Discrimination of events containing cumulative particles fixes configurations in the wave function of a nucleus, when several nucleons are closely spaced and their quark-parton components are collectivized. For the cumulative processes under consideration large distances between quarks are very important. The fundamental facts and theoretical interpretation of the quantum field theory and of the condensed media theory in the relativistic nuclear physics are presented in brief. The collisions of the relativistic nuclei with low momentum transfers is considered in a fast moving coordinate system. The basic parameter determining this type of collisions is the energy of nucleon binding in nuclei. It has been shown that the short-range correlation model provides a good presentation of many characteristics of the multiple particle production and it may be regarded as an approximate universal property of hadron interactions

  5. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  6. Energy Current Cumulants in One-Dimensional Systems in Equilibrium

    Science.gov (United States)

    Dhar, Abhishek; Saito, Keiji; Roy, Anjan

    2018-06-01

    A recent theory based on fluctuating hydrodynamics predicts that one-dimensional interacting systems with particle, momentum, and energy conservation exhibit anomalous transport that falls into two main universality classes. The classification is based on behavior of equilibrium dynamical correlations of the conserved quantities. One class is characterized by sound modes with Kardar-Parisi-Zhang scaling, while the second class has diffusive sound modes. The heat mode follows Lévy statistics, with different exponents for the two classes. Here we consider heat current fluctuations in two specific systems, which are expected to be in the above two universality classes, namely, a hard particle gas with Hamiltonian dynamics and a harmonic chain with momentum conserving stochastic dynamics. Numerical simulations show completely different system-size dependence of current cumulants in these two systems. We explain this numerical observation using a phenomenological model of Lévy walkers with inputs from fluctuating hydrodynamics. This consistently explains the system-size dependence of heat current fluctuations. For the latter system, we derive the cumulant-generating function from a more microscopic theory, which also gives the same system-size dependence of cumulants.

  7. Stakeholder attitudes towards cumulative and aggregate exposure assessment of pesticides.

    Science.gov (United States)

    Verbeke, Wim; Van Loo, Ellen J; Vanhonacker, Filiep; Delcour, Ilse; Spanoghe, Pieter; van Klaveren, Jacob D

    2015-05-01

    This study evaluates the attitudes and perspectives of different stakeholder groups (agricultural producers, pesticide manufacturers, trading companies, retailers, regulators, food safety authorities, scientists and NGOs) towards the concepts of cumulative and aggregate exposure assessment of pesticides by means of qualitative in-depth interviews (n = 15) and a quantitative stakeholder survey (n = 65). The stakeholders involved generally agreed that the use of chemical pesticides is needed, primarily for meeting the need of feeding the growing world population, while clearly acknowledging the problematic nature of human exposure to pesticide residues. Current monitoring was generally perceived to be adequate, but the timeliness and consistency of monitoring practices across countries were questioned. The concept of cumulative exposure assessment was better understood by stakeholders than the concept of aggregate exposure assessment. Identified pitfalls were data availability, data limitations, sources and ways of dealing with uncertainties, as well as information and training needs. Regulators and food safety authorities were perceived as the stakeholder groups for whom cumulative and aggregate pesticide exposure assessment methods and tools would be most useful and acceptable. Insights obtained from this exploratory study have been integrated in the development of targeted and stakeholder-tailored dissemination and training programmes that were implemented within the EU-FP7 project ACROPOLIS. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. The trigger supervisor: Managing triggering conditions in a high energy physics experiment

    International Nuclear Information System (INIS)

    Wadsworth, B.; Lanza, R.; LeVine, M.J.; Scheetz, R.A.; Videbaek, F.

    1987-01-01

    A trigger supervisor, implemented in VME-bus hardware, is described, which enables the host computer to dynamically control and monitor the trigger configuration for acquiring data from multiple detector partitions in a complex experiment

  9. Storytelling as a trigger for sharing conversations

    Directory of Open Access Journals (Sweden)

    Emma Louise Parfitt

    2014-04-01

    Full Text Available This article explores whether traditional oral storytelling can be used to provide insights into the way in which young people of 12-14 years identify and understand the language of emotion and behaviour. Following the preliminary analysis, I propose that storytelling may trigger sharing conversations. My research attempts to extend the social and historical perspectives of Jack Zipes, on fairy tales, into a sociological analysis of young people’s lives today. I seek to investigate the extent that the storytelling space offers potential benefits as a safe place for young people to share emotions and experiences, and learn from one another. My research analysis involved NVivo coding of one hour storytelling and focus group sessions, held over five weeks. In total, there were six groups of four children, of mixed ethnicity, gender, ability, and socio-economic background, from three schools within Warwickshire. The results confirmed that the beneficial effects of the storytelling space include a safe area for sharing emotions and experiences, and in general for supporting young people outside formal learning settings.

  10. Vertex trigger implementation using shared memory technology

    CERN Document Server

    Müller, H

    1998-01-01

    The implementation of a 1 st level vertex trigger for LHC-B is particularly difficult due to the high ( 1 MHz ) input data rate. With ca. 350 silicon hits per event, both the R strips and Phi strips of the detectors produce a total of ca 2 Gbyte/s zero-suppressed da ta.1 note succeeds to the ideas to use R-phi coordinates for fast integer linefinding in programmable hardware, as described in LHB note 97-006. For an implementation we propose a FPGA preprocessing stage operating at 1 MHz with the benefit to substantially reduce the amount of data to be transmitted to the CPUs and to liberate a large fraction of CPU time. Interconnected via 4 Gbit/s SCI technol-ogy 2 , a shared memory system can be built which allows to perform data driven eventbuilding with, or without preprocessing. A fully data driven architecture between source modules and destination memories provides a highly reliable memory-to-memory transfer mechanism of very low latency. The eventbuilding is performed via associating events at the sourc...

  11. Tools for Trigger Aware Analyses in ATLAS

    CERN Document Server

    Krasznahorkay, A; The ATLAS collaboration; Stelzer, J

    2010-01-01

    In order to search for rare processes, all four LHC experiments have to use advanced triggering methods for selecting and recording the events of interest. At the expected nominal LHC operating conditions only about 0.0005% of the collision events can be kept for physics analysis in ATLAS. Therefore the understanding and evaluation of the trigger performance is one of the most crucial parts of any physics analysis. ATLAS’s first level trigger is composed of custom-built hardware, while the second and third levels are implemented using regular PCs running reconstruction and selection algorithms. Because of this split, accessing the results of the trigger execution for the two stages is different. The complexity of the software trigger presents further difficulties in accessing the trigger data. To make the job of the physicists easier when evaluating the trigger performance, multiple general-use tools are provided by the ATLAS Trigger Analysis Tools group. The TrigDecisionTool, a general tool, is provided to...

  12. The Run-2 ATLAS Trigger System

    CERN Document Server

    Ruiz-Martinez, Aranzazu; The ATLAS collaboration

    2016-01-01

    The ATLAS trigger has been successfully collecting collision data during the first run of the LHC between 2009-2013 at a centre-of-mass energy between 900 GeV and 8 TeV. The trigger system consists of a hardware Level-1 (L1) and a software based high-level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV resulting in roughly five times higher trigger rates. We will briefly review the ATLAS trigger system upgrades that were implemented during the shutdown, allowing us to cope with the increased trigger rates while maintaining or even improving our efficiency to select relevant physics processes. This includes changes to the L1 calorimeter and muon trigger systems, the introduction of a new L1 topological trigger module and the merging of the previously two-level HLT system into a single event filter farm. At hand of a few examples, we will show the ...

  13. The Run-2 ATLAS Trigger System

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00222798; The ATLAS collaboration

    2016-01-01

    The ATLAS trigger successfully collected collision data during the first run of the LHC between 2009-2013 at different centre-of-mass energies between 900 GeV and 8 TeV. The trigger system consists of a hardware Level-1 and a software-based high level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV and higher luminosity, resulting in roughly five times higher trigger rates. A brief review of the ATLAS trigger system upgrades that were implemented between Run-1 and Run-2, allowing to cope with the increased trigger rates while maintaining or even improving the efficiency to select physics processes of interest, will be given. This includes changes to the Level-1 calorimeter and muon trigger systems, the introduction of a new Level-1 topological trigger module and the merging of the previously two-level HLT system into a single event filter farm. A ...

  14. Intelligent trigger processor for the crystal box

    International Nuclear Information System (INIS)

    Sanders, G.H.; Butler, H.S.; Cooper, M.D.

    1981-01-01

    A large solid angle modular NaI(Tl) detector with 432 phototubes and 88 trigger scintillators is being used to search simultaneously for three lepton flavor changing decays of muon. A beam of up to 10 6 muons stopping per second with a 6% duty factor would yield up to 1000 triggers per second from random triple coincidences. A reduction of the trigger rate to 10 Hz is required from a hardwired primary trigger processor described in this paper. Further reduction to < 1 Hz is achieved by a microprocessor based secondary trigger processor. The primary trigger hardware imposes voter coincidence logic, stringent timing requirements, and a non-adjacency requirement in the trigger scintillators defined by hardwired circuits. Sophisticated geometric requirements are imposed by a PROM-based matrix logic, and energy and vector-momentum cuts are imposed by a hardwired processor using LSI flash ADC's and digital arithmetic loci. The secondary trigger employs four satellite microprocessors to do a sparse data scan, multiplex the data acquisition channels and apply additional event filtering

  15. Trigger factors for familial hemiplegic migraine

    DEFF Research Database (Denmark)

    Hansen, Jakob Møller; Hauge, Anne Werner; Ashina, Messoud

    2011-01-01

    The aim was to identify and describe migraine trigger factors in patients with familial hemiplegic migraine (FHM) from a population-based sample.......The aim was to identify and describe migraine trigger factors in patients with familial hemiplegic migraine (FHM) from a population-based sample....

  16. The ATLAS Level-1 Topological Trigger Performance

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00371751; The ATLAS collaboration

    2016-01-01

    The LHC will collide protons in the ATLAS detector with increasing luminosity through 2016, placing stringent operational and physical requirements to the ATLAS trigger system in order to reduce the 40 MHz collision rate to a manageable event storage rate of 1 kHz, while not rejecting interesting physics events. The Level-1 trigger is the first rate-reducing step in the ATLAS trigger system with an output rate of 100 kHz and decision latency smaller than 2.5 μs. It consists of a calorimeter trigger, muon trigger and a central trigger processor. During the LHC shutdown after the Run 1 finished in 2013, the Level-1 trigger system was upgraded including hardware, firmware and software updates. In particular, new electronics modules were introduced in the real-time data processing path: the Topological Processor System (L1Topo). It consists of a single AdvancedCTA shelf equipped with two Level-1 topological processor blades. They receive real-time information from the Level-1 calorimeter and muon triggers, which...

  17. Do episodes of anger trigger myocardial infarction?

    DEFF Research Database (Denmark)

    Möller, J; Hallqvist, J; Diderichsen, Finn

    1999-01-01

    Our objectives were to study anger as a trigger of acute myocardial infarction (MI) and to explore potential effect modification by usual behavioral patterns related to hostility.......Our objectives were to study anger as a trigger of acute myocardial infarction (MI) and to explore potential effect modification by usual behavioral patterns related to hostility....

  18. The LVL2 trigger goes online

    CERN Multimedia

    David Berge

    On Friday, the 9th of February, the ATLAS TDAQ community reached an important milestone. In a successful integration test, cosmic-ray muons were recorded with parts of the muon spectrometer, the central-trigger system and a second-level trigger algorithm. This was actually the first time that a full trigger slice all the way from the first-level trigger muon chambers up to event building after event selection by the second-level trigger ran online with cosmic rays. The ATLAS trigger and data acquisition system has a three-tier structure that is designed to cope with the enormous demands of proton-proton collisions at a bunch-crossing frequency of 40 MHz, with a typical event size of 1-2 MB. The online event selection has to reduce the incoming rate by a factor of roughly 200,000 to 200 Hz, a rate digestible by the archival-storage and offline-processing facilities. ATLAS has a mixed system: the first-level trigger (LVL1) is in hardware, while the other two consecutive levels, the second-level trigger (LVL2)...

  19. Evolution of costly explicit memory and cumulative culture.

    Science.gov (United States)

    Nakamaru, Mayuko

    2016-06-21

    Humans can acquire new information and modify it (cumulative culture) based on their learning and memory abilities, especially explicit memory, through the processes of encoding, consolidation, storage, and retrieval. Explicit memory is categorized into semantic and episodic memories. Animals have semantic memory, while episodic memory is unique to humans and essential for innovation and the evolution of culture. As both episodic and semantic memory are needed for innovation, the evolution of explicit memory influences the evolution of culture. However, previous theoretical studies have shown that environmental fluctuations influence the evolution of imitation (social learning) and innovation (individual learning) and assume that memory is not an evolutionary trait. If individuals can store and retrieve acquired information properly, they can modify it and innovate new information. Therefore, being able to store and retrieve information is essential from the perspective of cultural evolution. However, if both storage and retrieval were too costly, forgetting and relearning would have an advantage over storing and retrieving acquired information. In this study, using mathematical analysis and individual-based simulations, we investigate whether cumulative culture can promote the coevolution of costly memory and social and individual learning, assuming that cumulative culture improves the fitness of each individual. The conclusions are: (1) without cumulative culture, a social learning cost is essential for the evolution of storage-retrieval. Costly storage-retrieval can evolve with individual learning but costly social learning does not evolve. When low-cost social learning evolves, the repetition of forgetting and learning is favored more than the evolution of costly storage-retrieval, even though a cultural trait improves the fitness. (2) When cumulative culture exists and improves fitness, storage-retrieval can evolve with social and/or individual learning, which

  20. Reliability model analysis and primary experimental evaluation of laser triggered pulse trigger

    International Nuclear Information System (INIS)

    Chen Debiao; Yang Xinglin; Li Yuan; Li Jin

    2012-01-01

    High performance pulse trigger can enhance performance and stability of the PPS. It is necessary to evaluate the reliability of the LTGS pulse trigger, so we establish the reliability analysis model of this pulse trigger based on CARMES software, the reliability evaluation is accord with the statistical results. (authors)

  1. A general-purpose trigger processor system and its application to fast vertex trigger

    International Nuclear Information System (INIS)

    Hazumi, M.; Banas, E.; Natkaniec, Z.; Ostrowicz, W.

    1997-12-01

    A general-purpose hardware trigger system has been developed. The system comprises programmable trigger processors and pattern generator/samplers. The hardware design of the system is described. An application as a prototype of the very fast vertex trigger in an asymmetric B-factory at KEK is also explained. (author)

  2. El Carreto o Cumulá - Aspidosperma Dugandii Standl El Carreto o Cumulá - Aspidosperma Dugandii Standl

    Directory of Open Access Journals (Sweden)

    Dugand Armando

    1944-03-01

    Full Text Available Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin duda alguna al A. Dugandii StandI. Por otra parte, Santiago Cortés (FI. Colomb. 206. 1898; ed, 2: 239. 1912 cita el Cumulá "de Anapoima y otros lugares del (rio Magdalena" diciendo que pertenece a las Leguminosas, pero la brevísima descripción que este autor hace de la madera "naranjada y notable por densidad, dureza y resistencia a la humedad", me induce a creer que se trata del mismo Cumula coleccionado recientemente en Tocaima, ya que esta población esta situada a pocos kilómetros de Anapoima. Nombres vulgares: Carreto (Atlántico, Bolívar, Magdalena; Cumulá, Cumulá (Cundinamarca, ToIima. Según el Dr. Emilio Robledo (Lecciones de Bot. ed. 3, 2: 544. 1939 el nombre Carreto también es empleado en Puerto Berrío (Antioquia. El mismo autor (loc. cit. da el nombre Comulá para una especie indeterminada de Viburnum en Mariquita (Tolima y J. M. Duque, refiriendose a la misma planta y localidad (en Bot. Gen. Colomb. 340, 356. 1943 atribuye este nombre vulgar al Aspidosperma ellipticum Rusby.  Sin embargo, las muestras de madera de Cumulá o Comulá que yo he examinado, procedentes de la región de Mariquita -una de las cuales me fue recientemente enviada por el distinguido ictiólogo Sr. Cecil Miles- pertenecen sin

  3. The ATLAS Level-1 Trigger Timing Setup

    CERN Document Server

    Spiwoks, R; Ellis, Nick; Farthouat, P; Gällnö, P; Haller, J; Krasznahorkay, A; Maeno, T; Pauly, T; Pessoa-Lima, H; Resurreccion-Arcas, I; Schuler, G; De Seixas, J M; Torga-Teixeira, R; Wengler, T

    2005-01-01

    The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions at a bunch-crossing rate of 40 MHz. In order to reduce the data rate, a three-level trigger system selects potentially interesting physics. The first trigger level is implemented in electronics and firmware. It aims at reducing the output rate to less than 100 kHz. The Central Trigger Processor combines information from the calorimeter and muon trigger processors and makes the final Level-1-Accept decision. It is a central element in the timing setup of the experiment. Three aspects are considered in this article: the timing setup with respect to the Level-1 trigger, with respect to the expriment, and with respect to the world.

  4. MR imaging findings of trigger thumb

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Eric Y.; Chen, Karen C.; Chung, Christine B. [VA San Diego Healthcare System, Radiology Service, San Diego, CA (United States); University of California, San Diego Medical Center, Department of Radiology, San Diego, CA (United States)

    2015-08-15

    Trigger finger (or trigger thumb), also known as sclerosing tenosynovitis, is a common clinical diagnosis that rarely presents for imaging. Because of this selection bias, many radiologists may not be familiar with the process. Furthermore, patients who do present for imaging frequently have misleading examination indications. To our knowledge, magnetic resonance (MR) imaging findings of trigger thumb have not been previously reported in the literature. In this article, we review the entity of trigger thumb, the anatomy involved, and associated imaging findings, which include flexor pollicis longus tendinosis with a distinct nodule, A1 pulley thickening, and tenosynovitis. In addition, in some cases, an abnormal Av pulley is apparent. In the rare cases of trigger finger that present for MR imaging, accurate diagnosis by the radiologist can allow initiation of treatment and avoid further unnecessary workup. (orig.)

  5. MR imaging findings of trigger thumb

    International Nuclear Information System (INIS)

    Chang, Eric Y.; Chen, Karen C.; Chung, Christine B.

    2015-01-01

    Trigger finger (or trigger thumb), also known as sclerosing tenosynovitis, is a common clinical diagnosis that rarely presents for imaging. Because of this selection bias, many radiologists may not be familiar with the process. Furthermore, patients who do present for imaging frequently have misleading examination indications. To our knowledge, magnetic resonance (MR) imaging findings of trigger thumb have not been previously reported in the literature. In this article, we review the entity of trigger thumb, the anatomy involved, and associated imaging findings, which include flexor pollicis longus tendinosis with a distinct nodule, A1 pulley thickening, and tenosynovitis. In addition, in some cases, an abnormal Av pulley is apparent. In the rare cases of trigger finger that present for MR imaging, accurate diagnosis by the radiologist can allow initiation of treatment and avoid further unnecessary workup. (orig.)

  6. Pulse triggering mechanism of air proportional counters

    International Nuclear Information System (INIS)

    Aoyama, T.; Mori, T.; Watanabe, T.

    1983-01-01

    This paper describes the pulse triggering mechanism of a cylindrical proportional counter filled with air at atmospheric pressure for the incidence of β-rays. Experimental results indicate that primary electrons created distantly from the anode wire by a β-ray are transformed into negative ions, which then detach electrons close to the anode wire and generate electron avalanches thus triggering pulses, while electrons created near the anode wire by a β-ray directly trigger a pulse. Since a negative ion pulse is triggered by a single electron detached from a negative ion, multiple pulses are generated by a large number of ions produced by the incidence of a single β-ray. It is therefore necessary not to count pulses triggered by negative ions but to count those by primary electrons alone when use is made of air proportional counters for the detection of β-rays. (orig.)

  7. Concept of the CMS Trigger Supervisor

    CERN Document Server

    Magrans de Abril, Ildefons; Varela, Joao

    2006-01-01

    The Trigger Supervisor is an online software system designed for the CMS experiment at CERN. Its purpose is to provide a framework to set up, test, operate and monitor the trigger components on one hand and to manage their interplay and the information exchange with the run control part of the data acquisition system on the other. The Trigger Supervisor is conceived to provide a simple and homogeneous client interface to the online software infrastructure of the trigger subsystems. This document specifies the functional and non-functional requirements, design and operational details, and the components that will be delivered in order to facilitate a smooth integration of the trigger software in the context of CMS.

  8. A parallel non-neural trigger tracker for the SSC

    International Nuclear Information System (INIS)

    Farber, R.M.; Kennison, W.; Lapedes, A.S.

    1991-01-01

    The Superconducting Super Collider (SSC) is a major project promising to open the vistas of very high particle physics. When the SSC is in operation, data will be produced at a staggering rate. Current estimates place the raw data coming our of the proposed silicon detector system at 2.5 x 10 16 bits/second. Clearly, storing all events for later off-line processing is totally impracticable. A hierarchy of triggers, firing only on events meeting increasingly specific criteria, are planned to cull interesting events from the flood of information. Each event consists of a sequence of isolated ''hits'', caused by particles hitting various parts of the detector. Collating these hits into the tracks of the approximately 500 particles/event, and then quickly deciding which events meet the criteria for later processing, is essential if the SSC is to produce usable information. This paper addresses the need for real-time triggering and track reconstruction. A benchmarked and buildable algorithm, operable at the required data rates, is described. The use of neural nets, suggested by other researchers, is specifically avoided as unnecessary and impractical. Instead, a parallel algorithm, and associated hardware architecture using only conventional technology, is presented. The algorithm has been tested on fully scaled up, extensively detailed, simulated SSC events, with extremely encouraging results. Preliminary hardware analysis indicate that the trigger/tracker may be built within proposed SSC budget guidelines. 7 refs., 4 figs

  9. Trigger release mechanism for release of mine water to Magela Creek

    International Nuclear Information System (INIS)

    McQuade, C.V.; McGill, R.A.

    1988-01-01

    The Ranger Uranium Mine is surrounded by a World Heritage National Park. The strict environmental controls under which the mine operates are based on scientific and social requirements. Release of non-process storm runoff water to the Magela Creek during flood discharge and under controlled conditions has been identified as best practicable technology for the operation of the water management system. Social and political factors have limited this release to a wet season with an annual exceedance probability of one in ten. The first-generation trigger mechanism was based on a percentile analysis of monthly rainfall. The second-generation trigger is based on cumulative monthly volume increase in the retention ponds and is considered to be more applicable to the operation of the mine water management system. 6 figs., 2 tabs

  10. Designing signal-enriched triggers for boosted jets.

    CERN Document Server

    Toumazou, Marina

    2017-01-01

    Triggers designed to favour the selection of hadronically decaying massive particles have been studied. Both triggers using solely ET and mass cuts (similar to new 2017 triggers) and triggers exploiting polarization information have been studied. The mass cut triggers show substantial gains in rate reduction, while the benefits of polarization triggers are less obvious. The final conclusion is that it is more useful to identify and trigger on generic boosted decays, irrespective of the polarization of the decaying particle

  11. 78 FR 26308 - Endangered and Threatened Wildlife and Plants; Proposed Threatened Status for Coral Pink Sand...

    Science.gov (United States)

    2013-05-06

    ... effects of climate change and drought; and (4) cumulative interaction of individual factors such as off..., we considered the types of activities that might trigger regulatory impacts under the rule, as well... work programs; Child Nutrition; Food Stamps; Social Services Block Grants; Vocational Rehabilitation...

  12. Asynchronous sampled-data approach for event-triggered systems

    Science.gov (United States)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  13. A Cross-Layer User Centric Vertical Handover Decision Approach Based on MIH Local Triggers

    Science.gov (United States)

    Rehan, Maaz; Yousaf, Muhammad; Qayyum, Amir; Malik, Shahzad

    Vertical handover decision algorithm that is based on user preferences and coupled with Media Independent Handover (MIH) local triggers have not been explored much in the literature. We have developed a comprehensive cross-layer solution, called Vertical Handover Decision (VHOD) approach, which consists of three parts viz. mechanism for collecting and storing user preferences, Vertical Handover Decision (VHOD) algorithm and the MIH Function (MIHF). MIHF triggers the VHOD algorithm which operates on user preferences to issue handover commands to mobility management protocol. VHOD algorithm is an MIH User and therefore needs to subscribe events and configure thresholds for receiving triggers from MIHF. In this regard, we have performed experiments in WLAN to suggest thresholds for Link Going Down trigger. We have also critically evaluated the handover decision process, proposed Just-in-time interface activation technique, compared our proposed approach with prominent user centric approaches and analyzed our approach from different aspects.

  14. Synchronization of Switched Neural Networks With Communication Delays via the Event-Triggered Control.

    Science.gov (United States)

    Wen, Shiping; Zeng, Zhigang; Chen, Michael Z Q; Huang, Tingwen

    2017-10-01

    This paper addresses the issue of synchronization of switched delayed neural networks with communication delays via event-triggered control. For synchronizing coupled switched neural networks, we propose a novel event-triggered control law which could greatly reduce the number of control updates for synchronization tasks of coupled switched neural networks involving embedded microprocessors with limited on-board resources. The control signals are driven by properly defined events, which depend on the measurement errors and current-sampled states. By using a delay system method, a novel model of synchronization error system with delays is proposed with the communication delays and event-triggered control in the unified framework for coupled switched neural networks. The criteria are derived for the event-triggered synchronization analysis and control synthesis of switched neural networks via the Lyapunov-Krasovskii functional method and free weighting matrix approach. A numerical example is elaborated on to illustrate the effectiveness of the derived results.

  15. KATANA – A charge-sensitive triggering system for the SπRIT experiment

    Energy Technology Data Exchange (ETDEWEB)

    Lasko, P. [Institute of Nuclear Physics, Polish Academy of Sciences,Kraków (Poland); Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Adamczyk, M.; Brzychczyk, J. [Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Hirnyk, P.; Łukasik, J. [Institute of Nuclear Physics, Polish Academy of Sciences,Kraków (Poland); Pawłowski, P., E-mail: piotr.pawlowski@ifj.edu.pl [Institute of Nuclear Physics, Polish Academy of Sciences,Kraków (Poland); Pelczar, K. [Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Snoch, A. [University of Wroclaw, Wrocław (Poland); Sochocka, A.; Sosin, Z. [Faculty of Physics, Astronomy and Applied Computer Science, Jagiellonian University,Kraków (Poland); Barney, J. [Department of Physics and Astronomy, Michigan State University, East Lansing (United States); National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing (United States); Cerizza, G. [National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing (United States); Estee, J. [Department of Physics and Astronomy, Michigan State University, East Lansing (United States); National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing (United States); Isobe, T. [RIKEN Nishina Center, Wako, Saitama (Japan); Jhang, G. [Department of Physics, Korea University, Seoul (Korea, Republic of); Kaneko, M. [Department of Physics, Kyoto University, Kita-shirakawa, Kyoto (Japan); Kurata-Nishimura, M. [RIKEN Nishina Center, Wako, Saitama (Japan); and others

    2017-06-01

    KATANA - the Krakow Array for Triggering with Amplitude discrimiNAtion - has been built and used as a trigger and veto detector for the SπRIT TPC at RIKEN. Its construction allows operating in magnetic field and providing fast response for ionizing particles, giving an approximate forward multiplicity and charge information. Depending on this information, trigger and veto signals are generated. The article presents performance of the detector and details of its construction. A simple phenomenological parametrization of the number of emitted scintillation photons in plastic scintillator is proposed. The effect of the light output deterioration in the plastic scintillator due to the in-beam irradiation is discussed.

  16. Instrumentation of a Level-1 Track Trigger at ATLAS with Double Buffer Front-End Architecture

    CERN Document Server

    Cooper, B; The ATLAS collaboration

    2012-01-01

    The increased collision rate and pile-up produced at the HLLHC requires a substantial upgrade of the ATLAS level-1 trigger in order to maintain a broad physics reach. We show that tracking information can be used to control trigger rates, and describe a proposal for how this information can be extracted within a two-stage level-1 trigger design that has become the baseline for the HLLHC upgrade. We demonstrate that, in terms of the communication between the external processing and the tracking detector frontends, a hardware solution is possible that fits within the latency constraints of level-1.

  17. The ATLAS Electron and Photon Trigger

    CERN Document Server

    Jones, Samuel David; The ATLAS collaboration

    2017-01-01

    Electron and photon triggers covering transverse energies from 5 GeV to several TeV are essential for signal selection in a wide variety of ATLAS physics analyses to study Standard Model processes and to search for new phenomena. Final states including leptons and photons had, for example, an important role in the discovery and measurement of the Higgs boson. Dedicated triggers are also used to collect data for calibration, efficiency and fake rate measurements. The ATLAS trigger system is divided in a hardware-based Level-1 trigger and a software-based high-level trigger, both of which were upgraded during the LHC shutdown in preparation for Run-2 operation. To cope with the increasing luminosity and more challenging pile-up conditions at a center-of-mass energy of 13 TeV, the trigger selections at each level are optimized to control the rates and keep efficiencies high. To achieve this goal multivariate analysis techniques are used. The ATLAS electron and photon triggers and their performance with Run 2 dat...

  18. The ATLAS Electron and Photon Trigger

    CERN Document Server

    Jones, Samuel David; The ATLAS collaboration

    2018-01-01

    Electron and photon triggers covering transverse energies from 5 GeV to several TeV are essential for signal selection in a wide variety of ATLAS physics analyses to study Standard Model processes and to search for new phenomena. Final states including leptons and photons had, for example, an important role in the discovery and measurement of the Higgs boson. Dedicated triggers are also used to collect data for calibration, efficiency and fake rate measurements. The ATLAS trigger system is divided in a hardware-based Level-1 trigger and a software-based high-level trigger, both of which were upgraded during the LHC shutdown in preparation for Run-2 operation. To cope with the increasing luminosity and more challenging pile-up conditions at a center-of-mass energy of 13 TeV, the trigger selections at each level are optimized to control the rates and keep efficiencies high. To achieve this goal multivariate analysis techniques are used. The ATLAS electron and photon triggers and their performance with Run 2 dat...

  19. Upgrade of the CMS Global Muon Trigger

    CERN Document Server

    Jeitler, Manfred; Rabady, Dinyar; Sakulin, Hannes; Stahl, Achim

    2015-01-01

    The increase in center-of-mass energy and luminosity for Run-II of the Large Hadron Collider poses new challenges for the trigger systems of the experiments. To keep triggering with a similar performance as in Run-I, the CMS muon trigger is currently being upgraded. The new algorithms will provide higher resolution, especially for the muon transverse momentum and will make use of isolation criteria that combine calorimeter with muon information already in the level-1 trigger. The demands of the new algorithms can only be met by upgrading the level-1 trigger system to new powerful FPGAs with high bandwidth I/O. The processing boards will be based on the new μTCA standard. We report on the planned algorithms for the upgraded Global Muon Trigger (μGMT) which sorts and removes duplicates from boundaries of the muon trigger sub-systems. Furthermore, it determines how isolated the muon candidates are based on calorimetric energy deposits. The μGMT will be implemented using a processing board that features a larg...

  20. Upgrade of the CMS Global Muon Trigger

    CERN Document Server

    Lingemann, Joschka; Sakulin, Hannes; Jeitler, Manfred; Stahl, Achim

    2015-01-01

    The increase in center-of-mass energy and luminosity for Run 2 of the Large Hadron Collider pose new challenges for the trigger systems of the experiments. To keep triggering with a similar performance as in Run 1, the CMS muon trigger is currently being upgraded. The new algorithms will provide higher resolution, especially for the muon transverse momentum and will make use of isolation criteria that combine calorimeter with muon information already in the level-1 trigger. The demands of the new algorithms can only be met by upgrading the level-1 trigger system to new powerful FPGAs with high bandwidth I/O. The processing boards will be based on the new microTCA standard. We report on the planned algorithms for the upgraded Global Muon Trigger (GMT) which combines information from the muon trigger sub-systems and assigns the isolation variable. The upgraded GMT will be implemented using a Master Processor 7 card, built by Imperial College, that features a large Xilinx Virtex 7 FPGA. Up to 72 optical links at...

  1. The ZEUS calorimeter first level trigger

    International Nuclear Information System (INIS)

    Smith, W.H.; Ali, I.; Behrens, B.; Fordham, C.; Foudas, C.; Goussiou, A.; Jaworski, M.; Kinnel, T.; Lackey, J.; Robl, P.; Silverstein, S.; Dawson, J.W.; Krakauer, D.A.; Talaga, R.L.; Schlereth, J.L.

    1994-10-01

    The design of the ZEUS Calorimeter First Level Trigger (CFLT) is presented. The CFLT utilizes a pipelined architecture to provide trigger data for a global first leel trigger decision 5 μsec after each beam crossing, occurring every 96 nsec. The charges from 13K phototubes are summed into 1792 trigger tower pulseheights which are digitized by flash ADC's. The digital values are linearized, stored and used for sums and pattern tests. Summary data is forwarded to the Global First Level Trigger for each crossing 2 μsec after the crossing occurred. The CFLT determines the total energy, the total transverse energy, the missing energy, and the energy and number of isolated electrons and muons. It also provides information on the electromagnetic and hadronic energy deposited in various regions of the calorimeter. The CFLT has kept the experimental trigger rate below ∼200 Hz at the highest luminosity experienced at HERA. Performance studies suggest that the CFLT will keep the trigger rate below 1 kHZ against a rate of proton-beam gas interactions on the order of the 100 kHz expected at design luminosity. (orig.)

  2. The D0 run II trigger system

    International Nuclear Information System (INIS)

    Schwienhorst, Reinhard; Michigan State U.

    2004-01-01

    The D0 detector at the Fermilab Tevatron was upgraded for Run II. This upgrade included improvements to the trigger system in order to be able to handle the increased Tevatron luminosity and higher bunch crossing rates compared to Run I. The D0 Run II trigger is a highly exible system to select events to be written to tape from an initial interaction rate of about 2.5 MHz. This is done in a three-tier pipelined, buffered system. The first tier (level 1) processes fast detector pick-off signals in a hardware/firmware based system to reduce the event rate to about 1. 5kHz. The second tier (level 2) uses information from level 1 and forms simple Physics objects to reduce the rate to about 850 Hz. The third tier (level 3) uses full detector readout and event reconstruction on a filter farm to reduce the rate to 20-30 Hz. The D0 trigger menu contains a wide variety of triggers. While the emphasis is on triggering on generic lepton and jet final states, there are also trigger terms for specific final state signatures. In this document we describe the D0 trigger system as it was implemented and is currently operating in Run II

  3. Mismatch or cumulative stress : Toward an integrated hypothesis of programming effects

    NARCIS (Netherlands)

    Nederhof, Esther; Schmidt, Mathias V.

    2012-01-01

    This paper integrates the cumulative stress hypothesis with the mismatch hypothesis, taking into account individual differences in sensitivity to programming. According to the cumulative stress hypothesis, individuals are more likely to suffer from disease as adversity accumulates. According to the

  4. Stochastic evaluation of the dynamic response and the cumulative damage of nuclear power plant piping

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hanaoka, Masaaki

    1981-01-01

    This report deals with a fundamental study concerning an evaluation of uncertainties of the nuclear piping response and cumulative damage under excess-earthquake loadings. The main purposes of this study cover following several problems. (1) Experimental estimation analysis of the uncertainties concerning the dynamic response and the cumulative failure by using piping test model. (2) Numerical simulation analysis by Monte Carlo method under the assumption that relation between restoring force and deformation is characterized by perfectly elasto-plastic one. (Checking the mathematical model.) (3) Development of the conventional uncertainty estimating method by introducing a perturbation technique based on an appropriate equivalently linearized approach. (Checking the estimation technique.) (4) An application of this method to more realistical cases. Through above mentioned procedures some important results are obtained as follows; First, fundamental statistical properties of the natural frequencies and the number of cycle to failure crack initiation are evaluated. Second, the effect of the frequency fluctuation and the yielding fluctuation are estimated and examined through Monte Carlo simulation technique. It has become clear that the yielding fluctuation gives significant effect on the piping power response up to its failure initiation. Finally some results through proposed perturbation technique are discussed. Statistical properties estimated coincide fairly well with those through numerical simulation. (author)

  5. Uncertainty analysis technique of dynamic response and cumulative damage properties of piping system

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hara, Fumio; Hanaoka, Masaaki; Yamashita, Tadashi.

    1982-01-01

    It is a technologically important subject to establish the method of uncertainty analysis statistically examining the variation of the earthquake response and damage properties of equipment and piping system due to the change of input load and the parameters of structural system, for evaluating the aseismatic capability and dynamic structural reliability of these systems. The uncertainty in the response and damage properties when equipment and piping system are subjected to excessive vibration load is mainly dependent on the irregularity of acting input load such as the unsteady vibration of earthquakes, and structural uncertainty in forms and dimensions. This study is the basic one to establish the method for evaluating the uncertainty in the cumulative damage property at the time of resonant vibration of piping system due to the disperse of structural parameters with a simple model. First, the piping models with simple form were broken by resonant vibration, and the uncertainty in the cumulative damage property was evaluated. Next, the response analysis using an elasto-plastic mechanics model was performed by numerical simulation. Finally, the method of uncertainty analysis for response and damage properties by the perturbation method utilizing equivalent linearization was proposed, and its propriety was proved. (Kako, I.)

  6. Highway travel time information system based on cumulative count curves and new tracking technologies

    Energy Technology Data Exchange (ETDEWEB)

    Soriguera Marti, F.; Martinez-Diaz, M.; Perez Perez, I.

    2016-07-01

    Travel time is probably the most important indicator of the level of service of a highway, and it is also the most appreciated information for its users. Administrations and private companies make increasing efforts to improve its real time estimation. The appearance of new technologies makes the precise measurement of travel times easier than never before. However, direct measurements of travel time are, by nature, outdated in real time, and lack of the desired forecasting capabilities. This paper introduces a new methodology to improve the real time estimation of travel times by using the equipment usually present in most highways, i.e., loop detectors, in combination with Automatic Vehicle Identification or Tracking Technologies. One of the most important features of the method is the usage of cumulative counts at detectors as an input, avoiding the drawbacks of common spot-speed methodologies. Cumulative count curves have great potential for freeway travel time information systems, as they provide spatial measurements and thus allow the calculation of instantaneous travel times. In addition, they exhibit predictive capabilities. Nevertheless, they have not been used extensively mainly because of the error introduced by the accumulation of the detector drift. The proposed methodology solves this problem by correcting the deviations using direct travel time measurements. The method results highly beneficial for its accuracy as well as for its low implementation cost. (Author)

  7. A damage cumulation method for crack initiation prediction under non proportional loading and overloading

    International Nuclear Information System (INIS)

    Taheri, S.

    1992-04-01

    For a sequence of constant amplitude cyclic loading containing overloads, we propose a method for damage cumulation in non proportional loading. This method uses as data cyclic stabilized states at non proportional loading and initiation or fatigue curve in uniaxial case. For that, we take into account the dependence of Cyclic Strain Stress Curves (C.S.S.C.) and mean cell size on prehardening and we define a stabilized uniaxial state cyclically equivalent to a non proportional stabilized state through a family of C.S.S.C. Although simple assumptions like linear damage function and linear cumulation is used we obtain a sequence effect for difficult cross slip materials as 316 stainless steel, but the Miner rule for easy cross-slip materials. We show then differences between a load-controlled test and a strain controlled test: for a 316 stainless steel in a load controlled test, the non proportional loading at each cycle is less damaging than the uniaxial one for the same equivalent stress, while the result is opposite in a strain controlled test. We show also that an overloading retards initiation in a load controlled test while it accelerates initiation in a strain controlled test. (author). 26 refs., 8 figs

  8. Measurement of multi-particle azimuthal correlations with the subevent cumulant method with the ATLAS detector

    CERN Document Server

    Zhou, Mingliang; The ATLAS collaboration

    2017-01-01

    The measurement of four-particle cumulant and anisotropic elliptic flow coefficient for the second harmonic, $c_{2}\\{4\\}$ and $v_{2}\\{4\\}$, are presented using $pp$ data at $\\sqrt{s}=5.02$ and $\\sqrt{s}=13$ TeV, and $p+$Pb data at $\\sqrt{s_{\\text{NN}}}=5.02$ TeV. These measurements aim to assess collective nature of multi-particle production. While collectivity is well established in $p$+Pb and Pb+Pb collisions, because of larger non-flow contributions, its evidence in $pp$ collisions is contested. The values of $c_{2}\\{4\\}$ are calculated using the standard cumulant method and recently proposed two and three subevent methods, which can further suppress the non-flow contributions in small systems. In these collisions systems, the three subevent method gives a negative $c_{2}\\{4\\}$, and thus a well-defined $v_{2}\\{4\\}$. The magnitude of $c_{2}\\{4\\}$ is found to be nearly independent of $\\langle N_{\\text{ch}} \\rangle$ and third harmonic $c_{3}\\{4\\}$ is consistent with 0. $v_{2}\\{4\\}$ is found to be smaller than...

  9. ATLAS: triggers for B-physics

    International Nuclear Information System (INIS)

    George, Simon

    2000-01-01

    The LHC will produce bb-bar events at an unprecedented rate. The number of events recorded by ATLAS will be limited by the rate at which they can be stored offline and subsequently analysed. Despite the huge number of events, the small branching ratios mean that analysis of many of the most interesting channels for CP violation and other measurements will be limited by statistics. The challenge for the Trigger and Data Acquisition (DAQ) system is therefore to maximise the fraction of interesting B decays in the B-physics data stream. The ATLAS Trigger/DAQ system is split into three levels. The initial B-physics selection is made in the first-level trigger by an inclusive low-p T muon trigger (∼6 GeV). The second-level trigger strategy is based on identifying classes of final states by their partial reconstruction. The muon trigger is confirmed before proceeding to a track search. Electron/hadron separation is given by the transition radiation tracking detector and the Electromagnetic calorimeter. Muon identification is possible using the muon detectors and the hadronic calorimeter. From silicon strips, pixels and straw tracking, precise track reconstruction is used to make selections based on invariant mass, momentum and impact parameter. The ATLAS trigger group is currently engaged in algorithm development and performance optimisation for the B-physics trigger. This is closely coupled to the R and D programme for the higher-level triggers. Together the two programmes of work will optimise the hardware, architecture and algorithms to meet the challenging requirements. This paper describes the current status and progress of this work

  10. Science and societal partnerships to address cumulative impacts

    Directory of Open Access Journals (Sweden)

    Carolyn J Lundquist

    2016-02-01

    Full Text Available Funding and priorities for ocean research are not separate from the underlying sociological, economic, and political landscapes that determine values attributed to ecological systems. Here we present a variation on science prioritisation exercises, focussing on inter-disciplinary research questions with the objective of shifting broad scale management practices to better address cumulative impacts and multiple users. Marine scientists in New Zealand from a broad range of scientific and social-scientific backgrounds ranked 48 statements of research priorities. At a follow up workshop, participants discussed five over-arching themes based on survey results. These themes were used to develop mechanisms to increase the relevance and efficiency of scientific research while acknowledging socio-economic and political drivers of research agendas in New Zealand’s ocean ecosystems. Overarching messages included the need to: 1 determine the conditions under which ‘surprises’ (sudden and substantive undesirable changes are likely to occur and the socio-ecological implications of such changes; 2 develop methodologies to reveal the complex and cumulative effects of change in marine systems, and their implications for resource use, stewardship, and restoration; 3 assess potential solutions to management issues that balance long-term and short-term benefits and encompass societal engagement in decision-making; 4 establish effective and appropriately resourced institutional networks to foster collaborative, solution-focused marine science; and 5 establish cross-disciplinary dialogues to translate diverse scientific and social-scientific knowledge into innovative regulatory, social and economic practice. In the face of multiple uses and cumulative stressors, ocean management frameworks must be adapted to build a collaborative framework across science, governance and society that can help stakeholders navigate uncertainties and socio-ecological surprises.

  11. Cumulative risk hypothesis: Predicting and preventing child maltreatment recidivism.

    Science.gov (United States)

    Solomon, David; Åsberg, Kia; Peer, Samuel; Prince, Gwendolyn

    2016-08-01

    Although Child Protective Services (CPS) and other child welfare agencies aim to prevent further maltreatment in cases of child abuse and neglect, recidivism is common. Having a better understanding of recidivism predictors could aid in preventing additional instances of maltreatment. A previous study identified two CPS interventions that predicted recidivism: psychotherapy for the parent, which was related to a reduced risk of recidivism, and temporary removal of the child from the parent's custody, which was related to an increased recidivism risk. However, counter to expectations, this previous study did not identify any other specific risk factors related to maltreatment recidivism. For the current study, it was hypothesized that (a) cumulative risk (i.e., the total number of risk factors) would significantly predict maltreatment recidivism above and beyond intervention variables in a sample of CPS case files and that (b) therapy for the parent would be related to a reduced likelihood of recidivism. Because it was believed that the relation between temporary removal of a child from the parent's custody and maltreatment recidivism is explained by cumulative risk, the study also hypothesized that that the relation between temporary removal of the child from the parent's custody and recidivism would be mediated by cumulative risk. After performing a hierarchical logistic regression analysis, the first two hypotheses were supported, and an additional predictor, psychotherapy for the child, also was related to reduced chances of recidivism. However, Hypothesis 3 was not supported, as risk did not significantly mediate the relation between temporary removal and recidivism. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Cumulative or delayed nephrotoxicity after cisplatin (DDP) treatment.

    Science.gov (United States)

    Pinnarò, P; Ruggeri, E M; Carlini, P; Giovannelli, M; Cognetti, F

    1986-04-30

    The present retrospective study reports data regarding renal toxicity in 115 patients (63 males, 52 females; median age, 56 years) who received cumulative doses of cisplatin (DDP) greater than or equal to 200 mg/m2. DDP was administered alone or in combination at a dose of 50-70 mg/m2 in 91 patients, and at a dose of 100 mg/m2 in 22 patients. Two patients after progression of ovarian carcinoma treated with conventional doses of DDP received 4 and 2 courses, respectively, of high-dose DDP (40 mg/m2 for 5 days) in hypertonic saline. The median number of DDP courses was 6 (range 2-14), and the median cumulative dose was 350 mg/m2 (range, 200-1200). Serum creatinine and urea nitrogen were determined before initiating the treatment and again 13-16 days after each administration. The incidence of azotemia (creatinina levels that exceeded 1.5 mg/dl) was similar before (7.8%) and after (6.1%) DDP doses of 200 mg/m2. Azotemia appears to be related to the association of DDP with other potentially nephrotoxic antineoplastic drugs (methotrexate) more than to the dose per course of DDP. Of 59 patients followed for 2 months or more after discontinuing the DDP treatment, 3 (5.1%) presented creatinine values higher than 1.5 mg/dl. The data deny that the incidence of nephrotoxicity is higher in patients receiving higher cumulative doses of DDP and confirm that increases in serum creatinine levels may occur some time after discontinuation of the drug.

  13. Thesis Proposal

    DEFF Research Database (Denmark)

    Sloth, Erik

    2010-01-01

    Strukturen i Thesis proposal er følgende: Først præsenteres mine konkrete empiriske forskningsprojekter som skal munde ud i afhandlingens artikler. Jeg præsenterer herefter de teoretiske overvejelser omkring oplevelsesbegrebet og forbrugerkulturteori som danner baggrund for at jeg er nået frem til...

  14. Droop-Free Distributed Control with Event-Triggered Communication in DC Micro-Grid

    DEFF Research Database (Denmark)

    Han, Renke; Aldana, Nelson Leonardo Diaz; Meng, Lexuan

    2017-01-01

    A novel nonlinear droop-free distributed controller is proposed to achieve accurate current sharing and eliminate voltage drops in dc Micro-Grid (MG). Then by introducing the sample and holding scheme, the proposed controller is extended to the event-triggered-based controller which is designed...

  15. The proportional odds cumulative incidence model for competing risks

    DEFF Research Database (Denmark)

    Eriksson, Frank; Li, Jianing; Scheike, Thomas

    2015-01-01

    We suggest an estimator for the proportional odds cumulative incidence model for competing risks data. The key advantage of this model is that the regression parameters have the simple and useful odds ratio interpretation. The model has been considered by many authors, but it is rarely used...... in practice due to the lack of reliable estimation procedures. We suggest such procedures and show that their performance improve considerably on existing methods. We also suggest a goodness-of-fit test for the proportional odds assumption. We derive the large sample properties and provide estimators...

  16. Cumulative exposure to phthalates from phthalate-containing drug products

    DEFF Research Database (Denmark)

    Ennis, Zandra Nymand; Broe, Anne; Pottegård, Anton

    2018-01-01

    European regulatory limit of exposure ranging between 380-1710 mg/year throughout the study period. Lithium-products constituted the majority of dibutyl phthalate exposure. Diethyl phthalate exposure, mainly caused by erythromycin, theophylline and diclofenac products, did not exceed the EMA regulatory...... to quantify annual cumulated phthalate exposure from drug products among users of phthalate-containing oral medications in Denmark throughout the period of 2004-2016. METHODS: We conducted a Danish nationwide cohort study using The Danish National Prescription Registry and an internal database held...

  17. Lyapunov exponent of the random frequency oscillator: cumulant expansion approach

    International Nuclear Information System (INIS)

    Anteneodo, C; Vallejos, R O

    2010-01-01

    We consider a one-dimensional harmonic oscillator with a random frequency, focusing on both the standard and the generalized Lyapunov exponents, λ and λ* respectively. We discuss the numerical difficulties that arise in the numerical calculation of λ* in the case of strong intermittency. When the frequency corresponds to a Ornstein-Uhlenbeck process, we compute analytically λ* by using a cumulant expansion including up to the fourth order. Connections with the problem of finding an analytical estimate for the largest Lyapunov exponent of a many-body system with smooth interactions are discussed.

  18. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  19. Numerical simulation of explosive magnetic cumulative generator EMG-720

    Energy Technology Data Exchange (ETDEWEB)

    Deryugin, Yu N; Zelenskij, D K; Kazakova, I F; Kargin, V I; Mironychev, P V; Pikar, A S; Popkov, N F; Ryaslov, E A; Ryzhatskova, E G [All-Russian Research Inst. of Experimental Physics, Sarov (Russian Federation)

    1997-12-31

    The paper discusses the methods and results of numerical simulations used in the development of a helical-coaxial explosive magnetic cumulative generator (EMG) with the stator up to 720 mm in diameter. In the process of designing, separate units were numerically modeled, as was the generator operation with a constant inductive-ohmic load. The 2-D processes of the armature acceleration by the explosion products were modeled as well as those of the formation of the sliding high-current contact between the armature and stator`s insulated turns. The problem of the armature integrity in the region of the detonation waves collision was numerically analyzed. 8 figs., 2 refs.

  20. Cumulative exergy losses associated with the production of lead metal

    Energy Technology Data Exchange (ETDEWEB)

    Szargut, J [Technical Univ. of Silesia, Gliwice (PL). Inst. of Thermal-Engineering; Morris, D R [New Brunswick Univ., Fredericton, NB (Canada). Dept. of Chemical Engineering

    1990-08-01

    Cumulative exergy losses result from the irreversibility of the links of a technological network leading from raw materials and fuels extracted from nature to the product under consideration. The sum of these losses can be apportioned into partial exergy losses (associated with particular links of the technological network) or into constituent exergy losses (associated with constituent subprocesses of the network). The methods of calculation of the partial and constituent exergy losses are presented, taking into account the useful byproducts substituting the major products of other processes. Analyses of partial and constituent exergy losses are made for the technological network of lead metal production. (author).

  1. The CMS Barrel Muon trigger upgrade

    International Nuclear Information System (INIS)

    Triossi, A.; Sphicas, P.; Bellato, M.; Montecassiano, F.; Ventura, S.; Ruiz, J.M. Cela; Bedoya, C. Fernandez; Tobar, A. Navarro; Fernandez, I. Redondo; Ferrero, D. Redondo; Sastre, J.; Ero, J.; Wulz, C.; Flouris, G.; Foudas, C.; Loukas, N.; Mallios, S.; Paradas, E.; Guiducci, L.; Masetti, G.

    2017-01-01

    The increase of luminosity expected by LHC during Phase1 will impose tighter constraints for rate reduction in order to maintain high efficiency in the CMS Level1 trigger system. The TwinMux system is the early layer of the muon barrel region that concentrates the information from different subdetectors: Drift Tubes, Resistive Plate Chambers and Outer Hadron Calorimeter. It arranges the slow optical trigger links from the detector chambers into faster links (10 Gbps) that are sent in multiple copies to the track finders. Results from collision runs, that confirm the satisfactory operation of the trigger system up to the output of the barrel track finder, will be shown.

  2. Electronic trigger for the ASP experiment

    International Nuclear Information System (INIS)

    Wilson, R.J.

    1985-11-01

    The Anomalous Single Photon (ASP) electronic trigger is described. The experiments is based on an electromagnetic calorimeter composed of arrays of lead glass blocks, read out with photo-multiplier tubes, surrounding the interaction point at the PEP storage ring. The primary requirement of the trigger system is to be sensitive to low energy (approx. =0.5 GeV and above) photons whilst discriminating against high backgrounds at PEP. Analogue summing of the PMT signals and a sequence of programmable digital look-up tables produces a ''dead-timeless'' trigger for the beam collision rate of 408 kHz. 6 refs., 6 figs

  3. The LHCb trigger in Run II

    CERN Document Server

    Michielin, Emanuele

    2016-01-01

    The LHCb trigger system has been upgraded to allow alignment, calibration and physics analysis to be performed in real time. An increased CPU capacity and improvements in the software have allowed lifetime unbiased selections of beauty and charm decays in the high level trigger. Thanks to offline quality event reconstruction already available online, physics analyses can be performed directly on this information and for the majority of charm physics selections a reduced event format can be written out. Beauty hadron decays are more efficiently triggered by re-optimised inclusive selections, and the HLT2 output event rate is increased by a factor of three.

  4. Expansion formulae for characteristics of cumulative cost in finite horizon production models

    NARCIS (Netherlands)

    Ayhan, H.; Schlegel, S.

    2001-01-01

    We consider the expected value and the tail probability of cumulative shortage and holding cost (i.e. the probability that cumulative cost is more than a certain value) in finite horizon production models. An exact expression is provided for the expected value of the cumulative cost for general

  5. Correlation between thermal gradient and flexure-type deformation as a potential trigger for exfoliation-related rock falls (Invited)

    Science.gov (United States)

    Collins, B. D.; Stock, G. M.

    2010-12-01

    Stress-induced exfoliation of granitic rocks is an important means by which cliffs deform and subsequently erode. During exfoliation, fractures are formed, and when exposed in cliff faces, are susceptible to subsequent rock falls. This is the case in Yosemite National Park, California, where exfoliation continues to play a primary role in cliff evolution. In Yosemite, numerous mechanisms are inferred to trigger rock falls; nevertheless, many rock falls have no recognized triggers. As a result, several potential, but as yet unquantified, triggering mechanisms have been proposed. One of these, thermally induced flexure, wherein solar radiation and temperature variation drives cumulative deformation of partially detached rock flakes, has the potential to explain several recent rock falls in Yosemite. We explore this potential mechanism by quantifying the deformation, temperature, and solar radiation exposure of a near-vertical rock flake in Yosemite Valley. The flake, 14 m tall, 4 m wide and 12 cm thick, receives direct sunlight during most of the day. Whereas the flake is attached to the cliff face at its bottom and top, the sides are detached from the cliff by a 10 cm wide crack on one side, tapering to a 1 cm wide crack on the opposite side. Instrumentation consists of three custom-designed crackmeters placed between the flake and the adjacent cliff face, three air temperature sensors located behind the flake, and three dual air temperature-light sensors located on the outside surface of the flake. Nearby relative humidity and barometric pressure sensors complete the instrumentation. Five-minute interval data from spring - fall 2010 indicate the flake undergoes maximum deformation at mid-span between attachment points and that it deforms from both diurnal and climatic temperature fluctuations. Recorded maximum deformations, measured perpendicular to crack orientation, are 1 cm diurnally and nearly 1.5 cm (including diurnal effect) over a 5-day period of cooler

  6. Instrumentation of the upgraded ATLAS tracker with a double buffer front-end architecture for track triggering

    International Nuclear Information System (INIS)

    Wardrope, D

    2012-01-01

    The Large Hadron Collider will be upgraded to provide instantaneous luminosity L = 5 × 10 34 cm −2 s −1 , leading to excessive rates from the ATLAS Level-1 trigger. A double buffer front-end architecture for the ATLAS tracker replacement is proposed, that will enable the use of track information in trigger decisions within 20 μs in order to reduce the high trigger rates. Analysis of ATLAS simulations have found that using track information will enable the use of single lepton triggers with transverse momentum thresholds of p T ∼ 25 GeV, which will be of great benefit to the future physics programme of ATLAS.

  7. Triggered creep as a possible mechanism for delayed dynamic triggering of tremor and earthquakes

    Science.gov (United States)

    Shelly, David R.; Peng, Zhigang; Hill, David P.; Aiken, Chastity

    2011-01-01

    The passage of radiating seismic waves generates transient stresses in the Earth's crust that can trigger slip on faults far away from the original earthquake source. The triggered fault slip is detectable in the form of earthquakes and seismic tremor. However, the significance of these triggered events remains controversial, in part because they often occur with some delay, long after the triggering stress has passed. Here we scrutinize the location and timing of tremor on the San Andreas fault between 2001 and 2010 in relation to distant earthquakes. We observe tremor on the San Andreas fault that is initiated by passing seismic waves, yet migrates along the fault at a much slower velocity than the radiating seismic waves. We suggest that the migrating tremor records triggered slow slip of the San Andreas fault as a propagating creep event. We find that the triggered tremor and fault creep can be initiated by distant earthquakes as small as magnitude 5.4 and can persist for several days after the seismic waves have passed. Our observations of prolonged tremor activity provide a clear example of the delayed dynamic triggering of seismic events. Fault creep has been shown to trigger earthquakes, and we therefore suggest that the dynamic triggering of prolonged fault creep could provide a mechanism for the delayed triggering of earthquakes. ?? 2011 Macmillan Publishers Limited. All rights reserved.

  8. Does Twitter trigger bursts in signature collections?

    Directory of Open Access Journals (Sweden)

    Rui Yamaguchi

    Full Text Available INTRODUCTION: The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. METHODS AND FINDINGS: In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78% of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26% was smaller than the Forum effect (52% in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. CONCLUSIONS: The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore

  9. Does Twitter trigger bursts in signature collections?

    Science.gov (United States)

    Yamaguchi, Rui; Imoto, Seiya; Kami, Masahiro; Watanabe, Kenji; Miyano, Satoru; Yuji, Koichiro

    2013-01-01

    The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78%) of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26%) was smaller than the Forum effect (52%) in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore information hidden in social phenomena.

  10. The second level trigger system of FAST

    CERN Document Server

    Martínez,G; Berdugo, J; Casaus, J; Casella, V; De Laere, D; Deiters, K; Dick, P; Kirkby, J; Malgeri, L; Mañá, C; Marín, J; Pohl, M; Petitjean, C; Sánchez, E; Willmott, C

    2009-01-01

    The Fibre Active Scintillator Target (FAST) experiment is a novel imaging particle detector currently operating in a high-intensity π+ beam at the Paul Scherrer Institute (PSI), Villigen, Switzerland. The detector is designed to perform a high precision measurement of the μ+ lifetime, in order to determine the Fermi constant, Gf, to 1 ppm precision. A dedicated second level (LV2) hardware trigger system has been developed for the experiment. It performs an online analysis of the π/μ decay chain by identifying the stopping position of each beam particle and detecting the subsequent appearance of the muon. The LV2 trigger then records the muon stop pixel and selectively triggers the Time-to-Digital Converters (TDCs) in the vicinity. A detailed description of the trigger system is presented in this paper.

  11. The second level trigger system of FAST

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, G. [CIEMAT, Avenida Complutense 22, 28040 Madrid (Spain)], E-mail: gustavo.martinez@ciemat.es; Barcyzk, A. [CERN, CH-1211 Geneva 23 (Switzerland); Berdugo, J.; Casaus, J. [CIEMAT, Avenida Complutense 22, 28040 Madrid (Spain); Casella, C.; De Laere, S. [Universite de Geneve, 30 quai Ernest-Anserment, CH-1211 Geneva 4 (Switzerland); Deiters, K.; Dick, P. [Paul Scherrer Institut, 5232 Villigen PSI (Switzerland); Kirkby, J.; Malgeri, L. [CERN, CH-1211 Geneva 23 (Switzerland); Mana, C.; Marin, J. [CIEMAT, Avenida Complutense 22, 28040 Madrid (Spain); Pohl, M. [Universite de Geneve, 30 quai Ernest-Anserment, CH-1211 Geneva 4 (Switzerland); Petitjean, C. [Paul Scherrer Institut, 5232 Villigen PSI (Switzerland); Sanchez, E.; Willmott, C. [CIEMAT, Avenida Complutense 22, 28040 Madrid (Spain)

    2009-10-11

    The Fibre Active Scintillator Target (FAST) experiment is a novel imaging particle detector currently operating in a high-intensity {pi}{sup +} beam at the Paul Scherrer Institute (PSI), Villigen, Switzerland. The detector is designed to perform a high precision measurement of the {mu}{sup +} lifetime, in order to determine the Fermi constant, G{sub f}, to 1 ppm precision. A dedicated second level (LV2) hardware trigger system has been developed for the experiment. It performs an online analysis of the {pi}/{mu} decay chain by identifying the stopping position of each beam particle and detecting the subsequent appearance of the muon. The LV2 trigger then records the muon stop pixel and selectively triggers the Time-to-Digital Converters (TDCs) in the vicinity. A detailed description of the trigger system is presented in this paper.

  12. SSC physics signatures and trigger requirements

    International Nuclear Information System (INIS)

    1985-01-01

    Strategies are considered for triggering on new physics processes on the environment of the SSC, where interaction rates will be very high and most new physics processes quite rare. The quantities available for use in the trigger at various levels are related to the signatures of possible new physics. Two examples were investigated in some detail using the ISAJET Monte Carlo program: Higgs decays to W pairs and a missing energy trigger applied to gluino pair production. In both of the examples studied in detail, it was found that workable strategies for reducing the trigger rate were obtainable which also produced acceptable efficiency for the processes of interest. In future work, it will be necessary to carry out such a program for the full spectrum of suggested new physics

  13. Graphics Processing Units for HEP trigger systems

    International Nuclear Information System (INIS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.

    2016-01-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  14. Pulling the trigger on LHC electronics

    CERN Document Server

    CERN. Geneva

    2001-01-01

    The conditions at CERN's Large Hadron Collider pose severe challenges for the designers and builders of front-end, trigger and data acquisition electronics. A recent workshop reviewed the encouraging progress so far and discussed what remains to be done. The LHC experiments have addressed level one trigger systems with a variety of high-speed hardware. The CMS Calorimeter Level One Regional Trigger uses 160 MHz logic boards plugged into the front and back of a custom backplane, which provides point-to-point links between the cards. Much of the processing in this system is performed by five types of 160 MHz digital applications-specific integrated circuits designed using Vitesse submicron high-integration gallium arsenide gate array technology. The LHC experiments make extensive use of field programmable gate arrays (FPGAs). These offer programmable reconfigurable logic, which has the flexibility that trigger designers need to be able to alter algorithms so that they can follow the physics and detector perform...

  15. Boredom and Passion: Triggers of Habitual Entrepreneurship

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle

    . The case based, the study identifies eight factors, which contribute to consecutive venture creation. The findings suggest that boredom and passion are necessary conditions triggering habitual entrepreneurship. Other important mechanisms included the joy of discovering and exploiting an opportunity...

  16. D0 triggering and data acquisition

    International Nuclear Information System (INIS)

    Gibbard, B.

    1992-10-01

    The trigger for D0 is a multi-tier system. Within the 3.5 μsec bunch crossing interval, custom electronics select interesting event candidates based on electromagnetic and hadronic energy deposits in the calorimeter and on indications of tracks in the muon system. Subsequent hardware decisions use refined calculations of electron and muon characteristics. The highest level trigger occurs in one element of a farm of microprocessors, where fully developed algorithms for electrons, muons, jets, or missing E t are executed. This highest level trigger also provides the assembly of the event into its final data structure. Performance of this trigger and data acquisition system in collider operation is described

  17. Triggering and data acquisition general considerations

    International Nuclear Information System (INIS)

    Butler, Joel N.

    2003-01-01

    We provide a general introduction to trigger and data acquisition systems in High Energy Physics. We emphasize the new possibilities and new approaches that have been made possible by developments in computer technology and networking

  18. Session summary: Electronics, triggering and data acquisition

    International Nuclear Information System (INIS)

    Rescia, S.

    1991-12-01

    The session focused on the requirements for calorimetry at the SSC/LHC. Results on new readout techniques, calibration, radiation hard electronics and semiconductor devices, analog and digital front and electronics, and trigger strategies are presented

  19. Trigger factors in migraine with aura

    DEFF Research Database (Denmark)

    Hauge, A W; Kirchmann, M; Olesen, J

    2010-01-01

    The aim of the present study was to identify trigger factors in migraine with aura (MA). A total of 629 MA patients representative of the Danish population were sent a questionnaire listing 16 trigger factors thought to be relevant as well as space for free text. Distinction was made between...... attacks with or without aura within each patient. The questionnaire was returned by 522 patients of whom 347 had current MA attacks. In total 80% with current attacks (278/347) indicated that at least one factor triggered their migraine, and 67% (187/278) in this group indicated that they were aware...... of at least one factor often or always giving rise to an attack of MA. Forty-one per cent (113/278) had co-occurring attacks of migraine without aura (MO). Stress (following stress), bright light, intense emotional influences, stress (during stress) and sleeping too much or too little were the trigger factors...

  20. The Aurora accelerator's triggered oil switch

    International Nuclear Information System (INIS)

    Weidenheimer, D.M.; Pereira, N.R.; Judy, D.C.; Stricklett, K.L.

    1993-01-01

    Achieving a radiation pulse with 15 ns risetime using all four of the Aurora accelerator's Blumlein pulse-forming lines demands synchronization of the Blumleins to within 10 ns (in addition to a 15 ns risetime for a single line). Timing of each Blumlein is controlled by a triggered 12 MV oil switch. A smaller-than-customary trigger electrode makes the switching time more reproducible. Time-resolved photography of the oil arcs suggests that triggering occurs simultaneously around the sharp edge of the trigger electrode, perhaps with small deviations that grow into the most prominent arcs characteristically seen in open-shutter photographs. However, many smaller arcs that are usually overlooked in open-shutter pictures may contribute to current conduction in a closed switch

  1. Graphics Processing Units for HEP trigger systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R. [INFN Sezione di Roma “Tor Vergata”, Via della Ricerca Scientifica 1, 00133 Roma (Italy); Bauce, M. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Biagioni, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Chiozzi, S.; Cotta Ramusino, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Fantechi, R. [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); CERN, Geneve (Switzerland); Fiorini, M. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Giagu, S. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Gianoli, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Lamanna, G., E-mail: gianluca.lamanna@cern.ch [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); INFN Laboratori Nazionali di Frascati, Via Enrico Fermi 40, 00044 Frascati (Roma) (Italy); Lonardo, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Messina, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); and others

    2016-07-11

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  2. Trigger circuits for the PHENIX electromagnetic calorimeter

    International Nuclear Information System (INIS)

    Frank, S.S.; Britton, C.L. Jr.; Winterberg, A.L.; Young, G.R.

    1997-11-01

    Monolithic and discrete circuits have been developed to provide trigger signals for the PHENIX electromagnetic calorimeter detector. These trigger circuits are deadtimeless and create overlapping 4 by 4 energy sums, a cosmic muon trigger, and a 144 channel energy sum. The front end electronics of the PHENIX system sample the energy and timing channels at each bunch crossing (BC) but it is not known immediately if this data is of interest. The information from the trigger circuits is used to determine if the data collected is of interest and should be digitized and stored or discarded. This paper presents details of the design, issues affecting circuit performance, characterization of prototypes fabricated in 1.2 microm Orbit CMOS, and integration of the circuits into the EMCal electronics system

  3. New Fast Interaction Trigger for ALICE

    Energy Technology Data Exchange (ETDEWEB)

    Trzaska, Wladyslaw Henryk

    2017-02-11

    The LHC heavy-ion luminosity and collision rate from 2021 onwards will considerably exceed the design parameters of the present ALICE forward trigger detectors and the introduction of the Muon Forward Tracker (MFT) will significantly reduce the space available for the new trigger detectors. To comply with these conditions a new Fast Interaction Trigger (FIT) will be built. FIT will be the main forward trigger, luminometer, and interaction-time detector. It will also determine multiplicity, centrality, and reaction plane of heavy-ion collisions. FIT will consist of two arrays of Cherenkov quartz radiators with MCP-PMT sensors and of a plastic scintillator ring. By increasing the overall acceptance of FIT, the scintillator will improve centrality and event plane resolution. It will also add sensitivity for the detection of beam-gas events and provide some degree of redundancy. FIT is currently undergoing an intense R&D and prototyping period. It is scheduled for installation in ALICE during 2020.

  4. Cumulative hierarchies and computability over universes of sets

    Directory of Open Access Journals (Sweden)

    Domenico Cantone

    2008-05-01

    Full Text Available Various metamathematical investigations, beginning with Fraenkel’s historical proof of the independence of the axiom of choice, called for suitable definitions of hierarchical universes of sets. This led to the discovery of such important cumulative structures as the one singled out by von Neumann (generally taken as the universe of all sets and Godel’s universe of the so-called constructibles. Variants of those are exploited occasionally in studies concerning the foundations of analysis (according to Abraham Robinson’s approach, or concerning non-well-founded sets. We hence offer a systematic presentation of these many structures, partly motivated by their relevance and pervasiveness in mathematics. As we report, numerous properties of hierarchy-related notions such as rank, have been verified with the assistance of the ÆtnaNova proof-checker.Through SETL and Maple implementations of procedures which effectively handle the Ackermann’s hereditarily finite sets, we illustrate a particularly significant case among those in which the entities which form a universe of sets can be algorithmically constructed and manipulated; hereby, the fruitful bearing on pure mathematics of cumulative set hierarchies ramifies into the realms of theoretical computer science and algorithmics.

  5. Cumulative Effects Assessment: Linking Social, Ecological, and Governance Dimensions

    Directory of Open Access Journals (Sweden)

    Marian Weber

    2012-06-01

    Full Text Available Setting social, economic, and ecological objectives is ultimately a process of social choice informed by science. In this special feature we provide a multidisciplinary framework for the use of cumulative effects assessment in land use planning. Forest ecosystems are facing considerable challenges driven by population growth and increasing demands for resources. In a suite of case studies that span the boreal forest of Western Canada to the interior Atlantic forest of Paraguay we show how transparent and defensible methods for scenario analysis can be applied in data-limited regions and how social dimensions of land use change can be incorporated in these methods, particularly in aboriginal communities that have lived in these ecosystems for generations. The case studies explore how scenario analysis can be used to evaluate various land use options and highlight specific challenges with identifying social and ecological responses, determining thresholds and targets for land use, and integrating local and traditional knowledge in land use planning. Given that land use planning is ultimately a value-laden and often politically charged process we also provide some perspective on various collective and expert-based processes for identifying cumulative impacts and thresholds. The need for good science to inform and be informed by culturally appropriate democratic processes calls for well-planned and multifaceted approaches both to achieve an informed understanding of both residents and governments of the interactive and additive changes caused by development, and to design action agendas to influence such change at the ecological and social level.

  6. Maternal distress and parenting in the context of cumulative disadvantage.

    Science.gov (United States)

    Arditti, Joyce; Burton, Linda; Neeves-Botelho, Sara

    2010-06-01

    This article presents an emergent conceptual model of the features and links between cumulative disadvantage, maternal distress, and parenting practices in low-income families in which parental incarceration has occurred. The model emerged from the integration of extant conceptual and empirical research with grounded theory analysis of longitudinal ethnographic data from Welfare, Children, and Families: A Three-City Study. Fourteen exemplar family cases were used in the analysis. Results indicated that mothers in these families experienced life in the context of cumulative disadvantage, reporting a cascade of difficulties characterized by neighborhood worries, provider concerns, bureaucratic difficulties, violent intimate relationships, and the inability to meet children's needs. Mothers, however, also had an intense desire to protect their children, and to make up for past mistakes. Although, in response to high levels of maternal distress and disadvantage, most mothers exhibited harsh discipline of their children, some mothers transformed their distress by advocating for their children under difficult circumstances. Women's use of harsh discipline and advocacy was not necessarily an "either/or" phenomenon as half of the mothers included in our analysis exhibited both harsh discipline and care/advocacy behaviors. Maternal distress characterized by substance use, while connected to harsh disciplinary behavior, did not preclude mothers engaging in positive parenting behaviors.

  7. Cumulative phase delay imaging for contrast-enhanced ultrasound tomography

    International Nuclear Information System (INIS)

    Demi, Libertario; Van Sloun, Ruud J G; Wijkstra, Hessel; Mischi, Massimo

    2015-01-01

    Standard dynamic-contrast enhanced ultrasound (DCE-US) imaging detects and estimates ultrasound-contrast-agent (UCA) concentration based on the amplitude of the nonlinear (harmonic) components generated during ultrasound (US) propagation through UCAs. However, harmonic components generation is not specific to UCAs, as it also occurs for US propagating through tissue. Moreover, nonlinear artifacts affect standard DCE-US imaging, causing contrast to tissue ratio reduction, and resulting in possible misclassification of tissue and misinterpretation of UCA concentration. Furthermore, no contrast-specific modality exists for DCE-US tomography; in particular speed-of-sound changes due to UCAs are well within those caused by different tissue types. Recently, a new marker for UCAs has been introduced. A cumulative phase delay (CPD) between the second harmonic and fundamental component is in fact observable for US propagating through UCAs, and is absent in tissue. In this paper, tomographic US images based on CPD are for the first time presented and compared to speed-of-sound US tomography. Results show the applicability of this marker for contrast specific US imaging, with cumulative phase delay imaging (CPDI) showing superior capabilities in detecting and localizing UCA, as compared to speed-of-sound US tomography. Cavities (filled with UCA) which were down to 1 mm in diameter were clearly detectable. Moreover, CPDI is free of the above mentioned nonlinear artifacts. These results open important possibilities to DCE-US tomography, with potential applications to breast imaging for cancer localization. (fast track communication)

  8. Cumulant expansions for measuring water exchange using diffusion MRI

    Science.gov (United States)

    Ning, Lipeng; Nilsson, Markus; Lasič, Samo; Westin, Carl-Fredrik; Rathi, Yogesh

    2018-02-01

    The rate of water exchange across cell membranes is a parameter of biological interest and can be measured by diffusion magnetic resonance imaging (dMRI). In this work, we investigate a stochastic model for the diffusion-and-exchange of water molecules. This model provides a general solution for the temporal evolution of dMRI signal using any type of gradient waveform, thereby generalizing the signal expressions for the Kärger model. Moreover, we also derive a general nth order cumulant expansion of the dMRI signal accounting for water exchange, which has not been explored in earlier studies. Based on this analytical expression, we compute the cumulant expansion for dMRI signals for the special case of single diffusion encoding (SDE) and double diffusion encoding (DDE) sequences. Our results provide a theoretical guideline on optimizing experimental parameters for SDE and DDE sequences, respectively. Moreover, we show that DDE signals are more sensitive to water exchange at short-time scale but provide less attenuation at long-time scale than SDE signals. Our theoretical analysis is also validated using Monte Carlo simulations on synthetic structures.

  9. A Cumulant-based Analysis of Nonlinear Magnetospheric Dynamics

    International Nuclear Information System (INIS)

    Johnson, Jay R.; Wing, Simon

    2004-01-01

    Understanding magnetospheric dynamics and predicting future behavior of the magnetosphere is of great practical interest because it could potentially help to avert catastrophic loss of power and communications. In order to build good predictive models it is necessary to understand the most critical nonlinear dependencies among observed plasma and electromagnetic field variables in the coupled solar wind/magnetosphere system. In this work, we apply a cumulant-based information dynamical measure to characterize the nonlinear dynamics underlying the time evolution of the Dst and Kp geomagnetic indices, given solar wind magnetic field and plasma input. We examine the underlying dynamics of the system, the temporal statistical dependencies, the degree of nonlinearity, and the rate of information loss. We find a significant solar cycle dependence in the underlying dynamics of the system with greater nonlinearity for solar minimum. The cumulant-based approach also has the advantage that it is reliable even in the case of small data sets and therefore it is possible to avoid the assumption of stationarity, which allows for a measure of predictability even when the underlying system dynamics may change character. Evaluations of several leading Kp prediction models indicate that their performances are sub-optimal during active times. We discuss possible improvements of these models based on this nonparametric approach

  10. Strategy for an assessment of cumulative ecological impacts

    International Nuclear Information System (INIS)

    Boucher, P.; Collins, J.; Nelsen, J.

    1995-01-01

    The US Department of Energy (DOE) has developed a strategy to conduct an assessment of the cumulative ecological impact of operations at the 300-square-mile Savannah River Site. This facility has over 400 identified waste units and contains several large watersheds. In addition to individual waste units, residual contamination must be evaluated in terms of its contribution to ecological risks at zonal and site-wide levels. DOE must be able to generate sufficient information to facilitate cleanup in the immediate future within the context of a site-wide ecological risk assessment that may not be completed for many years. The strategy superimposes a more global perspective on ecological assessments of individual waste units and provides strategic underpinnings for conducting individual screening-level and baseline risk assessments at the operable unit and zonal or watershed levels. It identifies ecological endpoints and risk assessment tools appropriate for each level of the risk assessment. In addition, it provides a clear mechanism for identifying clean sites through screening-level risk assessments and for elevating sites with residual contamination to the next level of assessment. Whereas screening-level and operable unit-level risk assessments relate directly to cleanup, zonal and site-wide assessments verity or confirm the overall effectiveness of remediation. The latter assessments must show, for example, whether multiple small areas with residual pesticide contamination that have minimal individual impact would pose a cumulative risk from bioaccumulation because they are within the habitat range of an ecological receptor

  11. The new UA1 calorimeter trigger

    International Nuclear Information System (INIS)

    Eisenhandler, E.

    1988-01-01

    The new UA1 first-level calorimeter trigger processor is described, with emphasis on the fast two-dimensional electromagnetic cluster-finding that is its most novel feature. This processor is about five times more powerful than its predecessor, and makes extensive use of pipelining techniques. It allows multiple combinations of triggers on electromagnetic showers, hadronic jets and energy sums, including a total-energy veto of multiple interactions and a full vector sum of missing transverse energy. (author)

  12. The upgrade of the LHCb trigger system

    CERN Document Server

    INSPIRE-00259834; Fitzpatrick, C.; Gligorov, V.; Raven, G.

    2014-10-20

    The LHCb experiment will operate at a luminosity of $2\\times10^{33}$ cm$^{-2}$s$^{-1}$ during LHC Run 3. At this rate the present readout and hardware Level-0 trigger become a limitation, especially for fully hadronic final states. In order to maintain a high signal efficiency the upgraded LHCb detector will deploy two novel concepts: a triggerless readout and a full software trigger.

  13. 76 FR 81490 - Agency Information Collection Activities; Proposed Collection; Comment Request; Contractor...

    Science.gov (United States)

    2011-12-28

    ... Activities; Proposed Collection; Comment Request; Contractor Cumulative Claim and Reconciliation (Renewal... identify the Docket ID Number EPA-HQ-OARM-2011-0997, Contractor Cumulative Claim and Reconciliation. Hand... information collection activity or ICR does this apply to? Affected entities: All contractors who have...

  14. EMIC triggered chorus emissions in Cluster data

    Science.gov (United States)

    Grison, B.; SantolíK, O.; Cornilleau-Wehrlin, N.; Masson, A.; Engebretson, M. J.; Pickett, J. S.; Omura, Y.; Robert, P.; Nomura, R.

    2013-03-01

    Electromagnetic ion cyclotron (EMIC) triggered chorus emissions have recently been a subject of several experimental, theoretical and simulation case studies, noting their similarities with whistler-mode chorus. We perform a survey of 8 years of Cluster data in order to increase the database of EMIC triggered emissions. The results of this is that EMIC triggered emissions have been unambiguously observed for only three different days. These three events are studied in detail. All cases have been observed at the plasmapause between 22 and 24 magnetic local time (MLT) and between - 15° and 15° magnetic latitude (λm). Triggered emissions are also observed for the first time below the local He+ gyrofrequency (fHe+). The number of events is too low to produce statistical results, nevertheless we point out a variety of common properties of those waves. The rising tones have a high level of coherence and the waves propagate away from the equatorial region. The propagation angle and degree of polarization are related to the distance from the equator, whereas the slope and the frequency extent vary from one event to the other. From the various spacecraft separations, we determine that the triggering process is a localized phenomenon in space and time. However, we are unable to determine the occurrence rates of these waves. Small frequency extent rising tones are more common than large ones. The newly reported EMIC triggered events are generally observed during periods of large AE index values and in time periods close to solar maximum.

  15. Progress on the Level-1 Calorimeter Trigger

    CERN Multimedia

    Eric Eisenhandler

    The Level-1 Calorimeter Trigger (L1Calo) has recently passed a number of major hurdles. The various electronic modules that make up the trigger are either in full production or are about to be, and preparations in the ATLAS pit are well advanced. L1Calo has three main subsystems. The PreProcessor converts analogue calorimeter signals to digital, associates the rather broad trigger pulses with the correct proton-proton bunch crossing, and does a final calibration in transverse energy before sending digital data streams to the two algorithmic trigger processors. The Cluster Processor identifies and counts electrons, photons and taus, and the Jet/Energy-sum Processor looks for jets and also sums missing and total transverse energy. Readout drivers allow the performance of the trigger to be monitored online and offline, and also send region-of-interest information to the Level-2 Trigger. The PreProcessor (Heidelberg) is the L1Calo subsystem with the largest number of electronic modules (124), and most of its fu...

  16. Triggered tremor sweet spots in Alaska

    Science.gov (United States)

    Gomberg, Joan; Prejean, Stephanie

    2013-01-01

    To better understand what controls fault slip along plate boundaries, we have exploited the abundance of seismic and geodetic data available from the richly varied tectonic environments composing Alaska. A search for tremor triggered by 11 large earthquakes throughout all of seismically monitored Alaska reveals two tremor “sweet spots”—regions where large-amplitude seismic waves repeatedly triggered tremor between 2006 and 2012. The two sweet spots locate in very different tectonic environments—one just trenchward and between the Aleutian islands of Unalaska and Akutan and the other in central mainland Alaska. The Unalaska/Akutan spot corroborates previous evidence that the region is ripe for tremor, perhaps because it is located where plate-interface frictional properties transition between stick-slip and stably sliding in both the dip direction and laterally. The mainland sweet spot coincides with a region of complex and uncertain plate interactions, and where no slow slip events or major crustal faults have been noted previously. Analyses showed that larger triggering wave amplitudes, and perhaps lower frequencies (tremor. However, neither the maximum amplitude in the time domain or in a particular frequency band, nor the geometric relationship of the wavefield to the tremor source faults alone ensures a high probability of triggering. Triggered tremor at the two sweet spots also does not occur during slow slip events visually detectable in GPS data, although slow slip below the detection threshold may have facilitated tremor triggering.

  17. Hierarchical trigger of the ALICE calorimeters

    CERN Document Server

    Muller, Hans; Novitzky, Norbert; Kral, Jiri; Rak, Jan; Schambach, Joachim; Wang, Ya-Ping; Wang, Dong; Zhou, Daicui

    2010-01-01

    The trigger of the ALICE electromagnetic calorimeters is implemented in 2 hierarchically connected layers of electronics. In the lower layer, level-0 algorithms search shower energy above threshold in locally confined Trigger Region Units (TRU). The top layer is implemented as a single, global trigger unit that receives the trigger data from all TRUs as input to the level-1 algorithm. This architecture was first developed for the PHOS high pT photon trigger before it was adopted by EMCal also for the jet trigger. TRU units digitize up to 112 analogue input signals from the Front End Electronics (FEE) and concentrate their digital stream in a single FPGA. A charge and time summing algorithm is combined with a peakfinder that suppresses spurious noise and is precise to single LHC bunches. With a peak-to-peak noise level of 150 MeV the linear dynamic range above threshold spans from MIP energies at 215 up to 50 GeV. Local level-0 decisions take less than 600 ns after LHC collisions, upon which all TRUs transfer ...

  18. Level-1 Calorimeter Trigger starts firing

    CERN Multimedia

    Stephen Hillier

    2007-01-01

    L1Calo is one of the major components of ATLAS First Level trigger, along with the Muon Trigger and Central Trigger Processor. It forms all of the first-level calorimeter-based triggers, including electron, jet, tau and missing ET. The final system consists of over 250 custom designed 9U VME boards, most containing a dense array of FPGAs or ASICs. It is subdivided into a PreProcessor, which digitises the incoming trigger signals from the Liquid Argon and Tile calorimeters, and two separate processor systems, which perform the physics algorithms. All of these are highly flexible, allowing the possibility to adapt to beam conditions and luminosity. All parts of the system are read out through Read-Out Drivers, which provide monitoring data and Region of Interest (RoI) information for the Level-2 trigger. Production of the modules is now essentially complete, and enough modules exist to populate the full scale system in USA15. Installation is proceeding rapidly - approximately 90% of the final modules are insta...

  19. The LHCb trigger and data acquisition system

    CERN Document Server

    Dufey, J P; Harris, F; Harvey, J; Jost, B; Mato, P; Müller, E

    2000-01-01

    The LHCb experiment is the most recently approved of the 4 experiments under construction at CERNs LHC accelerator. It is a special purpose experiment designed to precisely measure the CP violation parameters in the B-B system. Triggering poses special problems since the interesting events containing B-mesons are immersed in a large background of inelastic p-p reactions. We therefore decided to implement a 4 level triggering scheme. The LHCb Data Acquisition (DAQ) system will have to cope with an average trigger rate of ~40 kHz, after two levels of hardware triggers, and an average event size of ~100 kB. Thus an event-building network which can sustain an average bandwidth of 4 GB/s is required. A powerful software trigger farm will have to be installed to reduce the rate from the 40 kHz to ~100 Hz of events written to permanent storage. In this paper we outline the general architecture of the Trigger and DAQ system and the readout protocols we plan to implement. First results of simulations of the behavior o...

  20. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  1. Cumulative impacts of hydroelectric development on the fresh water balance in Hudson Bay

    International Nuclear Information System (INIS)

    Anctil, F.; Couture, R.

    1994-01-01

    A study is presented of the impacts of hydroelectric development on the surface water layer of Hudson Bay, including James Bay and the Foxe Basin. These impacts are directly related to the modifications in the fresh water balance of Hudson Bay and originate from the management of hydroelectric complexes. The fresh water balance is determined by identifying, at different scales, the modifications caused by each complex. The main inputs are the freezing and thawing of the ice cover, runoff water, and mass exchange at the air-water interface. Three spatial scales were used to obtain the resolution required to document the cumulative effects of fresh water balance modifications on the water surface layer, one each for Hudson Bay, Hudson Strait, and the Labrador Sea. Finally, the addition of the proposed Great Whale hydroelectric complex is examined from the available information and forecasts. 18 refs,. 6 figs., 1 tab

  2. Gradual and Cumulative Improvements to the Classical Differential Evolution Scheme through Experiments

    Directory of Open Access Journals (Sweden)

    Anescu George

    2016-12-01

    Full Text Available The paper presents the experimental results of some tests conducted with the purpose to gradually and cumulatively improve the classical DE scheme in both efficiency and success rate. The modifications consisted in the randomization of the scaling factor (a simple jitter scheme, a more efficient Random Greedy Selection scheme, an adaptive scheme for the crossover probability and a resetting mechanism for the agents. After each modification step, experiments have been conducted on a set of 11 scalable, multimodal, continuous optimization functions in order to analyze the improvements and decide the new improvement direction. Finally, only the initial classical scheme and the constructed Fast Self-Adaptive DE (FSA-DE variant were compared with the purpose of testing their performance degradation with the increase of the search space dimension. The experimental results demonstrated the superiority of the proposed FSA-DE variant.

  3. Design studies for the Double Chooz trigger

    International Nuclear Information System (INIS)

    Cucoanes, Andi Sebastian

    2009-01-01

    The main characteristic of the neutrino mixing effect is assumed to be the coupling between the flavor and the mass eigenstates. Three mixing angles (θ 12 , θ 23 , θ 13 ) are describing the magnitude of this effect. Still unknown, θ 13 is considered very small, based on the measurement done by the CHOOZ experiment. A leading experiment will be Double Chooz, placed in the Ardennes region, on the same site as used by CHOOZ. The Double Chooz goal is the exploration of ∝80% from the currently allowed θ 13 region, by searching the disappearance of reactor antineutrinos. Double Chooz will use two similar detectors, located at different distances from the reactor cores: a near one at ∝150 m where no oscillations are expected and a far one at 1.05 km distance, close to the first minimum of the survival probability function. The measurement foresees a precise comparison of neutrino rates and spectra between both detectors. The detection mechanism is based on the inverse β-decay. The Double Chooz detectors have been designed to minimize the rate of random background. In a simplified view, two optically separated regions are considered. The target, filled with Gd-doped liquid scintillator, is the main antineutrino interaction volume. Surrounding the target, the inner veto region aims to tag the cosmogenic muon background which hits the detector. Both regions are viewed by photomultipliers. The Double Chooz trigger system has to be highly efficient for antineutrino events as well as for several types of background. The trigger analyzes discriminated signals from the central region and the inner veto photomultipliers. The trigger logic is fully programmable and can combine the input signals. The trigger conditions are based on the total energy released in event and on the PMT groups multiplicity. For redundancy, two independent trigger boards will be used for the central region, each of them receiving signals from half of the photomultipliers. A third trigger board

  4. Design studies for the Double Chooz trigger

    Energy Technology Data Exchange (ETDEWEB)

    Cucoanes, Andi Sebastian

    2009-07-24

    The main characteristic of the neutrino mixing effect is assumed to be the coupling between the flavor and the mass eigenstates. Three mixing angles ({theta}{sub 12}, {theta}{sub 23}, {theta}{sub 13}) are describing the magnitude of this effect. Still unknown, {theta}{sub 13} is considered very small, based on the measurement done by the CHOOZ experiment. A leading experiment will be Double Chooz, placed in the Ardennes region, on the same site as used by CHOOZ. The Double Chooz goal is the exploration of {proportional_to}80% from the currently allowed {theta}{sub 13} region, by searching the disappearance of reactor antineutrinos. Double Chooz will use two similar detectors, located at different distances from the reactor cores: a near one at {proportional_to}150 m where no oscillations are expected and a far one at 1.05 km distance, close to the first minimum of the survival probability function. The measurement foresees a precise comparison of neutrino rates and spectra between both detectors. The detection mechanism is based on the inverse {beta}-decay. The Double Chooz detectors have been designed to minimize the rate of random background. In a simplified view, two optically separated regions are considered. The target, filled with Gd-doped liquid scintillator, is the main antineutrino interaction volume. Surrounding the target, the inner veto region aims to tag the cosmogenic muon background which hits the detector. Both regions are viewed by photomultipliers. The Double Chooz trigger system has to be highly efficient for antineutrino events as well as for several types of background. The trigger analyzes discriminated signals from the central region and the inner veto photomultipliers. The trigger logic is fully programmable and can combine the input signals. The trigger conditions are based on the total energy released in event and on the PMT groups multiplicity. For redundancy, two independent trigger boards will be used for the central region, each of

  5. LHCb: The LHCb Trigger Architecture beyond LS1

    CERN Multimedia

    Albrecht, J; Neubert, S; Raven, G; Sokoloff, M D; Williams, M

    2013-01-01

    The LHCb experiment is a spectrometer dedicated to the study of heavy flavor at the LHC. The rate of proton-proton collisions at the LHC is 15 MHz, but resource limitations mean that only 5 kHz can be written to storage for offline analytsis. For this reason the LHCb data acquisition system -- trigger -- plays a key role in selecting signal events and rejecting background. In contrast to previous experiments at hadron colliders like for example CDF or D0, the bulk of the LHCb trigger is implemented in software and deployed on a farm of 20k parallel processing nodes. This system, called the High Level Trigger (HLT) is responsible for reducing the rate from the maximum at which the detector can be read out, 1.1 MHz, to the 5 kHz which can be processed offline,and has 20 ms in which to process and accept/reject each event. In order to minimize systematic uncertainties, the HLT was designed from the outset to reuse the offline reconstruction and selection code. During the long shutdown it is proposed to extend th...

  6. Constraining the trigger for an ancient warming episode

    Science.gov (United States)

    Schultz, Colin

    2011-08-01

    The Paleocene epoch (˜66-56 million years ago) was sandwiched between sudden climate shifts and mass extinctions. The boundary between the end of the Paleocene and the beginning of the Eocene (the P-E boundary) saw the global average temperature soar by 5°C over a few thousand years, leading to a pronounced reorganization of both terrestrial and oceanic plant and animal communities. The P-E boundary warming was triggered by an influx of atmospheric carbon dioxide, but the influx's ultimate trigger is still being debated. Other prominent warming events within the Paleogene (˜66-23 million years ago), the broad time span that encompasses the Paleocene and Eocene, have been linked to regularly recurring changes in the eccentricity of the Earth's orbit that take place on 100,000- and 405,000-year cycles. Proponents of this view suggest that an alignment of the two cycles could lead to the warming of deep ocean waters, melting frozen methane and triggering an increase in atmospheric carbon dioxide. However, some studies have suggested that the P-E boundary warming was instead the product of geological processes, where carbon-rich rocks were baked by injected magma, which eventually liberated the carbon to the atmosphere. Deciding between proposed explanations for the cause of the P-E warming, whether they are astronomical or geological, depends on accurately pinning the event in time. (Geochemistry, Geophysics, Geosystems, doi:10.1029/2010GC003426, 2011)

  7. Shale Gas Development and Brook Trout: Scaling Best Management Practices to Anticipate Cumulative Effects

    Science.gov (United States)

    Smith, David; Snyder, Craig D.; Hitt, Nathaniel P.; Young, John A.; Faulkner, Stephen P.

    2012-01-01

    Shale gas development may involve trade-offs between energy development and benefits provided by natural ecosystems. However, current best management practices (BMPs) focus on mitigating localized ecological degradation. We review evidence for cumulative effects of natural gas development on brook trout (Salvelinus fontinalis) and conclude that BMPs should account for potential watershed-scale effects in addition to localized influences. The challenge is to develop BMPs in the face of uncertainty in the predicted response of brook trout to landscape-scale disturbance caused by gas extraction. We propose a decision-analysis approach to formulating BMPs in the specific case of relatively undisturbed watersheds where there is consensus to maintain brook trout populations during gas development. The decision analysis was informed by existing empirical models that describe brook trout occupancy responses to landscape disturbance and set bounds on the uncertainty in the predicted responses to shale gas development. The decision analysis showed that a high efficiency of gas development (e.g., 1 well pad per square mile and 7 acres per pad) was critical to achieving a win-win solution characterized by maintaining brook trout and maximizing extraction of available gas. This finding was invariant to uncertainty in predicted response of brook trout to watershed-level disturbance. However, as the efficiency of gas development decreased, the optimal BMP depended on the predicted response, and there was considerable potential value in discriminating among predictive models through adaptive management or research. The proposed decision-analysis framework provides an opportunity to anticipate the cumulative effects of shale gas development, account for uncertainty, and inform management decisions at the appropriate spatial scales.

  8. The NA62 Liquid Krypton Electromagnetic Calorimeter Level 0 Trigger

    CERN Document Server

    INSPIRE-00293812; Paoluzzi, Giovanni; Salamon, Andrea; Salina, Gaetano; Santovetti, Emanuele; Scarfi, Francesco M.; Bonaiuto, Vincenzo; Sargeni, Fausto

    2012-01-01

    The NA62 experiment at CERN SPS aims to measure the Branching Ratio of the very rare kaon decay K+ -> pi+ nu nubar collecting O(100) events with a 10% background to make a stringent test of the Standard Model. One of the main backgrounds to the proposed measurement is represented by the K+ -> pi+ pi0 decay. To suppress this background an efficient photo veto system is foreseen. In the 1-10 mrad angular region the NA48 high performance liquid krypton electromagnetic calorimeter is used. The design, implementation and current status of the Liquid Krypton Electromagnetic Calorimeter Level 0 Trigger are presented.

  9. CMS Level-1 Upgrade Calorimeter Trigger Prototype Development

    CERN Document Server

    Klabbers, Pamela Renee

    2013-01-01

    As the LHC increases luminosity and energy, it will become increasingly difficult to select interesting physics events and remain within the readout bandwidth limitations. An upgrade to the CMS Calorimeter Trigger implementing more complex algorithms is proposed. It utilizes AMC cards with Xilinx FPGAs running in micro-TCA crate with card interconnections via crate backplanes and optical links operating at up to 10 Gbps. Prototype cards with Virtex-6 and Virtex-7 FPGAs have been built and software frameworks for operation and monitoring developed. The physics goals, hardware architectures, and software will be described in this talk. More details can be found in a separate poster at this conference.

  10. The NA62 Liquid Krypton Electromagnetic Calorimeter Level 0 Trigger

    CERN Document Server

    INSPIRE-00646848; Fucci, Adolfo; Paoluzzi, Giovanni; Salamon, Andrea; Salina, Gaetano; Santovetti, Emanuele; Scarfi, Francesco M.; Sargeni, Fausto

    2011-01-01

    The NA62 experiment at CERN SPS aims to measure the Branching Ratio of the very rare kaon decay K+ -> pi+ nu nubar collecting O(100) events with a 10% background to make a stringent test of the Standard Model. One of the main backgrounds to the proposed measurement is represented by the K+ -> pi+ pi0 decay. To suppress this background an efficient photo veto system is foreseen. In the 1-10 mrad angular region the NA48 high performance liquid krypton electromagnetic calorimeter is used. The design, implementation and current status of the Liquid Krypton Electromagnetic Calorimeter Level 0 Trigger are presented.

  11. Towards a Level-1 tracking trigger for the ATLAS experiment at the High Luminosity LHC

    CERN Document Server

    Martin, T A D; The ATLAS collaboration

    2014-01-01

    At the high luminosity HL-LHC, upwards of 160 individual proton-proton interactions (pileup) are expected per bunch-crossing at luminosities of around $5\\times10^{34}$ cm$^{-2}$s$^{-1}$. A proposal by the ATLAS collaboration to split the ATLAS first level trigger in to two stages is briefly detailed. The use of fast track finding in the new first level trigger is explored as a method to provide the discrimination required to reduce the event rate to acceptable levels for the read out system while maintaining high efficiency on the selection of the decay products of electroweak bosons at HL-LHC luminosities. It is shown that available bandwidth in the proposed new strip tracker is sufficiency for a region of interest based track trigger given certain optimisations, further methods for improving upon the proposal are discussed.

  12. Empirical rainfall thresholds for the triggering of landslides in Asturias (NW Spain)

    Science.gov (United States)

    Valenzuela, Pablo; Luís Zêzere, José; José Domínguez-Cuesta, María; Mora García, Manuel Antonio

    2017-04-01

    Rainfall-triggered landslides are common and widespread phenomena in Asturias, a mountainous region in the NW of Spain where the climate is characterized by average annual precipitation and temperature values of 960 mm and 13.3°C respectively. Different types of landslides (slides, flows and rockfalls) frequently occur during intense rainfall events, causing every year great economic losses and sometimes human injuries or fatalities. For this reason, its temporal forecast is of great interest. The main goal of the present research is the calculation of empirical rainfall thresholds for the triggering of landslides in the Asturian region, following the methodology described by Zêzere et al., 2015. For this purpose, data from 559 individual landslides collected from press archives during a period of eight hydrological years (October 2008-September 2016) and gathered within the BAPA landslide database (http://geol.uniovi.es/BAPA) were used. Precipitation data series of 37 years came from 6 weather stations representative of the main geographical and climatic conditions within the study area. Applied methodology includes: (i) the definition of landslide events, (ii) the reconstruction of the cumulative antecedent rainfall for each event from 1 to 90 consecutive days, (iii) the estimation of the return period for each cumulated rainfall-duration condition using Gumbel probability distribution, (iv) the definition of the critical cumulated rainfall-duration conditions taking into account the highest return period, (v) the calculation of the thresholds considering both the conditions for the occurrence and non-occurrence of landslides. References: Zêzere, J.L., Vaz, T., Pereira, S., Oliveira, S.C., Marqués, R., García, R.A.C. 2015. Rainfall thresholds for landslide activity in Portugal: a state of the art. Environmental Earth Sciences, 73, 2917-2936. doi: 10.1007/s12665-014-3672-0

  13. Cumulative effects in Swedish EIA practice - difficulties and obstacles

    International Nuclear Information System (INIS)

    Waernbaeck, Antoienette; Hilding-Rydevik, Tuija

    2009-01-01

    The importance of considering cumulative effects (CE) in the context of environmental assessment is manifested in the EU regulations. The demands on the contents of Environmental Impact Assessment (EIA) and Strategic Environmental Assessment (SEA) documents explicitly ask for CE to be described. In Swedish environmental assessment documents CE are rarely described or included. The aim of this paper is to look into the reasons behind this fact in the Swedish context. The paper describes and analyse how actors implementing the EIA and SEA legislation in Sweden perceive the current situation in relation to the legislative demands and the inclusion of cumulative effects. Through semi-structured interviews the following questions have been explored: Is the phenomenon of CE discussed and included in the EIA/SEA process? What do the actors include in and what is their knowledge of the term and concept of CE? Which difficulties and obstacles do these actors experience and what possibilities for inclusion of CE do they see in the EIA/SEA process? A large number of obstacles and hindrances emerged from the interviews conducted. It can be concluded from the analysis that the will to act does seem to exist. A lack of knowledge in respect of how to include cumulative effects and a lack of clear regulations concerning how this should be done seem to be perceived as the main obstacles. The knowledge of the term and the phenomenon is furthermore quite narrow and not all encompassing. They experience that there is a lack of procedures in place. They also seem to lack knowledge of methods in relation to how to actually work, in practice, with CE and how to include CE in the EIA/SEA process. It can be stated that the existence of this poor picture in relation to practice concerning CE in the context of impact assessment mirrors the existing and so far rather vague demands in respect of the inclusion and assessment of CE in Swedish EIA and SEA legislation, regulations, guidelines and

  14. Technical Note: SCUDA: A software platform for cumulative dose assessment

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seyoun; McNutt, Todd; Quon, Harry; Wong, John; Lee, Junghoon, E-mail: rshekhar@childrensnational.org, E-mail: junghoon@jhu.edu [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Plishker, William [IGI Technologies, Inc., College Park, Maryland 20742 (United States); Shekhar, Raj, E-mail: rshekhar@childrensnational.org, E-mail: junghoon@jhu.edu [IGI Technologies, Inc., College Park, Maryland 20742 and Sheikh Zayed Institute for Pediatric Surgical Innovation, Children’s National Health System, Washington, DC 20010 (United States)

    2016-10-15

    Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides

  15. A balanced solution to the cumulative threat of industrialized wind farm development on cinereous vultures (Aegypius monachus in south-eastern Europe.

    Directory of Open Access Journals (Sweden)

    Dimitris P Vasilakis

    Full Text Available Wind farm development can combat climate change but may also threaten bird populations' persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms than currently (operating wind farms, equivalent to 44% of the current population (103 individuals if all proposals are authorized (2744 MW. Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW, cumulative collision mortality would still be high (17% of current population and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2% caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our 'win-win' approach is appropriate to other potential conflicts where wind farms may cumulatively threaten

  16. A balanced solution to the cumulative threat of industrialized wind farm development on cinereous vultures (Aegypius monachus) in south-eastern Europe

    Science.gov (United States)

    Whitfield, D. Philip; Kati, Vassiliki

    2017-01-01

    Wind farm development can combat climate change but may also threaten bird populations’ persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2%) caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our ‘win-win’ approach is appropriate to other potential conflicts where wind farms may cumulatively threaten wildlife

  17. Neighborhood-targeted and case-triggered use of a single dose of oral cholera vaccine in an urban setting: Feasibility and vaccine coverage.

    Science.gov (United States)

    Parker, Lucy A; Rumunu, John; Jamet, Christine; Kenyi, Yona; Lino, Richard Laku; Wamala, Joseph F; Mpairwe, Allan M; Muller, Vincent; Llosa, Augusto E; Uzzeni, Florent; Luquero, Francisco J; Ciglenecki, Iza; Azman, Andrew S

    2017-06-01

    In June 2015, a cholera outbreak was declared in Juba, South Sudan. In addition to standard outbreak control measures, oral cholera vaccine (OCV) was proposed. As sufficient doses to cover the at-risk population were unavailable, a campaign using half the standard dosing regimen (one-dose) targeted high-risk neighborhoods and groups including neighbors of suspected cases. Here we report the operational details of this first public health use of a single-dose regimen of OCV and illustrate the feasibility of conducting highly targeted vaccination campaigns in an urban area. Neighborhoods of the city were prioritized for vaccination based on cumulative attack rates, active transmission and local knowledge of known cholera risk factors. OCV was offered to all persons older than 12 months at 20 fixed sites and to select groups, including neighbors of cholera cases after the main campaign ('case-triggered' interventions), through mobile teams. Vaccination coverage was estimated by multi-stage surveys using spatial sampling techniques. 162,377 individuals received a single-dose of OCV in the targeted neighborhoods. In these neighborhoods vaccine coverage was 68.8% (95% Confidence Interval (CI), 64.0-73.7) and was highest among children ages 5-14 years (90.0%, 95% CI 85.7-94.3), with adult men being less likely to be vaccinated than adult women (Relative Risk 0.81, 95% CI: 0.68-0.96). In the case-triggered interventions, each lasting 1-2 days, coverage varied (range: 30-87%) with an average of 51.0% (95% CI 41.7-60.3). Vaccine supply constraints and the complex realities where cholera outbreaks occur may warrant the use of flexible alternative vaccination strategies, including highly-targeted vaccination campaigns and single-dose regimens. We showed that such campaigns are feasible. Additional work is needed to understand how and when to use different strategies to best protect populations against epidemic cholera.

  18. How to Choose? Using the Delphi Method to Develop Consensus Triggers and Indicators for Disaster Response.

    Science.gov (United States)

    Lis, Rebecca; Sakata, Vicki; Lien, Onora

    2017-08-01

    To identify key decisions along the continuum of care (conventional, contingency, and crisis) and the critical triggers and data elements used to inform those decisions concerning public health and health care response during an emergency. A classic Delphi method, a consensus-building survey technique, was used with clinicians around Washington State to identify regional triggers and indicators. Additionally, using a modified Delphi method, we combined a workshop and single-round survey with panelists from public health (state and local) and health care coalitions to identify consensus state-level triggers and indicators. In the clinical survey, 122 of 223 proposed triggers or indicators (43.7%) reached consensus and were deemed important in regional decision-making during a disaster. In the state-level survey, 110 of 140 proposed triggers or indicators (78.6%) reached consensus and were deemed important in state-level decision-making during a disaster. The identification of consensus triggers and indicators for health care emergency response is crucial in supporting a comprehensive health care situational awareness process. This can inform the creation of standardized questions to ask health care, public health, and other partners to support decision-making during a response. (Disaster Med Public Health Preparedness. 2017;11:467-472).

  19. Cumulative trauma and symptom complexity in children: a path analysis.

    Science.gov (United States)

    Hodges, Monica; Godbout, Natacha; Briere, John; Lanktree, Cheryl; Gilbert, Alicia; Kletzka, Nicole Taylor

    2013-11-01

    Multiple trauma exposures during childhood are associated with a range of psychological symptoms later in life. In this study, we examined whether the total number of different types of trauma experienced by children (cumulative trauma) is associated with the complexity of their subsequent symptomatology, where complexity is defined as the number of different symptom clusters simultaneously elevated into the clinical range. Children's symptoms in six different trauma-related areas (e.g., depression, anger, posttraumatic stress) were reported both by child clients and their caretakers in a clinical sample of 318 children. Path analysis revealed that accumulated exposure to multiple different trauma types predicts symptom complexity as reported by both children and their caretakers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Cumulative growth of minor hysteresis loops in the Kolmogorov model

    International Nuclear Information System (INIS)

    Meilikhov, E. Z.; Farzetdinova, R. M.

    2013-01-01

    The phenomenon of nonrepeatability of successive remagnetization cycles in Co/M (M = Pt, Pd, Au) multilayer film structures is explained in the framework of the Kolmogorov crystallization model. It is shown that this model of phase transitions can be adapted so as to adequately describe the process of magnetic relaxation in the indicated systems with “memory.” For this purpose, it is necessary to introduce some additional elements into the model, in particular, (i) to take into account the fact that every cycle starts from a state “inherited” from the preceding cycle and (ii) to assume that the rate of growth of a new magnetic phase depends on the cycle number. This modified model provides a quite satisfactory qualitative and quantitative description of all features of successive magnetic relaxation cycles in the system under consideration, including the surprising phenomenon of cumulative growth of minor hysteresis loops.