WorldWideScience

Sample records for lhc experiments refinements

  1. LHC Experiments: refinements for the restart

    CERN Multimedia

    2009-01-01

    As the LHC restart draws closer, the Bulletin will be taking a look at how the six LHC experiments are preparing and what they have been up to since last September. In this issue we start with a roundup of the past 10 months of activity at CMS and ATLAS, both technical work and outreach activities.

  2. Towards LHC experiments

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    As plans for the LHC proton collider to be built in CERN's 27-kilometre LEP tunnel take shape, interest widens to bring in the experiments exploiting the big machine. The first public presentations of 'expressions of interest' for LHC experiments featured from 5-8 March at Evian-les-Bains on the shore of Lake Geneva, some 50 kilometres from CERN, at the special Towards the LHC Experimental Programme' meeting

  3. Dashboard for the LHC experiments

    International Nuclear Information System (INIS)

    Andreeva, J; Cirstoiu, C; Miguel, M D F D; Ivanchenko, A; Gaidioz, B; Herrala, J; Janulis, M; Maier, G; Maguire, E J; Rivera, R P; Rocha, R; Saiz, P; Sidorova, I; Belov, S; Berejnoj, A; Kodolova, O; Chen, Y; Chen, T; Chiu, S; Munro, C

    2008-01-01

    In this paper we present the Experiment Dashboard monitoring system, which is currently in use by four Large Hadron Collider (LHC) experiments. The goal of the Experiment Dashboard is to monitor the activities of the LHC experiments on the distributed infrastructure, providing monitoring data from the virtual organization (VO) and user perspectives. The LHC experiments are using various Grid infrastructures (LCG/EGEE, OSG, NDGF) with correspondingly various middleware flavors and job submission methods. Providing a uniform and complete view of various activities like job processing, data movement and publishing, access to distributed databases regardless of the underlying Grid flavor is the challenging task. In this paper we will describe the Experiment Dashboard concept, its framework and main monitoring applications

  4. Support for the LHC experiments

    CERN Document Server

    Butin, François; Gastal, M; Lacarrère, D; Macina, D; Perrot, A L; Tsesmelis, E; Wilhelmsson, M; CERN. Geneva. TS Department

    2008-01-01

    Experimental Area Teams have been put in place and charged with the general co-ordination and management of the LHC experimental areas and of the zones in the LHC tunnel hosting near-beam detectors of the experiments. This organization is responsible for the in situ co-ordination of work with the aim of providing a structure that enables the experiment collaborations and accelerator groups to carry out their work effectively and safely. This presentation will review some key elements in the support given to the LHC experimental areas and, given the track record and successful implementation during the LHC installation and commissioning phase, will argue that such an organization structure will be required also for the period of LHC exploitation for physics.

  5. submitter LHC experiments

    CERN Document Server

    Tanaka, Shuji

    2001-01-01

    Large Hadron Collider (LHC) is under construction at the CERN Laboratory in Switzerland. Four experiments (ATLAS, CMS, LHCb, ALICE) will try to study the new physics by LHC from 2006. Its goal to explore the fundamental nature of matter and the basic forces. The PDF file of the transparency is located on http://www-atlas.kek.jp/sub/documents/lepsymp-stanaka.pdf.

  6. Electronics for LHC experiments

    International Nuclear Information System (INIS)

    Bourgeois, Francois

    1995-01-01

    Full text: A major effort is being mounted to prepare the way handling the high interaction rates expected from CERN's new LHC proton-proton collider (see, for example, November, page 6). September saw the First Workshop on Electronics for LHC Experiments, organized by Lisbon's Particle Physics Instrumentation Laboratory (LIP) on behalf of CERN's LHC Electronics Review Board (LERB - March, page 2). Its purpose was not only for the LERB to have a thorough review of ongoing activities, but also to promote cross fertilization in the engineering community involved in electronics design for LHC experiments. The Workshop gathered 187 physicists and engineers from 20 countries including USA and Japan. The meeting comprised six sessions and 82 talks, with special focus on radiation-hard microelectronic processes, electronics for tracking, calorimetry and muon detectors, optoelectronics, trigger and data acquisition systems. Each topic was introduced by an invited speaker who reviewed the requirements set by the particular detector technology at LHC. At the end of each session, panel discussions were chaired by each invited speaker. Representatives from four major integrated circuit manufacturers covered advanced radiation hard processes. Two talks highlighted the importance of obsolescence and quality systems in the long-lived and demanding environment of LHC. The Workshop identified areas and encouraged efforts for rationalization and common developments within and between the different detector groups. As a result, it will also help ensure the reliability and the long term maintainability of installed equipment. The proceedings of the Workshop are available from LIP Lisbon*. The LERB Workshop on Electronics for LHC Experiments will become a regular event, with the second taking place in Hungary, by Lake Balaton, from 23-27 September 1996. The Hungarian institutes KFKIRMKI have taken up the challenge of being as successful as LIP Lisbon in the organization

  7. LHC-B: a dedicated LHC collider beauty experiment

    International Nuclear Information System (INIS)

    Erhan, S.

    1995-01-01

    LHC-B is a forward detector optimized for the study of CP-violation and other rare phenomena in the decays of beauty particles at the LHC. An open geometry forward detector design, with good mass, vertex resolution and particle identification, will facilitate the collection of a large numbers of event samples in diverse B decay channels and allow for a thorough understanding of the systematic uncertainties. With the expected large event statistics, LHC-B will be able to test the closure of the unitarity triangle and make sensitive tests of the Standard Model description of CP-violation. Here we describe the experiment and summarize its anticipated performance. (orig.)

  8. The LHC Tier1 at PIC: Experience from first LHC run

    International Nuclear Information System (INIS)

    Flix, J.; Perez-Calero Yzquierdo, A.; Accion, E.; Acin, V.; Acosta, C.; Bernabeu, G.; Bria, A.; Casals, J.; Caubet, M.; Cruz, R.; Delfino, M.; Espinal, X.; Lanciotti, E.; Lopez, F.; Martinez, F.; Mendez, V.; Merino, G.; Pacheco, A.; Planas, E.; Porto, M. C.; Rodriguez, B.; Sedov, A.

    2013-01-01

    This paper summarizes the operational experience of the Tier1 computer center at Port d'Informacio Cientifica (PIC) supporting the commissioning and first run (Run1) of the Large Hadron Collider (LHC). The evolution of the experiment computing models resulting from the higher amounts of data expected after there start of the LHC are also described. (authors)

  9. The LHC machine-experiment interface

    CERN Multimedia

    CERN. Geneva; Tsesmelis, Emmanuel; Brüning, Oliver Sim

    2002-01-01

    This series of three lectures will provide an overview of issues arising at the interface between the LHC machine and the experiments, which are required for guiding the interaction between the collider and the experiments when operation of the LHC commences. A basic description of the LHC Collider and its operating parameters, such as its energy, currents, bunch structure and luminosity, as well as variations on these parameters, will be given. Furthermore, the optics foreseen for the experimental insertions, the sources and intensities of beam losses and the running-in scenarios for the various phases of operation will be discussed. A second module will cover the specific requirements and expectations of each experiment in terms of the layout of experimental areas, the matters related to radiation monitoring and shielding, the design of the beam pipe and the vacuum system, alignment issues and the measurement of the total cross-section and absolute luminosity by the experiments. Finally an analysis of infor...

  10. 6. workshop on electronics for LHC experiments. Proceedings

    International Nuclear Information System (INIS)

    2000-01-01

    The purpose of the workshop was to review the electronics for LHC experiments and to identify areas and encourage common efforts for the development of electronics within and between the different LHC experiments and to promote collaboration in the engineering and physics communities involved in the LHC activities. (orig.)

  11. 6. workshop on electronics for LHC experiments. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-10-25

    The purpose of the workshop was to review the electronics for LHC experiments and to identify areas and encourage common efforts for the development of electronics within and between the different LHC experiments and to promote collaboration in the engineering and physics communities involved in the LHC activities. (orig.)

  12. Highlights from LHC experiments and future perspectives

    International Nuclear Information System (INIS)

    Campana, P.

    2016-01-01

    The experiments at LHC are collecting a large amount of data in a kinematic of the (x, Q 2 ) variables never accessed before. Boosted by LHC analyses, Quantum Chromodynamics (QCD) is experiencing an impressive progress in the last few years, and even brighter perspectives can be foreseen for the future data taking. A subset of the most recent results from the LHC experiments in the area of QCD (both perturbative and soft) are reviewed

  13. Cryogenics for LHC experiments

    CERN Multimedia

    2001-01-01

    Cryogenic systems will be used by LHC experiments to maximize their performance. Institutes around the world are collaborating with CERN in the construction of these very low temperature systems. The cryogenic test facility in hall 180 for ATLAS magnets. High Energy Physics experiments have frequently adopted cryogenic versions of their apparatus to achieve optimal performance, and those for the LHC will be no exception. The two largest experiments for CERN's new flagship accelerator, ATLAS and CMS, will both use large superconducting magnets operated at 4.5 Kelvin - almost 270 degrees below the freezing point of water. ATLAS also includes calorimeters filled with liquid argon at 87 Kelvin. For the magnets, the choice of a cryogenic version was dictated by a combination economy and transparency to emerging particles. For the calorimeters, liquid argon was selected as the fluid best suited to the experiment's physics requirements. High Energy Physics experiments are the result of worldwide collaborations and...

  14. Academic Training: The LHC machine /experiment interface

    CERN Multimedia

    Françoise Benz

    2005-01-01

    2004-2005 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 18, 19, 20, 21 & 22 April from 11.00 to 12.00 hrs - Main Auditorium, bldg. 500 The LHC machine /experiment interface S. TAPPROGGE, Univ. of Mainz, D, R. ASSMANN, CERN-AB E. TSESMELIS and D. MACINA, CERN-TS This series of lectures will cover some of the major issues at the boundary between the LHC machine and the experiments: 1) The physics motivation and expectations of the experiments regarding the machine operation. This will include an overview of the LHC physics programme (in pp and PbPb collisions), of the experimental signatures (from high pT objects to leading nucleons) and of the expected trigger rates as well as the data sets needed for specific measurements. Furthermore, issues related to various modes of operation of the machine (e.g. bunch spacings of 25 ns. vs. 75 ns.) and special requirements of the detectors for their commissioning will be described. 2) The LHC machine aspects: introduction of the main LHC parameters and discu...

  15. Academic Training: The LHC machine /experiment interface

    CERN Multimedia

    Françoise Benz

    2005-01-01

    2004-2005 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 18, 19, 20, 21 & 22 April from 11.00 to 12.00 hrs - Main Auditorium, bldg. 500 The LHC machine /experiment interface S. TAPPROGGE, Univ. of Mainz, D, R. ASSMANN, CERN-AB E. TSESMELIS and D. MACINA, CERN-TS This series of lectures will cover some of the major issues at the boundary between the LHC machine and the experiments: 1) The physics motivation and expectations of the experiments regarding the machine operation. This will include an overview of the LHC physics programme (in pp and PbPb collisions), of the experimental signatures (from high pT objects to leading nucleons) and of the expected trigger rates as well as the data sets needed for specific measurements. Furthermore, issues related to various modes of operation of the machine (e.g. bunch spacings of 25 ns. vs. 75 ns.) and special requirements of the detectors for their commissioning will be described. 2) The LHC machine aspects: introduction of the main LHC parameters and disc...

  16. LHC Accelerator Fault Tracker - First Experience

    CERN Document Server

    Apollonio, Andrea; Roderick, Chris; Schmidt, Ruediger; Todd, Benjamin; Wollmann, Daniel

    2016-01-01

    Availability is one of the key performance indicators of LHC operation, being directly correlated with integrated luminosity production. An effective tool for availability tracking is a necessity to ensure a coherent capture of fault information and relevant dependencies on operational modes and beam parameters. At the beginning of LHC Run 2 in 2015, the Accelerator Fault Tracking (AFT) tool was deployed at CERN to track faults or events affecting LHC operation. Information derived from the AFT is crucial for the identification of areas to improve LHC availability, and hence LHC physics production. For the 2015 run, the AFT has been used by members of the CERN Availability Working Group, LHC Machine coordinators and equipment owners to identify the main contributors to downtime and to understand the evolution of LHC availability throughout the year. In this paper the 2015 experience with the AFT for availability tracking is summarised and an overview of the first results as well as an outlook to future develo...

  17. Optical data transmission ASICs for the high-luminosity LHC (HL-LHC) experiments

    International Nuclear Information System (INIS)

    Li, X; Huang, G; Sun, X; Liu, G; Deng, B; Gong, D; Guo, D; Liu, C; Liu, T; Xiang, A C; Ye, J; Zhao, X; Chen, J; You, Y; He, M; Hou, S; Teng, P-K; Jin, G; Liang, H; Liang, F

    2014-01-01

    We present the design and test results of two optical data transmission ASICs for the High-Luminosity LHC (HL-LHC) experiments. These ASICs include a two-channel serializer (LOCs2) and a single-channel Vertical Cavity Surface Emitting Laser (VCSEL) driver (LOCld1V2). Both ASICs are fabricated in a commercial 0.25-μm Silicon-on-Sapphire (SoS) CMOS technology and operate at a data rate up to 8 Gbps per channel. The power consumption of LOCs2 and LOCld1V2 are 1.25 W and 0.27 W at 8-Gbps data rate, respectively. LOCld1V2 has been verified meeting the radiation-tolerance requirements for HL-LHC experiments

  18. Experiment Dashboard for Monitoring of the LHC Distributed Computing Systems

    International Nuclear Information System (INIS)

    Andreeva, J; Campos, M Devesas; Cros, J Tarragon; Gaidioz, B; Karavakis, E; Kokoszkiewicz, L; Lanciotti, E; Maier, G; Ollivier, W; Nowotka, M; Rocha, R; Sadykov, T; Saiz, P; Sargsyan, L; Sidorova, I; Tuckett, D

    2011-01-01

    LHC experiments are currently taking collisions data. A distributed computing model chosen by the four main LHC experiments allows physicists to benefit from resources spread all over the world. The distributed model and the scale of LHC computing activities increase the level of complexity of middleware, and also the chances of possible failures or inefficiencies in involved components. In order to ensure the required performance and functionality of the LHC computing system, monitoring the status of the distributed sites and services as well as monitoring LHC computing activities are among the key factors. Over the last years, the Experiment Dashboard team has been working on a number of applications that facilitate the monitoring of different activities: including following up jobs, transfers, and also site and service availabilities. This presentation describes Experiment Dashboard applications used by the LHC experiments and experience gained during the first months of data taking.

  19. Physics perspectives with AFTER@LHC (A Fixed Target ExpeRiment at LHC

    Directory of Open Access Journals (Sweden)

    Massacrier L.

    2018-01-01

    Full Text Available AFTER@LHC is an ambitious fixed-target project in order to address open questions in the domain of proton and neutron spins, Quark Gluon Plasma and high-x physics, at the highest energy ever reached in the fixed-target mode. Indeed, thanks to the highly energetic 7 TeV proton and 2.76 A.TeV lead LHC beams, center-of-mass energies as large as sNN = 115 GeV in pp/pA and sNN = 72 GeV in AA can be reached, corresponding to an uncharted energy domain between SPS and RHIC. We report two main ways of performing fixed-target collisions at the LHC, both allowing for the usage of one of the existing LHC experiments. In these proceedings, after discussing the projected luminosities considered for one year of data taking at the LHC, we will present a selection of projections for light and heavy-flavour production.

  20. Photodetection in the LHC experiments

    International Nuclear Information System (INIS)

    Joram, C.

    2012-01-01

    The challenging requirements on photodetection in the LHC experiments have motivated large-scale R and D efforts on various detector technologies, which started already in the 1990s. The state-of-the-art of the LEP era would not have allowed satisfying the demanding needs, particularly from calorimetry and particle identification. After almost two decades of intense development, construction and integration efforts, the LHC and its four major experiments are performing in a just exemplary manner, in many respects exceeding the expectations. Hundreds of thousands of photodetectors with millions of readout channels contribute to this success story. This article aims at reviewing the main activities in photodetection, the initial achieved performance as well as some consolidation and first upgrade efforts.

  1. Detector technologies for LHC experiments

    CERN Document Server

    Hansl-Kozanecka, Traudl

    1999-01-01

    Abstract The Large Hadron Collider (LHC) at CERN will provide proton-proton collisions ata centre-of-mass energy of 14 TeV with a design luminosity of 10^34cm^-2s^-1. The exploitation of the rich physics potential is illustrated using the expected performance of the two general-purpose detectors ATLAS and CMS.The lecture introduces the physics motivation for experiments at the LHC energy.The design parameters and expected performance of the LHC machine are then discussed, followed by the design objectives for the detectors. The technical solutions are presented for each detector system (calorimetry, muon system, inner tracker, trigger). For each system the requirements, the technology choices and the achieved and expected performance are discussed. Lectures given at Herbstschule fu:r Hochenergiephysik, Maria Laach, 1999Copies of the transparencies are available in reduced format (black-and-white) from the secretariats of ATLAS and CMS (1999-093 Talk). A full-size colour version is available for consultation.e...

  2. Strategy for the procurement of electronics for the LHC experiments

    CERN Document Server

    2002-01-01

    At its meeting on 14 March 2001 the Finance Committee requested the preparation of a document outlining the strategy for future procurement of electronics for the LHC experiments. The bulk of the electronics for the LHC experiments is based on custom-developed designs, the manufacturing of which will be contracted out to industry using the CERN purchasing procedures to ensure competitive prices. Analysis of on-going procurement activities for the electronics for the LHC experiments shows that in almost all cases the application of the CERN purchasing procedures has resulted in bids from a sufficient number of qualified companies to ensure competitive prices and a reasonable distribution of returns between CERN Member States. There is no reason to expect that this pattern will change significantly for the electronics that still remains to be purchased to complete the construction of the LHC experiments.

  3. Electronics for LHC Experiments

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This document gathers the abstracts of most presentations made at this workshop on electronics for the large hadron collider (LHC) experiments. The presentations were arranged into 6 sessions: 1) electronics for tracker, 2) trigger electronics, 3) detector control systems, 4) data acquisition, 5) electronics for calorimeters and electronics for muons, and 6) links, power systems, grounding and shielding, testing and quality assurance.

  4. Electronics for LHC Experiments

    International Nuclear Information System (INIS)

    2004-01-01

    This document gathers the abstracts of most presentations made at this workshop on electronics for the large hadron collider (LHC) experiments. The presentations were arranged into 6 sessions: 1) electronics for tracker, 2) trigger electronics, 3) detector control systems, 4) data acquisition, 5) electronics for calorimeters and electronics for muons, and 6) links, power systems, grounding and shielding, testing and quality assurance

  5. LHC experiences close encounters with UFOs

    CERN Multimedia

    Mike Lamont for the LHC Team

    2011-01-01

    On 29 May, yet another record was set as 1092 bunches per beam were injected into the LHC, hitting a peak luminosity of 1.26x1033 cm-2 s-1. While running at 3.5 TeV each beam now packs a total energy of over 70 MJ – equivalent to a TGV travelling at a 70 kph.   Operators in the LHC Control Centre happily show off their display screens after succesfully injecting 1092 bunches injected into the machine for the first time.  As the total beam intensity has been pushed up, the LHC has encountered a number of related problems, such as the so-called UFOs (Unidentified Falling Objects). These are thought to be dust particles falling through the beam, causing localized beam loss. The losses can push nearby beam loss monitors over the threshold and dump the beam. This is more of an annoyance than a danger for the LHC, but UFOs do reduce the operational efficiency of the machine. Despite this, the luminosity delivered to the experiments has steadily increased. On three occasions there ha...

  6. Tracking considerations for fixed target B experiments at SSC and LHC

    International Nuclear Information System (INIS)

    McManus, A.P.; Conetti, S.; Corti, G.; Cox, B.; Dukes, E.C.; Lawry, T.; Nelson, K.; Tzamouranis, I.

    1993-01-01

    Fixed target beauty (B) experiments proposed at the SSC or LHC come in two basic types. Extracted beam experiments use a bent crystal of silicon or some other method to extract a beam of protons parasitically from the circulating beam as the collider experiments are taking data. The two chief extracted beam experiments are the LHB collaboration at the LHC and the SFT collaboration at the SSC. The second type of fixed target experiment places the detector around the circulating beam using a gas jet or thin wire(s) as a target. The (GAJET) experiment proposed at CERN for LHC and the Hera-B experiment at DESY are of this type

  7. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  8. Experience Supporting the Integration of LHC Experiments Software Framework with the LCG Middleware

    CERN Document Server

    Santinelli, Roberto

    2006-01-01

    The LHC experiments are currently preparing for data acquisition in 2007 and because of the large amount of required computing and storage resources, they decided to embrace the grid paradigm. The LHC Computing Project (LCG) provides and operates a computing infrastructure suitable for data handling, Monte Carlo production and analysis. While LCG offers a set of high level services, intended to be generic enough to accommodate the needs of different Virtual Organizations, the LHC experiments software framework and applications are very specific and focused on the computing and data models. The LCG Experiment Integration Support team works in close contact with the experiments, the middleware developers and the LCG certification and operations teams to integrate the underlying grid middleware with the experiment specific components. The strategical position between the experiments and the middleware suppliers allows EIS team to play a key role at communications level between the customers and the service provi...

  9. Review of 2011 LHC run from the experiments perspective

    Energy Technology Data Exchange (ETDEWEB)

    Ferro-Luzzi, M [European Organization for Nuclear Research, Geneva (Switzerland)

    2012-07-01

    The 2011 LHC run is reviewed from the experiments' perspective. The LHC achievements directly related to physics production are summarized. This includes high luminosity p-p and Pb-Pb running, special activities (such as intermediate energy p-p physics, 90 m optics, luminosity calibrations) and other experiments (for example satellite-main bunch collisions in IP2, 25 ns stable beams tests, etc.). (author)

  10. Elastic scattering of protons at the TOTEM experiment at the LHC

    CERN Document Server

    AUTHOR|(CDS)2080719; Csanád, Máté; Niewiadomski, Hubert

    The TOTEM experiment at the LHC at CERN is optimized to measure elastic and diffractive scattering at the LHC and measures the total proton-proton cross-section with\tthe luminosity-independent method. The TOTEM experiment uses the special technique of movable beam pipe insertions -- called Roman Pots -- to detect very forward protons. The reconstruction of the forward proton kinematics requires the precise understanding of the LHC beam optics. A new method of LHC optics determination is reported, which exploits kinematical distributions of elastically scattered proton-proton data measured by the Roman Pots of the TOTEM experiment. The method has been successfully applied to data samples recorded since 2010. The interpretation of the proton-proton elastic differential cross-section is a challenging task. The geometrical model of proton-proton elastic scattering of Bialas and Bzdak is fitted to ISR data and to data measured by the TOTEM experiment at LHC energy of $\\sqrt{s}=7$~TeV. The Bialas-Bzdak model is g...

  11. TOTEM, a different LHC experiment

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    TOTEM will pursue a physics program (complementary to that of the other LHC detectors) spanning a wide range from total cross-section and elastic scattering measurements to the study of diffractive and forward phenomena. The TOTEM program will lead to a better understanding of the fundamental aspects of strong interactions. For the first time at hadron colliders, the very forward rapidity range, containing 90% of the energy flow and explored in high-energy cosmic ray experiments, is covered, allowing the search for unusual phenomena hinted at by cosmic ray experiments. The technical implementation of all TOTEM detectors is described. Silicon sensors housed in so-called Roman pots allow measurements of elastic and diffractive protons at distances as small as 1 mm from the beam centre. A scheme to tag events from Double-Pomeron-Exchange by diffractive protons on both sides transforms the LHC into an almost clean “gluon” collider, where the centre-of-mass energy is determined by the momentum losses of the ...

  12. Machine Protection for the Experiments of the LHC

    CERN Document Server

    Appleby, R B

    2010-01-01

    The LHC stored beam contains 362 MJ of energy at the top beam energy of 7 TeV/c, presenting a significant risk to the components of the machine and the detectors. In response to this threat, a sophisticated system of machine protection has been developed to minimize the danger, and detect potentially dangerous situations. In this paper, the protection of the experiments in the LHC from the machine is considered, focusing on pilot beam strikes on the experiments during injection and on the dynamics of hardware failure with a circulating beam, with detailed time-domain calculations performed for LHC ring power converter failures and magnet quenches. The prospects for further integration of the machine protection and experimental protection systems are considered, along with the risk to nearbeam detectors from closed local bumps.

  13. The LHC test string first operational experience

    CERN Document Server

    Bézaguet, Alain-Arthur; Casas-Cubillos, J; Coull, L; Cruikshank, P; Dahlerup-Petersen, K; Faugeras, Paul E; Flemsæter, B; Guinaudeau, H; Hagedorn, Dietrich; Hilbert, B; Krainz, G; Kos, N; Lavielle, D; Lebrun, P; Leo, G; Mathewson, A G; Missiaen, D; Momal, F; Parma, Vittorio; Quesnel, Jean Pierre; Richter, D; Riddone, G; Rijllart, A; Rodríguez-Mateos, F; Rohmig, P; Saban, R I; Schmidt, R; Serio, L; Skiadelli, M; Suraci, A; Tavian, L; Walckiers, L; Wallén, E; Van Weelderen, R; Williams, L; McInturff, A

    1996-01-01

    CERN operates the first version of the LHC Test String which consists of one quadrupole and three 10-m twin aperture dipole magnets. An experimental programme aiming at the validation of the LHC systems started in February 1995. During this programme the string has been powered 100 times 35 of which at 12.4 kA or above. The experiments have yielded a number of results some of which, like quench recovery for cryogenics, have modified the design of subsystems of LHC. Others, like controlled helium leaks in the cold bore and quench propagation bewteen magnets, have given a better understanding on the evolution of the phenomena inside a string of superconducting magnets cooled at superfluid helium temperatures. Following the experimental programme, the string will be powered up and powered down in one hour cycles as a fatigue test of the structure thus simulating 20 years of operation of LHC.

  14. Detector techniques and data acquisition for LHC experiments

    CERN Document Server

    AUTHOR|(CDS)2071367; Cittolin, Sergio; CERN. Geneva

    1996-01-01

    An overview of the technologies for LHC tracking detectors, particle identification and calorimeters will be given. In addition, the requirements of the front-end readout electronics for each type of detector will be addressed. The latest results from the R&D studies in each of the technologies will be presented. The data handling techniques needed to read out the LHC detectors and the multi-level trigger systems used to select the events of interest will be described. An overview of the LHC experiments data acquisition architectures and their current state of developments will be presented.

  15. Data storage accounting and verification at LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Huang, C. H. [Fermilab; Lanciotti, E. [CERN; Magini, N. [CERN; Ratnikova, N. [Moscow, ITEP; Sanchez-Hernandez, A. [CINVESTAV, IPN; Serfon, C. [Munich U.; Wildish, T. [Princeton U.; Zhang, X. [Beijing, Inst. High Energy Phys.

    2012-01-01

    All major experiments at the Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for resource management, planning, and operations. To verify the consistency of central catalogs, experiments are asking sites to provide a full list of the files they have on storage, including size, checksum, and other file attributes. Such storage dumps, provided at regular intervals, give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. The use of common solutions helps to reduce the maintenance costs, both at the large Tier1 facilities supporting multiple virtual organizations and at the small sites that often lack manpower. We discuss requirements and solutions to the common tasks of data storage accounting and verification, and present experiment-specific strategies and implementations used within the LHC experiments according to their computing models.

  16. Access safety systems - New concepts from the LHC experience

    International Nuclear Information System (INIS)

    Ladzinski, T.; Delamare, C.; Luca, S. di; Hakulinen, T.; Hammouti, L.; Havart, F.; Juget, J.F.; Ninin, P.; Nunes, R.; Riesco, T.; Sanchez-Corral Mena, E.; Valentini, F.

    2012-01-01

    The LHC Access Safety System has introduced a number of new concepts into the domain of personnel protection at CERN. These can be grouped into several categories: organisational, architectural and concerning the end-user experience. By anchoring the project on the solid foundations of the IEC 61508/61511 methodology, the CERN team and its contractors managed to design, develop, test and commission on time a SIL3 safety system. The system uses a successful combination of the latest Siemens redundant safety programmable logic controllers with a traditional relay logic hard wired loop. The external envelope barriers used in the LHC include personnel and material access devices, which are interlocked door-booths introducing increased automation of individual access control, thus removing the strain from the operators. These devices ensure the inviolability of the controlled zones by users not holding the required credentials. To this end they are equipped with personnel presence detectors and the access control includes a state of the art bio-metry check. Building on the LHC experience, new projects targeting the refurbishment of the existing access safety infrastructure in the injector chain have started. This paper summarises the new concepts introduced in the LHC access control and safety systems, discusses the return of experience and outlines the main guiding principles for the renewal stage of the personnel protection systems in the LHC injector chain in a homogeneous manner. (authors)

  17. CMS experiment at the LHC Commissioning and early physics

    CERN Document Server

    Safonov, A

    2010-01-01

    The CMS collaboration used the past year to greatly improve the level of detector readiness for the first collisions data. The acquired operational experience over this year, large gains in understanding the detector and improved preparedness for early physics will be instrumental in minimizing the time from the first collisions to first LHC physics. The following describes the status of the CMS experiment and outlines early physics plans with the first LHC data.

  18. LHC experiments present new results at Quark Matter 2011 Conference

    CERN Multimedia

    CERN Press Office

    2011-01-01

    The three LHC experiments that study lead ion collisions all presented their latest results today at the annual Quark Matter conference, held this year in Annecy, France. The results are based on analysis of data collected during the last two weeks of the 2010 LHC run, when the LHC switched from protons to lead-ions. All experiments report highly subtle measurements, bringing heavy-ion physics into a new era of high precision studies.   Events recorded by the ALICE experiment from the first lead ion collisions (Nov-Dec 2010). “These results from the LHC lead ion programme are already starting bring new understanding of the primordial universe,” said CERN Director General Rolf Heuer. “The subtleties they are already seeing are very impressive.” In its infancy, just microseconds after the Big Bang, the universe consisted of a plasma of quarks and gluons (QGP), the fundamental building blocks of matter. By colliding heavy ions, physicists can turn back time an...

  19. The ALICE experiment at the CERN LHC

    Energy Technology Data Exchange (ETDEWEB)

    Aamodt, K [Department of Physics, University of Oslo, Oslo (Norway); Abrahantes Quintana, A [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear (CEADEN), Madrid/Havana, Spain (Cuba); Achenbach, R [Kirchhoff-Institut fuer Physik, Ruprecht-Karls-Universitaet Heidelberg, Heidelberg, Germany BMBF (Germany); Acounis, S [SUBATECH, Ecole des Mines de Nantes, Universite de Nantes, CNRS/IN2P3, Nantes (France); Adamova, D [Academy of Sciences of the Czech Republic, Nuclear Physics Institute, Rez/Prague (Czech Republic); Adler, C [Physikalisches Institut, Ruprecht-Karls-Universitaet Heidelberg, Heidelberg, Germany BMBF (Germany); Aggarwal, M [Physics Department, Panjab University, Chandigarh (India); Agnese, F [IPHC, Universite Louis Pasteur, CNRS/IN2P3, Strasbourg (France); Rinella, G Aglieri [CERN, European Organization for Nuclear Reasearch, Geneva (Switzerland); Ahammed, Z [Variable Energy Cyclotron Centre, Kolkata (India); Ahmad, A; Ahmad, N; Ahmad, S [Department of Physics Aligarh Muslim University, Aligarh (India); Akindinov, A [Institute for Theoretical and Experimental Physics, Moscow (Russian Federation); Akishin, P [JINR, Joint Institute for Nuclear Research, Dubna, (Russian Federation); Aleksandrov, D [Russian Research Center Kurchatov Institute, Moscow (Russian Federation); Alessandro, B; Alfarone, G [Sezione INFN, Torino (Italy); Alfaro, R [Instituto de Fisica, Universidad Nacional Autonoma de Mexico, Mexico City (Mexico); Alici, A [Dipartimento di Fisica dell' Universita and Sezione INFN, Bologna (Italy)], E-mail: Hans-Ake.Gustafsson@hep.lu.se (and others)

    2008-08-15

    ALICE (A Large Ion Collider Experiment) is a general-purpose, heavy-ion detector at the CERN LHC which focuses on QCD, the strong-interaction sector of the Standard Model. It is designed to address the physics of strongly interacting matter and the quark-gluon plasma at extreme values of energy density and temperature in nucleus-nucleus collisions. Besides running with Pb ions, the physics programme includes collisions with lighter ions, lower energy running and dedicated proton-nucleus runs. ALICE will also take data with proton beams at the top LHC energy to collect reference data for the heavy-ion programme and to address several QCD topics for which ALICE is complementary to the other LHC detectors. The ALICE detector has been built by a collaboration including currently over 1000 physicists and engineers from 105 Institutes in 30 countries. Its overall dimensions are 16 x 16 x 26 m{sup 3} with a total weight of approximately 10 000 t. The experiment consists of 18 different detector systems each with its own specific technology choice and design constraints, driven both by the physics requirements and the experimental conditions expected at LHC. The most stringent design constraint is to cope with the extreme particle multiplicity anticipated in central Pb-Pb collisions. The different subsystems were optimized to provide high-momentum resolution as well as excellent Particle Identification (PID) over a broad range in momentum, up to the highest multiplicities predicted for LHC. This will allow for comprehensive studies of hadrons, electrons, muons, and photons produced in the collision of heavy nuclei. Most detector systems are scheduled to be installed and ready for data taking by mid-2008 when the LHC is scheduled to start operation, with the exception of parts of the Photon Spectrometer (PHOS), Transition Radiation Detector (TRD) and Electro Magnetic Calorimeter (EMCal). These detectors will be completed for the high-luminosity ion run expected in 2010

  20. Academic Training: Technological challenges for LHC experiments, the CMS example

    CERN Multimedia

    Françoise Benz

    2005-01-01

    2004-2005 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 28 February, 1, 2, 3 & 4 March from 11.00 to 12.00 hrs - Main Auditorium, bldg. 500 Technological challenges for LHC experiments, the CMS example by P. SPHICAS/CERN-PH, G. DISSERTORI/ETH, Zürich, Ch. M. MANNELLI/CERN-PH, G. HALL/Imperial College, London. GB, P. FABBRICATORE/INFN, Genova, I Monday 28 February Design principles and performances of CMS P. Sphicas/CERN-PH Tuesday 1st March Crystal calorimetry in LHC environment G. Dissertori/ETH Zürich, CH Wednesday 2 March Silicon tracking in LHC environment M. Mannelli/CERN-PH Thursday 3 March Radhard fast electronics for LHC experiments G. Hall/Imperial College London, GB Friday 4 March Design principles of thin high field superconducting solenoids P. Fabbricatore/INFN Genova, I ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch

  1. ATLAS. LHC experiments

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    In Greek mythology, Atlas was a Titan who had to hold up the heavens with his hands as a punishment for having taken part in a revolt against the Olympians. For LHC, the ATLAS detector will also have an onerous physics burden to bear, but this is seen as a golden opportunity rather than a punishment. The major physics goal of CERN's LHC proton-proton collider is the quest for the long-awaited£higgs' mechanism which drives the spontaneous symmetry breaking of the electroweak Standard Model picture. The large ATLAS collaboration proposes a large general-purpose detector to exploit the full discovery potential of LHC's proton collisions. LHC will provide proton-proton collision luminosities at the aweinspiring level of 1034 cm2 s~1, with initial running in at 1033. The ATLAS philosophy is to handle as many signatures as possible at all luminosity levels, with the initial running providing more complex possibilities. The ATLAS concept was first presented as a Letter of Intent to the LHC Committee in November 1992. Following initial presentations at the Evian meeting (Towards the LHC Experimental Programme') in March of that year, two ideas for generalpurpose detectors, the ASCOT and EAGLE schemes, merged, with Friedrich Dydak (MPI Munich) and Peter Jenni (CERN) as ATLAS cospokesmen. Since the initial Letter of Intent presentation, the ATLAS design has been optimized and developed, guided by physics performance studies and the LHC-oriented detector R&D programme (April/May, page 3). The overall detector concept is characterized by an inner superconducting solenoid (for inner tracking) and large superconducting air-core toroids outside the calorimetry. This solution avoids constraining the calorimetry while providing a high resolution, large acceptance and robust detector. The outer magnet will extend over a length of 26 metres, with an outer diameter of almost 20 metres. The total weight of the detector is 7,000 tonnes. Fitted with its end

  2. New strategies of the LHC experiments to meet the computing requirements of the HL-LHC era

    CERN Document Server

    Adamova, Dagmar

    2017-01-01

    The performance of the Large Hadron Collider (LHC) during the ongoing Run 2 is above expectations both concerning the delivered luminosity and the LHC live time. This resulted in a volume of data much larger than originally anticipated. Based on the current data production levels and the structure of the LHC experiment computing models, the estimates of the data production rates and resource needs were re-evaluated for the era leading into the High Luminosity LHC (HLLHC), the Run 3 and Run 4 phases of LHC operation. It turns out that the raw data volume will grow 10 times by the HL-LHC era and the processing capacity needs will grow more than 60 times. While the growth of storage requirements might in principle be satisfied with a 20 per cent budget increase and technology advancements, there is a gap of a factor 6 to 10 between the needed and available computing resources. The threat of a lack of computing and storage resources was present already in the beginning of Run 2, but could still be mitigated, e.g....

  3. Spin and diffractive physics with a fixed-target experiment at the LHC (AFTER-LHC)

    Energy Technology Data Exchange (ETDEWEB)

    Lorce, C.; Chambert, V.; Didelez, J. P.; Genolini, B.; Hadjidakis, C.; Lansberg, J. P.; Rosier, P. [IPNO, Universite Paris-Sud, CNRS/IN2P3, F-91406, Orsay (France); Anselmino, M.; Arnaldi, R.; Scomparin, E. [INFN Sez. Torino, Via P. Giuria 1,1-10125, Torino (Italy); Brodsky, S. J. [SLAC National Accelerator Laboratory, Stanford U, Stanford, CA 94309, (United States); Ferreiro, E. G. [Departamento de Fisica de Particulas, Univ. de Santiago de C, 15782 Santiago de C (Spain); Fleuret, F. [Laboratoire Leprince Ringuet, Ecole Polytechnique, CNRS/IN2P3, 91128 Palaiseau (France); Rakotozafindrabe, A. [IRFU/SPhN, CFA Society, 91191 Gifsur-Yvette Cedex (France); Schienbein, I. [LPSC, Universite Joseph Fourier, CNRS/IN2P3/INPG, F-38026 Grenoble (France); Uggerhoj, U. I. [Department of Physics and Astronomy, University of Aarhus (Denmark)

    2013-04-15

    We report on the spin and diffractive physics at a future multi-purpose f xed-target experiment with proton and lead LHC beams extracted by a bent crystal. The LHC multi-TeV beams allow for the most energetic f xed-target experiments ever performed, opening new domains of particle and nuclear physics and complementing that of collider physics, in particular that of RHIC and the EIC projects. The luminosity achievable with AFTER using typical targets would surpass that of RHIC by more than 3 orders of magnitude. The f xed-target mode has the advantage to allow for measurements of single-spin asymmetries with polarized target as well as of single-diffractive processes in the target region.

  4. Spin and diffractive physics with a fixed-target experiment at the LHC (AFTER-LHC)

    International Nuclear Information System (INIS)

    Lorcé, C.; Chambert, V.; Didelez, J. P.; Genolini, B.; Hadjidakis, C.; Lansberg, J. P.; Rosier, P.; Anselmino, M.; Arnaldi, R.; Scomparin, E.; Brodsky, S. J.; Ferreiro, E. G.; Fleuret, F.; Rakotozafindrabe, A.; Schienbein, I.; Uggerhøj, U. I.

    2013-01-01

    We report on the spin and diffractive physics at a future multi-purpose f xed-target experiment with proton and lead LHC beams extracted by a bent crystal. The LHC multi-TeV beams allow for the most energetic f xed-target experiments ever performed, opening new domains of particle and nuclear physics and complementing that of collider physics, in particular that of RHIC and the EIC projects. The luminosity achievable with AFTER using typical targets would surpass that of RHIC by more than 3 orders of magnitude. The f xed-target mode has the advantage to allow for measurements of single-spin asymmetries with polarized target as well as of single-diffractive processes in the target region.

  5. The Physics Programme Of The MoEDAL Experiment At The LHC

    CERN Document Server

    Acharya, B.; Bernabeu, J.; Campbell, M.; Cecchini, S.; Chwastowski, J.; De Montigny, M.; Derendarz, D.; De Roeck, A.; Ellis, J.R.; Fairbairn, M.; Felea, D.; Frank, M.; Frekers, D.; Garcia, C.; Giacomelli, G.; Giorgini, M.; Hasegan, D.; Hott, T.; J.Jak\\r u; Katre, A.; Kim, D-W.; King, M.G.L.; Kinoshita, K.; Lacarrere, D.; Lee, S.C.; Leroy, C.; Margiotta, A.; Mauri, N.; Mavromatos, N.E.; Mermod, P.; Mitsou, V.A.; Orava, R.; Pasqualini, L.; Patrizii, L.; Pavalas, G.E.; Pinfold, J.L.; Platkevic, M.; Popa, V.; Pozzato, M.; Pospisil, S.; Rajantie, A.; Sahnoun, Z.; Sakellariadou, M.; Sarkar, S.; Semenoff, G.; Sirri, G.; Sliwa, K.; Soluk, R.; Spurio, M.; Srivastava, Y.N.; Staszewski, R.; Swain, J.; Tenti, M.; Togo, V.; Trzebinski, M.; Tuszynski, J.A.; Vento, V.; Vives, O.; Vykydal, Z.; Widom, A.; Yoon, J.H.

    2014-01-01

    The MoEDAL experiment at Point 8 of the LHC ring is the seventh and newest LHC experiment. It is dedicated to the search for highly ionizing particle avatars of physics beyond the Standard Model, extending significantly the discovery horizon of the LHC. A MoEDAL discovery would have revolutionary implications for our fundamental understanding of the Microcosm. MoEDAL is an unconventional and largely passive LHC detector comprised of the largest array of Nuclear Track Detector stacks ever deployed at an accelerator, surrounding the intersection region at Point 8 on the LHC ring. Another novel feature is the use of paramagnetic trapping volumes to capture both electrically and magnetically charged highly-ionizing particles predicted in new physics scenarios. It includes an array of TimePix pixel devices for monitoring highly-ionizing particle backgrounds. The main passive elements of the MoEDAL detector do not require a trigger system, electronic readout, or online computerized data acquisition. The aim of this...

  6. Handbook of LHC Higgs Cross Sections: 3. Higgs Properties Report of the LHC Higgs Cross Section Working Group

    CERN Document Server

    Heinemeyer, S; Passarino, G; Tanaka, R; Andersen, J R; Artoisenet, P; Bagnaschi, E A; Banfi, A; Becher, T; Bernlochner, F U; Bolognesi, S; Bolzoni, P; Boughezal, R; Buarque, D; Campbell, J; Caola, F; Carena, M; Cascioli, F; Chanon, N; Cheng, T; Choi, S Y; David, A; de Aquino, P; Degrassi, G; Del Re, D; Denner, A; van Deurzen, H; Diglio, S; Di Micco, B; Di Nardo, R; Dittmaier, S; Dührssen, M; Ellis, R K; Ferrera, G; Fidanza, N; Flechl, M; de Florian, D; Forte, S; Frederix, R; Frixione, S; Gangal, S; Gao, Y; Garzelli, M V; Gillberg, D; Govoni, P; Grazzini, M; Greiner, N; Griffiths, J; Gritsan, A V; Grojean, C; Hall, D C; Hays, C; Harlander, R; Hernandez-Pinto, R; Höche, S; Huston, J; Jubb, T; Kadastik, M; Kallweit, S; Kardos, A; Kashif, L; Kauer, N; Kim, H; Klees, R; Krämer, M; Krauss, F; Laureys, A; Laurila, S; Lehti, S; Li, Q; Liebler, S; Liu, X; Logan, E; Luisoni, G; Malberti, M; Maltoni, F; Mawatari, K; Maierhoefer, F; Mantler, H; Martin, S; Mastrolia, P; Mattelaer, O; Mazzitelli, J; Mellado, B; Melnikov, K; Meridiani, P; Miller, D J; Mirabella, E; Moch, S O; Monni, P; Moretti, N; Mück, A; Mühlleitner, M; Musella, P; Nason, P; Neu, C; Neubert, M; Oleari, C; Olsen, J; Ossola, G; Peraro, T; Peters, K; Petriello, F; Piacquadio, G; Potter, C T; Pozzorini, S; Prokofiev, K; Puljak, I; Rauch, M; Rebuzzi, D; Reina, L; Rietkerk, R; Rizzi, A; Rotstein-Habarnau, Y; Salam, G P; Sborlini, G; Schissler, F; Schönherr, M; Schulze, M; Schumacher, M; Siegert, F; Slavich, P; Smillie, J M; Stål, O; von Soden-Fraunhofen, J F; Spira, M; Stewart, I W; Tackmann, F J; Taylor, P T E; Tommasini, D; Thompson, J; Thorne, R S; Torrielli, P; Tramontano, F; Tran, N V; Trócsányi, Z; Ubiali, M; Vazquez Acosta, M; Vickey, T; Vicini, A; Waalewijn, W J; Wackeroth, D; Wagner, C; Walsh, J R; Wang, J; Weiglein, G; Whitbeck, A; Williams, C; Yu, J; Zanderighi, G; Zanetti, M; Zaro, M; Zerwas, P M; Zhang, C; Zirke, T J E; Zuberi, S

    2013-01-01

    This Report summarizes the results of the activities in 2012 and the first half of 2013 of the LHC Higgs Cross Section Working Group. The main goal of the working group was to present the state of the art of Higgs Physics at the LHC, integrating all new results that have appeared in the last few years. This report follows the first working group report Handbook of LHC Higgs Cross Sections: 1. Inclusive Observables (CERN-2011-002) and the second working group report Handbook of LHC Higgs Cross Sections: 2. Differential Distributions (CERN-2012-002). After the discovery of a Higgs boson at the LHC in mid-2012 this report focuses on refined prediction of Standard Model (SM) Higgs phenomenology around the experimentally observed value of 125-126 GeV, refined predictions for heavy SM-like Higgs bosons as well as predictions in the Minimal Supersymmetric Standard Model and first steps to go beyond these models. The other main focus is on the extraction of the characteristics and properties of the newly discovered p...

  7. Characterization and performance optimization of radiation monitoring sensors for high energy physics experiments at the CERN LHC and Super-LHC

    CERN Document Server

    Mekki, Julien

    2009-01-01

    In order to study the matter originating from the universe, a new particle accelerator named the Large Hadron Collider (LHC) has been built at CERN. The radiation environment generated by the hadrons collisions in the high energy physics experiments of the LHC will be complex and locally very intense. For monitoring this complex radiation field, dosimeters have been installed in the LHC experiments. In previous study, RadFET dosimeters and PIN diodes have been characterized for their use in the particle accelerator. However, even if the RadFETs sensors have been already extensively characterized, their radiation response can be affected by their package. Depending on the material and the geometry, the package can induce errors in the dose measurement. In this thesis, a complete study has been carried out in order to evaluate its influence. Concerning the PIN diodes, the readout protocol used for the LHC is no longer valuable for the Super-LHC. Therefore, a complete study on their radiation response has been p...

  8. COOL, LCG Conditions Database for the LHC Experiments Development and Deployment Status

    CERN Document Server

    Valassi, A; Clemencic, M; Pucciani, G; Schmidt, S A; Wache, M; CERN. Geneva. IT Department, DM

    2009-01-01

    The COOL project provides common software components and tools for the handling of the conditions data of the LHC experiments. It is part of the LCG Persistency Framework (PF), a broader project set up within the context of the LCG Application Area (AA) to devise common persistency solutions for the LHC experiments. COOL software development is the result of the collaboration between the CERN IT Department and ATLAS and LHCb, the two experiments that have chosen it as the basis of their conditions database infrastructure. COOL supports conditions data persistency using several relational technologies (Oracle, MySQL, SQLite and FroNTier), based on the CORAL Common Relational Abstraction Layer. For both experiments, Oracle is the backend used for the deployment of COOL database services at Tier0 and Tier1 sites of the LHC Computing Grid. While the development of new software functionalities is being frozen as LHC operations are ramping up, the main focus for the project in 2008 has shifted to performance optimi...

  9. The miniature optical transmitter and transceiver for the High-Luminosity LHC (HL-LHC) experiments

    International Nuclear Information System (INIS)

    Liu, C; Zhao, X; Deng, B; Gong, D; Guo, D; Li, X; Liang, F; Liu, G; Liu, T; Xiang, A C; Ye, J; Chen, J; Huang, D; Hou, S; Teng, P-K

    2013-01-01

    We present the design and test results of the Miniature optical Transmitter (MTx) and Transceiver (MTRx) for the high luminosity LHC (HL-LHC) experiments. MTx and MTRx are Transmitter Optical Subassembly (TOSA) and Receiver Optical Subassembly (ROSA) based. There are two major developments: the Vertical Cavity Surface Emitting Laser (VCSEL) driver ASIC LOCld and the mechanical latch that provides the connection to fibers. In this paper, we concentrate on the justification of this work, the design of the latch and the test results of these two modules with a Commercial Off-The-Shelf (COTS) VCSEL driver

  10. The LHC experiments' joint controls project (JCOP)

    International Nuclear Information System (INIS)

    Wayne Salter

    2001-01-01

    The development and maintenance of the control systems of the four Large Hadron Collider (LHC) experiments will require a non-negligible amount of resources and effort. In order to minimise the overall effort required the Joint Controls Project (JCOP) was set-up as a collaboration between CERN and the four LHC experiments to find and implement common solutions for the control of the LHC experiments. It is one of the few examples of such a wide collaboration and therefore the existence of the JCOP project is extremely significant. The author will give a brief overview of the project, its structure and its history. It will go on to summarise the various sub-projects that have been initiated under the auspices of JCOP together will their current status. It will highlight that the JCOP general principle is to promote the use of industrial solutions wherever possible. However, this does not rule out the provision of custom solutions when non-standard devices or very large numbers of devices have to be controlled. The author will also discuss the architecture foreseen by JCOP and where in this architecture the various types of solutions are expected to be used. Finally, although the selection of common industrial and custom solutions is a necessary condition for JCOP to succeed, the use of these solutions in themselves would not necessarily lead to the production of homogeneous control systems. Therefore, the author will finish with a description of the JCOP Framework, which is being developed to promote the use of these common solutions, to reduce the development effort required by the various experiment development teams and to help to build and integrate control systems which can be more easily maintained

  11. Double-quarkonium production at a fixed-target experiment at the LHC (AFTER@LHC)

    CERN Document Server

    Lansberg, Jean-Philippe

    2015-01-01

    We present predictions for double-quarkonium production in the kinematical region relevant for the proposed fixed-target experiment using the LHC beams (dubbed as AFTER@LHC). These include all spin-triplet S -wave charmonium and bottomonium pairs, i.e. Psi(n_1S) + Psi(n_2S), Psi(n_1S) + Upsilon(m_1S) and Upsilon(m_1S) + Upsilon(m_2S ) with n_1,n_2 = 1,2 and m_1,m_2 = 1,2,3. We calculate the contributions from double-parton scatterings and single-parton scatterings. With an integrated luminosity of 20 fb-1 to be collected at AFTER@LHC, we find that the yields for double-charmonium production are large enough for differential distribution measurements. We discuss some differential distributions for J/Psi + J/Psi production, which can help to study the physics of double-parton and single-parton scatterings in a new energy range and which might also be sensitive to double intrinsic c-bar(c) coalescence at large negative Feynman x.

  12. Data storage accounting and verification in LHC experiments

    CERN Document Server

    Ratnikova ,Natalia

    2012-01-01

    All major experiments at Large Hadron Collider (LHC) need to measure real storage usage at the Grid sites. This information is equally important for the resource management, planning, and operations. To verify consistency of the central catalogs, experiments are asking sites to provide full list of files they have on storage, including size, checksum, and other file attributes. Such storage dumps provided at regular intervals give a realistic view of the storage resource usage by the experiments. Regular monitoring of the space usage and data verification serve as additional internal checks of the system integrity and performance. Both the importance and the complexity of these tasks increase with the constant growth of the total data volumes during the active data taking period at the LHC. Developed common solutions help to reduce the maintenance costs both at the large Tier-1 facilities supporting multiple virtual organizations, and at the small sites that often lack manpower. We discuss requirements...

  13. Experimental modal analysis of components of the LHC experiments

    CERN Document Server

    Guinchard, M; Catinaccio, A; Kershaw, K; Onnela, A

    2007-01-01

    Experimental modal analysis of components of the LHC experiments is performed with the purpose of determining their fundamental frequencies, their damping and the mode shapes of light and fragile detector components. This process permits to confirm or replace Finite Element analysis in the case of complex structures (with cables and substructure coupling). It helps solving structural mechanical problems to improve the operational stability and determine the acceleration specifications for transport operations. This paper describes the hardware and software equipment used to perform a modal analysis on particular structures such as a particle detector and the method of curve fitting to extract the results of the measurements. This paper exposes also the main results obtained for the LHC Experiments.

  14. The ALICE experiment at the CERN LHC

    NARCIS (Netherlands)

    Aamodt, K.; de Haas, A.P.; Grebenyuk, O.|info:eu-repo/dai/nl/304848883; Ivan, C.G.|info:eu-repo/dai/nl/304847747; Kamermans, R.|info:eu-repo/dai/nl/073698733; Mischke, A.|info:eu-repo/dai/nl/325781435; Nooren, G.J.L.|info:eu-repo/dai/nl/07051349X; Oskamp, C.J.; Peitzmann, T.|info:eu-repo/dai/nl/304833959; Simili, E.; van den Brink, A.; van Eijndhoven, N.J.A.M.|info:eu-repo/dai/nl/072823674; Yuting, B.

    2008-01-01

    ALICE (A Large Ion Collider Experiment) is a general-purpose, heavy-ion detector at the CERN LHC which focuses on QCD, the strong-interaction sector of the Standard Model. It is designed to address the physics of strongly interacting matter and the quark-gluon plasma at extreme values of energy

  15. A water-cooling solution for PC-racks of the LHC experiments

    CERN Document Server

    Vannerem, P

    2004-01-01

    With ever increasing power consumption and heat dissipation of todays CPUs, cooling of rack-mounted PCs is an issue for the future online farms of the LHC experiments. In order to investigate the viability of a water-cooling solution, a prototype PC-farm rack has been equipped with a commercially available retrofitted heat exchanger. The project has been carried out as a collaboration of the four LHC experiments and the PH-ESS group . This note reports on the results of a series of cooling and power measurements of the prototype rack with configurations of 30 to 48 PCs. The cooling performance of the rack-cooler is found to be adequate; it extracts the heat dissipated by the CPUs efficiently into the cooling water. Hence, the closed PC rack transfers almost no heat into the room. The measurements and the failure tests show that the rack-cooler concept is a viable solution for the future PC farms of the LHC experiments.

  16. Decentralized Data Storage and Processing in the Context of the LHC Experiments at CERN

    CERN Document Server

    Blomer, Jakob; Fuhrmann, Thomas

    The computing facilities used to process data for the experiments at the Large Hadron Collider (LHC) at CERN are scattered around the world. The embarrassingly parallel workload allows for use of various computing resources, such as computer centers comprising the Worldwide LHC Computing Grid, commercial and institutional cloud resources, as well as individual home PCs in “volunteer clouds”. Unlike data, the experiment software and its operating system dependencies cannot be easily split into small chunks. Deployment of experiment software on distributed grid sites is challenging since it consists of millions of small files and changes frequently. This thesis develops a systematic approach to distribute a homogeneous runtime environment to a heterogeneous and geographically distributed computing infrastructure. A uniform bootstrap environment is provided by a minimal virtual machine tailored to LHC applications. Based on a study of the characteristics of LHC experiment software, the thesis argues for the ...

  17. Experiments and Cycling at the LHC Prototype Half-Cell

    Science.gov (United States)

    Saban, R.; Casas-Cubillos, J.; Coull, L.; Cruikshank, P.; Dahlerup-Petersen, K.; Hilbert, B.; Krainz, G.; Kos, N.; Lebrun, P.; Momal, F.; Misiaen, D.; Parma, V.; Poncet, A.; Riddone, G.; Rijllart, A.; Rodriguez-Mateos, F.; Schmidt, R.; Serio, L.; Wallen, E.; van Weelderen, R.; Williams, L. R.

    1997-05-01

    The first version of the LHC prototype half-cell has been in operation since February 1995. It consists of one quadrupole and three 10-m twin aperture dipole magnets which operate at 1.8 K. This experimental set-up has been used to observe and study phenomena which appear when the systems are assembled in one unit and influence one another. The 18-month long experimental program has validated the cryogenic system and yielded a number of results on cryogenic instrumentation, magnet protection and vacuum in particular under non-standard operating conditions. The program was recently complemented by the cycling experiment: it consisted in powering the magnets following the ramp rates which will be experienced by the magnets during an LHC injection. In order to simulate 10 years of routine operation of LHC, more than 2000 1-hour cycles were performed interleaved with provoked quenches. The objective of this experiment was to reveal eventual flaws in the design of components. The prototype half-cell performed to expectations showing no sign of failure of fatigue of components for more than 2000 cycles until one of the dipoles started exhibiting an erratic quench behavior.

  18. NEEDS for LHC experiment planning from results of very high energy cosmic ray Investigations (NEEDS-2

    Directory of Open Access Journals (Sweden)

    Petrukhin A.A.

    2015-01-01

    Full Text Available 12 years ago, at 12th ISVHECRI, a special NEEDS workshop was held to discuss future LHC data required for interpretation of cosmic ray experiments. Now, when the main task of LHC is solved – the Higgs boson is discovered – the question “What will be the next?” is very actual. In this paper the results of cosmic ray experiments at LHC energies are considered. Their possible explanation in the frame of a new model of production of quark-gluon matter blobs is discussed. The necessity to pass in LHC experiments from investigations of pp-interactions to investigations of nucleus-nucleus interactions is underlined since cosmic rays consist mainly of nuclei (≈ 60% which interact with nuclei of air. But namely in these nucleus-nucleus interactions many unusual results were obtained in cosmic ray investigations. Corresponding tasks for future LHC experiments are proposed.

  19. The LHC experiment control system: on the path to full automation

    International Nuclear Information System (INIS)

    Gaspar, C.; Alessio, F.; Cardoso, L.; Frank, M.; Garnier, J.C.; Herwijnen, E.V.; Jacobsson, R.; Jost, B.; Neufeld, N.; Schwemmer, R.; Callot, O.; Franek, B.

    2012-01-01

    LHCb is a large experiment at the LHC accelerator. The experiment control system is in charge of the configuration, control and monitoring of the different sub-detectors and of all areas of the online system. The building blocks of the control system are based on the PVSS SCADA System complemented by a control Framework developed in common for the 4 LHC experiments. This framework includes an 'expert system' like tool called SMI++ which is used for the system automation. The experiment's operations are now almost completely automated, driven by a top-level object called Big-Brother, which pilots all the experiment's standard procedures and the most common error-recovery procedures. The architecture, tools and mechanisms used for the implementation as well as some operational examples will be described. (authors)

  20. The LHCf experiment modelling cosmic rays at LHC

    CERN Document Server

    Tricomi, A; Bonechi, L; Bongi, M; Castellini, G; D'Alessandro, R; Faus, A; Fukui, K; Haguenauer, M; Itow, Y; Kasahara, K; Macina, D; Mase, T; Masuda, K; Matsubara, Y; Mizuishi, M; Menjo, H; Muraki, Y; Papini, P; Perrot, A L; Ricciarini, S B; Sako, T; Shimizu, Y; Tamura, T; Taki, K; Torii, S; Tricomi, A; Turner, W C; Velasco, J; Watanabe, H; Yoshida, K

    2008-01-01

    The LHCf experiment at LHC has been designed to provide a calibration of nuclear interaction models used in cosmic ray physics up to energies relevant to test the region between the knee and the GZK cut-off. Details of the detector and its performances are discussed.

  1. Jet calibration in the ATLAS experiment at LHC

    CERN Document Server

    Francavilla, P

    2009-01-01

    Jets produced in the hadronisation of quarks and gluons play a central role in the rich physics program that will be covered by the ATLAS experiment at the LHC, and are central elements of the signature for many physics channels. A well understood energy scale, which for some processes demands an uncertainty in the energy scale of order 1%, is a prerequisite. Moreover, in early data we face the challenge of dealing with the unexpected issues of a brand new detector in an unexplored energy domain. The ATLAS collaboration is carrying out a program to revisit the jet calibration strategies used in earlier hadron-collider experiments and develop a strategy which takes into account the new experimental problems introduced from higher measurement precision and from the LHC environment. The ATLAS calorimeter is intrinsically non-compensating and we will discuss the use of different offline approaches based on cell energy density and jet topology to correct the linearity response while improving the resolution. In ad...

  2. Benchmarking the Particle Background in the LHC Experiments

    CERN Document Server

    Gschwendtner, E

    2000-01-01

    The experiments for the Large Hadron Collider LHC at CERN have to work for 15 years in the presence of a very high particle background of photons in the energy range from 100\\,keV to 10\\,MeV and neutrons in the range from thermal energies ($\\approx 0.025\\,$eV) to 20\\,MeV. \\\\ The background is so high that it becomes a major design criterion for the ATLAS ex\\-peri\\-ment, a general purpose experiment at LHC that will be operational in the year 2005. The exact level of this background is poorly known. At present an uncertainty factor of five has to be assumed to which the limited knowledge of the shower processes in the absorber material and the ensueing neutron and photon production is estimated to contribute with a factor 2.5. \\\\ So far, the background has been assessed only through extensive Monte Carlo evaluation with the particle transport code FLUKA. The lack of relevant measurements, which were not done up to now, are to a large extent responsible for this uncertainty. Hence it is essential to benchmark t...

  3. Depleted CMOS pixels for LHC proton–proton experiments

    International Nuclear Information System (INIS)

    Wermes, N.

    2016-01-01

    While so far monolithic pixel detectors have remained in the realm of comparatively low rate and radiation applications outside LHC, new developments exploiting high resistivity substrates with three or four well CMOS process options allow reasonably large depletion depths and full CMOS circuitry in a monolithic structure. This opens up the possibility to target CMOS pixel detectors also for high radiation pp-experiments at the LHC upgrade, either in a hybrid-type fashion or even fully monolithic. Several pixel matrices have been prototyped with high ohmic substrates, high voltage options, and full CMOS electronics. They were characterized in the lab and in test beams. An overview of the necessary development steps and different approaches as well as prototype results are presented in this paper.

  4. Handbook of LHC Higgs Cross Sections: 3. Higgs Properties

    Energy Technology Data Exchange (ETDEWEB)

    Heinemeyer, S; et al.

    2013-01-01

    This Report summarizes the results of the activities in 2012 and the first half of 2013 of the LHC Higgs Cross Section Working Group. The main goal of the working group was to present the state of the art of Higgs Physics at the LHC, integrating all new results that have appeared in the last few years. This report follows the first working group report Handbook of LHC Higgs Cross Sections: 1. Inclusive Observables (CERN-2011-002) and the second working group report Handbook of LHC Higgs Cross Sections: 2. Differential Distributions (CERN-2012-002). After the discovery of a Higgs boson at the LHC in mid-2012 this report focuses on refined prediction of Standard Model (SM) Higgs phenomenology around the experimentally observed value of 125-126 GeV, refined predictions for heavy SM-like Higgs bosons as well as predictions in the Minimal Supersymmetric Standard Model and first steps to go beyond these models. The other main focus is on the extraction of the characteristics and properties of the newly discovered particle such as couplings to SM particles, spin and CP-quantum numbers etc.

  5. Higgs boson results from the ATLAS experiment at LHC

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00216944; The ATLAS collaboration

    2017-01-01

    This report highlights some of the latest results on the Higgs boson studies with the ATLAS experiment at the LHC, using proton-proton collision data at centre-of-mass energy of $\\sqrt {s} =$ 13~TeV collected during 2015 and 2016.

  6. Expected performance of the upgrade ATLAS experiment for HL-LHC

    CERN Document Server

    Liu, Peilian; The ATLAS collaboration

    2018-01-01

    The Large Hadron Collider (LHC) has been successfully delivering proton-proton collision data at the unprecedented center of mass energy of 13 TeV. An upgrade is planned to increase the instantaneous luminosity delivered by the LHC in what is called the HL-LHC, aiming to deliver a total of up 3000/fb to 4000/fb of data per experiment. To cope with the expected data-taking conditions ATLAS is planning major upgrades of the detector. It is now a critical time for these upgrade projects and during the last year and a half, six Technical Design Reports (TDR) were produced by the ATLAS Collaboration. In these TDRs the physics motivation and benefits of such upgrades are discussed together with details on the upgrade project itself. In this contribution we review the expected performance of the upgraded ATLAS detector and the expected reach for physics measurements as well as the discovery potential for new physics that is expected by the end of the HL-LHC data-taking. The performance of object reconstruction under...

  7. Proposal to negotiate an amendment to an existing blanket purchase contract for the supply of Burndy connectors for the LHC project and LHC experiments

    CERN Document Server

    2006-01-01

    This document concerns the proposal to negotiate an amendment to an existing blanket purchase contract for the supply of Burndy connectors for the LHC project and LHC experiments. For the reasons explained in this document, the Finance Committee is invited to agree to the negotiation of an amendment to the blanket purchase contract for the supply of Burndy connectors for the LHC project and LHC experiments with the company SOURIAU (CH), for the next three years for up to 600 000 euros (954 600 Swiss francs), subject to revision for inflation from January 2007, bringing the total amount of the blanket purchase contract to a maximum amount of 1 200 000 euros (1 909 200 Swiss francs), subject to revision for inflation from January 2007. The amounts in Swiss francs have been calculated using the present rate of exchange. 2006/60/5/e

  8. Measurements of very forward particles production spectra at LHC: the LHCf experiment

    CERN Document Server

    Berti, Eugenio; Bonechi, Lorenzo; Bongi, Massimo; Castellini, Guido; D'Alessandro, Raffaello; Haguenauer, Maurice; Itow, Yoshitaka; Iwata, Taiki; Kasahara, Katsuaki; Makino, Yuya; Masuda, Kimiaki; Matsubayashi, Eri; Menjo, Hiroaki; Muraki, Yasushi; Papini, Paolo; Ricciarini, Sergio; Sako, Takashi; Suzuki, Takuya; Tamura, Tadahisa; Tiberio, Alessio; Torii, Shoji; Tricomi, Alessia; Turner, W C; Ueno, Mana; Zhou, Qi Dong

    2017-01-01

    Thanks to two small sampling calorimeters installed in the LHC tunnel at ±140 m from IP1, the LHC forward (LHCf) experiment is able to detect neutral particles produced by high energy proton-ion collisions in the very forward region (pseudo-rapidity η > 8.4). The main aim of LHCf is to provide precise measurements of the production spectra relative to these particles, in order to tune hadronic interaction models used by ground-based cosmic rays experiments. In this paper we will present the current status of the LHCf experiment, regarding in particular collected data and analysis results, as well as future prospects

  9. LHC-GCS Process Tuning selection and use of PID and Smith predictor for the regulations of the LHC experiments' gas systems

    CERN Document Server

    Cabaret, S; Rachid, A; Coppier, H

    2005-01-01

    The LHC experiment’s Gas Control System (LHC GCS) has to provide LHC experiments with homogeneous control systems (supervision and process control layers) for their 23 gas systems. The LHC GCS process control layer is based on Programmable Logic Controllers (PLCs), Field-Buses and on a library, UNICOS (UNified Industrial COntrol System). Its supervision layer is based on a commercial SCADA system and on the JCOP and UNICOS PVSS frameworks. A typical LHC experiment’s gas system is composed of up to ten modules, dedicated to specific functions (e.g. mixing, purification, circulation). Most of modules require control loops for the regulation of pressures, temperatures and flows or ratios of gases. The control loops of the 23 gas systems can be implemented using the same tools, but need specific tuning according to their respective size, volume, pipe lengths and required accuracy. Most of the control loops can be implemented by means a standard PID (Proportional, Integral and Derivative) controller. When this...

  10. Jet calibration in the ATLAS experiment at LHC

    CERN Document Server

    The ATLAS collaboration

    2009-01-01

    Jets produced in the hadronisation of quarks and gluons play a central role in the rich physics program that will be covered by the ATLAS experiment at the LHC, and are central elements of the signature for many physics channels. A well understood energy scale, which for some process demands an uncertainty in the energy scale of order 1%, is a prerequisite. Moreover, in early data we face the challenge of dealing with the unexpected issues of a brand new detector in an unexplored energy domain. The ATLAS collaboration is carrying out a program to revisit the jet calibration strategies used in earlier hadron-collider experiments and develop a strategy which takes account of the new experimental problems and demand for greater measurement precision which will be faced at the LHC. The ATLAS calorimeter is intrinsically non-compensating and we will present the use of different offline approaches based on cell energy density and jet topology to correct for this effect on jet energy resolution and scale. In additio...

  11. Use of fluorocarbons in the cooling of LHC experiments

    CERN Document Server

    Pimenta dos Santos, M

    2003-01-01

    Perfluorochemicals sold by 3M under the trade name 3M Fluorinert Electronic Liquids have been used for many years as heat transfer media in a variety of industries. The suitability of these liquids for the cooling of LHC experiment originates from their high dielectric strength as well as from their chemical stability under ionizing radiation. The Fluorinerts are clear, colorless, non-flammable with low toxicity and low corrosiveness. Additionally, they offer low global waming potential – GWP – and zero ozone-depletion potential – ODP. Some examples of fluorinert application in the cooling of LHC experiments will be presented : (a) the ATLAS Inner detector C3F8 evaporative cooling system (b) the ATLAS TRF C6F14 monophase cooling system and (c) the ALICE SPD “active heat pipe” C4F10 evaporative cooling system. A brief comparison of evaporative and monophase cooling systems will be outlined.

  12. Highlights from the ATLAS experiment at CERN LHC

    CERN Document Server

    Tsukerman, Ilya; The ATLAS collaboration

    2018-01-01

    Highlights from the ATLAS Experiment at the LHC are presented. Results shown are mostly based on the analysis of 2015-2016 year dataset which corresponds to the luminosity 36 inverse fb. Mainly recent measurements of Higgs boson production and decay are discussed while only summary of summaries is given for the SM processes, top production, SUSY and Exotics.

  13. LHC 2011 operation - as viewed from the experiments

    International Nuclear Information System (INIS)

    Ferro-Luzzi, M.

    2012-01-01

    The 2011 LHC run is reviewed from the experiments' perspective. Cooperation between machine and experiments was again very constructive. The LHC produced pp physics collisions at √s = 7 TeV with a peak luminosity reaching about 3.6*10 33 cm -2 s -1 and delivering more integrated luminosity than expected (∼ 5.7 fb -1 in IP1 and IP5, 1.2 fb -1 in IP8). Thanks to this excellent LHC performance, ATLAS and CMS came close to discover or exclude the SM Higgs boson, while increasing further the pressure on supersymmetric models. LHCb was able to set unprecedented limits on the branching ratio of B s → μμ and to observe an intriguing CP violating asymmetry in charm decay. An intermediate energy pp run was successfully carried out to complement the heavy ion physics data. Furthermore, an impressive 16-fold increase in peak and integrated luminosity with Pb beams was realized (4.5*10 26 cm -2 s -1 and about 160 μb -1 per IP). ALICE, ATLAS and CMS produced a wealth of new heavy ion physics results that will sharpen the understanding of the quark-gluon plasma. A first TOTEM measurement of the total pp cross section was realized and more diffractive physics results obtained with Roman Pots (TOTEM and ALFA/ATLAS). Luminosity calibration measurements with van der Meer scans and beam-gas imaging were further improved, to the point that cross sections could be measured with about 2% accuracy

  14. Elastic extension of a local analysis facility on external clouds for the LHC experiments

    Science.gov (United States)

    Ciaschini, V.; Codispoti, G.; Rinaldi, L.; Aiftimiei, D. C.; Bonacorsi, D.; Calligola, P.; Dal Pra, S.; De Girolamo, D.; Di Maria, R.; Grandi, C.; Michelotto, D.; Panella, M.; Taneja, S.; Semeria, F.

    2017-10-01

    The computing infrastructures serving the LHC experiments have been designed to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, the LHC experiments are exploring the opportunity to access Cloud resources provided by external partners or commercial providers. In this work we present the proof of concept of the elastic extension of a local analysis facility, specifically the Bologna Tier-3 Grid site, for the LHC experiments hosted at the site, on an external OpenStack infrastructure. We focus on the Cloud Bursting of the Grid site using DynFarm, a newly designed tool that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on an OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage.

  15. Resistive wall instability for the LHC: intermediate review

    CERN Document Server

    Brandt, D

    2001-01-01

    As the design of some basic components of the LHC becomes available, it is possible to refine the evaluation of the expected contribution of these elements to the total impedance budget of the machine. The LHC beam-screen being expected to be the main contributor for the resistive wall effect, it appeared justified to review the impedance budget, taking into account the latest available data. This note first recalls the original estimations presented in the LHC Conceptual Design [1], then presents an updated review of the instability rise times and finally discusses a possible reduction of this rather large contribution. ------------- !!Note!!: Please note that updated values for the LHC impedance budget are now available from the report CERN LHC Project Report 585 (Coupled Bunch Instabilities in the LHC, D. Angal-Kalinin and L. Vos, EPAC, July 2002 ).

  16. CMS distributed analysis infrastructure and operations: experience with the first LHC data

    International Nuclear Information System (INIS)

    Vaandering, E W

    2011-01-01

    The CMS distributed analysis infrastructure represents a heterogeneous pool of resources distributed across several continents. The resources are harnessed using glite and glidein-based work load management systems (WMS). We provide the operational experience of the analysis workflows using CRAB-based servers interfaced with the underlying WMS. The automatized interaction of the server with the WMS provides a successful analysis workflow. We present the operational experience as well as methods used in CMS to analyze the LHC data. The interaction with CMS Run-registry for Run and luminosity block selections via CRAB is discussed. The variations of different workflows during the LHC data-taking period and the lessons drawn from this experience are also outlined.

  17. The ATLAS muon trigger: Experience and performance in the first 3 years of LHC pp runs

    International Nuclear Information System (INIS)

    Ventura, Andrea

    2013-01-01

    The ATLAS experiment at CERN's Large Hadron Collider (LHC) deploys a three-level processing scheme for the trigger system. The Level-1 muon trigger system gets its input from fast muon trigger detectors. Sector logic boards select muon candidates, which are passed via an interface board to the central trigger processor and then to the High Level Trigger (HLT). The muon HLT is purely software based and encompasses a Level-2 trigger followed by an event filter for a staged trigger approach. It has access to the data of the precision muon detectors and other detector elements to refine the muon hypothesis. The ATLAS experiment has taken data with high efficiency continuously over entire running periods from 2010 to 2012, for which sophisticated triggers to guard the highest physics output while reducing effectively the event rate were mandatory. The ATLAS muon trigger has successfully adapted to this challenging environment. The selection strategy has been optimized for the various physics analyses involving muons in the final state. This work briefly summarizes these three years of experience in the ATLAS muon trigger and reports about efficiency, resolution, and general performance of the muon trigger

  18. Decentralized data storage and processing in the context of the LHC experiments at CERN

    International Nuclear Information System (INIS)

    Blomer, Jakob Johannes

    2012-01-01

    The computing facilities used to process data for the experiments at the Large Hadron Collider (LHC) at CERN are scattered around the world. The embarrassingly parallel workload allows for use of various computing resources, such as computer centers comprising the Worldwide LHC Computing Grid, commercial and institutional cloud resources, as well as individual home PCs in ''volunteer clouds''. Unlike data, the experiment software and its operating system dependencies cannot be easily split into small chunks. Deployment of experiment software on distributed grid sites is challenging since it consists of millions of small files and changes frequently. This thesis develops a systematic approach to distribute a homogeneous runtime environment to a heterogeneous and geographically distributed computing infrastructure. A uniform bootstrap environment is provided by a minimal virtual machine tailored to LHC applications. Based on a study of the characteristics of LHC experiment software, the thesis argues for the use of content-addressable storage and decentralized caching in order to distribute the experiment software. In order to utilize the technology at the required scale, new methods of pre-processing data into content-addressable storage are developed. A co-operative, decentralized memory cache is designed that is optimized for the high peer churn expected in future virtualized computing clusters. This is achieved using a combination of consistent hashing with global knowledge about the worker nodes' state. The methods have been implemented in the form of a file system for software and Conditions Data delivery. The file system has been widely adopted by the LHC community and the benefits of the presented methods have been demonstrated in practice.

  19. Decentralized data storage and processing in the context of the LHC experiments at CERN

    Energy Technology Data Exchange (ETDEWEB)

    Blomer, Jakob Johannes

    2012-06-01

    The computing facilities used to process data for the experiments at the Large Hadron Collider (LHC) at CERN are scattered around the world. The embarrassingly parallel workload allows for use of various computing resources, such as computer centers comprising the Worldwide LHC Computing Grid, commercial and institutional cloud resources, as well as individual home PCs in ''volunteer clouds''. Unlike data, the experiment software and its operating system dependencies cannot be easily split into small chunks. Deployment of experiment software on distributed grid sites is challenging since it consists of millions of small files and changes frequently. This thesis develops a systematic approach to distribute a homogeneous runtime environment to a heterogeneous and geographically distributed computing infrastructure. A uniform bootstrap environment is provided by a minimal virtual machine tailored to LHC applications. Based on a study of the characteristics of LHC experiment software, the thesis argues for the use of content-addressable storage and decentralized caching in order to distribute the experiment software. In order to utilize the technology at the required scale, new methods of pre-processing data into content-addressable storage are developed. A co-operative, decentralized memory cache is designed that is optimized for the high peer churn expected in future virtualized computing clusters. This is achieved using a combination of consistent hashing with global knowledge about the worker nodes' state. The methods have been implemented in the form of a file system for software and Conditions Data delivery. The file system has been widely adopted by the LHC community and the benefits of the presented methods have been demonstrated in practice.

  20. Experience with the Quality Assurance of the Superconducting Electrical Circuits of the LHC Machine

    CERN Document Server

    Bozzini, D; Kotarba, A; Mess, Karl Hubert; Olek, S; Russenschuck, Stephan

    2006-01-01

    The coherence between the powering reference database for the LHC and the Electrical Quality Assurance (ELQA) is guaranteed on the procedural level. However, a challenge remains the coherence between the database, the magnet test and assembly procedures, and the connection of all superconducting circuits in the LHC machine. In this paper, the methods, tooling, and procedures for the ELQA during the assembly phase of the LHC will be presented in view of the practical experience gained in the LHC tunnel. Some examples of detected polarity errors and electrical non-conformities will be presented. The parameters measured at ambient temperature, such as the dielectric insulation of circuits, will be discussed.

  1. Parallel Plate Chambers and their possible use in LHC experiments

    International Nuclear Information System (INIS)

    Arefiev, A.; Bencze, Gy.L.; Bizzeti, A.; Choumilov, E.; Civinini, C.; D'Alessandro, R.; Dajko, G.; Fenyvesi, A.; Ferrando, A.; Fouz, M.C.; Iglesias, A.; Ivochkin, V.; Maggi, F.; Malinin, A.; Martinez-Laso, L.; Meschini, M.; Molnar, J.; Pojidaev, V.; Szoncso, F.; Wulz, C.E.

    1995-01-01

    Present status of Parallel Plate Chambers (PPC) is reviewed. After a description of this detector, results from tests concerning PPC efficiency uniformity, radiation hardness, and behaviour in electromagnetic calorimetry are presented. Some possible utilizations in LHC experiments are mentioned. (orig.)

  2. The "Common Solutions" Strategy of the Experiment Support group at CERN for the LHC Experiments

    CERN Document Server

    Girone, M; Barreiro Megino, F H; Campana, S; Cinquilli, M; Di Girolamo, A; Dimou, M; Giordano, D; Karavakis, E; Kenyon, M J; Kokozkiewicz, L; Lanciotti, E; Litmaath, M; Magini, N; Negri, G; Roiser, S; Saiz, P; Saiz Santos, M D; Schovancova, J; Sciabà, A; Spiga, D; Trentadue, R; Tuckett, D; Valassi, A; Van der Ster, D C; Shiers, J D

    2012-01-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments' computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management m...

  3. The LHC cryogenic system and operational experience from the first three years run

    International Nuclear Information System (INIS)

    Delikaris, Dimitri; Tavian, Laurent

    2014-01-01

    The LHC (Large Hadron Collider) accelerator helium cryogenic system consists of eight cryogenically independent sectors, each 3.3 km long, all cooled and operated at 1.9 K. The overall, entropy equivalent, installed cryogenic capacity totalizes 144 kW (a) 4.5 K including 19.2 kW (a) 1.8 K with an associated helium inventory of 130 ton. The LHC cryogenic system is considered among the most complex and powerful in the world allowing the cooling down to superfluid helium temperature of 1.9 K. of the accelerators' high field superconducting magnets distributed over the 26.7 km underground ring. The present article describes the LHC cryogenic system and its associated cryogen infrastructure. Operational experience, including cryogen management, acquired from the first three years of LHC operation is finally presented. (author)

  4. Final Technical Report for ``Paths to Discovery at the LHC : Dark Matter and Track Triggering"

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Kristian [Northwestern Univ., Evanston, IL (United States)

    2016-10-24

    Particle Dark Matter (DM) is perhaps the most compelling and experimentally well-motivated new physics scenario anticipated at the Large Hadron Collider (LHC). The DE-SC0014073 award allowed the PI to define and pursue a path to the discovery of Dark Matter in Run-2 of the LHC with the Compact Muon Solenoid (CMS) experiment. CMS can probe regions of Dark Matter phase-space that direct and indirect detection experiments are unable to constrain. The PI’s team initiated the exploration of these regions, searching specifically for the associated production of Dark Matter with top quarks. The effort focuses on the high-yield, hadronic decays of W bosons produced in top decay, which provides the highest sensitivity to DM produced via through low-mass spin-0 mediators. The group developed identification algorithms that achieve high efficiency and purity in the selection of hadronic top decays, and analysis techniques that provide powerful signal discrimination in Run-2. The ultimate reach of new physics searches with CMS will be established at the high-luminosity LHC (HL-LHC). To fully realize the sensitivity the HL-LHC promises, CMS must minimize the impact of soft, inelastic (“pileup”) interactions on the real-time “trigger” system the experiment uses for data refinement. Charged particle trajectory information (“tracking”) will be essential for pileup mitigation at the HL-LHC. The award allowed the PI’s team to develop firmware-based data delivery and track fitting algorithms for an unprecedented, real-time tracking trigger to sustain the experiment’s sensitivity to new physics in the next decade.

  5. Large Cryogenic Infrastructure for LHC Superconducting Magnet and Cryogenic Component Tests: Layout, Commissioning and Operational Experience

    International Nuclear Information System (INIS)

    Calzas, C.; Chanat, D.; Knoops, S.; Sanmarti, M.; Serio, L.

    2004-01-01

    The largest cryogenic test facility at CERN, located at Zone 18, is used to validate and to test all main components working at cryogenic temperature in the LHC (Large Hadron Collider) before final installation in the machine tunnel. In total about 1300 main dipoles, 400 main quadrupoles, 5 RF-modules, eight 1.8 K refrigeration units will be tested in the coming years.The test facility has been improved and upgraded over the last few years and the first 18 kW refrigerator for the LHC machine has been added to boost the cryogenic capacity for the area via a 25,000 liter liquid helium dewar. The existing 6 kW refrigerator, used for the LHC Test String experiments, will also be employed to commission LHC cryogenic components.We report on the design and layout of the test facility as well as the commissioning and the first 10,000 hours operational experience of the test facility and the 18 kW LHC refrigerator

  6. PanDA: Exascale Federation of Resources for the ATLAS Experiment at the LHC

    CERN Document Server

    AUTHOR|(SzGeCERN)643806; The ATLAS collaboration; Caballero-Bejar, Jose; De, Kaushik; Hover, John; Klimentov, Alexei; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Petrosyan, Artem; Wenaus, Torre

    2016-01-01

    After a scheduled maintenance and upgrade period, the world’s largest and most powerful machine - the Large Hadron Collider(LHC) - is about to enter its second run at unprecedented energies. In order to exploit the scientific potential of the machine, the experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousand of physics users and compared to simulated data. Given diverse funding constraints, the computational resources for the LHC have been deployed in a worldwide mesh of data centres, connected to each other through Grid technologies. The PanDA (Production and Distributed Analysis) system was developed in 2005 for the ATLAS experiment on top of this heterogeneous infrastructure to seamlessly integrate the computational resources and give the users the feeling of a unique system. Since its origins, PanDA has evolved together with upcoming computing paradigms in and outside HEP, such as changes in the networking model, Cloud Computing and HPC. It ...

  7. LHC Injection Beam Quality During LHC Run I

    CERN Document Server

    AUTHOR|(CDS)2079186; Kain, Verena; Stapnes, Steinar

    The LHC at CERN was designed to accelerate proton beams from 450 GeV to 7 TeV and collide them in four large experiments. The 450 GeV beam is extracted from the last pre-accelerator, the SPS, and injected into the LHC via two 3 km long transfer lines, TI 2 and TI 8. The injection process is critical in terms of preservation of beam quality and machine protection. During LHC Run I (2009-2013) the LHC was filled with twelve high intensity injections per ring, in batches of up to 144 bunches of 1.7*10^11 protons per bunch. The stored beam energy of such a batch is already an order of magnitude above the damage level of accelerator equipment. Strict quality and machine protection requirements at injection have a significant impact on operational efficiency. During the first years of LHC operation, the injection phase was identified as one of the limiting factors for fast LHC turnaround time. The LHC Injection Quality Check (IQC) software framework was developed as a part of this thesis to monitor the beam quality...

  8. Low voltage powering of on-detector electronics for HL-LHC experiments upgrades

    CERN Document Server

    Bobillier, Vincent; Vasey, Francois; Karmakar, Sabyasachi; Maity, Manas; Roy, Subhasish; Kundu, Tapas Kumar

    2018-01-01

    All LHC experiments will be upgraded during the next LHC long shutdowns (LS2 and LS3). The increase in resolution and luminosity and the use of more advanced CMOS technology nodes typically implies higher current consumption of the on-detector electronics. In this context, and in view of limiting the cable voltage drop, point-of-load DC-DC converters will be used on detector. This will have a direct impact on the existing powering scheme, implying new AC-DC and/or DC-DC stages as well as changes in the power cabling infrastructure. This paper presents the first results obtained while evaluating different LV powering schemes and distribution layouts for HL-LHC trackers. The precise low voltage power source requirements are being assessed and understood using the CMS tracker upgrade as a use-case.

  9. Fully transparent LHC

    CERN Multimedia

    2008-01-01

    Thanks to the first real signals received from the LHC while in operation before the incident, the experiments are now set to make the best use of the data they have collected. Report from the LHCC open session.The September open session of the LHCC (LHC Experiments Committee) came just a few days after the incident that occurred at the LHC. The packed auditorium was a testament to the huge interest raised by Lyn Evans’ talk about the status of the machine and the plans for the future. After being told that the actual consequences of the incident will be clear only once Sector 3-4 has been warmed up, the audience focussed on the reports from the experiments. For the first time, the reports showed performance results of the various detectors with particles coming from the machine and not just from cosmic rays or tests and simulations. "The first days of LHC beam exceeded all expectations and the experiments made extensive and rapid use of the data they collected", says ...

  10. Refining Grasp Affordance Models by Experience

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Buch, Anders Glent

    2010-01-01

    We present a method for learning object grasp affordance models in 3D from experience, and demonstrate its applicability through extensive testing and evaluation on a realistic and largely autonomous platform. Grasp affordance refers here to relative object-gripper configurations that yield stable...... with a visual model of the object they characterize. We explore a batch-oriented, experience-based learning paradigm where grasps sampled randomly from a density are performed, and an importance-sampling algorithm learns a refined density from the outcomes of these experiences. The first such learning cycle...... is bootstrapped with a grasp density formed from visual cues. We show that the robot effectively applies its experience by downweighting poor grasp solutions, which results in increased success rates at subsequent learning cycles. We also present success rates in a practical scenario where a robot needs...

  11. LHC status report

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Following the great success of the first 3.5 TeV collisions in all four LHC experiments on 30 March, the focus of the LHC commissioning teams has turned to consolidating the beam injection and acceleration procedures.   During the last two weeks, the operators have adopted a cycle of beam commissioning studies by day and the preparation and delivery of collisions during the night shifts. The injection and acceleration processes for the beams are by now well established and almost all feedback systems, which are an essential ingredient for establishing reliable and safe machine operation, have been commissioned. Thanks to special current settings for the quadrupoles that are situated near the collision points, the LHC luminosity at high energy has been increased by a factor of 5 in three of the four experiments. Similar improvements are under way for the fourth experiment. The next steps include adjustments of the LHC machine protection and collimation devices, which will ensure 'stable beam' co...

  12. State of the Short Dipole Model Program for the LHC

    CERN Document Server

    Andreyev, N I; Kurtyka, T; Oberli, L R; Perini, D; Russenschuck, Stephan; Siegel, N; Siemko, A; Tommasini, D; Vanenkov, I; Walckiers, L

    1998-01-01

    Superconducting single and twin aperture 1-m long dipole magnets are currently being fabricated at CERN at a rate of about one per month in the framework of the short dipole model program for the LHC. The program allows to study performance improvements coming from refinements in design, components and assembly options and to accumulate statistics based on a small-scale production. The experience thus gained provides in turn feedback into the long magnet program in industry. In recent models initial quenching fields above 9 T have been obtained and after a short training the conductor limit at 2 K is reached, resulting in a central bore field exceeding 10 T. The paper describes the features of recent single aperture models, the results obtained during cold tests and the plans to ensure the continuation of a vigorous model program providing input for the fabrication of the main LHC dipoles.

  13. A full acceptance experiment at the CERN large hadron collider (LHC)

    International Nuclear Information System (INIS)

    Eggert, K.; Morsch, A.; Taylor, C.

    1996-01-01

    The physics of full acceptance detector at the LHC is reviewed. A possible experimental layout situated in IP4 is presented. The interface between the experiment and the machine lattice is described with particular attention given to the measurement of elastic and diffractive protons. (author)

  14. A brief review of measurements of electroweak bosons at the LHCb experiment in LHC Run 1

    CERN Document Server

    INSPIRE-00340962

    2016-09-15

    The LHCb experiment is one of four major experiments at the LHC. Despite being designed for the study of beauty and charm particles, it has made important contributions in other areas, such as the production and decay of $W$ and $Z$ bosons. Such measurements can be used to study and constrain parton distribution functions, as well as to test perturbative quantum chromodynamics in hard scattering processes. The angular structure of $Z$ boson decays to leptons can also be studied and used to measure the weak mixing angle. The phase space probed by LHCb is particularly sensitive to this quantity, and the LHCb measurement using the dimuon final state is currently the most precise determination of $\\sin^2\\theta^\\text{lept.}_\\text{eff.}$ at the LHC. LHCb measurements made using data collected during the first period of LHC operations (LHC Run 1) are discussed in this review. The article also considers the potential impact of related future measurements.

  15. LHC magnets

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Preparations for the LHC proton collider to be built in CERN's LEP tunnel continue to make good progress. In particular development work for the high field superconducting magnets to guide the almost 8 TeVproton beams through the 'tight' curve of the 27-kilometre ring are proceeding well, while the magnet designs and lattice configuration are evolving in the light of ongoing experience. At the Evian LHC Experiments meeting, this progress was covered by Giorgio Brianti

  16. LHC preparations change gear

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    After the formal approval by CERN Council in December (January, page 1) of the LHC protonproton collider for CERN's 27- kilometre LEP tunnel, preparations for the new machine change gear. Lyndon Evans becomes LHC Project Leader, and CERN's internal structure will soon be reorganized to take account of the project becoming a definite commitment. On the experimental side, the full Technical Proposals for the big general purpose ATLAS and CMS detectors were aired at a major meeting of the LHC Committee at CERN in January. These Technical Proposals are impressive documents each of some several hundred pages. (Summaries of the detector designs will appear in forthcoming issues of the CERN Courier.) The ALICE heavy ion experiment is not far behind, and plans for other LHC experiments are being developed. Playing an important role in this groundwork has been the Detector Research and Development Committee (DRDC), founded in 1990 to foster detector development for the LHC experimental programme and structured along the lines of a traditional CERN Experiments Committee. Established under the Director Generalship of Carlo Rubbia and initially steered by Research Director Walter Hoogland, the DRDC has done sterling work in blazing a trail for LHC experiments. Acknowledging that the challenge of LHC experimentation needs technological breakthroughs as well as specific detector subsystems, DRDC proposals have covered a wide front, covering readout electronics and computing as well as detector technology. Its first Chairman was Enzo larocci, succeeded in 1993 by Michal Turala. DRDC's role was to evaluate proposals, and make recommendations to CERN's Research Board for approval and resource allocation, not an easy task when the LHC project itself had yet to be formally approved. Over the years, a comprehensive portfolio of detector development has been built up, much of which has either led to specific LHC detector subsystems for traditional detector tasks

  17. A philosophical experiment: empirical study of knowledge production at the LHC

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    How is new knowledge produced in the natural sciences? This question has long been an issue of central relevance for philosophers, historians and sociologists of science, who have fiercely debated whether and how the emergence of new scientific knowledge can be described as following regular patterns, for example as far as the interplay of theory and experiment is concerned. To this aim, more or less recent historical examples have been used as empirical case studies, and widely diverging conclusions have at times been drawn from the same material. The interdisciplinary, DFG-funded project-cluster "Epistemology of the LHC" (University of Wuppertal, Germany) has in the past three years attempted to investigate knowledge production "in real time" by following the interplay of theory and experiment unfold during the first phase of LHC activity and how the knowledge landscape of high energy physics accordingly did (or did not) change. To try and reconstruct some aspects of this epistemic dynamics, the project ...

  18. HL-LHC and HE-LHC Upgrade Plans and Opportunities for US Participation

    Science.gov (United States)

    Apollinari, Giorgio

    2017-01-01

    The US HEP community has identified the exploitation of physics opportunities at the High Luminosity-LHC (HL-LHC) as the highest near-term priority. Thanks to multi-year R&D programs, US National Laboratories and Universities have taken the leadership in the development of technical solutions to increase the LHC luminosity, enabling the HL-LHC Project and uniquely positioning this country to make critical contributions to the LHC luminosity upgrade. This talk will describe the shaping of the US Program to contribute in the next decade to HL-LHC through newly developed technologies such as Nb3Sn focusing magnets or superconducting crab cavities. The experience gained through the execution of the HL-LHC Project in the US will constitute a pool of knowledge and capabilities allowing further developments in the future. Opportunities for US participations in proposed hadron colliders, such as a possible High Energy-LHC (HE-LHC), will be described as well.

  19. The CERN Detector Safety System for the LHC Experiments

    CERN Document Server

    Lüders, S; Morpurgo, G; Schmeling, S

    2003-01-01

    The Detector Safety System (DSS), currently being developed at CERN under the auspices of the Joint Controls Project (JCOP), will be responsible for assuring the protection of equipment for the four LHC experiments. Thus, the DSS will require a high degree of both availability and reliability. After evaluation of various possible solutions, a prototype is being built based on a redundant Siemens PLC front-end, to which the safety-critical part of the DSS task is delegated. This is then supervised by a PVSS SCADA system via an OPC server. The PLC front-end is capable of running autonomously and of automatically taking predefined protective actions whenever required. The supervisory layer provides the operator with a status display and with limited online reconfiguration capabilities. Configuration of the code running in the PLCs will be completely data driven via the contents of a "Configuration Database". Thus, the DSS can easily adapt to the different and constantly evolving requirements of the LHC experimen...

  20. Initial Experience with the Machine Protection System for LHC

    CERN Document Server

    Schmidt, Ruediger; Dehning, Bernd; Ferro-Luzzi, Massimiliano; Goddard, Brennan; Lamont, Mike; Siemko, Andrzej; Uythoven, Jan; Wenninger, Jorg; Zerlauth, Markus

    2010-01-01

    For nominal beam parameters at 7 TeV/c each proton beam with a stored energy of 362 MJ threatens to damage accelerator equipment in case of uncontrolled beam loss. These parameters will only be reached after some years of operation, however, a small fraction of this energy is already sufficient to damage accelerator equipment or experiments. The correct functioning of the machine protection systems is vital during the different operational phases already for initial operation. When operating the complex magnet system, with and without beam, safe operation relies on the protection and interlock systems for the superconducting circuits. For safe injection and transfer of the beams from SPS to LHC, transfer line parameters are monitored, beam absorbers must be in the correct position and the LHC must be ready to accept beam. At the end of a fill and in case of failures beams must be properly extracted onto the dump blocks, for some types of failure within less than few hundred microseconds. Safe operation requir...

  1. The “Common Solutions" Strategy of the Experiment Support group at CERN for the LHC Experiments

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing as well as WLCG deployment and operations need to evolve. As part of the activities of the Experiment Support group in CERN’s IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management...

  2. Strategies for reducing the environmental impact of gaseous detector operation at the CERN LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Capeans, M.; Guida, R.; Mandelli, B., E-mail: beatrice.mandelli@cern.ch

    2017-02-11

    A wide range of gas mixtures is used for the operation of different gaseous detectors at the Large Hadron Collider (LHC) experiments. Nowadays some of these gases, as C{sub 2}H{sub 2}F{sub 4}, CF{sub 4} and SF{sub 6}, are indicated as greenhouse gases (GHG) and dominate the overall GHG emission from particle detectors at the LHC experiments. The release of GHG is an important subject for the design of future particle detectors as well as for the operation of the current experiments. Different strategies have been adopted at CERN for reducing the GHG emissions. The standard approach is the recirculation of the gas mixture with complex gas systems where system stability and the possible accumulation of impurities need to be attentively evaluated for the good operation and safety of the detectors. A second approach is based on the recuperation of the gas mixture exiting the detectors and the separation of its gas components for re-use. At long-term, the use of less invasive gases is being investigated, especially for the Resistive Plate Chamber (RPC) systems. Operation of RPC with environmentally friendly gas mixtures is demonstrated for streamer mode while avalanche mode operation needs more complex gas mixtures. - Highlights: • Greenhouse gases (GHG) emission in the LHC experiments and detectors. • Strategies to reduce the GHG emissions: gas recirculation and recuperation systems. • GHG emission: achievements from LHC Run1 to Run2. • Resistive Plate Chambers operation with new environmentally friendly gases.

  3. Stress-testing the Standard Model at the LHC

    CERN Document Server

    2016-01-01

    With the high-energy run of the LHC now underway, and clear manifestations of beyond-Standard-Model physics not yet seen in data from the previous run, the search for new physics at the LHC may be a quest for small deviations with big consequences. If clear signals are present, precise predictions and measurements will again be crucial for extracting the maximum information from the data, as in the case of the Higgs boson. Precision will therefore remain a key theme for particle physics research in the coming years. The conference will provide a forum for experimentalists and theorists to identify the challenges and refine the tools for high-precision tests of the Standard Model and searches for signals of new physics at Run II of the LHC. Topics to be discussed include: pinning down Standard Model corrections to key LHC processes; combining fixed-order QCD calculations with all-order resummations and parton showers; new developments in jet physics concerning jet substructure, associated jets and boosted je...

  4. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  5. Electronic system of the RPC Muon Trigger in CMS experiment at LHC accelerator (Elektroniczny system trygera mionowego RPC w eksperymencie CMS akceleratora LHC

    CERN Document Server

    Bialkowska, H

    2009-01-01

    This paper presents implementation of distributed, multichannel electronic measurement system for RPC - based Muon Trigger in the CMS experiment at LHC. The introduction shortly describes the research aims of LHC and shows the metrological requirements for CMS - good spatial and time resolution, and possibility to estimate multiple physical parameters from registered collisions of particles. Further the paper describes RPC Muon Trigger consisting of 200 000 independent channels for position measurement. The first part of the paper presents the functional structure of the system in the context of requirements put by the CMS experiment, like global triggering system and data acquisition. The second part describes the hardware solutions used in particular parts of the RPC detector measuremnt system and shows some test results. The paper has a digest and overview nature.

  6. Beam-gas Background Observations at LHC

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00214737; The ATLAS collaboration; Alici, Andrea; Lazic, Dragoslav-Laza; Alemany Fernandez, Reyes; Alessio, Federico; Bregliozzi, Giuseppe; Burkhardt, Helmut; Corti, Gloria; Guthoff, Moritz; Manousos, Athanasios; Sjoebaek, Kyrre; D'Auria, Saverio

    2017-01-01

    Observations of beam-induced background at LHC during 2015 and 2016 are presented in this paper. The four LHC experiments use the non-colliding bunches present in the physics-filling pattern of the accelerator to trigger on beam-gas interactions. During luminosity production the LHC experiments record the beam-gas interactions using dedicated background monitors. These data are sent to the LHC control system and are used to monitor the background levels at the experiments during accelerator operation. This is a very important measurement, since poor beam-induced background conditions can seriously affect the performance of the detectors. A summary of the evolution of the background levels during 2015 and 2016 is given in these proceedings.

  7. Federated data storage system prototype for LHC experiments and data intensive science

    Science.gov (United States)

    Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.

    2017-10-01

    Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.

  8. Experiment Dashboard - a generic, scalable solution for monitoring of the LHC computing activities, distributed sites and services

    International Nuclear Information System (INIS)

    Andreeva, J; Cinquilli, M; Dieguez, D; Dzhunov, I; Karavakis, E; Karhula, P; Kenyon, M; Kokoszkiewicz, L; Nowotka, M; Ro, G; Saiz, P; Tuckett, D; Sargsyan, L; Schovancova, J

    2012-01-01

    The Experiment Dashboard system provides common solutions for monitoring job processing, data transfers and site/service usability. Over the last seven years, it proved to play a crucial role in the monitoring of the LHC computing activities, distributed sites and services. It has been one of the key elements during the commissioning of the distributed computing systems of the LHC experiments. The first years of data taking represented a serious test for Experiment Dashboard in terms of functionality, scalability and performance. And given that the usage of the Experiment Dashboard applications has been steadily increasing over time, it can be asserted that all the objectives were fully accomplished.

  9. VHMPID: a new detector for the ALICE experiment at LHC

    Directory of Open Access Journals (Sweden)

    Perini D.

    2011-04-01

    Full Text Available This article presents the basic idea of VHMPID, an upgrade detector for the ALICE experiment at LHC, CERN. The main goal of this detector is to extend the particle identification capabilities of ALICE to give more insight into the evolution of the hot and dense matter created in Pb-Pb collisions. Starting from the physics motivations and working principles the challenges and current status of development is detailed.

  10. VHMPID: a new detector for the ALICE experiment at LHC

    CERN Document Server

    Agócs, A Gu; Barnaföldi, G G; Bellwied, R; Bencze, Gy; Berényi, D; Boldizsár, L; Cuautle, E; De Cataldo, G; Di Bari, D; Di Mauro, A; Dominguez, I; Futó, E; García, E; Hamar, G; Harris, J; Harton, A; Kovács, L; Lévai, P; Lipusz, Cs; Markert, C; Martinengo, P; Martinez, M I; Mastromarco, M; Mayani, D; Molnár, L; Nappi, E; Ortiz, A; Paić, G; Pastore, C; Patino, M E; Perini, D; Perrino, D; Peskov, V; Pinsky, L; Piuz, F; Pochybová, S; Smirnov, N; Song, J; Timmins, A; Varga, D; Vargas, A; Vergara, S; Volpe, G; Yi, J; Yoo, I K

    2011-01-01

    This article presents the basic idea of VHMPID, an upgrade detector for the ALICE experiment at LHC, CERN. The main goal of this detector is to extend the particle identification capabilities of ALICE to give more insight into the evolution of the hot and dense matter created in Pb-Pb collisions. Starting from the physics motivations and working principles the challenges and current status of development is detailed.

  11. PanDA: Exascale Federation of Resources for the ATLAS Experiment at the LHC

    Science.gov (United States)

    Barreiro Megino, Fernando; Caballero Bejar, Jose; De, Kaushik; Hover, John; Klimentov, Alexei; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Petrosyan, Artem; Wenaus, Torre

    2016-02-01

    After a scheduled maintenance and upgrade period, the world's largest and most powerful machine - the Large Hadron Collider(LHC) - is about to enter its second run at unprecedented energies. In order to exploit the scientific potential of the machine, the experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousand of physics users and compared to simulated data. Given diverse funding constraints, the computational resources for the LHC have been deployed in a worldwide mesh of data centres, connected to each other through Grid technologies. The PanDA (Production and Distributed Analysis) system was developed in 2005 for the ATLAS experiment on top of this heterogeneous infrastructure to seamlessly integrate the computational resources and give the users the feeling of a unique system. Since its origins, PanDA has evolved together with upcoming computing paradigms in and outside HEP, such as changes in the networking model, Cloud Computing and HPC. It is currently running steadily up to 200 thousand simultaneous cores (limited by the available resources for ATLAS), up to two million aggregated jobs per day and processes over an exabyte of data per year. The success of PanDA in ATLAS is triggering the widespread adoption and testing by other experiments. In this contribution we will give an overview of the PanDA components and focus on the new features and upcoming challenges that are relevant to the next decade of distributed computing workload management using PanDA.

  12. Physics Validation of the LHC Software

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The LHC Software will be confronted to unprecedented challenges as soon as the LHC will turn on. We summarize the main Software requirements coming from the LHC detectors, triggers and physics, and we discuss several examples of Software components developed by the experiments and the LCG project (simulation, reconstruction, etc.), their validation, and their adequacy for LHC physics.

  13. Modelling of local/global architectures for second level trigger at the LHC experiment

    International Nuclear Information System (INIS)

    Hajduk, Z.; Iwanski, W.; Korecyl, K.; Strong, J.

    1994-01-01

    Different architectures of the second level triggering system for experiments on LHC have been simulated. The basic scheme was local/global system with distributed computing power. As a tool the authors have used the object-oriented MODSIM II language

  14. CMOS pixel development for the ATLAS experiment at HL-LHC

    CERN Document Server

    Risti{c}, Branislav; The ATLAS collaboration

    2017-01-01

    To cope with the rate and radiation environment expected at the HL-LHC new approaches are being developed on CMOS pixel detectors, providing charge collection in a depleted layer. They are based on: HV enabling technologies that allow to use high depletion voltages (HV-MAPS), high resistivity wafers for large depletion depths (HR-MAPS); radiation hard processed with multiple nested wells to allow CMOS electronics embedded with sufficient shielding into the sensor substrate and backside processing and thinning for material minimization and backside voltage application. Since 2014, members of more than 20 groups in the ATLAS experiment are actively pursuing CMOS pixel R&D in an ATLAS Demonstrator program pursuing sensor design and characterizations. The goal of this program is to demonstrate that depleted CMOS pixels, with monolithic or hybrid designs, are suited for high rate, fast timing and high radiation operation at LHC. For this a number of technologies have been explored and characterized. In this pr...

  15. Access Safety Systems – New Concepts from the LHC Experience

    CERN Document Server

    Ladzinski, T; di Luca, S; Hakulinen, T; Hammouti, L; Riesco, T; Nunes, R; Ninin, P; Juget, J-F; Havart, F; Valentini, F; Sanchez-Corral Mena, E

    2011-01-01

    The LHC Access Safety System has introduced a number of new concepts into the domain of personnel protection at CERN. These can be grouped into several categories: organisational, architectural and concerning the end-user experience. By anchoring the project on the solid foundations of the IEC 61508/61511 methodology, the CERN team and its contractors managed to design, develop, test and commission on time a SIL3 safety system. The system uses a successful combination of the latest Siemens redundant safety programmable logic controllers with a traditional relay logic hardwired loop. The external envelope barriers used in the LHC include personnel and material access devices, which are interlocked door-booths introducing increased automation of individual access control, thus removing the strain from the operators. These devices ensure the inviolability of the controlled zones by users not holding the required credentials. To this end they are equipped with personnel presence detectors and th...

  16. CMOS pixel development for the ATLAS experiment at HL-LHC

    CERN Document Server

    Rimoldi, Marco; The ATLAS collaboration

    2017-01-01

    To cope with the rate and radiation environment expected at the HL-LHC new approaches are being developed on CMOS pixel detectors, providing charge collection in a depleted layer. They are based on: HV enabling technologies that allow to use high depletion voltages, high resistivity wafers for large depletion depths; radiation hard processed with multiple nested wells to allow CMOS electronics embedded with sufficient shielding into the sensor substrate and backside processing and thinning for material minimization and backside voltage application. Since 2014, members of more than 20 groups in the ATLAS experiment are actively pursuing CMOS pixel R$\\&$D in an ATLAS Demonstrator program pursuing sensor design and characterizations. The goal of this program is to demonstrate that depleted CMOS pixels are suited for high rate, fast timing and high radiation operation at LHC. For this a number of technologies have been explored and characterized. In this presentation the challenges for the usage of CMOS pixel...

  17. CMOS Pixel Development for the ATLAS Experiment at HL-LHC

    CERN Document Server

    Gaudiello, Andrea; The ATLAS collaboration

    2017-01-01

    To cope with the rate and radiation environment expected at the HL-LHC new approaches are being developed on CMOS pixel detectors, providing charge collection in a depleted layer. They are based on: HV enabling technologies that allow to use high depletion voltages (HV-MAPS), high resistivity wafers for large depletion depths (HR-MAPS); radiation hard processed with multiple nested wells to allow CMOS electronics embedded with sufficient shielding into the sensor substrate and backside processing and thinning for material minimization and backside voltage application. Since 2014, members of more than 20 groups in the ATLAS experiment are actively pursuing CMOS pixel R&D in an ATLAS Demonstrator program pursuing sensor design and characterizations. The goal of this program is to demonstrate that depleted CMOS pixels, with monolithic or hybrid designs, are suited for high rate, fast timing and high radiation operation at LHC. For this a number of technologies have been explored and characterized. In this pr...

  18. LHC Report: a record start for LHC ion operation

    CERN Multimedia

    Jan Uythoven for the LHC Team

    2011-01-01

    After the technical stop, the LHC switched over to ion operation, colliding lead-ions on lead-ions. The recovery from the technical stop was very smooth, and records for ion luminosity were set during the first days of ion operation.   The LHC technical stop ended on the evening of Friday, 11 November. The recovery from the technical stop was extremely smooth, and already that same evening ion beams were circulating in the LHC. ‘Stable beams’ were declared the same night, with 2 x 2 bunches of ions circulating in the LHC, allowing the experiments to have their first look at ion collisions this year. However, the next step-up in intensity – colliding 170 x 170 bunches – was postponed due to a vacuum problem in the PS accelerator, so the collisions on Sunday, 13 November were confined to 9 x 9 bunches. The vacuum problem was solved, and on the night of Monday, 14 November, trains of 24 lead bunches were injected into the LHC and 170 x 170 bunches were brough...

  19. Silicon drift detectors in alice experiment at lhc, performance tests and simulations

    International Nuclear Information System (INIS)

    ALICE collaboration

    2001-01-01

    A brief introduction to the silicon drift detector (SDD) in ALICE experiment at LHC CERN. Excellent agreement are found between the results from the simulation code (Ali Root) and the results of the test beam data for SDD s. A study of SDD performance and double track separation capability are shown

  20. Physics at LHC and beyond

    CERN Document Server

    2014-01-01

    The topics addressed during this Conference are as follows. ---An overview of the legacy results of the LHC experiments with 7 and 8 TeV data on Standard Model physics, Scalar sector and searches for New Physics. ---A discussion of the readiness of the CMS, ATLAS, and LHCb experiments for the forthcoming high-energy run and status of the detector upgrades ---A review of the most up-to-date theory outcome on cross-sections and uncertainties, phenomenology of the scalar sector, constraints and portals for new physics. ---The presentation of the improvements and of the expected sensibilities for the Run 2 of the LHC at 13 TeV and beyond. ---A comparison of the relative scientific merits of the future projects for hadron and e+e- colliders (HL-LHC, HE-LHC, ILC, CLIC, TLEP, VHE-LHC) towards precision measurements of the Scalar boson properties and of the Electroweak-Symmetry-Breaking parameters, and towards direct searches for New Physics.

  1. Electron-cloud simulation results for the SPS and recent results for the LHC

    International Nuclear Information System (INIS)

    Furman, M.A.; Pivi, M.T.F.

    2002-01-01

    We present an update of computer simulation results for some features of the electron cloud at the Large Hadron Collider (LHC) and recent simulation results for the Super Proton Synchrotron (SPS). We focus on the sensitivity of the power deposition on the LHC beam screen to the emitted electron spectrum, which we study by means of a refined secondary electron (SE) emission model recently included in our simulation code

  2. LHC Report: Summer temperatures in the LHC

    CERN Multimedia

    Jan Uythoven for the LHC Team

    2012-01-01

    The LHC experiments have finished their data-taking period before the summer conferences. The machine has already delivered substantially more collisions to the experiments this year than in the whole of 2011. The LHC has now started a six-day Machine Development period, which will be followed by the second Technical Stop of the year.   The number of collisions delivered to the experiments is expressed in integrated luminosity. In 2011, the integrated luminosity delivered to both ATLAS and CMS was around 5.6 fb-1. On Monday 18 June, experiments finished taking data before the summer conferences and the integrated luminosity for 2012 so far is about 6.6 fb-1, well above the unofficial target of 5 fb-1. The LHC’s performance over the last week of running was so efficient that the injection kicker magnets – which heat up due to the circulating beam – did not have time to cool down between the subsequent fills. As the time constants for warming up and cooli...

  3. Detector control system for an LHC experiment

    CERN Document Server

    Mato, P

    1998-01-01

    The purpose of this document is to provide the user requirements for a detector control system kernel for the LHC experiments following the ESA standard PSS-05 [1]. The first issue will be used to provide the basis for an evaluation of possible development philosophies for a kernel DCS. As such it will cover all the major functionality but only to a level of detail sufficient for such an evaluation to be performed. Many of the requirements are therefore intentionally high level and generic, and are meant to outline the functionality that would be required of the kernel DCS, but not yet to the level of the detail required for implementation. The document is also written in a generic fashion in order not to rule out any implementation technology.1

  4. Magnetic monopole searches with the MoEDAL experiment at the LHC

    CERN Document Server

    Pinfold, J; Lacarrère, D; Mermod, P; Katre, A

    2014-01-01

    The magnetic monopole appears in theories of spontaneous ga uge symmetry breaking and its existence would explain the quantisation of electric charg e. MoEDAL is the latest approved LHC experiment, designed to search directly for monopoles. It h as now taken data for the first time. The MoEDAL detectors are based on two complementary techniq ues: nuclear-track detectors are sensitive to the high-ionisation signature expected fr om a monopole, and the new magnetic monopole trapper (MMT) relies on the stopping and trapping o f monopoles inside an aluminium array which is then analysed with a superconducting magneto meter. Preliminary results obtained with a subset of the MoEDAL MMT test array deployed in 2012 are presented, where monopoles with charge above the fundamental unit magnetic charge or ma ss above 1.5 TeV are probed for the first time at the LHC

  5. Integrating LWDAQ into the Detector Control Systems of the LHC Experiments at CERN

    CERN Multimedia

    Holme, Oliver; Golonka, Piotr

    2009-01-01

    The LWDAQ (Long-Wire Data Acquisition) software and hardware from Brandeis University, Mass., USA provides access to a powerful suite of measurement instruments. Two high precision monitors used to measure the relative alignment between a source and a sensor are included. The BCAM (Brandeis CCD Angle Monitor) cameras take images of point light sources and the Rasnik (Red Alignment System of NIKhef) cameras take images of the NIKHEF developed Rasnik mask. Both systems are used in the LHC experiments at CERN. Brandeis University provides a tool called Acquisifier to script the data acquisition process and to analyse the images to determine the alignment data. In order to incorporate the resulting data from the alignment system into the Detector Control System (DCS) of the LHC experiments a new software component of the Joint COntrols Project (JCOP) Framework was developed. It provides a TCP/IP interface between LWDAQ and the SCADA tool PVSS so that the results of the data acquisition process can easily be retur...

  6. LHC Report: astounding availability

    CERN Multimedia

    Andrea Apollonio for the LHC team

    2016-01-01

    The LHC is off to an excellent start in 2016, having already produced triple the luminosity of 2015. An important factor in the impressive performance so far this year is the unprecedented machine availability.   LHC integrated luminosity in 2011, 2012, 2015 and 2016 and the prediction of the 2016 performance foreseen at the start of the year. Following the 2015-2016 end of year shutdown, the LHC restarted beam operation in March 2016. Between the restart and the first technical stop (TS1) in June, the LHC's beam intensity was successively increased, achieving operation with 2040 bunches per beam. The technical stop on 7-8 June was shortened to maximise the time available for luminosity production for the LHC experiments before the summer conferences. Following the technical stop, operation resumed and quickly returned to the performance levels previously achieved. Since then, the LHC has been running steadily with up to 2076 bunches per beam. Since the technical stop, a...

  7. Handling Worldwide LHC Computing Grid Critical Service Incidents : The infrastructure and experience behind nearly 5 years of GGUS ALARMs

    CERN Multimedia

    Dimou, M; Dulov, O; Grein, G

    2013-01-01

    In the Wordwide LHC Computing Grid (WLCG) project the Tier centres are of paramount importance for storing and accessing experiment data and for running the batch jobs necessary for experiment production activities. Although Tier2 sites provide a significant fraction of the resources a non-availability of resources at the Tier0 or the Tier1s can seriously harm not only WLCG Operations but also the experiments' workflow and the storage of LHC data which are very expensive to reproduce. This is why availability requirements for these sites are high and committed in the WLCG Memorandum of Understanding (MoU). In this talk we describe the workflow of GGUS ALARMs, the only 24/7 mechanism available to LHC experiment experts for reporting to the Tier0 or the Tier1s problems with their Critical Services. Conclusions and experience gained from the detailed drills performed in each such ALARM for the last 4 years are explained and the shift with time of Type of Problems met. The physical infrastructure put in place to ...

  8. Preliminary accelerator plans for maximizing the integrated LHC luminosity

    CERN Document Server

    Benedikt, Michael; Ruggiero, F; Ostojic, R; Scandale, Walter; Shaposhnikova, Elena; Wenninger, J

    2006-01-01

    A working group on "Proton Accelerators for the Future" (PAF) has been created in May 2005 by the CERN direction to elaborate a baseline scenario of the possible development and upgrade of the present Proton Accelerator Complex. This report is the result of the investigation conducted until the end of 2005, in close connection with the working group on "Physics Opportunities with Future Proton Accelerators" (POFPA) and is consistent with their recommendations. Focused on the goal of maximizing the integrated luminosity for the LHC experiments, a scenario of evolution is proposed, subject to further refinement using the future experience of commissioning and running-in the collider and its injector complex. The actions to be taken in terms of consolidation, R & D and improvement are outlined. The benefits for other types of physics are mentioned and will be investigated in more detail in the future.

  9. Experiments on the margin of beam induced quenches a superconducting quadrupole magnet in the LHC

    CERN Document Server

    Bracco, C; Bednarek, M J; Nebot Del Busto, E; Goddard, B; Holzer, E B; Nordt, A; Sapinski, M; Schmidt, R; Solfaroli Camillocci, M; Zerlauth, M

    2012-01-01

    Protection of LHC equipment relies on a complex system of collimators to capture injected and circulating beam in case of LHC kicker magnet failures. However, for specific failures of the injection kickers, the beam can graze the injection protection collimators and induce quenches of downstream superconducting magnets. This occurred twice during 2011 operation and cannot be excluded during future operation. Tests were performed during Machine Development periods of the LHC to assess the quench margin of the quadrupole located just downstream of the last injection protection collimator in point 8. In addition to the existing Quench Protection System, a special monitoring instrumentation was installed at this magnet to detect any resistance increase below the quench limit. The correlation between the magnet and Beam Loss Monitor signals was analysed for different beam intensities and magnet currents. The results of the experiments are presented.

  10. Development of Aluminium Vacuum Chambers for the LHC Experiments at CERN

    CERN Document Server

    Gallilee, M; Costa-Pinto, P; Lepeule, P; Perez-Espinos, J; Marques Antunes Ferreira, L; Prever-Loiri, L; Sapountzis, A

    2014-01-01

    Beam losses may cause activation of vacuum chamber walls, in particular those of the Large Hadron Collider (LHC) experiments. For the High Luminosity (HL-LHC), the activation of such vacuum chambers will increase. It is therefore necessary to use a vacuum chamber material which interacts less with the circulating beam. While beryllium is reserved for the collision point, a good compromise between cost, availability and transparency is obtained with aluminium alloys; such materials are a preferred choice with respect to austenitic stainless steel. Manufacturing a thin-wall aluminium vacuum chamber presents several challenges as the material grade needs to be machinable, weldable, leak-tight for small thicknesses, and able to withstand heating to 250°C for extended periods of time. This paper presents some of the technical challenges during the manufacture of these vacuum chambers and the methods for overcoming production difficulties, including surface treatments and Non-Evaporable Getter (NEG) thin-film coat...

  11. First Operational Experience with the LHC Beam Dump Trigger Synchronisation Unit

    CERN Document Server

    Antoine, A; Magnin, N; Juteau, P; Voumard, N

    2011-01-01

    Two LHC Beam Dumping Systems (LBDS) remove the counter-rotating beams safely from the collider during setting up of the accelerator, at the end of a physics run and in case of emergencies. Dump requests can come from 3 different sources: the machine protection system in emergency cases, the machine timing system for scheduled dumps or the LBDS itself in case of internal failures. These dump requests are synchronized with the 3 μs beam abort gap in a fail-safe redundant Trigger Synchronization Unit (TSU) based on a Digital Phase Locked Loop (DPLL), locked onto the LHC beam revolution frequency with a maximum phase error of 40 ns. The synchronized trigger pulses coming out of the TSU are then distributed to the high voltage generators of the beam dump kickers through a redundant fault-tolerant trigger distribution system. This paper describes the operational experience gained with the TSU since its commissioning with beam in 2009, and highlights the improvements, which have been implemented f...

  12. LHC Report: LHC smashes collision records

    CERN Multimedia

    Sarah Charley

    2016-01-01

    The Large Hadron Collider is now producing more than a billion proton-proton collisions per second.   The LHC is colliding protons at a faster rate than ever before: approximately 1 billion times per second. Since April 2016, the LHC has delivered more than 30 inverse femtobarns (fb-1) to both ATLAS and CMS. This means that around 2.4 quadrillion (2.4 million billion) collisions have been seen by each of the experiments this year. The inverse femtobarn is the unit of measurement for integrated luminosity, indicating the cumulative number of potential collisions. This compares with the total of 33.2 fb-1 produced between 2010 and 2015. The unprecedented performance this year is the result of both the incremental increases in collision rate and the sheer amount of time the LHC has been up and running. This comes after a slow start-up in 2015, when scientists and engineers still needed to learn how to operate the machine at a much higher energy. “With more energy, the machine is much more sen...

  13. Electronics at LHC

    CERN Document Server

    Hall, Geoffrey

    1998-01-01

    An overview of the electronic readout systems planned for use in the CMS and ATLAS experiments at the LHC will be given, with an emphasis on the motivations for the designs adopted and major technologies to be employed, specially those which are specific to LHC. At its design luminosity, the LHC will deliver hundreds of millions of proton-proton interactions per second. Storage and computing limitations limit the number of physics events that can be recorded to about 100 per second. The selection will be carried out by the Trigger and data acquisition systems of the experiments. This lecture will review the requirements, architectures and various designs currently considered. Introduction. Structure of gauge theories. The QED and QCD examples. Chiral theories. The electroweak theory. Spontaneous symmetry breaking. The Higgs machanism.Gauge boson and fermion masses. Yukawa coupling. Charges current couplings. The Cabibbo-Kobayashi-Maskawa matrix and CP violation. neutral current couplings. the Clashow-Iliopoul...

  14. Conference: STANDARD MODEL @ LHC

    CERN Multimedia

    2012-01-01

    HCØ institute Universitetsparken 5 DK-2100 Copenhagen Ø Denmark Room: Auditorium 2 STANDARD MODEL @ LHC Niels Bohr International Academy and Discovery Center 10-13 April 2012 This four day meeting will bring together both experimental and theoretical aspects of Standard Model phenomenology at the LHC. The very latest results from the LHC experiments will be under discussion. Topics covered will be split into the following categories:     * QCD (Hard,Soft & PDFs)     * Vector Boson production     * Higgs searches     * Top Quark Physics     * Flavour physics

  15. The high-performance database archiver for the LHC experiments

    CERN Document Server

    González-Berges, M

    2007-01-01

    Each of the Large Hadron Collider (LHC) experiments will be controlled by a large distributed system built with the Supervisory Control and Data Acquisition (SCADA) tool Prozeßvisualisierungs- und Steuerungsystem (PVSS). There will be in the order of 150 computers and one million input/output parameters per experiment. The values read from the hardware, the alarms generated and the user actions will be archived for the later physics analysis, the operation and the debugging of the control system itself. Although the original PVSS implementation of a database archiver was appropriate for standard industrial use, the performance was not sufficient for the experiments. A collaboration was setup between CERN and ETM, the company that develops PVSS. Changes in the architecture and several optimizations were made and tested in a system of a comparable size to the final ones. As a result, we have been able to improve the performance by more than one order of magnitude, and what is more important, we now have a scal...

  16. Review of the ATLAS experiment at the LHC (CERN)

    International Nuclear Information System (INIS)

    Taylor, G.

    1998-01-01

    Full text: This talk gives in overview of the physics program for the next generation high energy physics experiments at CERN's Large Hadron Collider (LHC). Emphasis will be on the ATLAS experiment and in particular on the Australian participation in that experiment. Australian physicists from Melbourne, Sydney and Wollongong are playing a significant role in the development, production, installation and operation of the ambitious Semiconductor Tracker (SCT) in the ATLAS' Inner Detector. The SCT, particularly important for the detection and measurement of high energy electrons, will be essential in the search for the Higgs Boson through electron decay channels (amongst other reactions). The design calls for a total detector surface area an order of magnitude larger than in current silicon detectors, in a harsh radiation environment. Prodigious data rates and high speed electronics add to the complications of this detector. The talk will review progress and describe the schedule for the completion of the SCT and ATLAS

  17. LHC luminosity upgrade detector challenges

    CERN Multimedia

    CERN. Geneva; de Roeck, Albert; Bortoletto, Daniela; Wigmans, Richard; Riegler, Werner; Smith, Wesley H

    2006-01-01

    LHC luminosity upgrade: detector challenges The upgrade of the LHC machine towards higher luminosity (1035 cm -2s-1) has been studied over the last few years. These studies have investigated scenarios to achieve the increase in peak luminosity by an order of magnitude, as well as the physics potential of such an upgrade and the impact of a machine upgrade on the LHC DETECTORS. This series of lectures will cover the following topics: • Physics motivation and machine scenarios for an order of magnitude increase in the LHC peak luminosity (lecture 1) • Detector challenges including overview of ideas for R&D programs by the LHC experiments: tracking and calorimetry, other new detector developments (lectures 2-4) • Electronics, trigger and data acquisition challenges (lecture 5) Note: the much more ambitious LHC energy upgrade will not be covered

  18. Optoelectronics in TESLA, LHC and pi-of-the-sky experiments

    CERN Document Server

    Romaniuk, Ryszard; Simrock, Stefan; Wrochna, Grzegorz

    2004-01-01

    Optical and optoelectronics technologies are more and more widely used in the biggest world experiments of high energy and nuclear physics, as well as in the astronomy. The paper is a kind of a broad digest describing the usage of optoelectronics is such experiments and information about some of the involved teams. The described experiments include: TESLA linear accelerator and FEL, Compact Muon Solenoid at LHC and recently started pi-of-the-sky global gamma ray bursts (with associated optical flashes) observation experiment. Optoelectronics and photonics offer several key features which are either extending the technical parameters of existing solutions or adding quite new practical application possibilities. Some of these favorable features of photonic systems are: high selectivity of optical sensors, immunity to some kinds of noise processes, extremely broad bandwidth exchangeable for either terabit rate transmission or ultrashort pulse generation, parallel image processing capability, etc. The following g...

  19. The physics behind LHC

    CERN Multimedia

    CERN. Geneva

    2006-01-01

    What do physicists want to discover with experiments at the LHC? What is the Higgs boson? What are the new phenomena that could be observed at the LHC?I will try to answer these questions using language accessible also to non-experts. Organiser(s): L. Alvarez-Gaume / PH-THNote: * Tea & coffee will be served at 16:00.

  20. First experiences with the LHC BLM sanity checks

    OpenAIRE

    Emery, J; Dehning, B; Effinger, E; Nordt, A; Sapinski, M G; Zamantzas, C

    2010-01-01

    The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test "as good as ne...

  1. LHC Report: The beam is back at the LHC

    CERN Multimedia

    Reyes Alemany

    2015-01-01

    A series of sector beam tests paved the way for the start-up of the LHC in 2008 and 2009. These tests and the follow-up of the issues that arose were part of the process that led to a smooth start-up with beam.   Given this experience, sector tests were scheduled to take place several weeks before the 2015 start-up. On the weekend of 6-9 March, beam from the SPS was injected into both LHC injection regions, followed by a first pass through the downstream LHC sectors. For the clockwise LHC beam (called “beam 1”) this meant passing through ALICE and into Sector 2-3, while the anticlockwise beam (called “beam 2”) was threaded through LHCb and all the way from Point 8 to Point 6, where it was extracted by the beam dump kickers onto the beam dump block. The dry runs in the previous weeks were mainly targeted at preparation for the sector tests. The systems tested included: injection, timing, synchronisation and beam instrumentation. The beam interlock ...

  2. New U.S. LHC Web site launched

    CERN Multimedia

    Katie Yurkewicz

    2007-01-01

    On September 12, the U.S. Department of Energy's Office of Science launched a new Web site, www.uslhc.us, to tell the story of the U.S. role in the LHC. The site provides general information for the public about the LHC and its six experiments, as well as detailed information about the participation of physicists, engineers and students from the United States. The U.S. site joins the UK's LHC site in providing information for a national audience, with sites from several more countries expected to launch within the next year. The US LHC site features news and information about the LHC, along with high-resolution images and resources for students and educators. The site also features blogs by four particle physicists, including ATLAS collaborators Monica Dunford from the University of Chicago and Peter Steinberg from Brookhaven National Laboratory. More than 1,300 scientists from over 90 U.S. institutions participate in the LHC and its experiments, representing universities and national laboratories from...

  3. LHC Luminosity Performance

    CERN Document Server

    AUTHOR|(CDS)2091107; Fuchsberger, Kajetan; Papotti, Giulia

    This thesis adresses several approaches with the common goal of assessing, understanding and improving the luminosity of the Large Hadron Collider (LHC). To better exploit existing margins for maximum luminosity while fulfilling the requirements of the LHC experiments, new techniques for luminosity levelling are studied and developed to an operational state, such as changing the crossing angle or $\\beta^*$ (beam size) at the interaction points with the beams in collisions. In 2017 LHC operation, the crossing angle reduction in collisions improved the integrated luminosity by $\\mathrm{\\sim} 2\\,\\mathrm{fb^{-1}}$ ($\\mathrm{\\sim} 4\\,\\mathrm{\\%}$ of the yearly production). For additional diagnostics, a new method for measuring beam sizes and orbits for each circulating bunch using the luminosity measurement during beam separation scans is shown. The results of these Emittance Scans improved the understanding of the LHC luminosity reach and of the orbit offsets introduced by beam-beam long-range effects.

  4. Federated software defined network operations for LHC experiments

    Science.gov (United States)

    Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon

    2013-09-01

    The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.

  5. LHC computing (WLCG) past, present, and future

    CERN Document Server

    Bird, I G

    2016-01-01

    The LCG project, and the WLCG Collaboration, represent a more than 10-year investment in building and operating the LHC computing environment. This article gives some of the history of how the WLCG was constructed and the preparations for the accelerator start-up. It will discuss the experiences and lessons learned during the first 3 year run of the LHC, and will conclude with a look forwards to the planned upgrades of the LHC and the experiments, discussing the implications for computing.

  6. Operational experience of the upgraded LHC injection kicker magnets during Run 2 and future plans

    Science.gov (United States)

    Barnes, M. J.; Adraktas, A.; Bregliozzi, G.; Goddard, B.; Ducimetière, L.; Salvant, B.; Sestak, J.; Vega Cid, L.; Weterings, W.; Vallgren, C. Yin

    2017-07-01

    During Run 1 of the LHC, one of the injection kicker magnets caused occasional operational delays due to beam induced heating with high bunch intensity and short bunch lengths. In addition, there were also sporadic issues with vacuum activity and electrical flashover of the injection kickers. An extensive program of studies was launched and significant upgrades were carried out during Long Shutdown 1 (LS 1). These upgrades included a new design of beam screen to reduce both beam coupling impedance of the kicker magnet and the electric field associated with the screen conductors, hence decreasing the probability of electrical breakdown in this region. This paper presents operational experience of the injection kicker magnets during the first years of Run 2 of the LHC, including a discussion of faults and kicker magnet issues that limited LHC operation. In addition, in light of these issues, plans for further upgrades are briefly discussed.

  7. Thermal Runaways in LHC Interconnections: Experiments

    CERN Document Server

    Willering, G P; Bottura, L; Scheuerlein, C; Verweij, A P

    2011-01-01

    The incident in the LHC in September 2008 occurred in an interconnection between two magnets of the 13 kA dipole circuit. This event was traced to a defect in one of the soldered joints between two superconducting cables stabilized by a copper busbar. Further investigation revealed defective joints of other types. A combination of (1) a poor contact between the superconducting cable and the copper stabilizer and (2) an electrical discontinuity in the stabilizer at the level of the connection can lead to an unprotected quench of the busbar. Once the heating power in the unprotected superconducting cable exceeds the heat removal capacity a thermal run-away occurs, resulting in a fast melt-down of the non-stabilized cable. We have performed a thorough investigation of the conditions upon which a thermal run-away in the defect can occur. To this aim, we have prepared heavily instrumented samples with well-defined and controlled defects. In this paper we describe the experiment, and the analysis of the data, and w...

  8. High-level trigger system for the LHC ALICE experiment

    CERN Document Server

    Bramm, R; Lien, J A; Lindenstruth, V; Loizides, C; Röhrich, D; Skaali, B; Steinbeck, T M; Stock, Reinhard; Ullaland, K; Vestbø, A S; Wiebalck, A

    2003-01-01

    The central detectors of the ALICE experiment at LHC will produce a data size of up to 75 MB/event at an event rate less than approximately equals 200 Hz resulting in a data rate of similar to 15 GB/s. Online processing of the data is necessary in order to select interesting (sub)events ("High Level Trigger"), or to compress data efficiently by modeling techniques. Processing this data requires a massive parallel computing system (High Level Trigger System). The system will consist of a farm of clustered SMP-nodes based on off- the-shelf PCs connected with a high bandwidth low latency network.

  9. Critical services in the LHC computing

    International Nuclear Information System (INIS)

    Sciaba, A

    2010-01-01

    The LHC experiments (ALICE, ATLAS, CMS and LHCb) rely for the data acquisition, processing, distribution, analysis and simulation on complex computing systems, running using a variety of services, provided by the experiments, the Worldwide LHC Computing Grid and the different computing centres. These services range from the most basic (network, batch systems, file systems) to the mass storage services or the Grid information system, up to the different workload management systems, data catalogues and data transfer tools, often internally developed in the collaborations. In this contribution we review the status of the services most critical to the experiments by quantitatively measuring their readiness with respect to the start of the LHC operations. Shortcomings are identified and common recommendations are offered.

  10. Performance and Operational Aspects of HL-LHC Scenarios

    CERN Document Server

    Medina Medrano, Luis

    2016-01-01

    Several alternatives to the present HL-LHC baseline configuration have been proposed, aiming either to improve the potential performance, reduce its risks, or to provide options for addressing possible limitations or changes in its parameters. In this paper we review and compare the performance of the HL-LHC baseline and the main alternatives with the latest parameters set. The results are obtained using refined simulations of the evolution of the luminosity with β^{*}-levelling, for which new criteria have been introduced, such as improved calculation of the intrabeam scattering and the addition of penalty steps to take into account the necessary time to move between consecutive optics during the process. The features of the set of optics are discussed for the nominal baseline.

  11. LHC Results Highlights (CLASHEP 2013)

    CERN Document Server

    Gonzalez, O.

    2015-05-22

    The good performance of the LHC provided enough data at 7 TeV and 8 TeV to allow the experiments to perform very competitive measurements and to expand the knowledge about the fundamental interaction far beyond that from previous colliders. This report summarizes the highlights of the results obtained with these data samples by the four large experiments, covering all the topics of the physics program and focusing on those exploiting the possibilities of the LHC.

  12. A proposed Drift Tubes-seeded muon track trigger for the CMS experiment at the High Luminosity-LHC

    CERN Document Server

    AUTHOR|(CDS)2070813; Lazzizzera, Ignazio; Vanini, Sara; Zotto, Pierluigi

    2016-01-01

    The LHC program at 13 and 14 TeV, after the observation of the candidate SM Higgs boson, will help clarify future subjects of study and shape the needed tools. Any upgrade of the LHC experiments for unprecedented luminosities, such as the High Luminosity-LHC ones, must then maintain the acceptance on electroweak processes that can lead to a detailed study of the properties of the candidate Higgs boson. The acceptance of the key lepton, photon and hadron triggers should be kept such that the overall physics acceptance, in particular for low-mass scale processes, can be the same as the one the experiments featured in 2012.In such a scenario, a new approach to early trigger implementation is needed. One of the major steps will be the inclusion of high-granularity tracking sub-detectors, such as the CMS Silicon Tracker, in taking the early trigger decision. This contribution can be crucial in several tasks, including the confirmation of triggers in other subsystems, and the improvement of the on-line momentum mea...

  13. First experiences with the LHC BLM sanity checks

    Science.gov (United States)

    Emery, J.; Dehning, B.; Effinger, E.; Nordt, A.; Sapinski, M. G.; Zamantzas, C.

    2010-12-01

    The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test "as good as new". The sanity checks are part of this strategy. They are testing the electrical part of the detectors (ionisation chamber or secondary emission detector), their cable connections to the front-end electronics, further connections to the back-end electronics and their ability to request a beam abort. During the installation and in the early commissioning phase, these checks have shown their ability to find also non-conformities caused by unexpected failure event scenarios. In every day operation, a non-conformity discovered by this check inhibits any further injections into the LHC until the check confirms the absence of non-conformities.

  14. First experiences with the LHC BLM sanity checks

    International Nuclear Information System (INIS)

    Emery, J; Dehning, B; Effinger, E; Nordt, A; Sapinski, M G; Zamantzas, C

    2010-01-01

    The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test a s good as new . The sanity checks are part of this strategy. They are testing the electrical part of the detectors (ionisation chamber or secondary emission detector), their cable connections to the front-end electronics, further connections to the back-end electronics and their ability to request a beam abort. During the installation and in the early commissioning phase, these checks have shown their ability to find also non-conformities caused by unexpected failure event scenarios. In every day operation, a non-conformity discovered by this check inhibits any further injections into the LHC until the check confirms the absence of non-conformities.

  15. First experiences with the LHC BLM sanity checks

    Energy Technology Data Exchange (ETDEWEB)

    Emery, J; Dehning, B; Effinger, E; Nordt, A; Sapinski, M G; Zamantzas, C, E-mail: Jonathan.emery@cern.ch [CERN, CH-1211 Geneve 23 (Switzerland)

    2010-12-15

    The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test {sup a}s good as new{sup .} The sanity checks are part of this strategy. They are testing the electrical part of the detectors (ionisation chamber or secondary emission detector), their cable connections to the front-end electronics, further connections to the back-end electronics and their ability to request a beam abort. During the installation and in the early commissioning phase, these checks have shown their ability to find also non-conformities caused by unexpected failure event scenarios. In every day operation, a non-conformity discovered by this check inhibits any further injections into the LHC until the check confirms the absence of non-conformities.

  16. Review of LHC dark matter searches

    International Nuclear Information System (INIS)

    Kahlhoefer, Felix

    2017-02-01

    This review discusses both experimental and theoretical aspects of searches for dark matter at the LHC. An overview of the various experimental search channels is given, followed by a summary of the different theoretical approaches for predicting dark matter signals. A special emphasis is placed on the interplay between LHC dark matter searches and other kinds of dark matter experiments, as well as among different types of LHC searches.

  17. Review of LHC dark matter searches

    Energy Technology Data Exchange (ETDEWEB)

    Kahlhoefer, Felix

    2017-02-15

    This review discusses both experimental and theoretical aspects of searches for dark matter at the LHC. An overview of the various experimental search channels is given, followed by a summary of the different theoretical approaches for predicting dark matter signals. A special emphasis is placed on the interplay between LHC dark matter searches and other kinds of dark matter experiments, as well as among different types of LHC searches.

  18. Enabling the ATLAS Experiment at the LHC for High Performance Computing

    CERN Document Server

    AUTHOR|(CDS)2091107; Ereditato, Antonio

    In this thesis, I studied the feasibility of running computer data analysis programs from the Worldwide LHC Computing Grid, in particular large-scale simulations of the ATLAS experiment at the CERN LHC, on current general purpose High Performance Computing (HPC) systems. An approach for integrating HPC systems into the Grid is proposed, which has been implemented and tested on the „Todi” HPC machine at the Swiss National Supercomputing Centre (CSCS). Over the course of the test, more than 500000 CPU-hours of processing time have been provided to ATLAS, which is roughly equivalent to the combined computing power of the two ATLAS clusters at the University of Bern. This showed that current HPC systems can be used to efficiently run large-scale simulations of the ATLAS detector and of the detected physics processes. As a first conclusion of my work, one can argue that, in perspective, running large-scale tasks on a few large machines might be more cost-effective than running on relatively small dedicated com...

  19. A dedicated LHC collider Beauty experiment for precision measurements of CP-violation. LHC-B letter of intent; TOPICAL

    International Nuclear Information System (INIS)

    Crosetto, Dario B.

    1996-01-01

    The LHC-B Collaboration proposes to build a forward collider detector dedicated to the study of CP violation and other rare phenomena in the decays of Beauty particles. The forward geometry results in an average 80 GeV momentum of reconstructed B-mesons and, with multiple, efficient and redundant triggers, yields large event samples. B-hadron decay products are efficiently identified by Ring-Imaging Cerenkov Counters, rendering a wide range of multi-particle final states accessible and providing precise measurements of all angles,(alpha),(beta) and(gamma) of the unitarity triangle. The LHC-B microvertex detector capabilities facilitate multi-vertex event reconstruction and proper-time measurements with an expected few-percent uncertainty, permitting measurements of B(sub s)-mixing well beyond the largest conceivable values of x(sub S). LHC-B would be fully operational at the startup of LHC and requires only a modest luminosity to reveal its full performance potential

  20. The CERN Detector Safety System for LHC Experiments

    CERN Document Server

    Lüders, S; Morpurgo, G; Schmeling, S M

    2003-01-01

    The Detector Safety System (DSS), developed at CERN in common for the four LHC experiments under the auspices of the Joint Controls Project (JCOP), will be responsible for assuring the equipment protection for these experiments. Therefore, the DSS requires a high degree of both availability and reliability. It is composed of a Front-end and a Back-end part. The Front-end is based on a redundant Siemens PLC, to which the safety-critical part of the DSS task is delegated. The PLC Front-end is capable of running autonomously and of automati-cally taking predefined protective actions whenever re-quired. It is supervised and configured by the CERN-cho-sen PVSS SCADA system via a Siemens OPC server. The supervisory layer provides the operator with a status display and with limited online reconfiguration capabili-ties. Configuration of the code running in the PLCs is completely data driven via the contents of a ?Configura-tion Database?. Thus, the DSS can easily adapt to the different and constantly evolving require...

  1. High Luminosity LHC Project Description

    CERN Document Server

    Apollinari, Giorgio; Rossi, Lucio

    2014-01-01

    The High Luminosity LHC (HL-LHC) is a novel configuration of the Large Hadron Collider, aiming at increasing the luminosity by a factor five or more above the nominal LHC design, to allow increasing the integrated luminosity, in the high luminosity experiments ATLAS and CMS, from the 300 fb-1 of the LHC original design up to 3000 fb-1 or more. This paper contains a short description of the main machine parameters and of the main equipment that need to be developed and installed. The preliminary cost evaluation and the time plan are presented, too. Finally, the international collaboration that is supporting the project, the governance and the project structure are discussed, too.

  2. High Intensity Beam Test of Low Z Materials for the Upgrade of SPS-to-LHC Transfer Line Collimators and LHC Injection Absorbers

    CERN Document Server

    Maciariello, Fausto; Butcher, Mark; Calviani, Marco; Folch, Ramon; Kain, Verena; Karagiannis, Konstantinos; Lamas Garcia, Inigo; Lechner, Anton; Nuiry, Francois-Xavier; Steele, Genevieve; Uythoven, Jan

    2016-01-01

    In the framework of the LHC Injector Upgrade (LIU) and High-Luminosity LHC (HL-LHC) project, the collimators in the SPS-to LHC transfer lines will undergo important modifications. The changes to these collimators will allow them to cope with beam brightness and intensity levels much increased with respect to their original design parameters: nominal and ultimate LHC. The necessity for replacement of the current materials will need to be confirmed by a test in the High Radiation to Materials (HRM) facility at CERN. This test will involve low Z materials (such as Graphite and 3-D Carbon/Carbon composite), and will recreate the worst case scenario those materials could see when directly impacted by High luminosity LHC (HL-LHC) or Batch Compression Merging and Splitting (BCMS) beams. Thermo-structural simulations used for the material studies and research, the experiment preparation phase, the experiment itself, pre irradiation analysis (including ultrasound and metrology tests on the target materials), the resul...

  3. Computation for LHC experiments: a worldwide computing grid; Le calcul scientifique des experiences LHC: une grille de production mondiale

    Energy Technology Data Exchange (ETDEWEB)

    Fairouz, Malek [Universite Joseph-Fourier, LPSC, CNRS-IN2P3, Grenoble I, 38 (France)

    2010-08-15

    In normal operating conditions the LHC detectors are expected to record about 10{sup 10} collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10{sup 9} octets per second and recording capacity of a few tens of 10{sup 15} octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  4. The Lhc beam commissioning

    International Nuclear Information System (INIS)

    Redarelli, S.; Bailey, R.

    2008-01-01

    The plans for the Lhc proton beam commissioning are presented. A staged commissioning approach is proposed to satisfy the request of the Lhc experiments while minimizing the machine complexity in early commissioning phases. Machine protection and collimation aspects will be tackled progressively as the performance will be pushed to higher beam intensities. The key parameters are the number of bunches, k b , the proton intensity pe bunch, N, and the β in the various interaction points. All together these parameters determine the total beam power and the complexity of the machine. We will present the proposed trade off between the evolution of these parameters and the Lhc luminosity performance.

  5. The detector safety system for LHC experiments

    CERN Document Server

    Schmeling, Sascha; Lüders, S; Morpurgo, Giulio

    2004-01-01

    The Detector Safety System (DSS), currently being developed at CERN under the auspices of the Joint Controls Project (JCOP), will be responsible for assuring the protection of equipment for the four Large Hadron Collider (LHC)**1 experiments. Thus, the DSS will require a high degree of both availability and reliability. After evaluation of various possible solutions, a prototype is being built based on a redundant Siemens PLC**2 front-end, to which the safety- critical part of the DSS task is delegated. This is then supervised by a PVSS**3 SCADA**4 system via an OPC**5 server. The PLC front-end is capable of running autonomously and of automatically taking predefined protective actions whenever required. The supervisory layer provides the operator with a status display and with limited online reconfiguration capabilities. Configuration of the code running in the PLCs will be completely data driven via the contents of a "configuration database." Thus, the DSS can easily adapt to the different and constantly ev...

  6. Using widgets to monitor the LHC experiments

    CERN Document Server

    Gonzalez Caballero, Isidro

    2011-01-01

    The complexity of the LHC experiments requires monitoring systems to verify the correct functioning of different sub-systems and to allow operators to quickly spot problems and issues that may cause loss of information and data. Due to the distributed nature of the collaborations and the different technologies involved, the information data that need to be correlated is usually spread over several databases, web pages and monitoring systems. On the other hand, although the complete set of monitorable aspects is known and fixed, the subset that each person needs to monitor is often different for each individual. Therefore, building a unique monitoring tool that suits every single collaborator becomes close to impossible. A modular approach with a set of customizable widgets, small autonomous portions of HTML and JavaScript, that can be aggregated to form private or public monitoring web pages can be a scalable and robust solution, where the information can be provided by a simple and thin set of web services. ...

  7. CKM angles measurements and New Physics at Lhc b

    International Nuclear Information System (INIS)

    Musy, M.

    2008-01-01

    In this paper a review is given of the main characteristics of the future measurements of the unitary triangle by the Lhc b experiment at the Large Hadron Collider (Lhc), and the expected achievable precision. The Lhc b experiment will be able to exploit a wide range of physics decays involving the B mesons, allowing for the possibility to have early indications of New Physics.

  8. LHCf experiment: forward physics at LHC for cosmic rays study

    Directory of Open Access Journals (Sweden)

    Del Prete M.

    2016-01-01

    Full Text Available The LHCf experiment, optimized for the study of forward physics at LHC, completes its main physics program in this year 2015, with the proton-proton collisions at the energy of 13 TeV. LHCf gives important results on the study of neutral particles at extreme pseudo-rapidity, both for proton-proton and for proton-ion interactions. These results are an important reference for tuning the models of the hadronic interaction currently used for the simulation of the atmospheric showers induced by very high energy cosmic rays. The results of this analysis and the future perspective are presented in this paper.

  9. Discovery of the Higgs boson by the ATLAS and CMS experiments at the LHC

    CERN Document Server

    Wang, HaiChen

    2014-01-01

    The Standard Model (SM) Higgs boson was predicted by theorists in the 1960s during the development of the electroweak theory. Prior to the startup of the CERN Large Hadron Collider (LHC), experimental searches found no evidence of the Higgs boson. In July 2012, the ATLAS and CMS experiments at the LHC reported the discovery of a new boson in their searches for the SM Higgs boson. Subsequent experimental studies have revealed the spin-0 nature of this new boson and found its couplings to SM particles consistent to those of a Higgs boson. These measurements confirmed the newly discovered boson is indeed a Higgs boson. More measurements will be performed to compare the properties of the Higgs boson with the SM predictions.

  10. Status and prospects from the LHC

    International Nuclear Information System (INIS)

    Hawkings, Richard

    2010-01-01

    This article reviews the status of the CERN Large Hadron Collider and associated experiments as of July 2010. After a brief discussion of the progress in accelerator and experiment commissioning, the LHC physics landscape is presented, together with a selection of the experimental results achieved so far. Finally the prospects for the 2010-11 LHC physics run are reviewed, with an emphasis on possible discoveries in the Higgs and supersymmetry sectors.

  11. Chasseurs de Higgs au LHC - A la Recherche des l'Origines

    CERN Multimedia

    Yves Sirois

    To increase understanding of the LHC, why scientists collaborate on this experiment and what they hope to achieve with the LHC. A 51 slide presentation in French for a general audience. Delivered at the "Cité des Sciences" in Paris, "Rencontres du Ciel et de l'Espace," November, 2010 This presentation covers the following topics: - The LHC --what it is --what it looks like --where it is located --the international nature of CERN & experiment collaborations --the experiments - Accelerators --a brief history on accelerators --what accelerators can do - The scientific goals of the LHC - Particle Physics in General --history & the basics - Impact on Technology and Society - First LHC Results - Concluding remarks

  12. Physics Capabilities of the ATLAS Experiment in Pb+Pb Collisions at the LHC

    CERN Document Server

    Derendarz, D; The ATLAS collaboration

    2010-01-01

    Relativistic heavy ion collisions at the LHC will uncover properties of hot and dense medium formed at collision energy thirty times larger than energy presently available at RHIC. ATLAS is one of three experiments participating in the heavy ion program at the LHC. A brief overview of variety of observables which will be measured by ATLAS to study soft and hard QCD phenomena in heavy ion environment is presented. In particular the detector will measure global observables like charged particle multiplicity, azimuthal anisotropy and energy flow. The detector provides also an excellent capability to probe the quark gluon plasma by measurement of high energy jets and photons as well as quarkonia states. Performance of a high granularity calorimeter, silicon tracking detector and muon spectrometer in heavy ion collisions is reported. A unique ATLAS potential to study Pb+Pb interactions is discussed.

  13. Designing a future Conditions Database based on LHC experience

    CERN Document Server

    Formica, Andrea; The ATLAS collaboration; Gallas, Elizabeth; Govi, Giacomo; Lehmann Miotto, Giovanna; Pfeiffer, Andreas

    2015-01-01

    The ATLAS and CMS Conditions Database infrastructures have served each of the respective experiments well through LHC Run 1, providing efficient access to a wide variety of conditions information needed in online data taking and offline processing and analysis. During the long shutdown between Run 1 and Run 2, we have taken various measures to improve our systems for Run 2. In some cases, a drastic change was not possible because of the relatively short time scale to prepare for Run 2. In this process, and in the process of comparing to the systems used by other experiments, we realized that for Run 3, we should consider more fundamental changes and possibilities. We seek changes which would streamline conditions data management, improve monitoring tools, better integrate the use of metadata, incorporate analytics to better understand conditions usage, as well as investigate fundamental changes in the storage technology, which might be more efficient while minimizing maintenance of the data as well as simplif...

  14. First Experience with the LHC Cryogenic Instrumentation

    CERN Document Server

    Vauthier, N; Balle, Ch; Casas-Cubillos, J; Ciechanowski, M; Fernandez-Penacoba, G; Fortescue-Beck, E; Gomes, P; Jeanmonod, N; Lopez-Lorente, A; Suraci, A

    2008-01-01

    The LHC under commissioning at CERN will be the world's largest superconducting accelerator and therefore makes extensive use of cryogenic instruments. These instruments are installed in the tunnel and therefore have to withstand the LHC environment that imposes radiation-tolerant design and construction. Most of the instruments require individual calibration; some of them exhibit several variants as concerns measuring span; all relevant data are therefore stored in an Oracle® database. Those data are used for the various quality assurance procedures defined for installation and commissioning, as well as for generating tables used by the control system to configure automatically the input/output channels. This paper describes the commissioning of the sensors and the corresponding electronics, the first measurement results during the cool-down of one machine sector; it discusses the different encountered problems and their corresponding solutions.

  15. Anatomy of the inert two-Higgs-doublet model in the light of the LHC and non-LHC dark matter searches

    Science.gov (United States)

    Belyaev, Alexander; Cacciapaglia, Giacomo; Ivanov, Igor P.; Rojas-Abatte, Felipe; Thomas, Marc

    2018-02-01

    The inert two-Higgs-doublet model (i2HDM) is a theoretically well-motivated example of a minimal consistent dark matter (DM) model which provides monojet, mono-Z , mono-Higgs, and vector-boson-fusion +ETmiss signatures at the LHC, complemented by signals in direct and indirect DM search experiments. In this paper we have performed a detailed analysis of the constraints in the full five-dimensional parameter space of the i2HDM, coming from perturbativity, unitarity, electroweak precision data, Higgs data from the LHC, DM relic density, direct/indirect DM detection, and LHC monojet analysis, as well as implications of experimental LHC studies on disappearing charged tracks relevant to a high DM mass region. We demonstrate the complementarity of the above constraints and present projections for future LHC data and direct DM detection experiments to probe further i2HDM parameter space. The model is implemented into the CalcHEP and micrOMEGAs packages, which are publicly available at the HEPMDB database, and it is ready for a further exploration in the context of the LHC, relic density, and DM direct detection.

  16. Gravitino LSP scneario at the LHC

    International Nuclear Information System (INIS)

    Heisig, Jan

    2010-05-01

    In this thesis we discuss the phenomenology of the gravitino LSP scenario at the large hadron collider (LHC) experiment. We concentrate on a long-lived stau NLSP which gives rise to a prominent signature in the LHC detector as a 'slow muon'. We discuss the production channels and compute the cross sections for direct production via the Drell-Yan process. On this basis we claim a conservative estimation of the discovery potential for this scenario at the LHC. (orig.)

  17. First beam splashes at the LHC

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    After a two-year shutdown, the first beams of Run 2 circulated in the LHC last Sunday. On Tuesday, the LHC operators performed dedicated runs to allow some of the experiments to record their first signals coming from particles splashed out when the circulating beams hit the collimators. Powerful reconstruction software then transforms the electronic signals into colourful images.     “Splash” events are used by the experiments to test their numerous subdetectors and to synchronise them with the LHC clock. These events are recorded when the path of particles travelling in the LHC vacuum pipe is intentionally obstructed using collimators – one-metre-long graphite or tungsten jaws that are also used to catch particles that wander too far from the beam centre and to protect the accelerator against unavoidable regular and irregular beam losses. The particles sprayed out of the collision between the beam and the collimators are mostly muons. ATLAS and CMS&...

  18. CVD diamond pixel detectors for LHC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wedenig, R.; Adam, W.; Bauer, C.; Berdermann, E.; Bergonzo, P.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; Dabrowski, W.; Delpierre, P.; Deneuville, A.; Dulinski, W.; Eijk, B. van; Fallou, A.; Fizzotti, F.; Foulon, F.; Friedl, M.; Gan, K.K.; Gheeraert, E.; Grigoriev, E.; Hallewell, G.; Hall-Wilton, R.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kania, D.; Kaplon, J.; Karl, C.; Kass, R.; Knoepfle, K.T.; Krammer, M.; Logiudice, A.; Lu, R.; Manfredi, P.F.; Manfredotti, C.; Marshall, R.D.; Meier, D.; Mishina, M.; Oh, A.; Pan, L.S.; Palmieri, V.G.; Pernicka, M.; Peitz, A.; Pirollo, S.; Polesello, P.; Pretzl, K.; Procario, M.; Re, V.; Riester, J.L.; Roe, S.; Roff, D.; Rudge, A.; Runolfsson, O.; Russ, J.; Schnetzer, S.; Sciortino, S.; Speziali, V.; Stelzer, H.; Stone, R.; Suter, B.; Tapper, R.J.; Tesarek, R.; Trawick, M.; Trischuk, W.; Vittone, E.; Wagner, A.; Walsh, A.M.; Weilhammer, P.; White, C.; Zeuner, W.; Ziock, H.; Zoeller, M.; Blanquart, L.; Breugnion, P.; Charles, E.; Ciocio, A.; Clemens, J.C.; Dao, K.; Einsweiler, K.; Fasching, D.; Fischer, P.; Joshi, A.; Keil, M.; Klasen, V.; Kleinfelder, S.; Laugier, D.; Meuser, S.; Milgrome, O.; Mouthuy, T.; Richardson, J.; Sinervo, P.; Treis, J.; Wermes, N

    1999-08-01

    This paper reviews the development of CVD diamond pixel detectors. The preparation of the diamond pixel sensors for bump-bonding to the pixel readout electronics for the LHC and the results from beam tests carried out at CERN are described.

  19. CVD diamond pixel detectors for LHC experiments

    International Nuclear Information System (INIS)

    Wedenig, R.; Adam, W.; Bauer, C.; Berdermann, E.; Bergonzo, P.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; Dabrowski, W.; Delpierre, P.; Deneuville, A.; Dulinski, W.; Eijk, B. van; Fallou, A.; Fizzotti, F.; Foulon, F.; Friedl, M.; Gan, K.K.; Gheeraert, E.; Grigoriev, E.; Hallewell, G.; Hall-Wilton, R.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kania, D.; Kaplon, J.; Karl, C.; Kass, R.; Knoepfle, K.T.; Krammer, M.; Logiudice, A.; Lu, R.; Manfredi, P.F.; Manfredotti, C.; Marshall, R.D.; Meier, D.; Mishina, M.; Oh, A.; Pan, L.S.; Palmieri, V.G.; Pernicka, M.; Peitz, A.; Pirollo, S.; Polesello, P.; Pretzl, K.; Procario, M.; Re, V.; Riester, J.L.; Roe, S.; Roff, D.; Rudge, A.; Runolfsson, O.; Russ, J.; Schnetzer, S.; Sciortino, S.; Speziali, V.; Stelzer, H.; Stone, R.; Suter, B.; Tapper, R.J.; Tesarek, R.; Trawick, M.; Trischuk, W.; Vittone, E.; Wagner, A.; Walsh, A.M.; Weilhammer, P.; White, C.; Zeuner, W.; Ziock, H.; Zoeller, M.; Blanquart, L.; Breugnion, P.; Charles, E.; Ciocio, A.; Clemens, J.C.; Dao, K.; Einsweiler, K.; Fasching, D.; Fischer, P.; Joshi, A.; Keil, M.; Klasen, V.; Kleinfelder, S.; Laugier, D.; Meuser, S.; Milgrome, O.; Mouthuy, T.; Richardson, J.; Sinervo, P.; Treis, J.; Wermes, N.

    1999-01-01

    This paper reviews the development of CVD diamond pixel detectors. The preparation of the diamond pixel sensors for bump-bonding to the pixel readout electronics for the LHC and the results from beam tests carried out at CERN are described

  20. Quality assurance for CORAL and COOL within the LCG software stack for the LHC experiments

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    CORAL and COOL are software packages used by the LHC experiments for managing different categories of physics data using a variety of relational database technologies. The core components are written in C++, but Python bindings are also provided. CORAL is a generic relational access layer, while COOL includes the implementation of a specific relational data model and optimization of SQL queries for "conditions data". The software is the result of more than 10 years of development in colaboration between the IT department and the LHC experiments. The packages are built and released within the LCG software stack, for which automatic nightly builds and release installations are provided by PH-SFT (cmake, jenkins, cdash) for many different platforms, compilers and software version configurations. Test-driven development and functional tests of both C++ and Python components (CppUnit, unittest) have been key elements in the success of the projects. Dedicated test suites have also been prepared to commission and ma...

  1. Integrated monitoring of multi-domain backbone connections Operational experience in the LHC optical private network

    CERN Document Server

    Marcu, Patricia; Fritz, Wolfgang; Yampolskiy, Mark; Hommel, Wolfgang

    2011-01-01

    Novel large scale research projects often require cooperation between various different project partners that are spread among the entire world. They do not only need huge computing resources, but also a reliable network to operate on. The Large Hadron Collider (LHC) at CERN is a representative example for such a project. Its experiments result in a vast amount of data, which is interesting for researchers around the world. For transporting the data from CERN to 11 data processing and storage sites, an optical private network (OPN) has been constructed. As the experiment data is highly valuable, LHC defines very high requirements to the underlying network infrastructure. In order to fulfil those requirements, the connections have to be managed and monitored permanently. In this paper, we present the integrated monitoring solution developed for the LHCOPN. We first outline the requirements and show how they are met on the single network layers. After that, we describe, how those single measurements can be comb...

  2. Future wishes and constraints from the experiments at the LHC for the proton-proton programme

    CERN Document Server

    Jacobsson, R.

    2014-01-01

    Hosting six different experiments at four different interaction points and widely different requirements for the running conditions, the LHC machine has been faced with a long list of challenges in the first three years of luminosity production 2010 – 2012 (Run 1), many of which were potentially capable of limiting the performance due to instabilities resulting from the extremely high bunch brightness. Nonetheless, LHC met the challenges and performed extremely well at high efficiency and routinely with beam brightness at twice the design, well over 1/3 of the time in collision for physics, average luminosity lifetimes in excess of 10h, and extremely good background conditions in the experiments. While the experimental running configurations remain largely the same for the future high luminosity proton-proton operational mode, the energy and the luminosity should increase significantly making a prior assessment of related beam-beam effects extremely important to guarantee high performance. Of particular in...

  3. 25th anniversary of the Large Hadron Collider (LHC) experimental programme

    CERN Multimedia

    AUTHOR|(CDS)2094367

    2017-01-01

    On Friday 15 December 2017, CERN celebrated the 25th anniversary of the Large Hadron Collider (LHC) experimental programme. The occasion was marked with a special scientific symposium looking at the LHC’s history, the physics landscape into which the LHC experiments were born, and the challenging path that led to the very successful LHC programme we know today. The anniversary was linked to a meeting that took place in 1992, in Evian, entitled "Towards the LHC Experimental Programme", marking a crucial milestone in the design and development of the LHC experiments.

  4. "Big Science: the LHC in Pictures" in the Globe

    CERN Multimedia

    2008-01-01

    An exhibition of spectacular photographs of the LHC and its experiments is about to open in the Globe. The LHC and its four experiments are not only huge in size but also uniquely beautiful, as the exhibition "Big Science: the LHC in Pictures" in the Globe of Science and Innovation will show. The exhibition features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. These giant pictures reflecting the immense scale of the LHC and the mysteries of the Universe it is designed to uncover fill the Globe with shape and colour. The exhibition, which will open on 4 March, is divided into six different themes: CERN, the LHC and the four experiments ATLAS, LHCb, CMS and ALICE. Facts about all these subjects will be available at information points and in an explanatory booklet accompanying the exhibition (which visitors will be able to buy if they wish to take it home with them). Globe of Science and Innovatio...

  5. CVD diamond pixel detectors for LHC experiments

    CERN Document Server

    Wedenig, R; Bauer, C; Berdermann, E; Bergonzo, P; Bogani, F; Borchi, E; Brambilla, A; Bruzzi, Mara; Colledani, C; Conway, J; Dabrowski, W; Delpierre, P A; Deneuville, A; Dulinski, W; van Eijk, B; Fallou, A; Fizzotti, F; Foulon, F; Friedl, M; Gan, K K; Gheeraert, E; Grigoriev, E; Hallewell, G D; Hall-Wilton, R; Han, S; Hartjes, F G; Hrubec, Josef; Husson, D; Kagan, H; Kania, D R; Kaplon, J; Karl, C; Kass, R; Knöpfle, K T; Krammer, Manfred; Lo Giudice, A; Lü, R; Manfredi, P F; Manfredotti, C; Marshall, R D; Meier, D; Mishina, M; Oh, A; Pan, L S; Palmieri, V G; Pernicka, Manfred; Peitz, A; Pirollo, S; Polesello, P; Pretzl, Klaus P; Procario, M; Re, V; Riester, J L; Roe, S; Roff, D G; Rudge, A; Runólfsson, O; Russ, J; Schnetzer, S R; Sciortino, S; Speziali, V; Stelzer, H; Stone, R; Suter, B; Tapper, R J; Tesarek, R J; Trawick, M L; Trischuk, W; Vittone, E; Wagner, A; Walsh, A M; Weilhammer, Peter; White, C; Zeuner, W; Ziock, H J; Zöller, M

    1999-01-01

    This paper reviews the development of CVD diamond pixel detectors. The preparation of the diamond pixel sensors for bump-bonding to the pixel readout electronics for the LHC and the results from beam tests carried out at CERN are described. (9 refs).

  6. Scenarios for the LHC Upgrade

    CERN Document Server

    Scandale, Walter

    2008-01-01

    The projected lifetime of the LHC low-beta quadrupoles, the evolution of the statistical error halving time, and the physics potential all call for an LHC luminosity upgrade by the middle of the coming decade. In the framework of the CARE-HHH network three principal scenarios have been developed for increasing the LHC peak luminosity by more than a factor of 10, to values above 1035 cm−2s−1. All scenarios imply a rebuilding of the high-luminosity interaction regions (IRs) in combination with a consistent change of beam parameters. However, their respective features, bunch structures, IR layouts, merits and challenges, and luminosity variation with β∗ differ substantially. In all scenarios luminosity leveling during a store would be advantageous for the physics experiments. An injector upgrade must complement the upgrade measures in the LHC proper in order to provide the beam intensity and brightness needed as well as to reduce the LHC turnaround time for higher integrated luminosity.

  7. Gravitino LSP scneario at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Heisig, Jan

    2010-06-15

    In this thesis we discuss the phenomenology of the gravitino LSP scenario at the large hadron collider (LHC) experiment. We concentrate on a long-lived stau NLSP which gives rise to a prominent signature in the LHC detector as a 'slow muon'. We discuss the production channels and compute the cross sections for direct production via the Drell-Yan process. On this basis we claim a conservative estimation of the discovery potential for this scenario at the LHC. (orig.)

  8. Fire and Gas Detection in the LHC Experiments The Sniffer Project

    CERN Document Server

    Nunes, R W

    2001-01-01

    The LHC experiments, due to their complexity and size, present many safety challenges. Cryogenic gases are used in large quantities as well as certain flammable mixtures. The electrical power involved calls for analysis of the fire risks. Access is restricted to the minimum and environmental conditions are extremely harsh, due to strong magnetic fields and ionising radiation. This paper will describe the Combined Fire/Gas/Oxygen deficiency Detection systems proposed for inside the ATLAS and CMS Experiments and possibly for the two others, if they deem it necessary. The requirements of the experiments and the development and implementation of such a system will be discussed. In parallel, commercial procedures to implement these systems by industry shall be described, taking into consideration that a previous development has already been undertaken by CERN for the LEP experiments. The stage is set for inter-divisional collaboration in a project of utmost importance for the safety of people and protection of the...

  9. LHC detectors trigger/DAQ at LHC

    CERN Document Server

    Sphicas, Paris

    1998-01-01

    At its design luminosity, the LHC will deliver hundreds of millions of proton-proton interactions per second. Storage and computing limitations limit the number of physics events that can be recorded to about 100 per second. The selection will be carried out by the Trigger and data acquisition systems of the experiments. This lecture will review the requirements, architectures and various designs currently considered.

  10. Xrootd data access for LHC experiments at the INFN-CNAF Tier-1

    International Nuclear Information System (INIS)

    Gregori, Daniele; Prosperini, Andrea; Ricci, Pier Paolo; Sapunenko, Vladimir; Boccali, Tommaso; Noferini, Francesco; Vagnoni, Vincenzo

    2014-01-01

    The Mass Storage System installed at the INFN-CNAF Tier-1 is one of the biggest hierarchical storage facilities in Europe. It currently provides storage resources for about 12% of all LHC data, as well as for other experiments. The Grid Enabled Mass Storage System (GEMSS) is the current solution implemented at CNAF and it is based on a custom integration between a high performance parallel file system (General Parallel File System, GPFS) and a tape management system for long-term storage on magnetic media (Tivoli Storage Manager, TSM). Data access to Grid users is being granted since several years by the Storage Resource Manager (StoRM), an implementation of the standard SRM interface, widely adopted within the WLCG community. The evolving requirements from the LHC experiments and other users are leading to the adoption of more flexible methods for accessing the storage. These include the implementation of the so-called storage federations, i.e. geographically distributed federations allowing direct file access to the federated storage between sites. A specific integration between GEMSS and Xrootd has been developed at CNAF to match the requirements of the CMS experiment. This was already implemented for the ALICE use case, using ad-hoc Xrootd modifications. The new developments for CMS have been validated and are already available in the official Xrootd builds. This integration is currently in production and appropriate large scale tests have been made. In this paper we present the Xrootd solutions adopted for ALICE, CMS, ATLAS and LHCb to increase the availability and optimize the overall performance.

  11. Search for new neutral gauge bosons with the CMS Experiment at the LHC

    Science.gov (United States)

    Lanyov, Alexander; Shmatov, Sergei; Zhizhin, Ilia

    2018-04-01

    A search for narrow resonances in dimuon invariant mass spectra has been performed using 13 fb-1 data obtained in 2016 from proton-proton collisions at √s = 13 TeV with the CMS experiment at the LHC. No evidence for physics beyond standard model is found. Limits on the production cross section and the masses of hypothetical particles that could appear in the scenarios of new physics have been set.

  12. TOTEM and LHCf: refinements for the restart

    CERN Multimedia

    2009-01-01

    Following the previous two issues, the Bulletin continues its series to find out what the six LHC experiments have been up to since last September, and how they are preparing for the restart. We covered CMS, ATLAS, LHCb and ALICE in previous issues. In this issue we will round up the past 10 months of activity at TOTEM and LHCf. Roman Pots of the TOTEM experiment.TOTEM The past 10 months at TOTEM have been amongst the busiest since the project’s inception. The delay in the LHC startup has certainly had a silver lining for the TOTEM collaboration - not only has it given them a much-needed opportunity to test and install many crucial new detector parts, but also the lower energy range that the LHC will initially operate at in 2009 is perfect for TOTEM physics. "In fact, the LHC almost seems to be following the schedule of TOTEM!" jokes Karsten Eggert, TOTEM spokesperson. TOTEM is made up of three different detectors spread out...

  13. Commissioning of the LHC

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    The LHC construction is now approaching the end and it is now time to prepare for commissioning with beam. The behavior of a proton storage ring is much different to that of LEP, which profited from strong radiation damping to keep the beam stable. Our last experience with a hadron collider at CERN goes back more than 15 years when the proton-antiproton collider last operated. Ppbar taught us a lot about the machine physics of bunched beam proton storage rings and was essential input for the design of the LHC. After a short presentation of where we stand today with machine installation and hardware commissioning, I will discuss the main machine physics issues that will have to be dealt with in the LHC.

  14. LHC 2008 lectures
    The LHC: an accelerator of science

    CERN Multimedia

    2008-01-01

    In 2008, CERN will be switching on the greatest physics experiment ever undertaken. The Large Hadron Collider, or LHC, is a particle accelerator that will provide many answers to our questions about the Universe - What is the reason for mass? Where is the invisible matter in the Universe hiding? What is the relationship between matter and antimatter? Will we have to use a theory claiming more than four dimensions? … and what about "time" ? To understand better the raison d’être of the LHC, this gigantic, peerless scientific instrument and all the knowledge it can bring to us, members of the general public are invited to a series of lectures at the Globe of Science and Innovation. Thursday 8 May 2008 at 8.00 p.m. « Comment fonctionne l’Univers ? Ce que le LHC peut nous apprendre » Alvaro de Rujula, CERN physicist Thursday 15 May 2008 at 8.00 p.m. – « Une nouvelle vision du monde » Jean-Pierre Luminet, Director of...

  15. Quarkonium Physics at a Fixed-Target Experiment Using the LHC Beams

    Energy Technology Data Exchange (ETDEWEB)

    Lansberg, J.P.; /Orsay, IPN; Brodsky, S.J.; /SLAC; Fleuret, F.; /Ecole Polytechnique; Hadjidakis, C.; /Orsay, IPN

    2012-04-09

    We outline the many quarkonium-physics opportunities offered by a multi-purpose fixed-target experiment using the p and Pb LHC beams extracted by a bent crystal. This provides an integrated luminosity of 0.5 fb{sup -1} per year on a typical 1cm-long target. Such an extraction mode does not alter the performance of the collider experiments at the LHC. With such a high luminosity, one can analyse quarkonium production in great details in pp, pd and pA collisions at {radical}s{sub NN} {approx_equal} 115 GeV and at {radical}s{sub NN} {approx_equal} 72 GeV in PbA collisions. In a typical pp (pA) run, the obtained quarkonium yields per unit of rapidity are 2-3 orders of magnitude larger than those expected at RHIC and about respectively 10 (70) times larger than for ALICE. In PbA, they are comparable. By instrumenting the target-rapidity region, the large negative-x{sub F} domain can be accessed for the first time, greatly extending previous measurements by Hera-B and E866. Such analyses should help resolving the quarkonium-production controversies and clear the way for gluon PDF extraction via quarkonium studies. The nuclear target-species versatility provides a unique opportunity to study nuclear matter and the features of the hot and dense matter formed in PbA collisions. A polarised proton target allows the study of transverse-spin asymmetries in J/{Psi} and {Upsilon} production, providing access to the gluon and charm Sivers functions.

  16. Japanese contributions to CERN-LHC

    International Nuclear Information System (INIS)

    Kondo, Takahiko; Shintomi, Takakazu; Kimura, Yoshitaka

    2001-01-01

    The Large Hadron Collider (LHC) is now under construction at CERN, Geveva, to study frontier researches of particle physics. The LHC is the biggest superconducting accelerator using the most advanced cryogenics and applied superconductivities. The accelerator and large scale detectors for particle physics experiments are being constructed by collaboration with European countries and also by participation with non-CERN countries worldwide. In 1995, the Japanese government decided to take on a share in the LHC project with funding and technological contributions. KEK contributes to the development of low beta insertion superconducting quadrupole magnets and of components of the ATLAS detector by collaboration with university groups. Some Japanese companies have received contracts for technically key elements such as superconducting cable, cold compressor, nonmagnetic steel, polyimide film, and so on. An outline of the LHC project and Japanese contributions are described. (author)

  17. Object Oriented Approach to Software Development for LHC Experiments

    CERN Multimedia

    Tummers, B J; Day, C; Innocente, V; Mount, R; Visser, E; Burnett, T H; Balke, C

    2002-01-01

    % RD41 \\\\ \\\\ We propose to study the viability of the Object Oriented~(OO) approach for developing the code for LHC experiments. The authors of this proposal will learn the key issues of this approach:~~OO analysis and design. Several methodologies will be studied to select the most appropriate for the High Energy Physics case. Some Computer Aided Software Engineering tools and implementation languages will be evaluated. These studies will be carried out with various well-defined prototypes, some of which have been defined in a preceding study and some of which will be defined in the course of this R\\&D project. We propose to also study in this project how the OO approach enhances a different, and hopefully better, project management. Management tools will be tried and professional training will be organized.

  18. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  19. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    International Nuclear Information System (INIS)

    Varela Rodriguez, F

    2011-01-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  20. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    Science.gov (United States)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  1. HL-LHC kicker magnet (MKI)

    CERN Multimedia

    Brice, Maximilien

    2018-01-01

    HL-LHC kicker magnet (MKI): last vacuum test, preparation for transport to LHC transfer line in underground tunnel.The LHC injection kicker systems (MKIs) generate fast field pulses to inject the clockwise rotating beam at Point 2 and the anti-clockwise rotating beam at Point 8: there are eight MKI magnets installed in total. Each MKI magnet contains a high purity alumina tube: if an MKI magnet is replaced this tube requires conditioning with LHC beam: until it is properly conditioned, there can be high vacuum pressure due to the beam. This high pressure can also cause electrical breakdowns in the MKI magnets. A special coating (Cr2O3) has been applied to the inside of the alumina tube of an upgraded MKI magnet – this is expected to greatly reduce the pressure rise with beam. In addition, HL-LHC beam would result in excessive heating of the MKI magnets: the upgraded design includes modifications that will reduce heating, and move the power deposition to parts that will be easier to cool. Experience during 2...

  2. Half way round the LHC

    CERN Multimedia

    CERN Bulletin

    The LHC operations teams are preparing the machine for circulating beams and things are going very smoothly. ALICE and LHCb are getting used to observing particle tracks coming from the LHC beams. During the weekend of 7-8 November, CMS also  saw its first signals from beams dumped just upstream of  the experiment cavern.   Operators in the CMS control room observe the good performance of their detector. Particles are smoothly making their way around the 27 km circumference of the LHC. Last weekend (7-8 November), the first bunches of injection energy protons completed their journey (anti-clockwise) through three octants of the LHC’s circumference and were dumped in a collimator just before entering the CMS cavern. The particles produced by the impact of the protons on the tertiary collimators (used to stop the beam) left their tracks in the calorimeters and the muon chambers of the experiment. The more delicate inner detectors were switched off for protection reasons....

  3. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    Science.gov (United States)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  4. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    International Nuclear Information System (INIS)

    Girone, M; Andreeva, J; Barreiro Megino, F H; Campana, S; Cinquilli, M; Di Girolamo, A; Dimou, M; Giordano, D; Karavakis, E; Kenyon, M J; Kokozkiewicz, L; Lanciotti, E; Litmaath, M; Magini, N; Negri, G; Roiser, S; Saiz, P; Saiz Santos, M D; Schovancova, J; Sciabà, A

    2012-01-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  5. Radiation-hard Optoelectronics for LHC detector upgrades.

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00375195; Newbold, Dave

    A series of upgrades foreseen for the LHC over the next decade will allow the proton-proton collisions to reach the design center of mass energy of 14 TeV and increase the luminosity to five times (High Luminosity-LHC) the design luminosity by 2027. Radiation-tolerant high-speed optical data transmission links will continue to play an important role in the infrastructure of particle physics experiments over the next decade. A new generation of optoelectronics that meet the increased performance and radiation tolerance limits imposed by the increase in the intensity of the collisions at the interaction points are currently being developed. This thesis focuses on the development of a general purpose bi-directional 5 Gb/s radiation tolerant optical transceiver, the Versatile Transceiver (VTRx), for use by the LHC experiments over the next five years, and on exploring the radiation-tolerance of state-of-the art silicon photonics modulators for HL-LHC data transmission applications. The compliance of the VTRx ...

  6. Abort Gap Cleaning for LHC Run 2

    Energy Technology Data Exchange (ETDEWEB)

    Uythoven, Jan [CERN; Boccardi, Andrea [CERN; Bravin, Enrico [CERN; Goddard, Brennan [CERN; Hemelsoet, Georges-Henry [CERN; Höfle, Wolfgang [CERN; Jacquet, Delphine [CERN; Kain, Verena [CERN; Mazzoni, Stefano [CERN; Meddahi, Malika [CERN; Valuch, Daniel [CERN; Gianfelice-Wendt, Eliana [Fermilab

    2014-07-01

    To minimize the beam losses at the moment of an LHC beam dump the 3 μs long abort gap should contain as few particles as possible. Its population can be minimised by abort gap cleaning using the LHC transverse damper system. The LHC Run 1 experience is briefly recalled; changes foreseen for the LHC Run 2 are presented. They include improvements in the observation of the abort gap population and the mechanism to decide if cleaning is required, changes to the hardware of the transverse dampers to reduce the detrimental effect on the luminosity lifetime and proposed changes to the applied cleaning algorithms.

  7. Abort Gap Cleaning for LHC Run 2

    CERN Document Server

    Uythoven, J; Bravin, E; Goddard, B; Hemelsoet, GH; Höfle, W; Jacquet, D; Kain, V; Mazzoni, S; Meddahi, M; Valuch, D

    2015-01-01

    To minimise the beam losses at the moment of an LHC beam dump the 3 μs long abort gap should contain as few particles as possible. Its population can be minimised by abort gap cleaning using the LHC transverse damper system. The LHC Run 1 experience is briefly recalled; changes foreseen for the LHC Run 2 are presented. They include improvements in the observation of the abort gap population and the mechanism to decide if cleaning is required, changes to the hardware of the transverse dampers to reduce the detrimental effect on the luminosity lifetime and proposed changes to the applied cleaning algorithms.

  8. Silicon Strip Detectors for the ATLAS sLHC Upgrade

    CERN Document Server

    Miñano, M; The ATLAS collaboration

    2011-01-01

    While the Large Hadron Collider (LHC) at CERN is continuing to deliver an ever-increasing luminosity to the experiments, plans for an upgraded machine called Super-LHC (sLHC) are progressing. The upgrade is foreseen to increase the LHC design luminosity by a factor ten. The ATLAS experiment will need to build a new tracker for sLHC operation, which needs to be suited to the harsh sLHC conditions in terms of particle rates. In order to cope with the increase in pile-up backgrounds at the higher luminosity, an all silicon detector is being designed. To successfully face the increased radiation dose, a new generation of extremely radiation hard silicon detectors is being designed. The left part of figure 1 shows the simulated layout for the ATLAS tracker upgrade to be installed in the volume taken up by the current ATLAS pixel, strip and transition radiation detectors. Silicon sensors with sufficient radiation hardness are the subject of an international R&D programme, working on pixel and strip sensors. The...

  9. A Global Computing Grid for LHC; Una red global de computacion para LHC

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Calama, J. M.; Colino Arriero, N.

    2013-06-01

    An innovative computing infrastructure has played an instrumental role in the recent discovery of the Higgs boson in the LHC and has enabled scientists all over the world to store, process and analyze enormous amounts of data in record time. The Grid computing technology has made it possible to integrate computing center resources spread around the planet, including the CIEMAT, into a distributed system where these resources can be shared and accessed via Internet on a transparent, uniform basis. A global supercomputer for the LHC experiments. (Author)

  10. RooStats: Statistical Tools for the LHC

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC data, with emphasis on discoveries, confidence intervals, and combined measurements in the both the Bayesian and Frequentist approaches. The tools are built on top of the RooFit data modeling language and core ROOT mathematics libraries and persistence technology. These tools have been developed in collaboration with the LHC experiments and used by them to produce numerous physics results, such as the combination of ATLAS and CMS Higgs searches that resulted in a model with more than 200 parameters. We will review new developments which have been included in RooStats and the performance optimizations, required to cope with such complex models used by the LHC experiments. We will show as well the parallelization capability of these statistical tools using multiple-processors via PROOF.

  11. LHC Highlights, from dream to reality

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    The idea of the Large Hadron Collider (LHC) was born in the early 1980s. Although LEP (CERN’s previous large accelerator) was still under construction at that time, scientists were already starting to think about re-using the 27-kilometre ring for an even more powerful machine. Turning this ambitious scientific plan into reality proved to be an immensely complex task. Civil engineering work, state-of-the-art technologies, a new approach to data storage and analysis: many people worked hard for many years to accomplish all this.   Here are some of the highlights: 1984. A symposium organized in Lausanne, Switzerland, is the official starting point for the LHC. LHC prototype of the two beam pipes (1992). 1989. The first embryonic collaborations begin. 1992. A meeting in Evian, France, marks the beginning of the LHC experiments. 1994. The CERN Council approves the construction of the LHC accelerator. 1995. Japan becomes an Observer of CERN and announces a financial contribution to ...

  12. LHC goes global

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1995-09-15

    As CERN's major project for the future, the LHC sets a new scale in world-wide scientific collaboration. As well as researchers and engineers from CERN's 19 European Member States, preparations for the LHC now include scientists from several continents. Some 50 per cent of the researchers involved in one way or another with preparations for the LHC experimental programme now come from countries which are not CERN Member States. Underlining this enlarged international involvement is the recent decision by the Japanese Ministry of Education, Science and Culture ('Monbusho') to accord CERN a generous contribution of five billion yen (about 65 million Swiss francs) to help finance the construction of the LHC. This money will be held in a special fund earmarked for construction of specific LHC components and related activities. To take account of the new situation, CERN is proposing to set up a totally new 'Associate State' status. This is foreseen as a flexible bilateral framework which will be set up on a case-by-case basis to adapt to different circumstances. This proposal was introduced to CERN Council in June, and will be further discussed later this year. These developments reflect CERN's new role as a focus of world science, constituting a first step towards a wider level of international collaboration. At the June Council session, as a first step, Japan was unanimously elected as a CERN Observer State, giving them the right to attend Council meetings. Introducing the topic at the Council session, Director General Chris Llewellyn Smith sketched the history of Japanese involvement in CERN research. This began in 1957 and has gone on to include an important experiment at the LEAR low energy antiproton ring using laser spectroscopy of antiprotonic helium atoms, the new Chorus neutrino experiment using an emulsion target, and a major contribution to the Opal experiment at the LEP electronpositron collider. In welcoming the development, many Council delegates looked

  13. A word from the CSO: The LHC experiments are going on-line

    CERN Multimedia

    2008-01-01

    In the early discussions about experiments for a future high-energy proton collider the statement, "but we don’t know how to build detectors that can work in such an environment" was made more than once. On one hand I found this somewhat disconcerting, on the other hand it was encouraging in that it did not stop pioneers from starting to design experiments for the Large Hadron Collider. Initially, some ‘back of the envelope’ ideas were quite conservative, looking for very specific event signatures only, but today ATLAS and CMS are fully fledged ‘multi-purpose’ detectors featuring powerful tracking systems, high-resolution calorimetry and high-precision muon spectrometers. The dedicated experiments ALICE and LHCb and the smaller TOTEM and LHCf detectors are also state of the art. I have actively followed the creation of the experiments as a member (and later the Chair) of the LHC Committee, as a director of NIKHEF, and for ...

  14. Optics Measurements and Correction Challenges for the HL-LHC

    CERN Document Server

    Carlier, Felix Simon; Fartoukh, Stephane; Fol, Elena; Gamba, Davide; Garcia-Tabares Valdivieso, Ana; Giovannozzi, Massimo; Hofer, Michael; Langner, Andy Sven; Maclean, Ewen Hamish; Malina, Lukas; Medina Medrano, Luis Eduardo; Persson, Tobias Hakan Bjorn; Skowronski, Piotr Krzysztof; Tomas Garcia, Rogelio; Van Der Veken, Frederik; Wegscheider, Andreas

    2017-01-01

    Optics control in the HL-LHC will be challenged by a very small β* of 15 cm in the two main experiments. HL-LHC physics fills will keep a constant luminosity during several hours via β* leveling. This will require the commissioning of a large number of optical configurations, further challenging the efficiency of the optics measurements and correction tools. We report on the achieved level of optics control in the LHC with simulations and extrapolations for the HL-LHC.

  15. CERN database services for the LHC computing grid

    International Nuclear Information System (INIS)

    Girone, M

    2008-01-01

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed

  16. CERN database services for the LHC computing grid

    Energy Technology Data Exchange (ETDEWEB)

    Girone, M [CERN IT Department, CH-1211 Geneva 23 (Switzerland)], E-mail: maria.girone@cern.ch

    2008-07-15

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed.

  17. Studying Radiation Tolerant ICs for LHC

    CERN Multimedia

    Faccio, F; Snoeys, W; Campbell, M; Casas-cubillos, J; Gomes, P

    2002-01-01

    %title\\\\ \\\\In the recent years, intensive work has been carried out on the development of custom ICs for the readout electronics for LHC experiments. As far as radiation hardness is concerned, attention has been focussed on high total dose applications, mainly for the tracker systems. The dose foreseen in this inner region is estimated to be higher than 1~Mrad/year. In the framework of R&D projects (RD-9 and RD-20) and in the ATLAS and CMS experiments, the study of different radiation hard processes has been pursued and good contacts with the manufacturers have been established. The results of these studies have been discussed during the Microelectronics User Group (MUG) rad-hard meetings, and now some HEP groups are working to develop radiation hard ICs for the LHC experiments on some of the available rad-hard processes.\\\\ \\\\In addition, a lot of the standard commercial electronic components and ASICs which are planned to be installed near the LHC machine and in the detectors will receive total doses in ...

  18. LHC Create

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    LHC Create is an upcoming 2-day workshop held at IdeaSquare in November. Participants from CERN and IPAC school of design will compete to design an exhibit that explains why CERN does what it does. The winner will have their exhibit fully realised and made available to experiments, institutes, and tourism agencies around the world.

  19. UFOs in the LHC after LS1

    International Nuclear Information System (INIS)

    Baer, T.; Barnes, M.J.; Carlier, E.; Cerutti, F.; Dehning, B.; Ducimetiere, L.; Ferrari, A.; Garrel, N.; Gerardin, A.; Goddard, B.; Holzer, E.B.; Jackson, S.; Jimenez, J.M.; Kain, V.; Lechner, A.; Mertens, V.; Misiowiec, M.; Moron Ballester, R.; Nebot del Busto, E.; Norderhaug Drosdal, L.; Nordt, A.; Uythoven, J.; Velghe, B.; Vlachoudis, V.; Wenninger, J.; Zamantzas, C.; Zimmermann, F.; Fuster Martinez, N.

    2012-01-01

    UFOs (Unidentified Falling Objects) are potentially a major luminosity limitation for nominal LHC operation. With large-scale increases of the BLM thresholds, their impact on LHC availability was mitigated in the second half of 2011. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. Therefore, in 2011, the diagnostics for UFO events were significantly improved, dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. In this paper, the state of knowledge is summarized and extrapolations for LHC operation after LS1 are presented. Mitigation strategies are proposed and related tests and measures for 2012 are specified. (authors)

  20. UFOs in the LHC after LS1

    CERN Document Server

    Baer, T; Carlier, E; Cerutti, F; Dehning, B; Ducimetière, L; Ferrari, A; Garrel, N; Gérardin, A; Goddard, B; Holzer, E B; Jackson, S; Jimenez, J M; Kain, V; Lechner, A; Mertens, V; Misiowiec, M; Morón Ballester, R; Nebot del Busto, E; Norderhaug Drosdal, L; Nordt, A; Uythoven, J; Velghe, B; Vlachoudis, V; Wenninger, J; Zamantzas, C; Zimmermann, F; Fuster Martinez, N

    2012-01-01

    UFOs (Unidentified Falling Objects) are potentially a major luminosity limitation for nominal LHC operation. With large-scale increases of the BLM thresholds, their impact on LHC availability was mitigated in the second half of 2011. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. Therefore, in 2011, the diagnostics for UFO events were significantly improved, dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. In this paper, the state of knowledge is summarized and extrapolations for LHC operation after LS1 are presented. Mitigation strategies are proposed and related tests and measures for 2012 are specified.

  1. Fast crab cavity failures in HL-LHC

    CERN Document Server

    Yee-Rendon, B; Calaga, R; Tomas, R; Zimmermann, F; Barranco, J

    2014-01-01

    Crab cavities (CCs) are a key ingredient of the High-Luminosity Large Hadron Collider (HL-LHC) to ensure head on collisions at the main experiments (ATLAS and CMS) and fully profit from the smaller provided by the ATS optics [1]. At KEKB, CCs have exhibited abrupt changes of phase and voltage during a time period of few LHC turns and considering the large energy stored in the HL-LHC beam, CC failures represent a serious risk to the LHC machine protection. In this paper, we discuss the effect of CC voltage or phase changes on a time interval similar to, or longer than, the one needed to dump the beam. The simulations assume a realistic steady-state distribution to assess the beam losses for the HL-LHC. Additionally, some strategies are studied in order to reduce the damage caused by the CC failures.

  2. Ultra-relativistic heavy-ion physics with AFTER@LHC

    DEFF Research Database (Denmark)

    Rakotozafindrabe, A.; Arnaldi, R.; Brodsky, Stanley

    2013-01-01

    We outline the opportunities for ultra-relativistic heavy–ion physics which are offered by a next generation and multi-purpose fixed-target experiment exploiting the proton and ion LHC beams extracted by a bent crystal.......We outline the opportunities for ultra-relativistic heavy–ion physics which are offered by a next generation and multi-purpose fixed-target experiment exploiting the proton and ion LHC beams extracted by a bent crystal....

  3. Highlights of LHC experiments – Part I

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00072301; The ATLAS collaboration

    2017-01-01

    The superb performance of the LHC accelerator in 2016, in both live time and peak luminosity, has provided a large data sample of collisions at 13 TeV. Excellent performances of the ATLAS and LHCb detectors, together with highly performant offline and analysis systems, mean that a wealth of results are already available from 13 TeV data. Selected highlights are reported here.

  4. Validation of the new filters configuration for the RPC gas systems at LHC experiments

    CERN Document Server

    Mandelli, Beatrice; Guida, Roberto; Hahn, Ferdinand; Haider, Stefan

    2012-01-01

    Resistive Plate Chambers (RPCs) are widely employed as muon trigger systems at the Large Hadron Collider (LHC) experiments. Their large detector volume and the use of a relatively expensive gas mixture make a closed-loop gas circulation unavoidable. The return gas of RPCs operated in conditions similar to the experimental background foreseen at LHC contains large amount of impurities potentially dangerous for long-term operation. Several gas-cleaning agents, characterized during the past years, are currently in use. New test allowed understanding of the properties and performance of a large number of purifiers. On that basis, an optimal combination of different filters consisting of Molecular Sieve (MS) 5Å and 4Å, and a Cu catalyst R11 has been chosen and validated irradiating a set of RPCs at the CERN Gamma Irradiation Facility (GIF) for several years. A very important feature of this new configuration is the increase of the cycle duration for each purifier, which results in better system stabilit...

  5. JACoW Configuring and automating an LHC experiment for faster and better physics output

    CERN Document Server

    Gaspar, Clara; Alessio, Federico; Barbosa, Joao; Cardoso, Luis; Frank, Markus; Jost, Beat; Neufeld, Niko; Schwemmer, Rainer

    2018-01-01

    LHCb has introduced a novel online detector alignment and calibration for LHC Run II. This strategy allows for better trigger efficiency, better data quality and direct physics analysis at the trigger output. This implies: running a first High Level Trigger (HLT) pass synchronously with data taking and buffering locally its output; use the data collected at the beginning of the fill, or on a run-by-run basis, to determine the new alignment and calibration constants; run a second HLT pass on the buffered data using the new constants. Operationally, it represented a challenge: it required running different activities concurrently in the farm, starting at different times and load balanced depending on the LHC state. However, these activities are now an integral part of LHCb's dataflow, seamlessly integrated in the Experiment Control System and completely automated under the supervision of LHCb's 'Big Brother'. In total, for all activities, there are usually around 60000 tasks running in the ~1600 nodes of the fa...

  6. Restart of the LHC. New physics. The particle physics behind the world machine illustratively explained; Neustart des LHC. Neue Physik. Die Teilchenphysik hinter der Weltmaschine anschaulich erklaert

    Energy Technology Data Exchange (ETDEWEB)

    Knochel, Alexander

    2016-07-01

    The following topics are dealt with: The ascertainment of scientific virgin territory by means of the LHC ar CERN, the study of actual questions of cosmology and astrophysics like dark matter and dark energy by means of the LHC, the presently existing anomalies in the data with regards to new phenomena together with statistical methods for the correct estimation of such observations, the supplement of other experiments for the LHC experiments, the Higgs boson, supersymmetry, higher dimensions, the study of quantum gravity in accelerator experiments with regards to the string theory. (HSI)

  7. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  8. Commissioning of the Absolute Luminosity For ATLAS detector at the LHC

    CERN Document Server

    Jakobsen, Sune; Hansen, Peter; Hansen, Jørgen Beck

    The startup of the LHC (Large Hadron Collider) has initialized a new era in particle physics. The standard model of particle physics has for the last 40 years with tremendous success described all measurements with phenomenal precision. The experiments at the LHC are testing the standard model in a new energy regime. To normalize the measurements and understand the potential discoveries of the LHC experiments it is often crucial to know the interaction rate - the absolute luminosity. The ATLAS (A Toroidal LHC ApparatuS) detector will measure luminosity by numerous methods. But for most of the methods only the relative luminosity is measured with good precision. The absolute scale has to be provided from elsewhere. ATLAS is like the other LHC experiments mainly relying of absolute luminosity calibration from van der Meer scans (beam separation scans). To cross check and maybe even improve the precision; ATLAS has built a sub-detector to measure the flux of protons scattered under very small angles as this flux...

  9. Tevatron-for-LHC Report: Preparations for Discoveries

    CERN Document Server

    Abdullin, Salavat; Asai, Shoji; Atramentov, Oleksiy Vladimirovich; Baer, Howard; Balazs, Csaba; Bartalini, Paolo; Belyaev, Alexander; Bernhard, Ralf Patrick; Birkedal, Andreas; Buescher, Volker; Cavanaugh, Richard; Chen, Mu-Chun; Clement, Christophe; Datta, AseshKrishna; de Boer, Ytsen R.; De Roeck, Albert; Dobrescu, Bogdan A.; Drozdetskiy, Alexey; Gershtein, Yuri S.; Glenzinski, Douglas A.; Group, Robert Craig; Heinemeyer, Sven; Heldmann, Michael; Hubisz, Jay; Karlsson, Martin; Kong, Kyoungchul; Korytov, Andrey; Kraml, Sabine; Krupovnickas, Tadas; Lafaye, Remi; Lane, Kenneth; Ledroit, Fabienne; Lehner, Frank; Lin, Cheng-Ju; Macesanu, Cosmin; Matchev, Konstantin T.; Menon, Arjun; Milstead, David; Mitselmakher, Guenakh; Morel, Julien; Morrissey, David; Mrenna, Steve; O'Farrill, Jorge; Pakhotin, Yu.; Perelstein, Maxim; Plehn, Tilman; Rainwater, David; Raklev, Are; Schmitt, Michael; Scurlock, Bobby; Sherstnev, Alexander; Skands, Peter Z.; Sullivan, Zack; Tait, Timothy M.P.; Tata, Xerxes; Torchiani, Ingo; Trocme, Benjamin; Wagner, Carlos; Weiglein, Georg; Zerwas, Dirk

    2006-01-01

    This is the "TeV4LHC" report of the "Physics Landscapes" Working Group, focused on facilitating the start-up of physics explorations at the LHC by using the experience gained at the Tevatron. We present experimental and theoretical results that can be employed to probe various scenarios for physics beyond the Standard Model.

  10. Cryogenic Silicon Microstrip Detector Modules for LHC

    CERN Document Server

    Perea-Solano, B

    2004-01-01

    CERN is presently constructing the LHC, which will produce collisions of 7 TeV protons in 4 interaction points at a design luminosity of 1034 cm-2 s-1. The radiation dose resulting from the operation at high luminosity will cause a serious deterioration of the silicon tracker performance. The state-of-art silicon microstrip detectors can tolerate a fluence of about 3 1014 cm-2 of hadrons or charged leptons. This is insufficient, however, for long-term operation in the central parts of the LHC trackers, in particular after the possible luminosity upgrade of the LHC. By operating the detectors at cryogenic temperatures the radiation hardness can be improved by a factor 10. This work proposes a cryogenic microstrip detector module concept which has the features required for the microstrip trackers of the upgraded LHC experiments at CERN. The module can hold an edgeless sensor, being a good candidate for improved luminosity and total cross-section measurements in the ATLAS, CMS and TOTEM experiments. The design o...

  11. QCD processes and search for supersymmetry at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Schum, Torben

    2012-07-15

    In this thesis, a data-driven method to estimate the number of QCD background events in a multijet search for supersymmetry at the LHC was developed. The method makes use of two models which predict the correlation of two key search variables, the missing transverse momentum and an angular variable, in order to extrapolate from a QCD dominated control region to the signal region. A good performance of the method was demonstrated by its application to 36 pb{sup -1} data, taken by the CMS experiment in 2010, and by the comparison with an alternative method. Comparing the number of data events to a combined background expectation of QCD and data-driven estimates of the electroweak and top background, no statistically significant excess was observed for three pre-defined search regions. Limits were calculated for the (m{sub 0},m{sub 1/2}) parameter space of the cMSSM, exceeding previous measurements. The expected sensitivity for further refined search regions was investigated.

  12. QCD processes and search for supersymmetry at the LHC

    International Nuclear Information System (INIS)

    Schum, Torben

    2012-07-01

    In this thesis, a data-driven method to estimate the number of QCD background events in a multijet search for supersymmetry at the LHC was developed. The method makes use of two models which predict the correlation of two key search variables, the missing transverse momentum and an angular variable, in order to extrapolate from a QCD dominated control region to the signal region. A good performance of the method was demonstrated by its application to 36 pb -1 data, taken by the CMS experiment in 2010, and by the comparison with an alternative method. Comparing the number of data events to a combined background expectation of QCD and data-driven estimates of the electroweak and top background, no statistically significant excess was observed for three pre-defined search regions. Limits were calculated for the (m 0 ,m 1/2 ) parameter space of the cMSSM, exceeding previous measurements. The expected sensitivity for further refined search regions was investigated.

  13. Search for extra dimensions in the di-photon channel with the ATLAS experiment at LHC

    International Nuclear Information System (INIS)

    Le Bao, T.

    2013-01-01

    This thesis summarizes a search for manifestations of Large Extra Dimensions (LED) using 4.91 fb -1 of data collected in 2011 by the ATLAS detector at the Large Hadron Collider (LHC) at CERN. In 2011, the LHC has provided proton-proton collisions at a center-of- mass energy of √(s)=7 TeV. LED can potentially solve the so-called hierarchy problem, i.e. large apparent difference between two fundamental scales of the Standard Model (SM), the electroweak and the Planck scales. In the context of the ADD model (named after the authors N. Arkani-Hamed, S. Dimopoulos and G. Dvali) of LED, the effects of quantum gravity become much stronger than in the SM; possibly large enough to be observed at the LHC. There are two possibilities of graviton production in proton-proton collisions: direct graviton production and virtual graviton exchange. In this thesis, we present a search for the manifestation of extra dimensions via the effect of virtual graviton exchange on the di-photon final state. The di-photon invariant mass spectrum is studied and found to be in good agreement with SM background expectation. We set limits on the fundamental Planck scale of the ADD model using two different methods: a counting experiment and an analysis of the shape of the di-photon mass spectrum. The counting experiment yields limits between 2.62 and 3.92 TeV at 95% C.L., depending on the number of extra dimensions and the theoretical formalism used. The shape analysis yields slightly more stringent limits: the lower limits on the fundamental Planck scale improve by a factor of 1.04. (author)

  14. LHC progress report

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Last weekend saw a record physics fill with a tenfold increase in instantaneous luminosity (event rate from collisions), marking an important milestone for the LHC. This physics fill did not only establish luminosities above 1.1 x 1028 cm-2 s-1 in all four experiments but was also kept in "stable beam" mode for a new record length of 30 hours. The particle physics experiments were able to more than double the total number of events so far recorded at 3.5 TeV.   The LHC screen indicating that squeezed stable beams have been achieved for the first time. The very successful weekend had been preceded by hard work on the accelerator side. A factor 5 improvement in luminosity was achieved by "squeezing" (reducing) the beam sizes at all four interaction points. This process, one of the most complex stages in the operation of the accelerator, was finalised the week before. Once the machine is "squeezed", the experimental insertions become aperture bot...

  15. Study of the production of nuclei and anti-nuclei at the LHC with the ALICE experiment

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00508690; Bufalino, Stefania

    In the ultra-relativistic lead-lead collisions at the CERN Large Hadron Collider (LHC), a state of matter called Quark Gluon Plasma (QGP) is created. A typical signature of a heavy ion collision (HIC) correlated to the production of the QGP is the large number of particles produced ($\\mathrm{d} N_{\\mathrm{ch}}/\\mathrm{d}\\eta$ up to 2000 in Pb-Pb collisions at $\\sqrt{s_{\\mathrm{NN}}}=5.02$ TeV). This high multiplicity environment poses a tremendous experimental challenge on the experiments that have to cope with the high density of signals in their sensitive volume. A Large Ion Collider Experiment (ALICE) has been designed to deal with the harsh environment of a HIC and to study in details the characteristics of the QGP. Among the particles produced in a HIC, light nuclei and their anti-matter companions are of special interest since the production mechanism of such loosely bound states is not clear in high energy collisions. The production rate at the LHC for the lightest of these objects, the deuteron, is a...

  16. MSSM A-funnel and the galactic center excess: prospects for the LHC and direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Freese, Katherine [Nordita (Nordic Institute for Theoretical Physics),KTH Royal Institute of Technology and Stockholm University,Roslagstullsbacken 23, SE-106 91 Stockholm (Sweden); The Oskar Klein Center for Cosmoparticle Physics, AlbaNova University Center,University of Stockholm,10691 Stockholm (Sweden); Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States); López, Alejandro [Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States); Shah, Nausheen R. [Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States); Department of Physics and Astronomy, Wayne State University,Detroit, Michigan 48201 (United States); Shakya, Bibhushan [Michigan Center for Theoretical Physics, Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States)

    2016-04-11

    The pseudoscalar resonance or “A-funnel' in the Minimal Supersymmetric Standard Model (MSSM) is a widely studied framework for explaining dark matter that can yield interesting indirect detection and collider signals. The well-known Galactic Center excess (GCE) at GeV energies in the gamma ray spectrum, consistent with annihilation of a ≲40 GeV dark matter particle, has more recently been shown to be compatible with significantly heavier masses following reanalysis of the background. In this paper, we explore the LHC and direct detection implications of interpreting the GCE in this extended mass window within the MSSM A-funnel framework. We find that compatibility with relic density, signal strength, collider constraints, and Higgs data can be simultaneously achieved with appropriate parameter choices. The compatible regions give very sharp predictions of 200–600 GeV CP-odd/even Higgs bosons at low tan β at the LHC and spin-independent cross sections ≈10{sup −11} pb at direct detection experiments. Regardless of consistency with the GCE, this study serves as a useful template of the strong correlations between indirect, direct, and LHC signatures of the MSSM A-funnel region.

  17. Partitioning,Automation and Error Recovery in the Control and Monitoring System of an LHC Experiment

    Institute of Scientific and Technical Information of China (English)

    C.Gaspar

    2001-01-01

    The Joint Controls Project(JCOP)is a collaboration between CERN and the four LHC experiments to find and implement common solutions for their control and monitoring systems.As part of this project and Architecture Working Group was set up in order to study the requirements and devise an architectural model that would suit the four experiments.Many issues were studied by this working group:Alarm handling,Access Control,Hierarchical Control,etc.This paper will report on the specific issue of hierarchical control and in particular partitioning,automation and error recovery.

  18. The LHC detector challenge

    CERN Document Server

    Virdee, Tejinder S

    2004-01-01

    The Large Hadron Collider (LHC) from CERN, scheduled to come online in 2007, is a multi-TeV proton-proton collider with vast detectors. Two of the more significant detectors for LHC are ATLAS and CMS. Currently, both detectors are more than 65% complete in terms of financial commitment, and the experiments are being assembled at an increasing pace. ATLAS is being built directly in its underground cavern, whereas CMS is being assembled above ground. When completed, both detectors will aid researchers in determining what lies at the high-energy frontier, in particular the mechanism by which particles attain mass. (Edited abstract).

  19. Manufacturing experience for the LHC inner triplet quadrupole cables

    CERN Document Server

    Scanlan, R M; Bossert, R; Kerby, J S; Ghosh, A K; Boivin, M; Roy, T

    2002-01-01

    The design for the U.S. LHC Inner Triplet Quadrupole magnet requires a 37 strand (inner layer) and a 46 strand (outer layer) cable. This represents the largest number of strands attempted to date for a production quantity of Rutherford-type cable. The cable parameters were optimized during the production of a series of short prototype magnets produced at FNAL. These optimization studies focused on critical current degradation, dimensional control, coil winding, and interstrand resistance. After the R&D phase was complete, the technology was transferred to NEEW and a new cabling machine was installed to produce these cables. At present, about 60 unit lengths, out of 90 required for the entire production series of magnets, have been completed for each type of cable. The manufacturing experience with these challenging cables will be reported. Finally, the implications for even larger cables, with more strands, will be discussed. (8 refs).

  20. LHC Startup

    CERN Document Server

    AUTHOR|(CDS)2067853

    2008-01-01

    The Large Hadron Collider will commence operations in the latter half of 2008. The plans of the LHC experiments ALICE, ATLAS, CMS and LHCb are described. The scenario for progression of luminosity and the strategies of these 4 experiments to use the initial data are detailed. There are significant measurements possible with integrated luminosities of 1, 10 and 100 pb^-1. These measurements will provide essential calibration and tests of the detectors, understanding of the Standard Model backgrounds and a first oportunity to look for new physics.

  1. Timing, Trigger and Control Systems for LHC Detectors

    CERN Multimedia

    2002-01-01

    \\\\ \\\\At the LHC, precise bunch-crossing clock and machine orbit signals must be broadcast over distances of several km from the Prevessin Control Room to the four experiment areas and other destinations. At the LHC experiments themselves, quite extensive distribution systems are also required for the transmission of timing, trigger and control (TTC) signals to large numbers of front-end electronics controllers from a single location in the vicinity of the central trigger processor. The systems must control the detector synchronization and deliver the necessary fast signals and messages that are phased with the LHC clock, orbit or bunch structure. These include the bunch-crossing clock, level-1 trigger decisions, bunch and event numbers, as well as test signals and broadcast commands. A common solution to this TTC system requirement is expected to result in important economies of scale and permit a rationalization of the development, operational and support efforts required. LHC Common Project RD12 is developi...

  2. The development of diamond tracking detectors for the LHC

    International Nuclear Information System (INIS)

    Adam, W.; Berdermann, E.; Bergonzo, P.; Boer, W. de; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; D'Angelo, P.; Dabrowski, W.; Delpierre, P.; Doroshenko, J.; Dulinski, W.; Eijk, B. van; Fallou, A.; Fischer, P.; Fizzotti, F.; Furetta, C.; Gan, K.K.; Ghodbane, N.; Grigoriev, E.; Hallewell, G.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kaplon, J.; Karl, C.; Kass, R.; Keil, M.; Knoepfle, K.T.; Koeth, T.; Krammer, M.; Logiudice, A.; Lu, R.; Mac Lynne, L.; Manfredotti, C.; Marshall, R.D.; Meier, D.; Menichelli, D.; Meuser, S.; Mishina, M.; Moroni, L.; Noomen, J.; Oh, A.; Perera, L.; Pernegger, H.; Pernicka, M.; Polesello, P.; Potenza, R.; Riester, J.L.; Roe, S.; Rudge, A.; Sala, S.; Sampietro, M.; Schnetzer, S.; Sciortino, S.; Stelzer, H.; Stone, R.; Sutera, C.; Trischuk, W.; Tromson, D.; Tuve, C.; Vincenzo, B.; Weilhammer, P.; Wermes, N.; Wetstein, M.; Zeuner, W.; Zoeller, M.

    2003-01-01

    Chemical vapor deposition diamond has been discussed extensively as an alternate sensor material for use very close to the interaction region of the LHC where extreme radiation conditions exist. During the last few years diamond devices have been manufactured and tested with LHC electronics with the goal of creating a detector usable by all LHC experiment. Extensive progress on diamond quality, on the development of diamond trackers and on radiation hardness studies has been made. Transforming the technology to the LHC specific requirements is now underway. In this paper we present the recent progress achieved

  3. The development of diamond tracking detectors for the LHC

    CERN Document Server

    Adam, W; Bergonzo, P; de Boer, Wim; Bogani, F; Borchi, E; Brambilla, A; Bruzzi, M; Colledani, C; Conway, J; D'Angelo, P; Dabrowski, W; Delpierre, P A; Doroshenko, J; Dulinski, W; van Eijk, B; Fallou, A; Fischer, P; Fizzotti, F; Furetta, C; Gan, K K; Ghodbane, N; Grigoriev, E; Hallewell, G D; Han, S; Hartjes, F; Hrubec, Josef; Husson, D; Kagan, H; Kaplon, J; Karl, C; Kass, R; Keil, M; Knöpfle, K T; Koeth, T W; Krammer, M; Lo Giudice, A; Lü, R; MacLynne, L; Manfredotti, C; Marshall, R D; Meier, D; Menichelli, D; Meuser, S; Mishina, M; Moroni, L; Noomen, J; Oh, A; Perera, L; Pernegger, H; Pernicka, M; Polesello, P; Potenza, R; Riester, J L; Roe, S; Rudge, A; Sala, S; Sampietro, M; Schnetzer, S; Sciortino, S; Stelzer, H; Stone, R; Sutera, C; Trischuk, W; Tromson, D; Tuvé, C; Vincenzo, B; Weilhammer, P; Wermes, N; Wetstein, M; Zeuner, W; Zöller, M

    2003-01-01

    Chemical vapor deposition diamond has been discussed extensively as an alternate sensor material for use very close to the interaction region of the LHC where extreme radiation conditions exist. During the last few years diamond devices have been manufactured and tested with LHC electronics with the goal of creating a detector usable by all LHC experiment. Extensive progress on diamond quality, on the development of diamond trackers and on radiation hardness studies has been made. Transforming the technology to the LHC specific requirements is now underway. In this paper we present the recent progress achieved.

  4. The development of diamond tracking detectors for the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Adam, W.; Berdermann, E.; Bergonzo, P.; Boer, W. de; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; D' Angelo, P.; Dabrowski, W.; Delpierre, P.; Doroshenko, J.; Dulinski, W.; Eijk, B. van; Fallou, A.; Fischer, P.; Fizzotti, F.; Furetta, C.; Gan, K.K.; Ghodbane, N.; Grigoriev, E.; Hallewell, G.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H. E-mail: harris.kagan@cern.ch; Kaplon, J.; Karl, C.; Kass, R.; Keil, M.; Knoepfle, K.T.; Koeth, T.; Krammer, M.; Logiudice, A.; Lu, R.; Mac Lynne, L.; Manfredotti, C.; Marshall, R.D.; Meier, D.; Menichelli, D.; Meuser, S.; Mishina, M.; Moroni, L.; Noomen, J.; Oh, A.; Perera, L.; Pernegger, H.; Pernicka, M.; Polesello, P.; Potenza, R.; Riester, J.L.; Roe, S.; Rudge, A.; Sala, S.; Sampietro, M.; Schnetzer, S.; Sciortino, S.; Stelzer, H.; Stone, R.; Sutera, C.; Trischuk, W.; Tromson, D.; Tuve, C.; Vincenzo, B.; Weilhammer, P.; Wermes, N.; Wetstein, M.; Zeuner, W.; Zoeller, M

    2003-11-21

    Chemical vapor deposition diamond has been discussed extensively as an alternate sensor material for use very close to the interaction region of the LHC where extreme radiation conditions exist. During the last few years diamond devices have been manufactured and tested with LHC electronics with the goal of creating a detector usable by all LHC experiment. Extensive progress on diamond quality, on the development of diamond trackers and on radiation hardness studies has been made. Transforming the technology to the LHC specific requirements is now underway. In this paper we present the recent progress achieved.

  5. The development of diamond tracking detectors for the LHC

    Science.gov (United States)

    Adam, W.; Berdermann, E.; Bergonzo, P.; de Boer, W.; Bogani, F.; Borchi, E.; Brambilla, A.; Bruzzi, M.; Colledani, C.; Conway, J.; D'Angelo, P.; Dabrowski, W.; Delpierre, P.; Doroshenko, J.; Dulinski, W.; van Eijk, B.; Fallou, A.; Fischer, P.; Fizzotti, F.; Furetta, C.; Gan, K. K.; Ghodbane, N.; Grigoriev, E.; Hallewell, G.; Han, S.; Hartjes, F.; Hrubec, J.; Husson, D.; Kagan, H.; Kaplon, J.; Karl, C.; Kass, R.; Keil, M.; Knöpfle, K. T.; Koeth, T.; Krammer, M.; Logiudice, A.; Lu, R.; mac Lynne, L.; Manfredotti, C.; Marshall, R. D.; Meier, D.; Menichelli, D.; Meuser, S.; Mishina, M.; Moroni, L.; Noomen, J.; Oh, A.; Perera, L.; Pernegger, H.; Pernicka, M.; Polesello, P.; Potenza, R.; Riester, J. L.; Roe, S.; Rudge, A.; Sala, S.; Sampietro, M.; Schnetzer, S.; Sciortino, S.; Stelzer, H.; Stone, R.; Sutera, C.; Trischuk, W.; Tromson, D.; Tuve, C.; Vincenzo, B.; Weilhammer, P.; Wermes, N.; Wetstein, M.; Zeuner, W.; Zoeller, M.; RD42 Collaboration

    2003-11-01

    Chemical vapor deposition diamond has been discussed extensively as an alternate sensor material for use very close to the interaction region of the LHC where extreme radiation conditions exist. During the last few years diamond devices have been manufactured and tested with LHC electronics with the goal of creating a detector usable by all LHC experiment. Extensive progress on diamond quality, on the development of diamond trackers and on radiation hardness studies has been made. Transforming the technology to the LHC specific requirements is now underway. In this paper we present the recent progress achieved.

  6. Prototype HL-LHC magnet undergoes testing

    CERN Multimedia

    Corinne Pralavorio

    2016-01-01

    A preliminary short prototype of the quadrupole magnets for the High-Luminosity LHC has passed its first tests.   The first short prototype of the quadrupole magnet for the High Luminosity LHC. (Photo: G. Ambrosio (US-LARP and Fermilab), P. Ferracin and E. Todesco (CERN TE-MSC)) Momentum is gathering behind the High-Luminosity LHC (HL-LHC) project. In laboratories on either side of the Atlantic, a host of tests are being carried out on the various magnet models. In mid-March, a short prototype of the quadrupole magnet underwent its first testing phase at the Fermilab laboratory in the United States. This magnet is a pre-prototype of the quadrupole magnets that will be installed near to the ATLAS and CMS detectors to squeeze the beams before collisions. Six quadrupole magnets will be installed on each side of each experiment, giving a total of 24 magnets, and will replace the LHC's triplet magnets. Made of superconducting niobium-tin, the magnets will be more powerful than their p...

  7. Press Conference: LHC Restart, Season 2

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    PRESS BRIEFING ON THE LARGE HADRON COLLIDER (LHC) RE-START, SEASON 2 AT CERN, GLOBE OF SCIENCE AND INNOVATION Where :   http://cern.ch/directions   at the Globe of Science and Innovation When : Thursday, 12 March from 2.30 to 3.30pm - Open seating as from 2.15pm Speakers : CERN’s Director General, Rolf Heuer and Director of Accelerators, Frédérick Bordry, and representatives of the LHC experiments Webcast : https://webcast.web.cern.ch/webcast/ Dear Journalists, CERN is pleased to invite you to the above press briefing which will take place on Thursday 12 March, in the Globe of Science and Innovation, 1st floor, from 2.30 to 3.30pm. The Large Hadron Collider (LHC) is ready to start up for its second three-year run. The 27km LHC is the largest and most powerful particle accelerator in the world operating at a temperature of -217 degrees Centigrade and powered to a current of 11,000 amps. Run 2 of the LHC follows a two-year technical s...

  8. New EU project supports LHC theorists

    CERN Multimedia

    Katarina Anthony

    2011-01-01

    LHCPhenonet, a new EU-funded research network aimed at improving the theoretical predictions that guide the LHC experiments, has begun its 4-year run as a Marie Curie Initial Training Network. CERN joins the network as an associate partner, along with almost 30 multinational institutions and computing companies.   Theorists from around the world gathered in Valencia to attend LHCPhenonet's kick-off meeting. LHCPhenonet will create research opportunities for young, talented European theorists, providing funding for both doctoral and post-doctoral positions across the various participating institutions – including the University of Durham, DESY, and the Istituto Nazionale di Fisica Nucleare (INFN). LHCPhenoNet aims to improve the Quantum Field Theory calculations that set the parameters of the LHC experiments, focusing on the LHC phenomenology that gave it its name. The 4.5 million euro project is funded by the EU's 7th Research Framework Programme and will be coordinated through the Span...

  9. Refining animal experiments: the first Brazilian regulation on animal experimentation.

    Science.gov (United States)

    de A e Tréz, Thales

    2010-06-01

    The very first law on animal experimentation has been approved recently in Brazil, and now is part of a set of the legal instruments that profile the Brazilian government's attitude toward the use of animals in experiments. Law 11794/08 establishes a new legal instrument that will guide new methods of conduct for ethics committees, researchers and representatives of animal protection societies. This comment aims to analyse critically the implications that this law brings to Brazilian reality. The link between it and the Russell and Burch's Three Rs concept is defined, and certain problems are identified. The conclusion is that the body of the law emphasises the refinement of animal experiments, but gives little importance to the principles of reduction and replacement.

  10. First experiences with the LHC BLM sanity checks

    CERN Document Server

    Emery, J; Effinger, E; Nordt, A; Sapinski, M G; Zamantzas, C

    2010-01-01

    The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test "as good as new". The sanity checks are part of this strategy. They are testing the electrical part of the detectors (ionisation chamber or secondary emission detector), their cable connections to the front-end electronics, further connections to the back-end electronics and their ability to request a beam abort. During the installation and in the early commissioning phase, these checks have shown their ability to find also non-conformities caused by unexpected failure event scenarios. In every day operation, a non-conformity discovere...

  11. Upgrade of the LHC Schottky Monitor, Operational Experience and First Results

    CERN Document Server

    Betz, Michael; Lefèvre, Thibaut; Wendt, Manfred

    2016-01-01

    The LHC Schottky system allows the measurement of beam parameters such as tune and chromaticity in an entirely non-invasive way by extracting information from the statistical fluctuations in the incoherent motion of particles. The system was commissioned in 2011 and provided satisfactory beam-parameter measurements during LHC run 1 for lead-ions. However, for protons its usability was substantially limited due to strong interfering signals originating from the coherent motion of the particle bunch. The system has recently been upgraded with optimized travelling-wave pick-ups and an improved 4.8~GHz microwave signal path, with the front-end and the triple down-mixing chain optimized to reduce coherent signals. Design and operational aspects for the complete system are shown and the results from measurements with LHC beams in Run II are presented and discussed.

  12. Beam Loss and Beam Shape at the LHC Collimators

    CERN Document Server

    Burkart, Florian

    In this master thesis the beam loss and the beam shape at the LHC collimators was measured, analysed, presented and discussed. Beginning with a short introduction of the LHC, the experiments, the supercon- ducting magnet system, the basics on linear beam dynamics and a describtion of the LHC collimation system are given. This is followed by the presentation of the performance of the LHC collimation sys- tem during 2011. A method to convert the Beam Loss Monitor signal in Gy/s to a proton beam loss rate will be introduced. Also the beam lifetime during the proton physics runs in 2011 will be presented and discussed. Finally, the shape of the LHC beams is analysed by using data obtained by scraping the beam at the LHC primary collimators.

  13. CMOS pixel sensor development for the ATLAS experiment at the High Luminosity-LHC

    CERN Document Server

    Rimoldi, Marco; The ATLAS collaboration

    2017-01-01

    The current ATLAS Inner Detector will be replaced with a fully silicon based detector called Inner Tracker (ITk) before the start of the High Luminosity-LHC project (HL-LHC) in 2026. To cope with the harsh environment expected at the HL-LHC, new approaches are being developed for pixel detector based on CMOS pixel techology. Such detectors provide charge collection, analog and digital amplification in the same silicon bulk. The radiation hardness is obtained with multiple nested wells that have embedded the CMOS electronics with sufficient shielding. The goal of this programme is to demonstrate that depleted CMOS pixels are suitable for high rate, fast timing and high radiation operation at the LHC. A number of alternative solutions have been explored and characterised, and are presented in this document.

  14. First indication of LPM effect in LHCf, an LHC experiment

    Directory of Open Access Journals (Sweden)

    Prete M. Del

    2016-01-01

    Full Text Available The Large Hadron Collider forward (LHCf experiment is dedicated to the measurement of very forward neutral particle production in the high energy hadron-hadron collisions at LHC. The aim of the experiment is to improve the cosmic ray air shower development models and its setup gives an important opportunity to directly measure the Landau Pomeranchunk Migdal (LPM effect in heavy absorber. This work presents the analysis for LPM effect on data taken in 2010 and 2013 at √s = 7 TeV and √sNN = 5.02 TeV respectively. We study the interactions of gamma, mainly produced by π0 decay, in one of the calorimeter pairs of LHCf. This is composed by 16 Tungsten layers as absorbers and 16 plastic scintillators for energy measurements (Arm2. We use three parameters to describe the mean shower profile with respect the photon mean energy. The results are compared with Epics MonteCarlo simulation with LPM active and inactive models, shows a first evidence of LPM effect.

  15. Detector Control System for an LHC experiment - User Requirements Document

    CERN Document Server

    CERN. Geneva

    1997-01-01

    The purpose of this document is to provide the user requirements for a detector control system kernel for the LHC experiments following the ESA standard PSS-05 [1]. The first issue will be used to provide the basis for an evaluation of possible development philosophies for a kernel DCS. As such it will cover all the major functionality but only to a level of detail sufficient for such an evaluation to be performed. Many of the requirements are therefore intentionally high level and generic, and are meant to outline the functionality that would be required of the kernel DCS, but not yet to the level of the detail required for implementation. The document is also written in a generic fashion in order not to rule out any implementation technology.

  16. Multiplicity dependence of two-particle correlation in $\\sqrt{s}$ = 7 TeV pp collisions at LHC-ALICE experiment

    CERN Document Server

    Bhom, Jihyun; Esumi, Shinlchi

    Early stage of universe or inside of neutron stars are supposed to be Quark Gluon Plasma (QGP) state. The QGP is a state of matter in quantum chromodynamics (QCD), which exists at extremely high temperature and/or high density. In high energy nuclear collisions experiment, hot dense matter or QGP has been studied, the collective flow of the system has been one of key issues to understand the state of matter. Large Hadron Collider (LHC) has served pp collisions at a nucleon–nucleon center-of-mass energy of 7 TeV in 2010, where the maximum charged particle multiplicity has been measured as large as 100$\\sim$200 charged particles in $|\\eta|<$2.5 in $dN/d\\eta$ like Cu-Cu collisions $\\sqrt {s_{NN}} = 200$ GeV at mid-peripheral at RHIC. A Large Ion Collider Experiment (ALICE) detector at the LHC is optimized for studying the high-temperature and high-density system called as QGP. Angular correlations between two charged particles are measured with central and forward detectors in ALICE experiment. Th...

  17. Strategies for precision measurements of the charge asymmetry of the W boson mass at the LHC within the ATLAS experiment

    CERN Document Server

    Fayette, Florent

    This thesis dissertation presents a prospect for a measurement of the charge asymmetry of the W boson mass (MW+ - MW-) at the LHC within the ATLAS experiment. This measurement is of primordial importance for the LHC experimental program, both as a direct test of the charge sign independent coupling of the W bosons to the fermions and as a mandatory preliminary step towards the precision measurement of the charge averaged W boson mass. This last pragmatic point can be understood since the LHC specific collisions will provide unprecedented kinematics for the positive and negative channels while the SPS and Tevatron collider produced W+ and W- on the same footing. For that reason, the study of the asymmetries between W+ and W- in Drell--Yan like processes (production of single W decaying into leptons), studied to extract the properties of the W boson, is described thoroughly in this document. Then, the prospect for a measurement of MW+ - MW- at the LHC is addressed in a perspective intending to decrease as much ...

  18. 2008 LHC Open Days: Super(-conducting) events and activities

    CERN Multimedia

    2008-01-01

    Superconductivity will be one of the central themes of the programme of events and discovery activities of the forthcoming LHC Open Days on 5 and 6 April. Visitors will be invited to take part in a range of activities, experiments and exchanges all about this amazing aspect of the LHC project. Why superconductivity? Simply because it’s the principle on which the very operation of the LHC is based. At the heart of the LHC magnets lie 7000 kilometres of superconducting cables, each strand containing between 6000 and 9000 filaments of the superconducting alloy niobium-titanium in a copper coating. These cables, cooled to a temperature close to absolute zero, are able to conduct electricity without resistance. 12000 amp currents - an intensity some 30000 times greater than that of a 100 watt light bulb - pass through the cables of the LHC magnets.   Programme:   BLDG 163 (Saturday 5 and Sunday 6 April): See weird and wonderful experiments with your own eyes In the workshop where the 2...

  19. The High-Luminosity upgrade of the LHC: Physics and Technology Challenges for the Accelerator and the Experiments

    Science.gov (United States)

    Schmidt, Burkhard

    2016-04-01

    In the second phase of the LHC physics program, the accelerator will provide an additional integrated luminosity of about 2500/fb over 10 years of operation to the general purpose detectors ATLAS and CMS. This will substantially enlarge the mass reach in the search for new particles and will also greatly extend the potential to study the properties of the Higgs boson discovered at the LHC in 2012. In order to meet the experimental challenges of unprecedented pp luminosity, the experiments will need to address the aging of the present detectors and to improve the ability to isolate and precisely measure the products of the most interesting collisions. The lectures gave an overview of the physics motivation and described the conceptual designs and the expected performance of the upgrades of the four major experiments, ALICE, ATLAS, CMS and LHCb, along with the plans to develop the appropriate experimental techniques and a brief overview of the accelerator upgrade. Only some key points of the upgrade program of the four major experiments are discussed in this report; more information can be found in the references given at the end.

  20. LHC(ATLAS, CMS, LHCb) Run 2 commissioning status

    CERN Document Server

    Zimmermann, Stephanie; The ATLAS collaboration

    2015-01-01

    After a very successful run-1, the LHC accelerator and the LHC experiments had undergone intensive consolidation, maintenance and upgrade activities during the last 2 years in what has become known as Long-Shutdown-1 (LS1). LS1 ended in February this year, with beams back in the LHC since Easter. This talk will give a summary on the major shutdown activities of ATLAS, CMS and LHCb and review the status of commissioning for run-2 physics data taking.

  1. Virtual machines & volunteer computing: Experience from LHC@Home: Test4Theory project

    CERN Document Server

    Lombraña González, Daniel; Blomer, Jakob; Buncic, Predrag; Harutyunyan, Artem; Marquina, Miguel; Segal, Ben; Skands, Peter; Karneyeu, Anton

    2012-01-01

    Volunteer desktop grids are nowadays becoming more and more powerful thanks to improved high end components: multi-core CPUs, larger RAM memories and hard disks, better network connectivity and bandwidth, etc. As a result, desktop grid systems can run more complex experiments or simulations, but some problems remain: the heterogeneity of hardware architectures and software (library dependencies, code length, big repositories, etc.) make it very difficult for researchers and developers to deploy and maintain a software stack for all the available platforms. In this paper, the employment of virtualization is shown to be the key to solve these problems. It provides a homogeneous layer allowing researchers to focus their efforts on running their experiments. Inside virtual custom execution environments, researchers can control and deploy very complex experiments or simulations running on heterogeneous grids of high-end computers. The following work presents the latest results from CERN’s LHC@home Test4Theory p...

  2. Searching dark matter at LHC

    International Nuclear Information System (INIS)

    Nojiri, Mihoko M.

    2007-01-01

    We now believe that the dark matter in our Universe must be an unknown elementary particle, which is charge neutral and weakly interacting. The standard model must be extended to include it. The dark matter was likely produced in the early universe from the high energy collisions of the particles. Now LHC experiment starting from 2008 will create such high energy collision to explore the nature of the dark matter. In this article we explain how dark matter and LHC physics will be connected in detail. (author)

  3. LHC goes global

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    As CERN's major project for the future, the LHC sets a new scale in world-wide scientific collaboration. As well as researchers and engineers from CERN's 19 European Member States, preparations for the LHC now include scientists from several continents. Some 50 per cent of the researchers involved in one way or another with preparations for the LHC experimental programme now come from countries which are not CERN Member States. Underlining this enlarged international involvement is the recent decision by the Japanese Ministry of Education, Science and Culture ('Monbusho') to accord CERN a generous contribution of five billion yen (about 65 million Swiss francs) to help finance the construction of the LHC. This money will be held in a special fund earmarked for construction of specific LHC components and related activities. To take account of the new situation, CERN is proposing to set up a totally new 'Associate State' status. This is foreseen as a flexible bilateral framework which will be set up on a case-by-case basis to adapt to different circumstances. This proposal was introduced to CERN Council in June, and will be further discussed later this year. These developments reflect CERN's new role as a focus of world science, constituting a first step towards a wider level of international collaboration. At the June Council session, as a first step, Japan was unanimously elected as a CERN Observer State, giving them the right to attend Council meetings. Introducing the topic at the Council session, Director General Chris Llewellyn Smith sketched the history of Japanese involvement in CERN research. This began in 1957 and has gone on to include an important experiment at the LEAR low energy antiproton ring using laser spectroscopy of antiprotonic helium atoms, the new Chorus neutrino experiment using an emulsion target, and a major contribution to the Opal experiment at the LEP electronpositron collider. In welcoming the

  4. LHC an unprecedented technological challenge

    International Nuclear Information System (INIS)

    Baruch, J.O.

    2002-01-01

    This article presents the future LHC (large hadron collider) in simple terms and gives some details concerning radiation detectors and supra-conducting magnets. LHC will take the place of the LEP inside the 27 km long underground tunnel near Geneva and is scheduled to operate in 2007. 8 years after its official launching the LHC project has piled up 2 year delay and has exceeded its initial budget (2 milliard euros) by 18%. Technological challenges and design difficulties are the main causes of these shifts. The first challenge has been carried out successfully, it was the complete clearing out of the LEP installation. In order to release 14 TeV in each proton-proton collision, powerful magnetic fields (8,33 Tesla) are necessary. 1248 supra-conducting 15 m-long bipolar magnets have to be built. 30% of the worldwide production of niobium-titanium wires will be used each year for 5 years in the design of these coils. The global cryogenic system will be gigantic and will use 94 tons of helium. 4 radiation detectors are being built: ATLAS (a toroidal LHC apparatus), CMS (compact muon solenoid), ALICE (a large ion collider experiment) and LHC-b (large hadron collider beauty). The 2 first will search after the Higgs boson, ALICE will be dedicated to the study of the quark-gluon plasma and LHC-b will gather data on the imbalance between matter and anti-matter. (A.C.)

  5. Tevatron-for-LHC Report: Preparations for Discoveries

    Energy Technology Data Exchange (ETDEWEB)

    Buescher, V.; Carena, Marcela S.; Dobrescu, Bogdan A.; Mrenna, S.; Rainwater, D.; Schmitt, M.

    2006-08-01

    This is the ''TeV4LHC'' report of the ''Physics Landscapes'' Working Group, focused on facilitating the start-up of physics explorations at the LHC by using the experience gained at the Tevatron. We present experimental and theoretical results that can be employed to probe various scenarios for physics beyond the Standard Model.

  6. Prospects on electroweak physics from the LHC

    International Nuclear Information System (INIS)

    Vikas, Pratibha

    2001-01-01

    The abundant production of gauge bosons, gauge boson pairs and top quarks at the LHC will offer the opportunity for comprehensive and challenging tests of theoretical predictions in the electroweak sector. Some issues which influence these measurements followed by prospects on some possible measurements by the ATLAS and CMS experiments at the Large Hadron Collider (LHC), at CERN are discussed. (author)

  7. LHC physics

    National Research Council Canada - National Science Library

    Binoth, T

    2012-01-01

    "Exploring the phenomenology of the Large Hadron Collider (LHC) at CERN, LHC Physics focuses on the first years of data collected at the LHC as well as the experimental and theoretical tools involved...

  8. LHC: Past, Present, and Future

    CERN Document Server

    Landsberg, Greg

    2013-01-01

    In this overview talk, I give highlights of the first three years of the LHC operations at high energy, spanning heavy-ion physics, standard model measurements, and searches for new particles, which culminated in the discovery of the Higgs boson by the ATLAS and CMS experiments in 2012. I'll discuss what we found about the properties of the new particle in 10 months since the discovery and then talk about the future LHC program and preparations to the 2015 run at the center-of-mass energy of ~13 TeV. These proceedings are meant to be a snapshot of the LHC results as of May 2013 - the time of the conference. Many of the results shown in these proceedings have been since updated (sometimes significantly) just 4 months thereafter, when these proceedings were due. Nevertheless, keeping this writeup in sync with the results shown in the actual talk has some historical value, as, for one, it tells the reader how short is the turnaround time to update the results at the LHC. To help an appreciation of this fact, I b...

  9. LHC and the neutrino paradigm

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    I argue that LHC may shed light on the nature of neutrino mass through the probe of the seesaw mechanism. The smoking gun signature is lepton number violation through the production of same sign lepton pairs, a collider analogy of the neutrinoless double beta decay. I discuss this in the context of L-R symmetric theories, which predicted neutrino mass long before experiment and led to the seesaw mechanism. A WR gauge boson with a mass in a few TeV region could easily dominate neutrinoless double beta decay, and its discovery at LHC would have spectacular signatures of parity restoration and lepton number violation. I also discuss the collider signatures of the three types of seesaw mechanism, and show how in the case of Type II one can measure the PMNS mixing matrix at the LHC, complementing the low energy probes. Finally, I give an example of a simple realistic SU(5) grand unified theory that predicts the hybrid Type I + III seesaw with a weak fermion triplet at the LHC energies. The seminar will be fol...

  10. CMOS pixel sensor development for the ATLAS experiment at the High Luminosity-LHC

    Science.gov (United States)

    Rimoldi, M.

    2017-12-01

    The current ATLAS Inner Detector will be replaced with a fully silicon based detector called Inner Tracker (ITk) before the start of the High Luminosity-LHC project (HL-LHC) in 2026. To cope with the harsh environment expected at the HL-LHC, new approaches are being developed for pixel detectors based on CMOS technology. Such detectors can provide charge collection, analog amplification and digital processing in the same silicon wafer. The radiation hardness is improved thanks to multiple nested wells which give the embedded CMOS electronics sufficient shielding. The goal of this programme is to demonstrate that depleted CMOS pixels are suitable for high rate, fast timing and high radiation operation at the LHC . A number of alternative solutions have been explored and characterised. In this document, test results of the sensors fabricated in different CMOS processes are reported.

  11. 2008 LHC Open Days LHC magnets on display

    CERN Multimedia

    2008-01-01

    Over the last few years you’ve probably seen many of the 15 m long blue LHC dipole magnets being ferried around the site. Most of them are underground now, but on the LHC Open Days on 5 and 6 April the magnets will also play a central role on the surface. Installation of one of the LHC dipole magnets on the Saint-Genis roundabout on 7 March. The LHC dipole testing facility with several magnets at various stages of testing. The 27 km ring of the LHC consists of 1232 double-aperture superconducting dipole magnets, 360 short straight sections (SSS) and 114 special SSS for the insertion regions. On the Open Day, you will be able to "Follow the LHC magnets" through different stages around the site, culminating in their descent into the tunnel. Discover all the many components that have to be precisely integrated in the magnet casings, and talk to the engine...

  12. LHC Report: Tests of new LHC running modes

    CERN Document Server

    Verena Kain for the LHC team

    2012-01-01

    On 13 September, the LHC collided lead ions with protons for the first time. This outstanding achievement was key preparation for the planned 2013 operation in this mode. Outside of two special physics runs, the LHC has continued productive proton-proton luminosity operation.   Celebrating proton-ion collisions. The first week of September added another 1 fb-1 of integrated luminosity to ATLAS’s and CMS’s proton-proton data set. It was a week of good and steady production mixed with the usual collection of minor equipment faults. The peak performance was slightly degraded at the start of the week but thanks to the work of the teams in the LHC injectors the beam brightness – and thus the LHC peak performance – were restored to previous levels by the weekend. The LHC then switched to new running modes and spectacularly proved its potential as a multi-purpose machine. This is due in large part to the LHC equipment and controls, which have been designed wi...

  13. The radiation monitoring system for the LHC experiments and experimental areas

    CERN Document Server

    Ilgner, C

    2004-01-01

    With the high energies stored in the beams of the LHC, special attention needs to be paid to accident scenarios involving beam losses which may have an impact on the installed experiments. Among others, an unsynchronized beam abort and a D1 magnet failure are considered serious cases. According to simulations, the CMS inner tracker in such accident scenarios can be damaged by instantaneous rates which are many orders of magnitude above normal conditions. Investigations of synthetic diamond as a beam condition monitor sensor, capable of generating a fast beam dump signal, will be presented. Furthermore, a system to monitor the radiation fields in the experimental areas is being developed. It must function in the radiation fields inside and around the experiments, over a large dynamic range. Several new active and passive sensors, such as RadFET, OSL (Optically Stimulated Luminescence) sensors, p-i-n diodes, Polymer-Alanine Dosimeters and TLDs (Thermoluminescent Dosimeters) are under investigation. Recent resul...

  14. AUTOMATING THE CONFIGURATION OF THE CONTROLS SYSTEMS OF THE LHC EXPERIMENTS

    CERN Multimedia

    Calheiros, F; Varela, F

    2007-01-01

    The supervisory layer of the Large Hadron Collider (LHC) experiments is based on the Prozeßvisualisierungs- und Steuerungsystem (PVSS) [1] and the Joint COntrols Project (JCOP) Framework (FW) [2]. This controls framework includes a Finite State Machine (FSM) toolkit, which allows to operate the control systems according to a well-defined set of states and commands. During the FSM transitions of the detectors, it is required to re-configure parts of the control systems. All configuration parameters of the devices integrated into the control system are stored in the so-called configuration database. In this paper the JCOP FW FSM-Configuration database tool is presented. This tool ensures the availability of all required configuration data, for a given type of run of the experiment, in the PVSS sub-detector control applications. The chosen implementation strategy is discussed in the paper. The approach enables the standalone operation of different partitions of the detectors simultaneously while ensuring indepe...

  15. 6 March 2013 - Committee for Employment and Learning, Northern Ireland Legislative Assembly, United Kingdom of Great Britain and Northern Ireland in the LHC tunnel and visiting the LHCb experiment at LHC Point 8. Director for Accelerators and Technology S. Myers with Vice-Chair T. Buchanan.

    CERN Multimedia

    Anna Pantelia

    2013-01-01

    6 March 2013 - Committee for Employment and Learning, Northern Ireland Legislative Assembly, United Kingdom of Great Britain and Northern Ireland in the LHC tunnel and visiting the LHCb experiment at LHC Point 8. Director for Accelerators and Technology S. Myers with Vice-Chair T. Buchanan.

  16. QCD@LHC International Conference

    CERN Document Server

    2016-01-01

    The particle physics groups of UZH and ETH will host the QCD@LHC2016 conference (22.8.-26.8., UZH downtown campus), which is part of an annual conference series bringing together theorists and experimentalists working on hard scattering processes at the CERN LHC, ranging from precision studies of Standard Model processes to searches for new particles and phenomena. The format of the conference is a combination of plenary review talks and parallel sessions, with the latter providing a particularly good opportunity for junior researchers to present their results. The conference will take place shortly after the release of the new data taken by the LHC in sping 2016 at a collision energy of 13TeV, expected to more than double the currently available data set. It will be one of the first opportunities to discuss these data in a broader context, and we expect the conference to become a very lively forum at the interface of phenomenology and experiment.

  17. Lectures on LHC physics

    CERN Document Server

    Plehn, Tilman

    2015-01-01

    With the discovery of the Higgs boson, the LHC experiments have closed the most important gap in our understanding of fundamental interactions, confirming that such interactions between elementary particles can be described by quantum field theory, more specifically by a renormalizable gauge theory. This theory is a priori valid for arbitrarily high energy scales and does not require an ultraviolet completion. Yet, when trying to apply the concrete knowledge of quantum field theory to actual LHC physics - in particular to the Higgs sector and certain regimes of QCD - one inevitably encounters an intricate maze of phenomenological know-how, common lore and other, often historically developed intuitions about what works and what doesn’t. These lectures cover three aspects to help understand LHC results in the Higgs sector and in searches for physics beyond the Standard Model: they discuss the many facets of Higgs physics, which is at the core of this significantly expanded second edition; then QCD, to the deg...

  18. Lead ions and Coulomb’s Law at the LHC (CERN)

    Science.gov (United States)

    Cid-Vidal, Xabier; Cid, Ramon

    2018-03-01

    Although for most of the time the Large Hadron Collider (LHC) at CERN collides protons, for around one month every year lead ions are collided, to expand the diversity of the LHC research programme. Furthermore, in an effort not originally foreseen, proton-lead collisions are also taking place, with results of high interest to the physics community. All the large experiments of the LHC have now joined the heavy-ion programme, including the LHCb experiment, which was not at first expected to be part of it. The aim of this article is to introduce a few simple physical calculations relating to some electrical phenomena that occur when lead-ion bunches are running in the LHC, using Coulomb’s Law, to be taken to the secondary school classroom to help students understand some important physical concepts.

  19. Operational Experience with the ATLAS Pixel Detector at LHC

    CERN Document Server

    Keil, M

    2013-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus crucial for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via front-end chips bump-bonded to 1744 n-on-n silicon substrates. In this paper results from the successful operation of the Pixel Detector at the LHC will be presented, including calibration procedures, detector performance and measurements of radiation damage. The detector performance is excellent: more than 95% of the pixels are operational, noise occupancy and hit efficiency exceed the des...

  20. Contribution of thermo-fluid analyses to the LHC experiments

    CERN Document Server

    Gasser, G

    2003-01-01

    The big amount of electrical and electronic equipment that will be installed in the four LHC experiments will cause important heat dissipation into the detectors’ volumes. This is a major issue for the experimental groups, as temperature stability is often a fundamental requirement for the different sub-detectors to be able to provide a good measurement quality. The thermofluid analyses that are carried out in the ST/CV group are a very efficient tool to understand and predict the thermal behaviour of the detectors. These studies are undertaken according to the needs of the experimental groups; they aim at evaluate the thermal stability for a proposed design, or to compare different technical solutions in order to choose the best one for the final design. The usual approach to carry out these studies is first presented and then, some practical examples of thermo-fluid analyses are presented focusing on the main results in order to illustrate their contribution.

  1. Development of a timing detector for the TOTEM experiment at the LHC

    Science.gov (United States)

    Minafra, Nicola

    2017-09-01

    The upgrade program of the TOTEM experiment will include the installation of timing detectors inside vertical Roman Pots to allow the reconstruction of the longitudinal vertex position in the presence of event pile-up in high- β^{\\ast} dedicated runs. The small available space inside the Roman Pot, optimized for high-intensity LHC runs, and the required time precision led to the study of a solution using single crystal CVD diamonds. The sensors are read out using fast low-noise front-end electronics developed by the TOTEM Collaboration, achieving a signal-to-noise ratio larger than 20 for MIPs. A prototype was designed, manufactured and tested during a test beam campaign, proving a time precision below 100ps and an efficiency above 99%. The geometry of the detector has been designed to guarantee uniform occupancy in the expected running conditions keeping, at the same time, the number of channels below 12. The read-out electronics was developed during an extensive campaign of beam tests dedicated first to the characterization of existing solution and then to the optimization of the electronics designed within the Collaboration. The detectors were designed to be read out using the SAMPIC chip, a fast sampler designed specifically for picosecond timing measurements with high-rate capabilities; later, a modified version was realized using the HPTDC to achieve the higher trigger rates required for the CT-PPS experiment. The first set of prototypes was successfully installed and tested in the LHC in November 2015; moreover the detectors modified for CT-PPS are successfully part of the global CMS data taking since October 2016.

  2. Petroleum refining industry in China

    International Nuclear Information System (INIS)

    Walls, W.D.

    2010-01-01

    The oil refining industry in China has faced rapid growth in oil imports of increasingly sour grades of crude with which to satisfy growing domestic demand for a slate of lighter and cleaner finished products sold at subsidized prices. At the same time, the world petroleum refining industry has been moving from one that serves primarily local and regional markets to one that serves global markets for finished products, as world refining capacity utilization has increased. Globally, refined product markets are likely to experience continued globalization until refining investments significantly expand capacity in key demand regions. We survey the oil refining industry in China in the context of the world market for heterogeneous crude oils and growing world trade in refined petroleum products. (author)

  3. LHC, Astrophysics and Cosmology

    Directory of Open Access Journals (Sweden)

    Giulio Auriemma

    2014-12-01

    Full Text Available In this paper we discuss the impact on cosmology of recent results obtained by the LHC (Large Hadron Collider experiments in the 2011-2012 runs, respectively at √s = 7 and 8 TeV. The capital achievement of LHC in this period has been the discovery of a spin-0 particle with mass 126 GeV/c2, very similar to the Higgs boson of the Standard Model of Particle Physics. Less exciting, but not less important, negative results of searches for Supersymmetric particles or other exotica in direct production or rare decays are discussed in connection with particles and V.H.E. astronomy searches for Dark Matter.

  4. Switch on to the LHC!

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    The LHC is preparing to collide beams at 3.5 TeV for the first time ever! Be part of the event and follow live what goes on at the world’s most powerful particle accelerator by connecting to LHC1. Hereafter we give you a key to understand the display as well as a typical event display from the ATLAS and CMS experiments. Click on the image to enlarge it 1. This is the energy of beams. 1 TeV=1000 GeV. The LHC set the energy world’s record of 3.48 TeV per beam, today, 19 March 2010. 2. Intensity of, respectively, B1 (blue) and B2 (red). 3. The information in these boxes can vary. Operators display the graphs that are relevant to the specific operation. 4. Most of the flags are set automatically. They provide a quick summary of the machine status. In order to have collisions the ‘Stable Beams’ flag must be set to green. 5. Here operators write down their messages to the experiments. Often, they write the ongoing activity, followed by the plan for the coming hou...

  5. Changes to the LHC Beam Dumping System for LHC Run 2

    CERN Document Server

    Uythoven, Jan; Borburgh, Jan; Carlier, Etienne; Gabourin, Stéphane; Goddard, Brennan; Magnin, Nicolas; Senaj, Viliam; Voumard, Nicolas; Weterings, Wim

    2014-01-01

    The LHC beam dumping system performed according to expectations during Run 1 of the LHC (2009 – 2013). A brief overview of the experience is given, including a summary of the observed performance by comparison to expectations. An important number of changes are applied to the beam dumping system during the present Long Shutdown in order to further improve its safety and performance. They include the addition of a direct link between the Beam Interlock System and the re-triggering system of the dump kickers, the modification of the uninterrupted electrical power distribution architecture, the upgrade of the HV generators, the consolidation of the trigger synchronisation system, the modifications to the triggering system of the power switches and the changes to the dump absorbers TCDQ.

  6. LHC Interaction Region Upgrade Phase I

    CERN Document Server

    Ostojic, R

    2009-01-01

    The LHC is starting operation with beam in 2008. The primary goal of CERN and the LHC community is to ensure that the collider is operated efficiently, maximizing its physics reach, and to achieve the nominal performance in the shortest term. Since several years the community has been discussing the directions for upgrading the experiments, in particular ATLAS and CMS, the LHC machine and the CERN proton injector complex. A well substantiated and coherent scenario for the first phase of the upgrade, which is foreseen in 2013, is now approved by CERN Council. In this paper, we present the goals and the proposed conceptual solution for the Phase-I upgrade of the LHC interaction regions. This phase relies on the mature Nb-Ti superconducting magnet technology, with the target of increasing the luminosity by a factor of 2-3 with respect to the nominal luminosity of 1034 cm-2s-1, while maximising the use of the existing infrastructure.

  7. LHC and CLIC LLRF final reports

    CERN Document Server

    Dexter, A; Woolley, B; Ambattu, P; Tahir, I; Syratchev, Igor; Wuensch, Walter

    2013-01-01

    Crab cavities rotate bunches from opposing beams to achieve effective head-on collision in CLIC or collisions at an adjustable angle in LHC. Without crab cavities 90% of achievable luminosity at CLIC would be lost. In the LHC, the crab cavities allow the same or larger integrated luminosity while reducing significantly the requested dynamic range of physics detectors. The focus for CLIC is accurate phase synchronisation of the cavities, adequate damping of wakefields and modest amplitude stability. For the LHC, the main LLRF issues are related to imperfections: beam offsets in cavities, RF noise, measurement noise in feedback loops, failure modes and mitigations. This report develops issues associated with synchronising the CLIC cavities. It defines an RF system and experiments to validate the approach. It reports on the development of hardware for measuring the phase performance of the RF distributions system and cavities. For the LHC, the hardware being very close to the existing LLRF, the report focuses on...

  8. The VZERO detector, the present muon physics and its future with the ALICE experiment at the LHC

    International Nuclear Information System (INIS)

    Tieulent, R.

    2013-01-01

    The ALICE experiment studies the Pb-Pb or proton-Pb or proton-proton collisions at the LHC to assess the fundamental features of the quark-gluon plasma (QGP). A brief introduction to QGP and physics of heavy ions is given in the first chapter. A detector named VZERO composed of 2 hodoscopes made up of organic scintillators located on either side of the collision point has been designed. The main purpose of VZERO is to provide the triggering signal for the ALICE experiment and to provide a second triggering signal sensitive to the energy density released in the collision. VZERO is described in the second chapter. QGP can be studied through various observables. The muons is one of the most promising as the production of muons appears at any stage of the QGP evolution and the muons can be detected easily as they do interact weakly with the plasma. The muon spectrometer and its alignment system are described in the chapter 3. The vector mesons of low mass like for instance the ρ meson are sensitive to the medium effect and to the restoration of the Chiral symmetry. The Chiral symmetry is spontaneously broken in QCD at normal energy and density ranges but the restoration of the Chiral symmetry is predicted by QCD calculus at the temperatures reached by LHC. The study of low mass vector mesons is described in the fourth chapter. A new step forward for the ALICE experiment is being prepared in order to benefit fully with the increase of both luminosity and energy of the LHC in 2018. A new detector based on silicon pixels: the Muon Forward Tracker (MFT) is being designed. The experimental data of the muon spectrometer combined with those of the MFT will open a new road for muon physics. The last chapter is dedicated to the MFT

  9. Enabling technologies for silicon microstrip tracking detectors at the HL-LHC

    International Nuclear Information System (INIS)

    Feld, L.; Karpinski, W.; Klein, K.

    2016-04-01

    While the tracking detectors of the ATLAS and CMS experiments have shown excellent performance in Run 1 of LHC data taking, and are expected to continue to do so during LHC operation at design luminosity, both experiments will have to exchange their tracking systems when the LHC is upgraded to the high-luminosity LHC (HL-LHC) around the year 2024. The new tracking systems need to operate in an environment in which both the hit densities and the radiation damage will be about an order of magnitude higher than today. In addition, the new trackers need to contribute to the first level trigger in order to maintain a high data-taking efficiency for the interesting processes. Novel detector technologies have to be developed to meet these very challenging goals. The German groups active in the upgrades of the ATLAS and CMS tracking systems have formed a collaborative ''Project on Enabling Technologies for Silicon Microstrip Tracking Detectors at the HL-LHC'' (PETTL), which was supported by the Helmholtz Alliance ''Physics at the Terascale'' during the years 2013 and 2014. The aim of the project was to share experience and to work together on key areas of mutual interest during the R and D phase of these upgrades. The project concentrated on five areas, namely exchange of experience, radiation hardness of silicon sensors, low mass system design, automated precision assembly procedures, and irradiations. This report summarizes the main achievements.

  10. Enabling technologies for silicon microstrip tracking detectors at the HL-LHC

    Energy Technology Data Exchange (ETDEWEB)

    Feld, L.; Karpinski, W.; Klein, K. [RWTH Aachen Univ. (Germany). 1. Physikalisches Institut B; Collaboration: The PETTL Collaboration; and others

    2016-04-15

    While the tracking detectors of the ATLAS and CMS experiments have shown excellent performance in Run 1 of LHC data taking, and are expected to continue to do so during LHC operation at design luminosity, both experiments will have to exchange their tracking systems when the LHC is upgraded to the high-luminosity LHC (HL-LHC) around the year 2024. The new tracking systems need to operate in an environment in which both the hit densities and the radiation damage will be about an order of magnitude higher than today. In addition, the new trackers need to contribute to the first level trigger in order to maintain a high data-taking efficiency for the interesting processes. Novel detector technologies have to be developed to meet these very challenging goals. The German groups active in the upgrades of the ATLAS and CMS tracking systems have formed a collaborative ''Project on Enabling Technologies for Silicon Microstrip Tracking Detectors at the HL-LHC'' (PETTL), which was supported by the Helmholtz Alliance ''Physics at the Terascale'' during the years 2013 and 2014. The aim of the project was to share experience and to work together on key areas of mutual interest during the R and D phase of these upgrades. The project concentrated on five areas, namely exchange of experience, radiation hardness of silicon sensors, low mass system design, automated precision assembly procedures, and irradiations. This report summarizes the main achievements.

  11. Electron cloud in the CERN accelerators (PS, SPS, LHC)

    International Nuclear Information System (INIS)

    Iadarola, G; Rumolo, G

    2013-01-01

    Several indicators have pointed to the presence of an Electron Cloud (EC) in some of the CERN accelerators, when operating with closely spaced bunched beams. In particular, spurious signals on the pick ups used for beam detection, pressure rise and beam instabilities were observed at the Proton Synchrotron (PS) during the last stage of preparation of the beams for the Large Hadron Collider (LHC), as well as at the Super Proton Synchrotron (SPS). Since the LHC has started operation in 2009, typical electron cloud phenomena have appeared also in this machine, when running with trains of closely packed bunches (i.e. with spacings below 150ns). Beside the above mentioned indicators, other typical signatures were seen in this machine (due to its operation mode and/or more refined detection possibilities), like heat load in the cold dipoles, bunch dependent emittance growth and degraded lifetime in store and bunch-by-bunch stable phase shift to compensate for the energy loss due to the electron cloud. An overview of the electron cloud status in the different CERN machines (PS, SPS, LHC) will be presented in this paper, with a special emphasis on the dangers for future operation with more intense beams and the necessary countermeasures to mitigate or suppress the effect. (author)

  12. Proposal for a blanket purchase agreement for the supply and repair of subracks for the LHC experiments

    CERN Document Server

    2002-01-01

    This document concerns the award of a blanket purchase agreement for the supply and repair of subracks for the LHC experiments. Following a market survey carried out among 27 firms in seven Member States and one firm in a non-Member State, a call for tenders (IT-2916/EP) was sent on 9 November 2001 to 16 firms in five Member States. By the closing date, CERN had received six tenders. The Finance Committee is invited to agree to the negotiation of a blanket purchase agreement with WIENER, PLEIN & BAUS (DE) for the supply of subracks for a period of four years and a repair service for a period of ten years after expiry of the initial two year guarantee period, for a total amount not exceeding 5 600 000 euros, subject to revision for inflation from 1 January 2003. At the present rate of exchange, the total amount of the blanket purchase agreement is equivalent to approximately 8 300 000 Swiss francs. This requirement will be financed by the collaborating institutes of the LHC experiments and by CERN. CERN's ...

  13. SEAL: Common Core Libraries and Services for LHC Applications

    CERN Document Server

    Generowicz, J; Moneta, L; Roiser, S; Marino, M; Tuura, L A

    2003-01-01

    The CERN LHC experiments have begun the LHC Computing Grid project in 2001. One of the project's aims is to develop common software infrastructure based on a development vision shared by the participating experiments. The SEAL project will provide common foundation libraries, services and utilities identified by the project's architecture blueprint report. This requires a broad range of functionality that no individual package suitably covers. SEAL thus selects external and experiment-developed packages, integrates them in a coherent whole, develops new code for missing functionality, and provides support to the experiments. We describe the set of basic components identified by the LHC Computing Grid project and thought to be sufficient for development of higher level framework components and specializations. Examples of such components are a plug-in manager, an object dictionary, object whiteboards, an incident or event manager. We present the design and implementation of some of these components and the und...

  14. Enabling Technologies for Silicon Microstrip Tracking Detectors at the HL-LHC

    CERN Document Server

    Barth, C; Bloch, I.; Bögelspacher, F.; de Boer, W.; Daniels, M.; Dierlamm, A.; Eber, R.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Erfle, J.; Feld, L.; Garutti, E.; Gregor, I. -M.; Guthoff, M.; Hartmann, F.; Hauser, M.; Husemann, U.; Jakobs, K.; Junkes, A.; Karpinski, W.; Klein, K.; Kuehn, S.; Lacker, H.; Mahboubi, K.; Müller, Th.; Mussgiller, A.; Nürnberg, A.; Parzefall, U.; Poehlsen, T.; Poley, L.; Preuten, M.; Rehnisch, L.; Sammet, J.; Schleper, P.; Schuwalow, S.; Sperlich, D.; Stanitzki, M.; Steinbrück, G.; Wlochal, M.

    2016-01-01

    While the tracking detectors of the ATLAS and CMS experiments have shown excellent performance in Run 1 of LHC data taking, and are expected to continue to do so during LHC operation at design luminosity, both experiments will have to exchange their tracking systems when the LHC is upgraded to the high-luminosity LHC (HL-LHC) around the year 2024. The new tracking systems need to operate in an environment in which both the hit densities and the radiation damage will be about an order of magnitude higher than today. In addition, the new trackers need to contribute to the first level trigger in order to maintain a high data-taking efficiency for the interesting processes. Novel detector technologies have to be developed to meet these very challenging goals. The German groups active in the upgrades of the ATLAS and CMS tracking systems have formed a collaborative "Project on Enabling Technologies for Silicon Microstrip Tracking Detectors at the HL-LHC" (PETTL), which was supported by the Helmholtz Alliance "Phys...

  15. Turning the LHC Ring into a New Physics Search Machine

    CERN Document Server

    Kalliokoski, Matti; Mieskolainen, Mikael; Orava, Risto

    2016-01-01

    By combining the LHC Beam Loss Monitoring (BLM) system with the LHC experiments, a powerful search machine for new physics beyond the standard model can be realised. The pair of final state protons in the central production process, exit the LHC beam vacuum chamber at locations determined by their fractional momentum losses and will be detected by the BLM detectors. By mapping out the coincident pairs of the BLM identified proton candidates around the four LHC interaction regions, a scan for centrally produced particle states can be made independently of their decay modes.

  16. LHC project. Exploring the smallest world with the highest energy beam

    International Nuclear Information System (INIS)

    Kondo, Takahiko; Kobayashi, Tomio

    2007-01-01

    The LHC accelerator at CERN will be completed soon and the experiments are about to start, making it possible to explore the TeV energy region for the first time in human history. There exists a clear reason why the TeV region is especially important for experimental exploration. The Higgs particle, the last elusive element of the Standard Model, will be discovered with very high probability. In addition there are high chances to discover signs of new physics beyond the Standard Model such as SUSY particles. Dark matter may be discovered. As an introduction of the mini-special issue for LHC, its goals and history is briefly reviewed, followed by a description on LHC accelerator, four LHC experiments as well as the contributions by Japan. (author)

  17. Perspectives of SM Higgs measurements at the LHC

    Indian Academy of Sciences (India)

    ... where significant signals can be expected from the LHC experiments. The most sensitive LHC Higgs signatures are reviewed and the discovery year is estimated as a function of the Higgs mass. Finally, we give some ideas about: 'What might be known about the production and decays of a SM Higgs boson' after 10 years ...

  18. LHC Report: A new luminosity record

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    After about one month of operation, the LHC has already accumulated an integrated luminosity of 28 pb-1, which corresponds to over 50% of the total delivered to the experiments in 2010. This impressive start to the LHC run in 2011 bodes well for the rest of year.   Following careful collimator set-up and validation, the first phase of beam commissioning 2011 has come to an end. The first stable beams were declared on Sunday 13 March with a moderate 3 bunches per beam and an initial luminosity of 1.6 × 1030 cm-2s-1. Machine protection tests continued during the following week as the commissioning team made absolutely sure that all critical systems (beam dumps, beam interlock system, etc.) were functioning properly. When these tests had finished, the way was opened to increased intensity and the LHC quickly moved through the first part of its planned, staged intensity increase. Fills with increasing numbers of bunches were delivered to the experiments, culminating in a fill with 200...

  19. LHC Report: highs and wet lows

    CERN Multimedia

    Enrico Bravin and Stefano Redaelli for the LHC team

    2016-01-01

    Summertime, and the livin’ is easy… not so for the LHC, which is just entering four weeks of full-on luminosity production.   In the two weeks that followed the first technical stop (7-9 June), the LHC has demonstrated once again an outstanding performance. Thanks to the excellent availability of all systems, peaking at 93% in week 24, it was possible to chain physics fill after physics fill, with 60% of the time spent in collisions. We have now surpassed the total integrated luminosity delivered in 2015 (4.2 fb-1). The integrated luminosity for 2016 now exceeds 6 fb-1 for each of the two high-luminosity experiments, ATLAS and CMS. Long fills, exceeding 20 hours, are now part of regular operation, with some producing more than 0.5 fb-1. With the summer conferences approaching, this certainly provides a good dataset for the LHC experiments to analyse and present. Several records were broken again, namely the highest instantaneous luminosity – over 9 x 1033 cm-2...

  20. Virtual reality visualization algorithms for the ALICE high energy physics experiment on the LHC at CERN

    Science.gov (United States)

    Myrcha, Julian; Trzciński, Tomasz; Rokita, Przemysław

    2017-08-01

    Analyzing massive amounts of data gathered during many high energy physics experiments, including but not limited to the LHC ALICE detector experiment, requires efficient and intuitive methods of visualisation. One of the possible approaches to that problem is stereoscopic 3D data visualisation. In this paper, we propose several methods that provide high quality data visualisation and we explain how those methods can be applied in virtual reality headsets. The outcome of this work is easily applicable to many real-life applications needed in high energy physics and can be seen as a first step towards using fully immersive virtual reality technologies within the frames of the ALICE experiment.

  1. High-Luminosity LHC moves to the next phase

    CERN Multimedia

    2015-01-01

    This week saw several meetings vital for the medium-term future of CERN.    From Monday to Wednesday, the Resource Review Board, RRB, that oversees resource allocation in the LHC experiments, had a series of meetings. Thursday then saw the close-out meeting for the Hi-Lumi LHC design study, which was partially funded by the European Commission. These meetings focused on the High Luminosity upgrade for the LHC, which responds to the top priority of the European Strategy for Particle Physics adopted by the CERN Council in 2013. This upgrade will transform the LHC into a facility for precision studies, the logical next step for the high-energy frontier of particle physics. It is a challenging upgrade, both for the LHC and the detectors. The LHC is already the highest luminosity hadron collider ever constructed, generating up to a billion collisions per second at the heart of the detectors. The High Luminosity upgrade will see that number rise by a factor of five from 2025. For the detectors...

  2. UFOs in the LHC: Observations, studies and extrapolations

    CERN Document Server

    Baer, T; Cerutti, F; Ferrari, A; Garrel, N; Goddard, B; Holzer, EB; Jackson, S; Lechner, A; Mertens, V; Misiowiec, M; Nebot del Busto, E; Nordt, A; Uythoven, J; Vlachoudis, V; Wenninger, J; Zamantzas, C; Zimmermann, F; Fuster, N

    2012-01-01

    Unidentified falling objects (UFOs) are potentially a major luminosity limitation for nominal LHC operation. They are presumably micrometer sized dust particles which lead to fast beam losses when they interact with the beam. With large-scale increases and optimizations of the beam loss monitor (BLM) thresholds, their impact on LHC availability was mitigated from mid 2011 onwards. For higher beam energy and lower magnet quench limits, the problem is expected to be considerably worse, though. In 2011/12, the diagnostics for UFO events were significantly improved: dedicated experiments and measurements in the LHC and in the laboratory were made and complemented by FLUKA simulations and theoretical studies. The state of knowledge, extrapolations for nominal LHC operation and mitigation strategies are presented

  3. CMOS Pixel Development for the ATLAS Experiment at HL-LHC

    CERN Document Server

    Ristic, Branislav; The ATLAS collaboration

    2017-01-01

    To cope with the rate and radiation environment expected at the HL-LHC new approaches are being developed on CMOS pixel detectors, providing charge collection in a depleted layer. They are based on technologies that allow to use high depletion voltages (HV-MAPS) and high resistivity wafers (HR-MAPS) for large depletion depths; radiation hard processed with multiple nested wells to allow CMOS electronics to be embedded safely into the sensor substrate. We are investigating depleted CMOS pixels with monolithic or hybrid designs concerning their suitability for high rate, fast timing and high radiation operation at LHC. This paper will discuss recent results on the main candidate technologies and the current development towards a monolithic solution.

  4. Academic Training - LHC luminosity upgrade: detector challenges

    CERN Multimedia

    Françoise Benz

    2006-01-01

    ACADEMIC TRAINING LECTURE SERIES 13, 14, 15, March, from 11:00 to 12:00 - 16 March from 10:00 to 12:00 Main Auditorium, bldg. 500 on 14, 15 March, Council Room on 13, 16 March LHC luminosity upgrade: detector challenges A. De Roeck / CERN-PH, D. Bortoletto / Purdue Univ. USA, R. Wigmans / Texas, Tech Univ. USA, W. Riegler / CERN-PH, W. Smith / Wisconsin Univ. USA The upgrade of the LHC machine towards higher luminosity (1035 cm-2s-1) has been studied over the last few years. These studies have investigated scenarios to achieve the increase in peak luminosity by an order of magnitude, as well as the physics potential of such an upgrade and the impact of a machine upgrade on the LHC DETECTORS. This series of lectures will cover the following topics: Physics motivation and machine scenarios for an order of magnitude increase in the LHC peak luminosity (lecture 1) Detector challenges including overview of ideas for R&D programs by the LHC experiments: tracking and calorimetry, other new detector ...

  5. ATLAS Plans for the High-Luminosity LHC

    CERN Document Server

    Walkowiak, Wolfgang; The ATLAS collaboration

    2018-01-01

    Despite the excellent performance of the Large Hadron Collider (LHC) at CERN an upgrade to a High-Luminosity LHC (HL-LHC) with a peak instantaneous luminosity of up to $7.5\\times 10^{34}$ fb$^{-1}$ will be required after collecting a total dataset of approximately 300 fb$^{-1}$ by the end of Run 3 (in 2023). The upgrade will substantially increase the statistics available to the experiments for addressing the remaining open puzzles of particle physics. The HL-LHC is expected to start operating in 2026 and to deliver up to 4000 fb$^{-1}$ within twelve years. The corresponding upgrades of the ATLAS detector and the ATLAS beauty physics program at the HL-LHC are being discussed. As examples, preliminary results on the expected sensitivities for the search for CP-violation in the decay channel $B^0_s \\to J/\\psi \\,\\phi$ using the parameters $\\Delta\\Gamma_s$ and $\\phi_s$ as well as projections for the branching fractions of the rare decays $B^0_s \\to \\mu^+\\mu^-$ and $B^0\\to\\mu^+\\mu^-$ are provided.

  6. LHCb: Numerical Analysis of Machine Background in the LHCb Experiment for the Early and Nominal Operation of LHC

    CERN Multimedia

    Lieng, M H; Corti, G; Talanov, V

    2010-01-01

    We consider the formation of machine background induced by proton losses in the long straight section of the LHCb experiment at LHC. Both sources showering from the tertiary collimators located in the LHCb insertion region as well as local beam-gas interaction are taken into account. We present the procedure for, and results of, numerical studies of such background for various conditions. Additionally expected impact and on the experiment and signal characteristics are discussed.

  7. Application of diamond based beam loss monitors at LHC

    International Nuclear Information System (INIS)

    Hempel, Maria

    2013-04-01

    The Large Hadron Collider (LHC) was conceived in the 1980s and started the operation in 2008. It needed more than 20 years to plan and construct this accelerator and its experiments. Four main experiments are located around the ring, Compact Muon Solenoid (CMS), A Toroidal LHC Apparatus(ATLAS), A Large Ion Collider Experiment (ALICE) and LHC beauty (LHCb). Two beams that traveling in opposite direction in the LHC tunnel, collide in each of the experiments to study the questions: ''What is mass?'', ''What is the universe made of?'' and ''Why is there no antimatter?''. The four experiments take data of the collision products and try to answer the fundamental questions of physics. The two larger detectors, CMS and ATLAS, are looking for the Higgs boson to study the electroweak symmetry breaking. Both detectors were built with contrasting concepts to exclude potential error sources and to rea rm the results. The smaller experiment LHCb studies the matter-antimatter asymmetry with a focus of the beauty quark. Another smaller experiment is ALICE that studies the conditions right after the Big Bang by colliding heavy ions. The navigation of the beams is done by over 10000 magnets and each beam has a stored energy of 362MJ which correspond to the kinetic energy of a train like the TGV travelling of 150 km/h. Only a small percentage of that energy can damage the material in the LHC ring or the magnets. This would mean a repair time of months or years, without taking any data. To avoid such a scenario, it is important to monitor the beam condition and measure the amount of losses of the beam. Such losses can for example happen due to dust particles in the vacuum chambers or due to deviations of the beam parameters. Several systems called beam loss monitors (BLMs) can measure beam losses. This thesis concentrates on two of them, ionization chambers and diamond detectors. Over 3600 ionization chambers are installed in the LHC, especially near each quadrupole and next to

  8. R-hadron and long lived particle searches at the LHC

    CERN Document Server

    Bressler, S

    2007-01-01

    If long lived charged particles exist, and produced at the LHC, they may travel with velocity significantly slower than the speed of light. This unique signature was not considered during the design of the LHC experiments, ATLAS and CMS. As a result, hardware and trigger capabilities need to be evaluated. Model independent approaches for finding long lived particles with the LHC experiments are introduced. They are tested using two bench marks, one in GMSB and one in Split SUSY. The focus is on hardware and trigger issues, as well as reconstruction methods developed by ATLAS and CMS. Both experiments suggest time of flight (TOF) based methods. However, the implementation is different. In ATLAS a first beta estimation is done already at the trigger level. CMS also uses dE/dx to estimate beta.

  9. R-Hadron and long lived particle searches at the LHC

    CERN Document Server

    Bressler, S.

    2007-01-01

    If long lived charged particles exist, and produced at the LHC, they may travel with velocity significantly slower than the speed of light. This unique signature was not considered during the design of the LHC experiments, ATLAS and CMS. As a result, hardware and trigger capabilities need to be evaluated. Model independent approaches for finding long lived particles with the LHC experiments are introduced. They are tested using two bench marks, one in GMSB and one in Split SUSY. The focus is on hardware and trigger issues, as well as reconstruction methods developed by ATLAS and CMS. Both experiments suggest time of flight (TOF) based methods. However, the implementation is different. In ATLAS a first beta estimation is done already at the trigger level. CMS also uses dE/dx to estimate beta.

  10. The first LHC insertion quadrupole

    CERN Multimedia

    2004-01-01

    An important milestone was reached in December 2003 at the CERN Magnet Assembly Facility. The team from the Accelerator Technology - Magnet and Electrical Systems group, AT-MEL, completed the first special superconducting quadrupole for the LHC insertions which house the experiments and major collider systems. The magnet is 8 metres long and contains two matching quadrupole magnets and an orbit corrector, a dipole magnet, used to correct errors in quadrupole alignment. All were tested in liquid helium and reached the ultimate performance criteria required for the LHC. After insertion in the cryostat, the superconducting magnet will be installed as the Q9 quadrupole in sector 7-8, the first sector of the LHC to be put in place in 2004. Members of the quadrupole team, from the AT-MEL group, gathered around the Q9 quadrupole at its inauguration on 12 December 2003 in building 181.

  11. The LHC inauguration in pictures

    CERN Multimedia

    2008-01-01

    The LHC inauguration ceremony was a memorable experience for everyone who attended. On Tuesday 21 October the ceremony hall, SMA18, was filled with over 1500 invited guests, VIPs included Swiss President Pascal Couchepin, French Prime Minister François Fillon and several ministers from CERN’s Member States and around the world. You can watch a video of the highlights of the ceremony at http://cds.cern.ch/record/1136012 The Heads of Delegations from all the Member and Observer States pose with the Director-General. "The LHC is a marvel of modern technology, which would not have been possible without the continuous support of our Member States," said the Director-General in his opening speech. "This is an opportunity for me to thank them on behalf of the world’s particle physics community." The LHC inauguration ceremony officially marked the end of 24 years of conception, development, constru...

  12. Using widgets to monitor the LHC experiments

    International Nuclear Information System (INIS)

    Caballero, I González; Sarkar, S

    2011-01-01

    The complexity of the LHC experiments requires monitoring systems to verify the correct functioning of different sub-systems and to allow operators to quickly spot problems and issues that may cause loss of information and data. Due to the distributed nature of the collaborations and the different technologies involved, the information data that need to be correlated is usually spread over several databases, web pages and monitoring systems. On the other hand, although the complete set of monitorable aspects is known and fixed, the subset that each person needs to monitor is often different for each individual. Therefore, building a unique monitoring tool that suits every single collaborator becomes close to impossible. A modular approach with a set of customizable widgets, small autonomous portions of HTML and JavaScript, that can be aggregated to form private or public monitoring web pages can be a scalable and robust solution, where the information can be provided by a simple and thin set of web services. Among the different widget development toolkits available today, we have chosen the open project UWA (Unified Widget API) because of its portability to the most popular widget platforms (including iGoogle, Netvibes and Apple Dashboard). As an example, we show how this technology is currently being used to monitor parts of the CMS Computing project.

  13. Conceptual Design of the LHC Interaction Region Upgrade Phase-I

    CERN Document Server

    Ostojic, R; Baglin, V; Ballarino, A; Cerutti, F; Denz, R; Fartoukh, S; Fessia, P; Foraz, K; Fürstner, M; Herr, Werner; Karppinen, M; Kos, N; Mainaud-Durand, H; Mereghetti, A; Muttoni, Y; Nisbet, D; Prin, H; Tock, J P; Van Weelderen, R; Wildner, E

    2008-01-01

    The LHC is starting operation with beam. The primary goal of CERN and the LHC community is to ensure that the collider is operated efficiently and that it achieves nominal performance in the shortest term. Since several years the community has been discussing the directions for maximizing the physics reach of the LHC by upgrading the experiments, in particular ATLAS and CMS, the LHC machine and the CERN proton injector complex, in a phased approach. The first phase of the LHC interaction region upgrade was approved by Council in December 2007. This phase relies on the mature Nb-Ti superconducting magnet technology with the target of increasing the LHC luminosity to 2 to 3 10^34 cm^-2s^-1, while maximising the use of the existing infrastructure. In this report, we present the goals and the proposed conceptual solutions for the LHC IR Upgrade Phase-I which include the recommendations of the conceptual design review.

  14. Forward physics with the LHCf experiment: a LHC contribution to cosmic-ray physics

    Directory of Open Access Journals (Sweden)

    Bonechi L.

    2014-04-01

    Full Text Available LHCf is a small detector installed at LHC accelerator to measure neutral particle flow in the forward direction of proton -proton (p - p and proton -nucleus (p - A interactions. Thanks to the optimal performance that has characterized the last years’ running of the LHC collider, several measurements have been taken since 2009 in different running conditions. After data taking for p - p interactions at √s = 900 GeV, 2.76 TeV and 7 TeV and proton - Lead nucleus (p -Pb at √sNN = 5.02 TeV (energy of a couple of projectile and target nucleons in their center of mass reference frame, LHCf is now going to complete its physics program with the 13 TeV p - p run foreseen in 2015. The complete set of results will become a reference data set of forward physics for the calibration and tuning of the hadronic interaction models currently used for the simulation of the atmospheric showers induced by very high energy cosmic rays. For this reason we think that LHCf is giving an important contribution for the study of cosmic rays at the highest energies. In this paper the experiment, the published results and the current status are reviewed.

  15. HL-LHC Accelerator

    CERN Document Server

    Zimmermann, F

    2013-01-01

    The tentative schedule, key ingredients, as well as progress of pertinent R&D and component prototypes for the LHC luminosity upgrade, "HL-LHC," are reviewed. Also alternative scenarios based on performance-improving consolidations (PICs) instead of a full upgrade are discussed. Tentative time schedules and expected luminosity evolutions for the different scenarios are sketched. The important role of HL-LHC development as a step towards a future HE-LHC or VHE-LHC is finally highlighted. Presented at "Higgs & Beyond" Conference Tohoku University, Sendai 7 June 2013.

  16. The ATLAS Experiment at the LHC Collider

    CERN Document Server

    Djama, Fares; The ATLAS collaboration

    2017-01-01

    The talk shows selected results which illustrate the variety of physics measurements conducted by the ATLAS Collaboration at the LHC. Subdetectors and reconstructed objects are reviewed with their respective performance. Higgs boson and top quark properties are presented as well as electroweak and QCD measurements. Searches are briefly mentioned as they are covered by dedicated talks. The opportunity of the 25th anniversary of the Lomonosov Conference is also taken to look back 25 years ago, at the time of the ATLAS Letter of Intent.

  17. ELECTRO-THERMAL AND MECHANICAL VALIDATION EXPERIMENT ON THE LHC MAIN BUSBAR SPLICE CONSOLIDATION

    CERN Document Server

    Willering, GP; Bourcey, N; Bottura, L; Charrondiere, M; Cerqueira Bastos, M; Deferne, G; Dib, G; Giloux, Chr; Grand-Clement, L; Heck, S; Hudson, G; Kudryavtsev, D; Perret, P; Pozzobon, M; Prin, H; Scheuerlein, Chr; Rijllart, A; Triquet, S; Verweij, AP

    2012-01-01

    To eliminate the risk of thermal runaways in LHC interconnections a consolidation by placing shunts on the main bus bar interconnections is proposed by the Task Force Splices Consolidation. To validate the design two special SSS magnet spares are placed on a test bench in SM-18 to measure the interconnection in between with conditions as close as possible to the LHC conditions. Two dipole interconnections are instrumented and prepared with worst-case-conditions to study the thermo-electric stability limits. Two quadrupole interconnections are instrumented and prepared for studying the effect of current cycling on the mechanical stability of the consolidation design. All 4 shunted interconnections showed very stable behaviour, well beyond the LHC design current cycle.

  18. First operational experience with the LHC machine protection system when operating with beam energies beyond the 100MJ range

    CERN Document Server

    Assmann, R; Ferro-Luzzi, M; Goddard, B; Lamont, M; Schmidt, R; Siemko, A; Uythoven, J; Wenninger, J; Zerlauth, M

    2012-01-01

    The Large Hadron Collider (LHC) at CERN has made remarkable progress during 2011, surpassing its ambitious goal for the year in terms of luminosity delivered to the LHC experiments. This achievement was made possible by a progressive increase of beam intensities by more than 5 orders of magnitude during the first months of operation, reaching stored beam energies beyond the 100MJ range at the end of the year, less than a factor of 4 from the nominal design value. The correct functioning of the machine protection systems is vital during the different operational phases, for initial operation and even more when approaching nominal beam parameters where already a small fraction of the stored energy is sufficient to damage accelerator equipment or experiments in case of uncontrolled beam loss. Safe operation of the machine in presence of such high intensity proton beams is guaranteed by the interplay of many different systems: beam dumping system, beam interlocks, beam instrumentation, equipment monitoring, colli...

  19. Top quark production at the LHC

    CERN Document Server

    Ferreira da Silva, Pedro

    2016-01-01

    Twenty years past its discovery, the top quark continues attracting great interest as experiments keep unveiling its properties. An overview of the latest measurements in the domain of top quark production, performed by the ATLAS and CMS experiments at the CERN LHC, is given. The latest measurements of top quark production rates via strong and electroweak processes are reported and compared to different perturbative QCD predictions. Fundamental properties, such as the mass or the couplings of the top quark, as well as re-interpretations seeking for beyond the standard model contributions in the top quark sector, are extracted from these measurements. In each case an attempt to highlight the first results and main prospects for the on-going Run 2 of the LHC is made.

  20. The Electronic Logbook for the Information Storage of ATLAS Experiment at LHC (ELisA)

    CERN Document Server

    Corso-Radu, A; The ATLAS collaboration; Magnoni, L

    2012-01-01

    A large experiment like ATLAS at LHC (CERN), with over three thousand members and a shift crew of 15 people running the experiment 24/7, needs an easy and reliable tool to gather all the information concerning the experiment development, installation, deployment and exploitation over its lifetime. With the increasing number of users and the accumulation of stored information since the experiment start-up, the electronic logbook actually in use, ATLOG, started to show its limitations in terms of speed and usability. Its monolithic architecture makes the maintenance and implementation of new functionality a hard-to-almost-impossible process. A new tool ELisA has been developed to replace the existing ATLOG. It is based on modern web technologies: the Spring framework using a Model-View-Controller architecture was chosen, thus helping building flexible and easy to maintain applications. The new tool implements all features of the old electronic logbook with increased performance and better graphics: it uses the ...

  1. Next Generation High Quality Videoconferencing Service for the LHC

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    In recent times, we have witnessed an explosion of video initiatives in the industry worldwide. Several advancements in video technology are currently improving the way we interact and collaborate. These advancements are forcing tendencies and overall experiences: any device in any network can be used to collaborate, in most cases with an overall high quality. To cope with this technology progresses, CERN IT Department has taken the leading role to establish strategies and directions to improve the user experience in remote dispersed meetings and remote collaboration at large in the worldwide LHC communities. Due to the high rate of dispersion in the LHC user communities, these are critically dependent of videoconferencing technology, with a need of robustness and high quality for the best possible user experience. We will present an analysis of the factors that influenced the technical and strategic choices to improve the reliability, efficiency and overall quality of the LHC remote sessions. In particular, ...

  2. Forward and Small-x QCD Physics Results from CMS Experiment at LHC

    CERN Document Server

    AUTHOR|(CDS)2079608

    2016-01-01

    The Compact Muon Solenoid (CMS) is one of the two large, multi-purpose experiments at the Large Hadron Collider (LHC) at CERN. During the Run I Phase a large pp collision dataset has been collected and the CMS collaboration has explored measurements that shed light on a new era. Forward and small-$x$ quantum chromodynamics (QCD) physics measurements with CMS experiment covers a wide range of physics subjects. Some of highlights in terms of testing the very low-$x$ QCD, underlying event and multiple interaction characteristics, photon-mediated processes, jets with large rapidity separation at high pseudo-rapidities and the inelastic proton-proton cross section dominated by diffractive interactions are presented. Results are compared to Monte Carlo (MC) models with different parameter tunes for the description of the underlying event and to perturbative QCD calculations. The prominent role of multi-parton interactions has been confirmed in the semihard sector but no clear deviation from the standard DGLAP parto...

  3. Experiential learning in high energy physics: a survey of students at the LHC

    Science.gov (United States)

    Camporesi, Tiziano; Catalano, Gelsomina; Florio, Massimo; Giffoni, Francesco

    2017-03-01

    More than 36 000 students and post-docs will be involved until 2025 in research at the Large Hadron Collider (LHC) mainly through international collaborations. To what extent they value the skills acquired? Do students expect that their learning experience will have an impact on their professional future? By drawing from earlier literature on experiential learning, we have designed a survey of current and former students at LHC. To quantitatively measure the students’ perceptions, we compare the salary expectations of current students with the assessment of those now employed in different jobs. Survey data are analysed by ordered logistic regression models, which allow multivariate statistical analyses with limited dependent variables. Results suggest that experiential learning at LHC positively correlates with both current and former students’ salary expectations. Those already employed clearly confirm the expectations of current students. At least two not mutually exclusive explanations underlie the results. First, the training at LHC is perceived to provide students valuable skills, which in turn affect the salary expectations; secondly, the LHC research experience per se may act as signal in the labour market. Respondents put a price tag on their learning experience, a ‘LHC salary premium’ ranging from 5% to 12% compared with what they would have expected for their career without such an experience at CERN.

  4. HERA and the LHC: A Workshop on the implications of HERA for LHC physics: Proceedings Part A

    CERN Document Server

    De Roeck, A.; Startup Meeting; Working Group Meeting; Mid-term Review Meeting; Working Group Meeting; Working Group Meeting; Final Meeting

    2005-01-01

    The HERA electron--proton collider has collected 100 pb$^{-1}$ of data since its start-up in 1992, and recently moved into a high-luminosity operation mode, with upgraded detectors, aiming to increase the total integrated luminosity per experiment to more than 500 pb$^{-1}$. HERA has been a machine of excellence for the study of QCD and the structure of the proton. The Large Hadron Collider (LHC), which will collide protons with a centre-of-mass energy of 14 TeV, will be completed at CERN in 2007. The main mission of the LHC is to discover and study the mechanisms of electroweak symmetry breaking, possibly via the discovery of the Higgs particle, and search for new physics in the TeV energy scale, such as supersymmetry or extra dimensions. Besides these goals, the LHC will also make a substantial number of precision measurements and will offer a new regime to study the strong force via perturbative QCD processes and diffraction. For the full LHC physics programme a good understanding of QCD phenomena and the ...

  5. Disk storage at CERN: Handling LHC data and beyond

    International Nuclear Information System (INIS)

    Espinal, X; Adde, G; Chan, B; Iven, J; Presti, G Lo; Lamanna, M; Mascetti, L; Pace, A; Peters, A; Ponce, S; Sindrilaru, E

    2014-01-01

    The CERN-IT Data Storage and Services (DSS) group stores and provides access to data coming from the LHC and other physics experiments. We implement specialised storage services to provide tools for optimal data management, based on the evolution of data volumes, the available technologies and the observed experiment and users' usage patterns. Our current solutions are CASTOR, for highly-reliable tape-backed storage for heavy-duty Tier-0 workflows, and EOS, for disk-only storage for full-scale analysis activities. CASTOR is evolving towards a simplified disk layer in front of the tape robotics, focusing on recording the primary data from the detectors. EOS is now a well-established storage service used intensively by the four big LHC experiments. Its conceptual design based on multi-replica and in-memory namespace, makes it the perfect system for data intensive workflows. The LHC-Long Shutdown 1 (LSI) presents a window of opportunity to shape up both of our storage services and validate against the ongoing analysis activity in order to successfully face the new LHC data taking period in 2015. In this paper, the current state and foreseen evolutions of CASTOR and EOS will be presented together with a study about the reliability of our systems.

  6. Supersymmetry and the LHC (Lectures CANCELLED)

    CERN Multimedia

    CERN. Geneva

    2006-01-01

    I will first give a pedagogical motivation for, and introduction to, supersymmetric extensions of the Standard Model. The biggest obstacle that prevents theorists from making clear-cut predictions for the production of superparticles at the LHC is our lack of knowledge of how supersymmetry is broken. I will review the most promising SUSY breaking mechanisms that have been suggested so far, and outline the resulting signatures for LHC experiments. Finally, I will try to make contact with other areas of particle physics and cosmology, where supersymmetry also might play a role.

  7. LHC Report: LHC hit the target!

    CERN Multimedia

    Enrico Bravin for the LHC team

    2016-01-01

    Last week, the accumulated integrated luminosity reached the target value for 2016 of 25 fb-1 in both ATLAS and CMS.   The integrated luminosity delivered to ATLAS and CMS reached (and already passed!) 25 fb-1– the target for the whole year! Tuesday, 30 August was just a regular day for the 2016 LHC run. However,  on that day, the integrated luminosity delivered to ATLAS and CMS reached 25 fb-1 – the target for the whole year! How did we get here? A large group of committed scientists and technical experts work behind the scenes at the LHC, ready to adapt to the quirks of this truly impressive machine. After the push to produce as many proton-proton collisions as possible before the summer conferences, several new ideas and production techniques (such as Bunch Compression Multiple Splitting, BCMS) have been incorporated in the operation of LHC in order to boost its performance even further. Thanks to these improvements, the LHC was routinely operated with peak luminos...

  8. Muon pair study at LHC: ALICE experiment

    International Nuclear Information System (INIS)

    Chevallier, M.; Cheynis, B.; Grossiord, J.Y.; Guinet, D.; Guichard, A.; Lautesse, P.; Jacquin, M.; Nikulin, V.

    1998-01-01

    The nuclear matter at very high density, possibly as a quark gluon plasma, will be studied with ALICE at LHC, via the measurement of heavy quark resonances detected through their dimuon decay. The group is participating, since the end of 1996, in the development of the tracking chambers of the dimuon arm. These detectors are wire chambers with segmented cathodes and should measure the position of the tracks with a resolution of ≅ 100 μm in order to get a dimuon mass resolution better than 100 MeV. (authors)

  9. Application of diamond based beam loss monitors at LHC

    Energy Technology Data Exchange (ETDEWEB)

    Hempel, Maria

    2013-04-15

    The Large Hadron Collider (LHC) was conceived in the 1980s and started the operation in 2008. It needed more than 20 years to plan and construct this accelerator and its experiments. Four main experiments are located around the ring, Compact Muon Solenoid (CMS), A Toroidal LHC Apparatus(ATLAS), A Large Ion Collider Experiment (ALICE) and LHC beauty (LHCb). Two beams that traveling in opposite direction in the LHC tunnel, collide in each of the experiments to study the questions: ''What is mass?'', ''What is the universe made of?'' and ''Why is there no antimatter?''. The four experiments take data of the collision products and try to answer the fundamental questions of physics. The two larger detectors, CMS and ATLAS, are looking for the Higgs boson to study the electroweak symmetry breaking. Both detectors were built with contrasting concepts to exclude potential error sources and to rea rm the results. The smaller experiment LHCb studies the matter-antimatter asymmetry with a focus of the beauty quark. Another smaller experiment is ALICE that studies the conditions right after the Big Bang by colliding heavy ions. The navigation of the beams is done by over 10000 magnets and each beam has a stored energy of 362MJ which correspond to the kinetic energy of a train like the TGV travelling of 150 km/h. Only a small percentage of that energy can damage the material in the LHC ring or the magnets. This would mean a repair time of months or years, without taking any data. To avoid such a scenario, it is important to monitor the beam condition and measure the amount of losses of the beam. Such losses can for example happen due to dust particles in the vacuum chambers or due to deviations of the beam parameters. Several systems called beam loss monitors (BLMs) can measure beam losses. This thesis concentrates on two of them, ionization chambers and diamond detectors. Over 3600 ionization chambers are installed in

  10. Upgrade of the ATLAS Silicon Tracker for the sLHC

    CERN Document Server

    Minano, M; The ATLAS collaboration

    2009-01-01

    While the CERN Large Hadron Collider (LHC) will start taking data this year, scenarios for a machine upgrade to achieve a much higher luminosity are being developed. In the current planning, it is foreseen to increase the luminosity of the LHC at CERN around 2016 by about an order of magnitude, with the upgraded muchine dubbed Super-LHC or SLHC. As radiation damage scales with integrated luminosity, the particle physics experiments at the SLHC will need to be equipped with a new generation of radiation-hard detectors. This is of particular importance for the semiconductor tracking detectors located close to the LHC interaction region, where the higest radiation doses occur. The ATLAS experiment will require a new particle tracking system for SLHC operation. In order to cope with the increase in background events by about one order of magnitude at the higher luminosity, an all silicon detector with enhanced radiation hardness is being designed. The new silicon strip detector will use significantly shorter stri...

  11. Searches for SUSY at LHC

    International Nuclear Information System (INIS)

    Kharchilava, A.

    1997-01-01

    One of the main motivations of experiments at the LHC is to search for SUSY particles. The talk is based on recent analyses, performed by CMS Collaboration, within the framework of the Supergravity motivated minimal SUSY extension of the Standard Model. The emphasis is put on leptonic channels. The strategies for obtaining experimental signatures for strongly and weakly interacting sparticles productions, as well as examples of determination of SUSY masses and model parameters are discussed. The domain of parameter space where SUSY can be discovered is investigated. Results show, that if SUSY is of relevance at Electro-Weak scale it could hardly escape detection at LHC. (author)

  12. Development of Diamond Tracking Detectors for High Luminosity Experiments at the LHC, HL-LHC and Beyond

    CERN Document Server

    Kagan, Harris (Ohio State)

    2018-01-01

    The RD42 collaboration at CERN is leading the effort to develop radiation tolerant devices based on polycrystalline Chemical Vapor Deposition (pCVD) diamond as a material for tracking detectors operating in harsh radiation environments. Diamond has properties that make it suitable for such detector applications. During the last few years the RD42 group has succeeded in producing and characterising a number of devices to address specific issues related to their use at the LHC and HL-LHC. Herein we present the status of the RD42 project with emphasis on recent beam test results and our proposed three year research plan. In particular, we review recent results on the stability of signal size on incident particle rate in diamond detectors over a range of particle fluxes up to 20 MHz/cm2, on the radiation tolerance of CVD diamond, on the diamond work with ATLAS and CMS, on the results of 3D diamond detectors fabricated in pCVD diamond and on the work with diamond manufacturers. In addition, we present the details ...

  13. ELECTRONICS FOR CALORIMETERS AT LHC

    International Nuclear Information System (INIS)

    Radeka, V.

    2001-01-01

    Some principal design features of front-end electronics for calorimeters in experiments at the LHC will be highlighted. Some concerns arising in the transition from the research and development and design phase to the construction will be discussed. Future challenges will be indicated

  14. Operational Experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Keil, M; The ATLAS collaboration

    2012-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC and its status after three years of operation will be presented, including calibration procedures, timing optimization and detector performance. The detector performance is excellent: ~96 % of the pixels are operational, noise occupancy and hit efficiency e...

  15. Operational Experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Keil, M; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  16. Operational experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Hirschbuehl, D; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this paper results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 96.7% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  17. Operational experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Lapoire, C; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  18. Operational Experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Lapoire, C; The ATLAS collaboration

    2012-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as B-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this paper, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures and detector performance. The detector performance is excellent: 96.2% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification.

  19. Operational Experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Keil, M

    2012-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this paper results from the successful operation of the Pixel Detector at the LHC will be presented, including calibration procedures, timing optimization and detector performance. The detector performance is excellent: approximately 97% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  20. Operational experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Ince, T; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 96.8% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  1. Operational experience with the ATLAS Pixel detector at the LHC

    CERN Document Server

    Deluca, C; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this paper, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5\\% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, ...

  2. Operational Experience with the ATLAS Pixel Detector at the LHC

    CERN Document Server

    Lange, C; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump- bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, a...

  3. Operational experience with the ATLAS Pixel detector at the LHC

    CERN Document Server

    Deluca, C; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  4. Radiation protection issues after 20 years of LHC operation

    CERN Document Server

    Forkel-Wirth, D.; Roesler, S.; Theis, C.; Ulrici, L.; Vincke, H.; Vincke, Hz.

    2011-01-01

    Since November 2009, the LHC commissioning progresses very well, both with proton and lead beams. It will continue in 2011 and nominal LHC operation is expected to be attained in 2013. In parallel, plans for various LHC upgrades are under discussion, suggesting a High-Luminosity (HL) upgrade first and a High-Energy (HE) upgrade in a later state. Whereas the upgrade in luminosity would require the modification of only some few key accelerator components like the inner triplets, the upgrade in beam energy from 7 TeV to 16.5 TeV would require the exchange of all dipoles and of numerous other accelerator components. The paper gives an overview of the radiation protection issues related to the dismantling of LHC components prior to the installation of the HE-LHC components, i.e. after about 20 years of LHC operation. Two main topics will be discussed: (i) the exposure of workers to ionizing radiation during the dismantling of dipoles, inner triplets or collimators and experiments and (ii) the production, condition...

  5. Identifying Supersymmetry at the CERN LHC and Indirect Dark Matter Detection Experiments

    CERN Document Server

    Grajek, Phillip

    Supersymmetry (SUSY) remains the most well-motivated scenario for new physics beyond the Standard Model. There is strong reason to expect that if nature is supersymmetric it will be observed at the LHC. Consequently, searches for SUSY are among the primary tasks of the LHC program. However, much of this work focuses on scenarios such as mSUGRA, which include many simplifying assumptions. It is necessary, therefore, to consider the broader SUSY parameter space, and explore the implications of various other model choices on the spectrum of possible experimental signatures. This thesis addresses this phenomenologically challenging problem. We present several studies that examine the relationship between various SUSY scenarios and experimental phenomena, and introduce new techniques to extract meaningful information about fundamental parameters. First, we discuss identification of multiple top quark production from gluino decay at the LHC. We find that 4-top production can be discovered in excess of Standard Mode...

  6. CMS RPC muon detector performance with 2010-2012 LHC data

    CERN Document Server

    INSPIRE-00316302; Ban, Y.; Cai, J.; Li, Q.; Liu, S.; Qian, S.; Wang, D.; Xu, Z.; Zhang, F.; Choi, Y.; Kim, D.; Goh, J.; Choi, S.; Hong, B.; Kang, J.W.; Kang, M.; Kwon, J.H.; Lee, K.S.; Lee, S.K.; Park, S.K.; Pant, L.M.; Mohanty, A.K.; Chudasama, R.; Singh, J.B.; Bhatnagar, V.; Mehta, A.; Kumar, R.; Cauwenbergh, S.; Costantini, S.; Cimmino, A.; Crucy, S.; Fagot, A.; Garcia, G.; Ocampo, A.; Poyraz, D.; Salva, S.; Thyssen, F.; Tytgat, M.; Zaganidis, N.; Doninck, W.V.; Cabrera, A.; Chaparro, L.; Gomez, J.P.; Gomez, B.; Sanabria, J.C.; Avila, C.; Ahmad, A.; Muhammad, S.; Shoaib, M.; Hoorani, H.; Awan, I.; Ali, I.; Ahmed, W.; Asghar, M.I.; Shahzad, H.; Sayed, A.; Ibrahim, A.; Aly, S.; Assran, Y.; Radi, A.; Elkafrawy, T.; Sharma, A.; Colafranceschi, S.; Abbrescia, M.; Calabria, C.; Colaleo, A.; Iaselli, G.; Loddo, F.; Maggi, M.; Nuzzo, S.; Radogna, R.; Venditti, R.; Verwilligen, P.; Benussi, L.; Bianco, S.; Piccolo, D.; Paolucci, P.; Buontempo, S.; Cavallo, N.; Merola, M.; Fabozzi, F.; Iorio, O.M.; Braghieri, A.; Montagna, P.; Riccardi, C.; Salvini, P.; Vitulo, P.; Vai, I.; Magnani, A.; Dimitrov, A.; Litov, L.; Pavlov, B.; Petkov, P.; Aleksandrov, A.; Genchev, V.; Iaydjiev, P.; Rodozov, M.; Sultanov, G.; Vutova, M.; Stoykova, S.; Hadjiiska, R.; Ibargüen, H.S.; Morales, M.I.P.; Bernardino, S.C.; Bagaturia, I.; Tsamalaidze, Z.; Crotty, I.; Kim, M.S.

    2014-12-05

    The muon spectrometer of the CMS (Compact Muon Solenoid) experiment at the Large Hadron Collider (LHC) is equipped with a redundant system made of Resistive Plate Chambers and Drift Tube in barrel and RPC and Cathode Strip Chamber in endcap region. In this paper, the operations and performance of the RPC system during the first three years of LHC activity will be reported. The integrated charge was about 2 mC/cm$^{2}$, for the most exposed detectors. The stability of RPC performance, with particular attention on the stability of detector performance such as efficiency, cluster size and noise, will be reported. Finally, the radiation background levels on the RPC system have been measured as a function of the LHC luminosity. Extrapolations to the LHC design conditions and HL-LHC are also discussed.

  7. LHC Optics Measurement with Proton Tracks Detected by the Roman Pots of the TOTEM Experiment

    CERN Document Server

    INSPIRE-00062364; Aspell, P; Atanassov, I; Avati, V; Baechler, J; Berardi, V; Berretti, M; Bossini, E; Bottigli, U; Bozzo, M; Brücken, E; Buzzo, A; Cafagna, F S; Catanesi, M G; Covault, C; Csanád, M; Csörgö, T; Deile, M; Doubek, M; Eggert, K; Eremin, V; Ferro, F; Fiergolski, A; Garcia, F; Georgiev, V; Giani, S; Grzanka, L; Hammerbauer, J; Heino, J; Hilden, T; Karev, A; Kašpar, J; Kopal, J; Kundrát, V; Lami, S; Latino, G; Lauhakangas, R; Leszko, T; Lippmaa, E; Lippmaa, J; Lokajíček, M V; Losurdo, L; Lo Vetere, M; Lucas Rodríguez, F; Macrí, M; Mäki, T; Mercadante, A; Minafra, N; Minutoli, S; Nemes, F; Niewiadomski, H; Oliveri, E; Oljemark, F; Orava, R; Oriunno, M; Österberg, K; Palazzi, P; Peroutka, Z; Procházka, J; Quinto, M; Radermacher, E; Radicioni, E; Ravotti, F; Robutti, E; Ropelewski, L; Ruggiero, G; Saarikko, H; Scribano, A; Smajek, J; Snoeys, W; Sziklai, J; Taylor, C; Turini, N; Vacek, V; Welti, J; Whitmore, J; Wyszkowski, P; Zielinski, K

    2014-10-28

    Precise knowledge of the beam optics at the LHC is crucial to fulfil the physics goals of the TOTEM experiment, where the kinematics of the scattered protons is reconstructed with the near-beam telescopes -- so-called Roman Pots (RP). Before being detected, the protons' trajectories are influenced by the magnetic fields of the accelerator lattice. Thus precise understanding of the proton transport is of key importance for the experiment. A novel method of optics evaluation is proposed which exploits kinematical distributions of elastically scattered protons observed in the RPs. Theoretical predictions, as well as Monte Carlo studies, show that the residual uncertainty of this optics estimation method is smaller than 0.25 percent.

  8. The LHC Computing Grid Project

    CERN Multimedia

    Åkesson, T

    In the last ATLAS eNews I reported on the preparations for the LHC Computing Grid Project (LCGP). Significant LCGP resources were mobilized during the summer, and there have been numerous iterations on the formal paper to put forward to the CERN Council to establish the LCGP. ATLAS, and also the other LHC-experiments, has been very active in this process to maximally influence the outcome. Our main priorities were to ensure that the global aspects are properly taken into account, that the CERN non-member states are also included in the structure, that the experiments are properly involved in the LCGP execution and that the LCGP takes operative responsibility during the data challenges. A Project Launch Board (PLB) was active from the end of July until the 10th of September. It was chaired by Hans Hoffmann and had the IT division leader as secretary. Each experiment had a representative (me for ATLAS), and the large CERN member states were each represented while the smaller were represented as clusters ac...

  9. Scalar-mediated double beta decay and LHC

    International Nuclear Information System (INIS)

    Gonzalez, L.; Helo, J.C.; Hirsch, M.; Kovalenko, S.G.

    2016-01-01

    The decay rate of neutrinoless double beta (0νββ) decay could be dominated by Lepton Number Violating (LNV) short-range diagrams involving only heavy scalar intermediate particles, known as “topology-II” diagrams. Examples are diagrams with diquarks, leptoquarks or charged scalars. Here, we compare the LNV discovery potentials of the LHC and 0νββ-decay experiments, resorting to three example models, which cover the range of the optimistic-pessimistic cases for 0νββ decay. We use the LHC constraints from dijet as well as leptoquark searches and find that already with 20/fb the LHC will test interesting parts of the parameter space of these models, not excluded by the current limits on 0νββ-decay.

  10. Scalar-mediated double beta decay and LHC

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, L. [Universidad Técnica Federico Santa María, Centro-Científico-Tecnológico de Valparaíso,Casilla 110-V, Valparaíso (Chile); Helo, J.C. [Universidad Técnica Federico Santa María, Centro-Científico-Tecnológico de Valparaíso,Casilla 110-V, Valparaíso (Chile); Departamento de Física, Facultad de Ciencias, Universidad de La Serena,Avenida Cisternas 1200, La Serena (Chile); Hirsch, M. [AHEP Group, Instituto de Física Corpuscular - C.S.I.C./Universitat de València,Edificio de Institutos de Paterna, Apartado 22085, E-46071 València (Spain); Kovalenko, S.G. [Universidad Técnica Federico Santa María, Centro-Científico-Tecnológico de Valparaíso,Casilla 110-V, Valparaíso (Chile)

    2016-12-23

    The decay rate of neutrinoless double beta (0νββ) decay could be dominated by Lepton Number Violating (LNV) short-range diagrams involving only heavy scalar intermediate particles, known as “topology-II” diagrams. Examples are diagrams with diquarks, leptoquarks or charged scalars. Here, we compare the LNV discovery potentials of the LHC and 0νββ-decay experiments, resorting to three example models, which cover the range of the optimistic-pessimistic cases for 0νββ decay. We use the LHC constraints from dijet as well as leptoquark searches and find that already with 20/fb the LHC will test interesting parts of the parameter space of these models, not excluded by the current limits on 0νββ-decay.

  11. LHC Results on Charmonium in Heavy Ions

    CERN Document Server

    Hong, Byungsik

    2012-01-01

    In heavy-ion collisions at high energies, the quantum chromodynamics (QCD) predicts the production of the deconfined quark-gluon plasma (QGP) state. Quarkonia ($c\\bar{c}$ or $b\\bar{b}$ bound states) are a useful means to probe QGP and to investigate the behavior of QCD under the high parton-density environment. Up to now, the large hadron collider (LHC) at CERN provided two runs for PbPb collisions at $\\sqrt{s_{NN}}$ = 2.76 TeV in the years 2010 and 2011. The ALICE, ATLAS, and CMS experiments at LHC have analyzed the yields and spectra of the $J/\\psi$ and $\\Upsilon$ families. In this article, we review particularly the recent charmonium results in PbPb collisions at LHC from the 2010 run.

  12. The LHC babies

    CERN Multimedia

    Laëtitia Pedroso

    2011-01-01

    With the machine restart and first collisions at 3.5 TeV, 2009 and 2010 were two action-packed years at the LHC. The events were a real media success, but one important result that remained well hidden was the ten births in the LHC team over the same period. The mothers – engineers, cryogenics experts and administrative assistants working for the LHC – confirm that it is possible to maintain a reasonable work-life balance. Two of them tell us more…   Verena Kain (left) and Reyes Alemany (right) in the CERN Control Centre. With the LHC running around the clock, LHC operations engineers have high-pressure jobs with unsociable working hours. These past two years, which will undoubtedly go down in the annals of CERN history, the LHC team had their work cut out, but despite their high-octane professional lives, several female members of the team took up no less of a challenge in their private lives, creating a mini-baby-boom by which the LHC start-up will also be remembe...

  13. PDF4LHC recommendations for LHC Run II

    CERN Document Server

    Butterworth, Jon; Cooper-Sarkar, Amanda; De Roeck, Albert; Feltesse, Joel; Forte, Stefano; Gao, Jun; Glazov, Sasha; Huston, Joey; Kassabov, Zahari; McNulty, Ronan; Morsch, Andreas; Nadolsky, Pavel; Radescu, Voica; Rojo, Juan; Thorne, Robert

    2016-01-01

    We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

  14. Muon performance aspects and measurement of the inclusive ZZ production cross section through the four lepton final state with the ATLAS experiment at the LHC

    CERN Document Server

    Meyer, Jochen; Ströhmer, Raimund

    2013-07-08

    The "Large Hadron Collider" (LHC) is currently the most powerful particle accelerator. It provides particle collisions at a center of mass energy in the Tera-electronvolt range, which had never been reached in a laboratory before. Thereby a new era in high energy particle physics has began. Now it is possible to test one of the most precise theories in physics, the Standard Model of particle physics, at these high energies. The purpose is particularly served by four large experiments installed at the LHC, namely "A Toroidal LHC ApparatuS" (ATLAS), the "Compact-Muon-Solenoid" (CMS), the "Large Hadron Collider beauty" (LHCb) and "A Large Ion Collider Experiment" (ALICE). Besides exploring the high energy behavior of the well-established portions of the Standard Model, one of the main objectives is to find the Higgs boson included in the model, but not discovered by any preceding effort. It is of tremendous importance since fermions and heavy electroweak gauge bosons acquire mass because of this boson. Although ...

  15. Low-scale gravity black holes at LHC

    CERN Document Server

    Regos, E; Gamsizkan, H; Trocsanyi, Z

    2009-01-01

    We search for extra dimensions by looking for black holes at LHC. Theoretical investigations provide the basis for the collider experiments. We use black hole generators to simulate the experimental signatures (colour, charge, spectrum of emitted particles, missing transverse energy) of black holes at LHC in models with TeV scale quantum gravity, rotation, fermion splitting, brane tension and Hawking radiation. We implement the extra-dimensional simulations at the CMS data analysis and test further beyond standard models of black holes too.

  16. Start of run2 physics at the Large Hadron Collider (LHC)

    CERN Multimedia

    Brice, Maximilien

    2015-01-01

    Images from the CERN Control Centre (CCC), where operators control the LHC, and from the control rooms of the ALICE, ATLAS, CMS and LHCb experiments, where operators control huge detectors that capture data from collisions between beams of protons in the LHC.

  17. PHOBOS in the LHC era

    Energy Technology Data Exchange (ETDEWEB)

    Steinberg, Peter, E-mail: peter.steinberg@bnl.gov

    2015-01-15

    The PHOBOS experiment ran at the RHIC collider from 2000 to 2005, under the leadership of Wit Busza. These proceedings summarize selected PHOBOS results, highlighting their continuing relevance amidst the wealth of new results from the lead–lead program at the Large Hadron Collider (LHC)

  18. Implementation of an object oriented track reconstruction model into multiple LHC experiments*

    Science.gov (United States)

    Gaines, Irwin; Gonzalez, Saul; Qian, Sijin

    2001-10-01

    An Object Oriented (OO) model (Gaines et al., 1996; 1997; Gaines and Qian, 1998; 1999) for track reconstruction by the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders. The model has been coded in the C++ programming language and has been successfully implemented into the OO computing environments of both the CMS (1994) and ATLAS (1994) experiments at the future Large Hadron Collider (LHC) at CERN. We shall report: how the OO model was adapted, with largely the same code, to different scenarios and serves the different reconstruction aims in different experiments (i.e. the level-2 trigger software for ATLAS and the offline software for CMS); how the OO model has been incorporated into different OO environments with a similar integration structure (demonstrating the ease of re-use of OO program); what are the OO model's performance, including execution time, memory usage, track finding efficiency and ghost rate, etc.; and additional physics performance based on use of the OO tracking model. We shall also mention the experience and lessons learned from the implementation of the OO model into the general OO software framework of the experiments. In summary, our practice shows that the OO technology really makes the software development and the integration issues straightforward and convenient; this may be particularly beneficial for the general non-computer-professional physicists.

  19. Study of jet production in ALICE experiment at LHC collider

    CERN Document Server

    Jangal, Swensy

    The jet is one of the probes allowing testing strong interaction theory predictions, QCD, and to extract physical properties from a particular state of nuclear matter : Quark Gluon Plasma (QGP). This PhD work is aimed to show ALICE capacities to measure jets coming from collisions produced at the Large Hadron Collider (LHC). The detection of particles constituting jets, their association with reconstruction algorithms and the construction of observables such as jet pT spectrum of Hump-Backed Plateau is a hard work. We detail these different steps from simulation allowing to estimate jet rates we could expect for our analysis and to evaluate the impact of experimental measure on final observables. We finally present pT spectrum and Hump-Backed Plateau from first p+p collisions at LHC to whom mean corrections have been applied.

  20. RF upgrade program in LHC injectors and LHC machine

    International Nuclear Information System (INIS)

    Jensen, E.

    2012-01-01

    The main themes of the RF upgrade program are: the Linac4 project, the LLRF-upgrade and the study of a tuning-free wide-band system for PSB, the upgrade of the SPS 800 MHz amplifiers and beam controls and the upgrade of the transverse dampers of the LHC. Whilst LHC Splice Consolidation is certainly the top priority for LS1, some necessary RF consolidation and upgrade is necessary to assure the LHC performance for the next 3- year run period. This includes: 1) necessary maintenance and consolidation work that could not fit the shorter technical stops during the last years, 2) the upgrade of the SPS 200 MHz system from presently 4 to 6 cavities and possibly 3) the replacement of one LHC cavity module. On the longer term, the LHC luminosity upgrade requires crab cavities, for which some preparatory work in SPS Coldex must be scheduled during LS1. (author)

  1. WZ di-boson measurements with the ATLAS experiment at the LHC and performance of resistive Micromegas in view of HL-LHC applications

    International Nuclear Information System (INIS)

    Manjarres-Ramos, Joany

    2013-01-01

    During the past two years, the CERN Large Hadron Collider (LHC) has performed exceptionally. The data collected by ATLAS made possible the first Standard Model physics measurements and produced a number of important experimental results. In the first part of this document the measurement of the WZ production with the ATLAS detector is presented and the second part is devoted to the study of resistive Micromegas properties, in view of the installation in the ATLAS spectrometer forward regions for the first phase of High Luminosity LHC (HL-LHC). The measurement of the WZ production probes the electroweak sector of the Standard Model at high energies and allows for generic tests for New Physics beyond the Standard Model. Two datasets of LHC proton-proton collisions were analyzed, 4.8 fb -1 of integrated luminosity at center-of-mass energy of 7 TeV, and 13 fb -1 at 8 TeV, collected in 2011 and the first half of 2012 respectively. Fully leptonic decay events are selected with electrons, muons and missing transverse momentum in the final state. Different date-driven estimates of the background were developed in the context of this analysis. The fiducial and total cross section of WZ production are measured and limits on anomalous triple gauge boson couplings are set. The second part of the document is devoted to the upgrade of the ATLAS detector. The conditions at the High Luminosity LHC calls for detectors capable of operating in a flux of collisions and background particles approximately ten times larger compared to today's conditions. The efficiency, resolution and robustness of resistive Micromegas were studied, as part of the R and D project aimed at the construction of large-area spark-resistant muon chambers using the Micromegas technology. (author) [fr

  2. LHC@home is ready to support HiLumi LHC: take part!

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    Recently relaunched, the LHC@home volunteer computing project is now ready to support the HiLumi LHC project, the design phase of the planned upgrade of the LHC that will increase its luminosity by a factor of 5 to 10 beyond its original design value. HiLumi will need massive simulations to test the beam dynamics. Whether you are at home or at work, you can help experts design the future LHC by connecting your computer to LHC@home. Go for it!   LHC@home is aimed at involving the public in real science. If you have a computer that is connected to the Internet, you can join the large team of volunteers who are already supporting its two main projects: Test4Theory, which runs computer simulations of high-energy particle collisions, and SixTrack, which is aimed at optimizing the LHC performance by performing beam dynamics simulations. In both cases, the software is designed to run only when your computer is idle and causes no disruption to your normal activities. To the simulations run by the Six...

  3. Search for long-lived supersymmetry particles by signature of a high track-multiplicity displaced vertex using the LHC-ATLAS Experiment

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00360876

    Long-lived supersymmetry (SUSY) particles decaying within the tracking volume of the LHC-ATLAS Experiment can be reconstructed as a displaced vertex (DV). The search strategy involves attempting to reassemble the decay point of the long-lived particles (LLPs) by fitting vertices from the trajectories arising from the charged decay products. A search, looking for a signature of a massive high track-multiplicity DV has been conducted using data collected during 2012 by the LHC-ATLAS Experiment at $\\sqrt{s}~=~8$ TeV, equaling to an integrated luminosity of 20.3 fb$^{-1}$. A signature of a massive displaced vertex is especially powerful due to the lack of any heavy long-lived standard model particles. Thereby, giving an analysis that is nearly background free. This dissertation describes the new, much more generic, "$DV+\\text{jets}$" channel. In this channel events with high momentum jets and at least one displaced vertex are considered. Eliminating the requirement of an associated $\\mu$ generated, to date of w...

  4. Some LHC milestones...

    CERN Multimedia

    2008-01-01

    October 1995 The LHC technical design report is published. This document details the operation and the architecture of the future accelerator. November 2000 The first of the 1232 main dipole magnets for the LHC are delivered. May 2005 The first interconnection between two magnets of the accelerator is made. To carry out the 1700 interconnections of the LHC, 123 000 operations are necessary. February 2006 The new CERN Control Centre, which combines all the control rooms for the accelerators, the cryogenics and the technical infrastructure, starts operation. The LHC will be controlled from here. October 2006 Construction of the largest refrigerator in the world is complete. The 27 km cryogenic distribution line inside the LHC tunnel will circulate helium in liquid and gas phases to provide cryogenic conditions for the superconducting magnets of the accelerator. November 2006 Magnet production for the LHC is complete. The last of t...

  5. LHC Report: Full data production mode

    CERN Multimedia

    Mike Lamont for the LHC Team

    2012-01-01

    The LHC is accumulating as much data as possible for the experiments before the summer conferences. Performance is impressive, with 1380 bunches of around 1.5x1011 protons per bunch giving a peak luminosity of 6.8 x1033 cm-2s-1 and with integrated rates topping 20 pb-1 an hour at the start of fill.  As of today (13 June), the LHC has delivered more collisions in 2012 than it did in the whole of 2011. Not only that, the collisions have been at the higher energy of 4 TeV. In 2011, the LHC delivered an integrated luminosity of around 5.6 fb-1 to both ATLAS and CMS. Now, just a few months after the machine began its 2012 run, these integrated luminosity levels have been past. Follow the LHC performance and statistics on the dedicated page. The step-up in particle collision rates compared with 2011 is due to further reduction in the beam sizes at the interaction point, in conjunction with the use of tight collimator settings, the increase in energy to 4 TeV and the continued excellent beam quality from...

  6. Gas Condensates onto a LHC Type Cryogenic Vacuum System Subjected to Electron Cloud

    CERN Multimedia

    Baglin, V

    2004-01-01

    In the Large Hadron Collider (LHC), the gas desorbed via photon stimulated molecular desorption or electron stimulated molecular desorption will be physisorbed onto the beam screen held between 5 and 20 K. Studies of the effects of the electron cloud onto a LHC type cryogenic vacuum chamber have been done with the cold bore experiment (COLDEX) installed in the CERN Super Proton Synchrotron (SPS). Experiments performed with gas condensates such as H2, H2O, CO and CO2 are described. Implications for the LHC design and operation are discussed.

  7. Leading lead through the LHC

    CERN Multimedia

    2011-01-01

    Three of the LHC experiments - ALICE, ATLAS and CMS - will be studying the upcoming heavy-ion collisions. Given the excellent results from the short heavy-ion run last year, expectations have grown even higher in experiment control centres. Here they discuss their plans:   ALICE For the upcoming heavy-ion run, the ALICE physics programme will take advantage of a substantial increase of the LHC luminosity with respect to last year’s heavy-ion run.  The emphasis will be on the acquisition of rarely produced signals by implementing selective triggers. This is a different operation mode to that used during the first low luminosity heavy-ion run in 2010, when only minimum-bias triggered events were collected. In addition, ALICE will benefit from increased acceptance coverage by the electromagnetic calorimeter and the transition radiation detector. In order to double the amount of recorded events, ALICE will exploit the maximum available bandwidth for mass storage at 4 GB/s and t...

  8. The Phase-1 Upgrade for the Level-1 Muon Barrel Trigger of the ATLAS Experiment at LHC

    CERN Document Server

    Izzo, Vincenzo; The ATLAS collaboration

    2018-01-01

    The Level-1 Muon Barrel Trigger of the ATLAS Experiment at LHC makes use of Resistive Plate Chamber (RPC) detectors. The on-detector trigger electronics modules are able to identify muons with predefined transverse momentum values (pT) by executing a coincidence logic on signals coming from the various detector layers. On-detector trigger boards then transfer trigger data to the off-detector electronics. A complex trigger system processes the incoming data by combining trigger information from the barrel and the endcap regions, and providing the combined muon candidate to the Central Trigger Processor (CTP). For almost a decade, the Level-1 Trigger system operated very well, despite the challenging requirements on trigger efficiency and performance, and the continuously increasing LHC luminosity. In order to cope with these constraints, various upgrades for the full trigger system were already deployed, and others have been designed to be installed in the next years. Most of the upgrades to the trigger system...

  9. The Phase-1 Upgrade for the Level-1 Muon Barrel Trigger of the ATLAS Experiment at LHC

    CERN Document Server

    Izzo, Vincenzo; The ATLAS collaboration

    2018-01-01

    The Level-1 Muon Barrel Trigger of the ATLAS Experiment at LHC makes use of Resistive Plate Chamber (RPC) detectors. The on-detector trigger electronics modules are able to identify muons with predefined transverse momentum values (pT) by executing a coincidence logic on signals coming from the various detector layers. Then, on-detector trigger boards transfer trigger data to the off-detector electronics. A complex trigger system processes the incoming data by combining trigger information from the Barrel and the End-cap regions, and by providing the combined muon candidate to the Central Trigger Processor (CTP). For almost a decade, the Level-1 Trigger system has been operating very well, despite the challenging requirements on trigger efficiency and performance, and the continuously increasing LHC luminosity. In order to cope with these constraints, various upgrades for the full trigger system were already deployed, and others have been designed to be installed in the next years. Most of the upgrades to the...

  10. Higher brightness beams from the SPS for the HL-LHC era

    CERN Document Server

    AUTHOR|(CDS)2085448; Bracco, Chiara (CERN)

    The need to push the LHC beyond its limits and increase the deliverable luminosity to the experiments by about one order of magnitude has driven the ongoing injector and HL-LHC upgrades. The higher luminosity requires to increase the beam brightness, which directly translates in the need to adapt the different machine protection systems. Among all the foreseen upgrades, the transfer line collimators (TCDI) and the LHC injection protection systems will be revised. In particular, the guaranteed protection is evaluated in this Ph D work, together with the specification for the minimum shielded aperture in case of injection failures. A detailed model is also developed which insures a more reliable and efficient procedure for the validation of the TCDI setup within the required accuracy. The physics beyond colliders will also be pushed over its current limits in the HL-LHC era. SHiP, a new proposed fixed target experiment served by the SPS is under study. The unprecedented level of requested protons on target per ...

  11. Latest news from the LHC

    CERN Document Server

    CERN Bulletin

    2010-01-01

    Last week the LHC passed the threshold of 3 pb-1 total integrated luminosity delivered to the experiments, of which about half was delivered in just one week. These excellent results were achieved by operating the machine with up to 50 nominal bunches per beam.   After a very successful week that saw intense beams circulating for long periods (a total of 76.5 hours of stable beams, corresponding to about 40% of the time), there has been a technical stop this week. Over the coming days, experts will work on bunch trains with 150 ns spacing between bunches (the current minimum spacing is 1000 ns). This will involve making the necessary changes throughout the injector chain, as well as in the LHC itself. In the LHC, bunch trains imply working with a crossing angle throughout the machine cycle, in order to avoid unwanted parasitic collisions, which means that the whole process of injection, ramp and squeeze will have to be re-commissioned. The task also includes re-commissioning all the protection syste...

  12. Commissioning the cryogenic system of the first LHC sector

    International Nuclear Information System (INIS)

    Millet, F.; Claudet, S.; Ferlin, G.; Perin, A.; Riddone, G.; Serio, L.; Soubiran, M.; Tavian, L.; CERN; Ronayette, L.; GHMFL, Grenoble; Rabehl, R.; Fermilab

    2007-01-01

    The LHC machine, composed of eight sectors with superconducting magnets and accelerating cavities, requires a complex cryogenic system providing high cooling capacities (18 kW equivalent at 4.5 K and 2.4 W at 1.8 K per sector produced in large cold boxes and distributed via 3.3-km cryogenic transfer lines). After individual reception tests of the cryogenic subsystems (cryogen storages, refrigerators, cryogenic transfer lines and distribution boxes) performed since 2000, the commissioning of the cryogenic system of the first LHC sector has been under way since November 2006. After a brief introduction to the LHC cryogenic system and its specificities, the commissioning is reported detailing the preparation phase (pressure and leak tests, circuit conditioning and flushing), the cool-down sequences including the handling of cryogenic fluids, the magnet powering phase and finally the warm-up. Preliminary conclusions on the commissioning of the first LHC sector will be drawn with the review of the critical points already solved or still pending. The last part of the paper reports on the first operational experience of the LHC cryogenic system in the perspective of the commissioning of the remaining LHC sectors and the beam injection test

  13. Tevatron-for-LHC Report of the QCD Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Albrow, Michael G.; Begel, M.; Bourilkov, D.; Campanelli, M.; Chlebana, F.; De Roeck, A.; Dittmann, J.R.; Ellis, S.D.; Field, B.; Field, R.; Gallinaro, M.; /Fermilab

    2006-10-01

    The experiments at Run 2 of the Tevatron have each accumulated over 1 fb{sup -1} of high-transverse momentum data. Such a dataset allows for the first precision (i.e. comparisons between theory and experiment at the few percent level) tests of QCD at a hadron collider. While the Large Hadron Collider has been designed as a discovery machine, basic QCD analyses will still need to be performed to understand the working environment. The Tevatron-for-LHC workshop was conceived as a communication link to pass on the expertise of the Tevatron and to test new analysis ideas coming from the LHC community. The TeV4LHC QCD Working Group focused on important aspects of QCD at hadron colliders: jet definitions, extraction and use of Parton Distribution Functions, the underlying event, Monte Carlo tunes, and diffractive physics. This report summarizes some of the results achieved during this workshop.

  14. LHC data and cosmic ray coplanarity at superhigh energies

    Directory of Open Access Journals (Sweden)

    Mukhamedshin R.A.

    2017-01-01

    Full Text Available A new phenomenological model FANSY 2.0 is designed, which makes it possible to simulate hadron interactions via traditional and coplanar generation of most energetic particles as well as to reproduce a lot of LHC (ALICE, ATLAS, CMS, TOTEM, LHCf data. Features of the model are compared with LHC data. Problems of coplanarity are considered and a testing experiment is proposed.

  15. LHC Power Distribution

    CERN Document Server

    Pedersen, J

    1999-01-01

    The power distribution for the LHC machine and its experiments will be realised making extensive use of the existing infrastructure for the LEP. The overall power requirement is approximately the same, about 125 MW. The load distribution will however change. The even points will loose in importance and the points 1 and 5 will, due to the installation of ATLAS and CMS, gain. A thorough reorganisation of the 18 kV distribution will thus be necessary. Due to the important cryogenic installations required for the LHC, the 3.3 kV distribution system, supplying mainly cryogenic compressors, will be extended with a number of new substations. The important number of new surface buildings, underground caverns and other underground structures all will receive general service installations: Lighting and power. The new injection tunnels will require complete installations: A.C. supplies for the power converters and for general service, and D.C. cabling for the magnets of the beam line. Special safe power installations ar...

  16. Combined Ramp and Squeeze to 6.5 TeV in the LHC

    CERN Document Server

    Solfaroli Camillocci, Matteo; Tomás, Rogelio; Wenninger, Jorg

    2016-01-01

    The cycle of the LHC is composed of an energy ramp followed by a betatron squeeze, needed to reduce the beta- star value in the interaction points. Since Run 1, studies have been carried out to investigate the feasibility of combining the two operations, thus considerably reducing the duration of the operational cycle. In Run 2, the LHC is operating at the energy of 6.5 TeV that requires a much longer cycle than that of Run 1. Therefore, the performance gains from a Combined Ramp and Squeeze (CRS) is more interesting. Merging the energy ramp and the betatron squeeze could result in a gain of several minutes for each LHC cycle. With increasing maturity of LHC operation, it is now possible to envisage more complex beam manipulations; this paper describes the first machine experiment with beam, aiming at validating the combination of ramp and squeeze, which was performed in 2015, during a machine development phase. The operation experience with the LHC run at 2.51 TeV, when CRS down to 4 meters was deployed and ...

  17. The first experience with LHC beam gas ionization monitor

    CERN Document Server

    Sapinski, M; Dehning, B; Guerrero, A; Patecki, M; Versteegen, R

    2012-01-01

    The Beam Gas Ionization Monitors (BGI) are used to measure beam emittance on LHC. This paper describes the detectors and their operation and discusses the issues met during the commissioning. It also discusses the various calibration procedures used to correct for non-uniformity of Multi-Channel plates and to correct the beam size for effects affecting the electron trajectory after ionization.

  18. Development of a highly selective muon trigger exploiting the high spatial resolution of monitored drift-tube chambers for the ATLAS experiment at the HL-LHC

    CERN Document Server

    Kortner, Oliver; The ATLAS collaboration

    2018-01-01

    The High-Luminosity LHC will provide the unique opportunity to explore the nature of physics beyond the Standard Model. Highly selective first level triggers are essential for the physics programme of the ATLAS experiment at the HL-LHC, where the instantaneous luminosity will exceed the LHC design instantaneous luminosity by almost an order of magnitude. The ATLAS first level muon trigger rate is dominated by low momentum muons, selected due to the moderate momentum resolution of the current system. This first level trigger limitation can be overcome by including data from the precision muon drift tube (MDT) chambers. This requires the fast continuous transfer of the MDT hits to the off-detector trigger logic and a fast track reconstruction algorithm performed in the trigger logic. The feasibility of this approach was studied with LHC collision data and simulated data. Two main options for the hardware implementation will be studied with demonstrators: an FPGA based option with an embedded ARM microprocessor ...

  19. Development of a Highly Selective Muon Trigger Exploiting the High Spatial Resolution of Monitored Drift-Tube Chambers for the ATLAS Experiment at the HL-LHC

    CERN Document Server

    Kortner, Oliver; The ATLAS collaboration

    2018-01-01

    The High-Luminosity LHC will provide the unique opportunity to explore the nature of physics beyond the Standard Model. Highly selective first level triggers are essential for the physics programme of the ATLAS experiment at the HL-LHC, where the instantaneous luminosity will exceed the LHC design instantaneous luminosity by almost an order of magnitude. The ATLAS first level muon trigger rate is dominated by low momentum muons, selected due to the moderate momentum resolution of the current system. This first level trigger limitation can be overcome by including data from the precision muon drift tube (MDT) chambers. This requires the fast continuous transfer of the MDT hits to the off-detector trigger logic and a fast track reconstruction algorithm performed in the trigger logic. The feasibility of this approach was studied with LHC collision data and simulated data. Two main options for the hardware implementation are currently studied with demonstrators, an FPGA based option with an embedded ARM microproc...

  20. Tile Calorimeter Upgrade Program for the Luminosity Increasing at the LHC

    CERN Document Server

    Cerqueira, Augusto Santiago; The ATLAS collaboration

    2015-01-01

    The Tile Calorimeter (TileCal) is the central hadronic calorimeter of the ATLAS experiment at the Large Hadron Collider (LHC). TileCal is a sampling calorimeter with approximately 10,000 channels and is operating successfully (data quality efficiency above 99%) in ATLAS, since the start of the LHC collisions. The LHC is scheduled to undergo a major upgrade, in 2022, for the High Luminosity LHC (HL-LHC), where the luminosity will be increased by a factor of 10 above the original design value. The ATLAS upgrade program for high luminosity is split into three phases: Phase 0 occurred during 2013-2014 (Long Shutdown 1), and prepared the LHC for run 2; Phase 1, foreseen for 2019 (Long Shutdown 2), will prepare the LHC for run 3, whereafter the peak luminosity reaches 2-3 x 10^{34} cm^{2}s^{-1}; finally, Phase 2, which is foreseen for 2024 (Long Shutdown 3), will prepare the collider for the HL-LHC operation (5-7 x 10^{34} cm^{2}s^{-1}). The TileCal main activities for Phase 0 were the installation of the new low v...

  1. Tile Calorimeter Upgrade Program for the Luminosity Increasing at the LHC

    CERN Document Server

    Cerqueira, Augusto Santiago; The ATLAS collaboration

    2015-01-01

    The Tile Calorimeter (TileCal) is the central hadronic calorimeter of the ATLAS experiment at the Large Hadron Collider (LHC). TileCal is a sampling calorimeter with approximately 10,000 channels and is operating successfully (data quality efficiency above 99%) in ATLAS, since the start of the LHC collisions. The LHC is scheduled to undergo a major upgrade, in 2022, for the High Luminosity LHC (HL-LHC), where the luminosity will be increased by a factor of 10 above the original design value. The ATLAS upgrade program for high luminosity is split into three phases: Phase 0 occurred during 2013-2014 (Long Shutdown 1), and prepared the LHC for run 2; Phase 1, foreseen for 2019 (Long Shutdown 2), will prepare the LHC for run 3, whereafter the peak luminosity reaches 2-3 x 10^{34} cm^{2}s^{-1}; finally, Phase 2, which is foreseen for 2023 (Long Shutdown 3), will prepare the collider for the HL-LHC operation (5-7 x 10^{34} cm^{2}s^{-1}). The TileCal main activities for Phase 0 were the installation of the new low v...

  2. High Energy LHC Document prepared for the European HEP strategy update

    CERN Document Server

    Brüning, O; Mangano, M; Myers, S; Rossi, L; Todesco, E; Zimmerman, F

    2012-01-01

    The LHC will run to produce physics at the energy frontier of 13-14 TeV c.o.m. for protons for the next 20-25 years. The possibility of increasing the proton beam energy well beyond its nominal value of 7 TeV has been addressed in a study group in 2010 and then discussed in a workshop in October 2010. The reuse of the CERN infrastructure, the “ease” in producing luminosity with proton circular collider and the practical and technical experience gained with LHC, all are concurring reasons to explore this route. The High Energy LHC relies on the “natural” evolution of the LHC technologies. The High Luminosity LHC (HL-LHC) demands going 50% beyond the limit of magnetic field of LHC: therefore HL-LHC can be considered as the first milestone in the path toward the highest energy. The beam energy is set by the strength of superconducting magnets: assuming a dipole field in the range 16-20 T, the maximum attainable collision energy falls in the range of 26 to 33 TeV in the centre of mass. The driving techno...

  3. Top Quark and Higgs Boson Physics at LHC-ATLAS

    Science.gov (United States)

    Tomoto, M.

    2013-03-01

    One of the main goal of the Large Hadron Collider (LHC) experiments at CERN in Switzerland is to aim to solve the "origin of the mass" by discovering the Higgs boson and understanding the interaction of the Higgs boson with the elementary particles. The ATLAS, which is one of the LHC experiments has taken about 5 fb-1 of physics quality data and published several results with regard to the "origin of the mass" since March 2010. This presentation focuses on the latest results of the heaviest elementary particle, namely, top quark physics and the Higgs boson searches from ATLAS.

  4. The LHC Computing Grid in the starting blocks

    CERN Multimedia

    Danielle Amy Venton

    2010-01-01

    As the Large Hadron Collider ramps up operations and breaks world records, it is an exciting time for everyone at CERN. To get the computing perspective, the Bulletin this week caught up with Ian Bird, leader of the Worldwide LHC Computing Grid (WLCG). He is confident that everything is ready for the first data.   The metallic globe illustrating the Worldwide LHC Computing GRID (WLCG) in the CERN Computing Centre. The Worldwide LHC Computing Grid (WLCG) collaboration has been in place since 2001 and for the past several years it has continually run the workloads for the experiments as part of their preparations for LHC data taking. So far, the numerous and massive simulations of the full chain of reconstruction and analysis software could only be carried out using Monte Carlo simulated data. Now, for the first time, the system is starting to work with real data and with many simultaneous users accessing them from all around the world. “During the 2009 large-scale computing challenge (...

  5. LHC II system sensitivity to magnetic fluids

    CERN Document Server

    Cotae, Vlad

    2005-01-01

    Experiments have been designed to reveal the influences of ferrofluid treatment and static magnetic field exposure on the photosynthetic system II, where the light harvesting complex (LHC II) controls the ratio chlorophyll a/ chlorophyll b (revealing, indirectly, the photosynthesis rate). Spectrophotometric measurement of chlorophyll content revealed different influences for relatively low ferrofluid concentrations (10-30 mul/l) in comparison to higher concentrations (70-100 mul/l). The overlapped effect of the static magnetic field shaped better the stimulatory ferrofluid action on LHC II system in young poppy plantlets.

  6. Future Plans of the ATLAS Collaboration for the HL-LHC

    CERN Document Server

    Hristova, Ivana; The ATLAS collaboration

    2018-01-01

    These proceedings report the current plans to upgrade the ATLAS detector at CERN for the High Luminosity LHC (HL-LHC). The HL-LHC is expected to start operations in the middle of 2026, aiming to reach an ultimate peak instantaneous luminosity of 7.5$\\times10^{34}$cm$^{-2}$s$^{-1}$, corresponding to approximately 200 inelastic proton-proton collisions per bunch crossing, and to deliver over a period of twelve years more than ten times the integrated luminosity of the large hadron collider (LHC) Runs 1-3 combined (up to $4000$ fb$^{-1}$). This is a huge challenge to all sub-systems of the detector which will need extensive upgrades to allow the experiment to pursue a rich and interesting physics programme in the future.

  7. Study of Rare Beauty Decays with ATLAS Detector at LHC and MDT Chamber Perfomances

    CERN Document Server

    Policicchio, Antonio

    2006-01-01

    The Large Hadron Collider (LHC) is a proton-proton collider that will operate at a center of mass energy of $14~TeV$ and at a maximum luminosity of $L=10^{34}cm^{-2}s^{-1}$. The LHC will reproduce interactions similar to those which existed when the universe was only $\\sim 10^{-12}s$ old, conditions which have not been achieved in any previous collider. The primary goals of the LHC project are to discover the origin of particle masses, to explain why different particles have different masses and to search for new phenomena beyond the Standard Model. Also heavy quark systems and precision measurements on Standard Model parameters will be subject of LHC physics studies. ATLAS (A Toroidal LHC ApparatuS) is one of the two LHC general purpose experiments. The guiding principle in optimizing the ATLAS experiment has been maximizing the discovery potential for New Physics such as Higgs bosons and supersymmetric particles, while keeping the capability of high precision measurements of known objects such as heavy quar...

  8. Muon Event Filter Software for the ATLAS Experiment at LHC

    CERN Document Server

    Biglietti, M; Assamagan, Ketevi A; Baines, J T M; Bee, C P; Bellomo, M; Bogaerts, J A C; Boisvert, V; Bosman, M; Caron, B; Casado, M P; Cataldi, G; Cavalli, D; Cervetto, M; Comune, G; Conde, P; Conde-Muíño, P; De Santo, A; De Seixas, J M; Di Mattia, A; Dos Anjos, A; Dosil, M; Díaz-Gómez, M; Ellis, Nick; Emeliyanov, D; Epp, B; Falciano, S; Farilla, A; George, S; Ghete, V M; González, S; Grothe, M; Kabana, S; Khomich, A; Kilvington, G; Konstantinidis, N P; Kootz, A; Lowe, A; Luminari, L; Maeno, T; Masik, J; Meessen, C; Mello, A G; Merino, G; Moore, R; Morettini, P; Negri, A; Nikitin, N V; Nisati, A; Padilla, C; Panikashvili, N; Parodi, F; Pinfold, J L; Pinto, P; Primavera, M; Pérez-Réale, V; Qian, Z; Resconi, S; Rosati, S; Santamarina-Rios, C; Scannicchio, D A; Schiavi, C; Segura, E; Sivoklokov, S Yu; Soluk, R A; Stefanidis, E; Sushkov, S; Sutton, M; Sánchez, C; Tapprogge, Stefan; Thomas, E; Touchard, F; Venda-Pinto, B; Ventura, A; Vercesi, V; Werner, P; Wheeler, S; Wickens, F J; Wiedenmann, W; Wielers, M; Zobernig, G; Computing In High Energy Physics

    2005-01-01

    At LHC the 40 MHz bunch crossing rate dictates a high selectivity of the ATLAS Trigger system, which has to keep the full physics potential of the experiment in spite of a limited storage capability. The level-1 trigger, implemented in a custom hardware, will reduce the initial rate to 75 kHz and is followed by the software based level-2 and Event Filter, usually referred as High Level Triggers (HLT), which further reduce the rate to about 100 Hz. In this paper an overview of the implementation of the offline muon recostruction algortihms MOORE (Muon Object Oriented REconstruction) and MuId (Muon Identification) as Event Filter in the ATLAS online framework is given. The MOORE algorithm performs the reconstruction inside the Muon Spectrometer providing a precise measurement of the muon track parameters outside the calorimeters; MuId combines the measurements of all ATLAS sub-detectors in order to identify muons and provides the best estimate of their momentum at the production vertex. In the HLT implementatio...

  9. LHC report

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    This week's Report, by Gianluigi Arduini,  will be included in the LHC Physics Day, dedicated to the reviews of the LHC physics results presented at ICHEP 2010. Seehttp://indico.cern.ch/conferenceDisplay.py?confId=102669 

  10. PDF4LHC recommendations for LHC Run II

    NARCIS (Netherlands)

    Butterworth, Jon; Carrazza, Stefano; Cooper-Sarkar, Amanda; Roeck, Albert de; Feltesse, Joel; Forte, Stefano; Gao, Jun; Glazov, Sasha; Huston, Joey; Kassabov, Zahari; McNulty, Ronan; Morsch, Andreas; Nadolsky, Pavel; Radescu, Voica; Rojo, Juan; Thorne, Robert S.

    2015-01-01

    We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new

  11. LHC 2012 proton run extended by seven weeks

    CERN Multimedia

    James Gillies

    2012-01-01

    An important piece of news that almost got lost in the excitement of the Higgs update seminar on 4 July is that the 2012 LHC proton run is to be extended.   On 3 July, a meeting was held between the CERN Management and representatives of the LHC and the experiments to discuss the merits of increasing the data target for this year in the light of the announcement to be made the following day. The conclusion was that an additional seven weeks of running would allow the luminosity goal for the year to be increased from 15 inverse femtobarns to 20, giving the experiments a good supply of data to work on during the LHC’s first long shut-down (LS1), and allowing them to make progress in determining the properties of the new particle whose discovery was announced last week. The current LHC schedule foresees proton running reaching a conclusion on 16 October, with a proton-ion run scheduled for November. In the preliminary new schedule, proton running is planned to continue until 16 December, ...

  12. CORAL and COOL during the LHC long shutdown.

    CERN Document Server

    Valassi, Andrea; Dulstra, D; Goyal, N; Salnikov, A; Trentadue, R; Wache, M

    2014-01-01

    CORAL and COOL are two software packages used by the LHC experiments for managing detector conditions and other types of data using relational database technologies. They have been developed and maintained within the LCG Persistency Framework, a common project of the CERN IT department with ATLAS, CMS and LHCb. This presentation reports on the status of CORAL and COOL at the time of CHEP2013, covering the new features and enhancements in both packages, as well as the changes and improvements in the software process infrastructure. It also reviews the usage of the software in the experiments and the outlook for ongoing and future activities during the LHC long shutdown (LS1) and beyond.

  13. CORAL and COOL during the LHC long shutdown

    CERN Multimedia

    Valassi, A; Dykstra, D; Goyal, N; Salnikov, A; Trentadue, R; Wache, M

    2013-01-01

    CORAL and COOL are two software packages used by the LHC experiments for managing detector conditions and other types of data using relational database technologies. They have been developed and maintained within the LCG Persistency Framework, a common project of the CERN IT department with ATLAS, CMS and LHCb. This presentation reports on the status of CORAL and COOL at the time of CHEP2013, covering the new features and enhancements in both packages, as well as the changes and improvements in the software process infrastructure. It also reviews the usage of the software in the experiments and the outlook for ongoing and future activities during the LHC long shutdown (LS1) and beyond.

  14. The last stage of LHC construction

    International Nuclear Information System (INIS)

    Serin, L.

    2006-01-01

    A few months ago the setting of the LHC (large hadron collider) machine began in the Lep's tunnel at CERN. The LHC is composed of 1200 dipole magnets that are progressively installed in the 27 km long underground circular facility, 2 universal experiments ATLAS and CMS, huge by the size of their respective detector: 40 x 20 x 20 m as well as by their number of participants: 1500 people for each one are being built in gigantic carves. All the efforts are concentrated to make every component of the machine fully installed by summer 2007 in order to get the first collisions before 2008

  15. Heavy-ion performance of the LHC and future colliders

    Energy Technology Data Exchange (ETDEWEB)

    Schaumann, Michaela

    2015-04-29

    In 2008 the Large Hadron Collider (LHC) and its experiments started operation at the European Centre of Nuclear Research (CERN) in Geneva with the main aim of finding or excluding the Higgs boson. Only four years later, on the 4th of July 2012, the discovery of a Higgs-like particle was proven and first published by the two main experiments ATLAS and CMS. Even though proton-proton collisions are the main operation mode of the LHC, it also acts as an heavy-ion collider. Here, the term ''heavy-ion collisions'' refers to the collision between fully stripped nuclei. While the major hardware system of the LHC is compatible with heavy-ion operation, the beam dynamics and performance limits of ion beams are quite different from those of protons. Because of the higher mass and charge of the ions, beam dynamic effects like intra-beam scattering and radiation damping are stronger. Also the electromagnetic cross-sections in the collisions are larger, leading to significantly faster intensity decay and thus shorter luminosity lifetimes. As the production cross-sections for various physics processes under study of the experiments are still small at energies reachable with the LHC and because the heavy-ion run time is limited to a few days per year, it is essential to obtain the highest possible collision rate, i.e. maximise the instantaneous luminosity, in order to obtain enough events and therefore low statistical errors. Within this thesis, the past performance of the LHC in lead-lead (Pb-Pb) collisions, at a centre-of-mass energy of 2.76 TeV per colliding nucleon pair, is analysed and potential luminosity limitations are identified. Tools are developed to predict future performance and techniques are presented to further increase the luminosity. Finally, a perspective on the future of high energy heavy-ion colliders is given.

  16. Keeping HL-LHC accountable

    CERN Multimedia

    2015-01-01

    This week saw the cost and schedule of the High Luminosity LHC (HL-LHC) and LHC Injectors Upgrade (LIU) projects come under close scrutiny from the external review committee set up for the purpose.    HL-LHC, whose implementation requires an upgrade to the CERN injector complex, responds directly to one of the key recommendations of the updated European Strategy for Particle Physics, which urges CERN to prepare for a ‘major luminosity upgrade’, a recommendation that is also perfectly in line with the P5 report on the US strategy for the field. Responding to this recommendation, CERN set up the HL-LHC project in 2013, partially supported by FP7 funding through the HiLumi LHC Design Study (2011-2015), and coordinated with the American LARP project, which oversees the US contribution to the upgrade. A key element of HL-LHC planning is a mechanism for receiving independent expert advice on all aspects of the project.  To this end, several technical reviews h...

  17. Availability modeling approach for future circular colliders based on the LHC operation experience

    CERN Document Server

    AUTHOR|(CDS)2096726; Apollonio, Andrea; Gutleber, Johannes; Sollander, Peter; Penttinen, Jussi-Pekka; Virtanen, Seppo Johannes

    2016-01-01

    Reaching the challenging integrated luminosity production goals of a future circular hadron collider (FCC-hh) and high luminosity LHC (HL-LHC) requires a thorough understanding of today’s most powerful high energy physics research infrastructure, the LHC accelerator complex at CERN. FCC-hh, a 4 times larger collider ring aims at delivering 10–20  ab$^-$$^1$ of integrated luminosity at 7 times higher collision energy. Since the identification of the key factors that impact availability and cost is far from obvious, a dedicated activity has been launched in the frame of the future circular collider study to develop models to study possible ways to optimize accelerator availability. This paper introduces the FCC reliability and availability study, which takes a fresh new look at assessing and modeling reliability and availability of particle accelerator infrastructures. The paper presents a probabilistic approach for Monte Carlo simulation of the machine operational cycle, schedule and availability for p...

  18. Highlights of the SM Physics at the LHC

    CERN Document Server

    Yang, Haijun; The ATLAS collaboration

    2015-01-01

    This talk shows the recent highlights of the SM physics from the ATLAS and CMS experiments at the LHC. It includes the precision measurements of diboson, triboson, vector boson scattering, and indirect search for new physics via anomalous triple/quartic gauge boson couplings etc. Some latest results from LHC Run2 @ 13 TeV will also be presented. The talk was invited to present at the 5th KIAS Workshop on Particle Physics and Cosmology in Seoul on November 9-13, 2015.

  19. IONS FOR LHC STATUS OF THE INJECTOR CHAIN

    CERN Document Server

    Manglunki, Django; Borburgh, J; Carli, C; Chanel, M; Dumas, L; Fowler, T; Gourber-Pace, M; Hancock, S; Hourican, M; Jowett, John M; Küchler, D; Mahner, E; Martini, M; Maury, S; Pasinelli, S; Raich, U; Rey, A; Royer, J-P; Scrivens, R; Sermeus, L; Tranquille, G; Vallet, J L; Vandorpe, B

    2007-01-01

    The LHC will, in addition to proton runs, be operated with Pb ions and provide collisions at energies of 5.5 TeV per nucleon pair, i.e. more than 1.1 PeV per event, to experiments. The transformation of CERN's ion injector complex (Linac3-LEIR-PS-SPS) to allow collision of ions in LHC in 2008 is well under way. The status of these modifications and the latest results of commissioning will be presented. The remaining challenges are reviewed.

  20. Exotic highly ionising particles at the LHC

    CERN Document Server

    De Roeck, A; Mermod, P; Milstead, D; Sloan, T

    2012-01-01

    The experiments at the Large Hadron Collider (LHC) are able to discover or set limits on the production of exotic particles with TeV-scale masses possessing values of electric and/or magnetic charge such that they appear as highly ionising particles (HIPs). In this paper the sensitivity of the LHC experiments to HIP production is discussed in detail. It is shown that a number of different detection methods are required to investigate as fully as possible the charge-mass range. These include direct detection as the HIPs pass through detectors and, in the case of magnetically charged objects, the so-called induction method with which monopoles which stop in accelerator and detector material could be observed. The benefit of using complementary approaches to HIP detection is discussed.

  1. Transverse emittance measurement and preservation at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, Maria

    2016-06-20

    . During LHC Run 1 significant transverse emittance growth throughout the LHC cycle was observed. About 30 % of the potential luminosity performance was lost through the different phases of the LHC cycle. At the LHC design stage the total allowed emittance increase through the cycle was set to 7 %. Measurements indicated that most of the blow-up occurred during the injection plateau and the ramp. Intra-beam scattering was one of the main drivers for emittance growth. In April 2015 the LHC re-started with a collision energy of 6.5 TeV per beam. This thesis presents the first transverse emittance preservation studies in LHC Run 2 with 25 ns beams. A breakdown of the growth throughout the various phases in the LHC cycle is given for low intensity beams measured with wire scanners. Also presented is data collected from synchrotron light monitors and the LHC experiments. Finally, the emittance growth results is compared to intra-beam scattering simulations. A theory on emittance growth due to noise from the LHC transverse damper and other external sources is discussed. The results of the investigations are summarized, and an outlook in terms of emittance blow-up for future LHC upgrade scenarios with low emittance beams is given.

  2. Bremsstrahlung from Relativistic Heavy Ions in a Fixed Target Experiment at the LHC

    International Nuclear Information System (INIS)

    Mikkelsen, Rune E.; Uggerhøj, Ulrik I.; Sørensen, Allan H.

    2015-01-01

    We calculate the emission of bremsstrahlung from lead and argon ions in ultraperipheral collisions in a fixed target experiment (AFTER) that uses the LHC beams. With nuclear charges of Ze equal to 82e and 18e, respectively, these ions are accelerated to energies of 7 Tev × Z. The bremsstrahlung peaks around ≈100 GeV and the spectrum exposes the nuclear structure of the incoming ion. The peak structure is significantly different from the flat power spectrum pertaining to a point charge. Photons are predominantly emitted within an angle of 1/γ to the direction of ion propagation. Our calculations are based on the Weizsäcker-Williams method of virtual quanta with application of existing experimental data on photonuclear interactions.

  3. Operational experience with the CMS pixel detector in LHC Run II

    CERN Document Server

    Karancsi, Janos

    2016-01-01

    The CMS pixel detector was repaired successfully, calibrated and commissioned for the second run of Large Hadron Collider during the first long shutdown between 2013 and 2015. The replaced pixel modules were calibrated separately and show the expected behavior of an un-irradiated detector. In 2015, the system performed very well with an even improved spatial resolution compared to 2012. During this time, the operational team faced various challenges including the loss of a sector in one half shell which was only partially recovered. In 2016, the detector is expected to withstand instantaneous luminosities beyond the design limits and will need a combined effort of both online and offline teams in order to provide the high quality data that is required to reach the physics goals of CMS. We present the operational experience gained during the second run of the LHC and show the latest performance results of the CMS pixel detector.

  4. Recent photon physics results from the ALICE experiment at the LHC

    CERN Document Server

    Arbor, Nicolas

    2013-01-01

    We present an overview of the photon analysis in pp and Pb-Pb collisions with data taken by the ALICE experiment at the LHC. The ALICE detectors reconstruct photons by using the two electromagnetic calorimeters (photon spectrometer, sampling calorimeter) and central tracking systems for photon converted e + e pairs in the material of the inner ALICE layers. In Pb-Pb collisions the direct photon calculations under- predict the data below 4 GeV / c where it is expected to have a contribution from thermal radiations. The direct photon measurement also shows evidence for a non-zero elliptic flow for 1 < p T < 3 GeV / c. The nuclear modi- fication factor of the 0 production at di erent collision centralities shows a clear pattern of strong suppression in a hot QCD medium with respect to pp collisions. Finally, parton fragmentation following hard collisions is investigated by correlating high momentum direct photons and charged hadrons with the goal of revealing new insights into medium effects in the QGP.

  5. LHC synchronization test successful

    CERN Multimedia

    The synchronization of the LHC's clockwise beam transfer system and the rest of CERN's accelerator chain was successfully achieved last weekend. Tests began on Friday 8 August when a single bunch of a few particles was taken down the transfer line from the SPS accelerator to the LHC. After a period of optimization, one bunch was kicked up from the transfer line into the LHC beam pipe and steered about 3 kilometres around the LHC itself on the first attempt. On Saturday, the test was repeated several times to optimize the transfer before the operations group handed the machine back for hardware commissioning to resume on Sunday. The anti-clockwise synchronization systems will be tested over the weekend of 22 August.Picture:http://lhc-injection-test.web.cern.ch/lhc-injection-test/

  6. A new resource for the entire LHC community

    CERN Multimedia

    2010-01-01

    The first time I addressed the CERN community as Director-General in January 2009, I said that I wished to see the intellectual life of the Laboratory develop. With the experiments rapidly accumulating data, now is the time for that to happen. CERN is known as a global reference point for excellence in accelerator science, and our track record of providing world-class facilities is second to none. Simply stated, the division of labour between CERN and the experiments it hosts is that CERN has provided the beams and support systems from experimental areas to IT, while the experiments have done the physics. That doesn't mean, however, that CERN has no part to play in the intellectual life of the experiments. Our Theory group has always provided support to CERN's experiments, while CERN physicists, Staff and Fellows, are an essential part of every experiment conducted here. With the LHC coming on stream, the time is right to create a focal point at CERN dedicated to the LHC research programme and open to...

  7. Calibration techniques and strategies for the present and future LHC electromagnetic calorimeters

    Science.gov (United States)

    Aleksa, M.

    2018-02-01

    This document describes the different calibration strategies and techniques applied by the two general purpose experiments at the LHC, ATLAS and CMS, and discusses them underlining their respective strengths and weaknesses from the view of the author. The resulting performances of both calorimeters are described and compared on the basis of selected physics results. Future upgrade plans for High Luminosity LHC (HL-LHC) are briefly introduced and planned calibration strategies for the upgraded detectors are shown.

  8. The relationship between viscosity and refinement efficiency of pure aluminum by Al-Ti-B refiner

    Energy Technology Data Exchange (ETDEWEB)

    Yu Lina [Key Laboratory of Liquid Structure and Heredity of Materials, Ministry of Education, Shandong University, 73 Jingshi Road, Jinan 250061 (China); Liu Xiangfa [Key Laboratory of Liquid Structure and Heredity of Materials, Ministry of Education, Shandong University, 73 Jingshi Road, Jinan 250061 (China)]. E-mail: xfliu@sdu.edu.cn

    2006-11-30

    The relationship between viscosity and refinement efficiency of pure aluminum with the addition of Al-Ti-B master alloy was studied in this paper. The experimental results show that when the grain size of solidified sample is finer the viscosity of the melt is higher after the addition of different Al-Ti-B master alloys. This indicates that viscosity can be used to approximately estimate the refinement efficiency of Al-Ti-B refiners in production to a certain extent. The main reason was also discussed in this paper by using transmission electron microscopy (TEM) analysis and differential scanning calorimetry (DSC) experiment.

  9. Measurement of the top quark properties at the Tevatron and the LHC

    CERN Document Server

    INSPIRE-00040958

    2014-01-01

    Almost two decades after its discovery at Fermilab's Tevatron collider experiments, the top quark is still under the spotlight due to its connections to some of the most interesting puzzles in the Standard Model. The Tevatron has been shut down two years ago, yet some interesting results are coming out of the CDF and D0 collaborations. The LHC collider at CERN produced two orders of magnitude more top quarks than Tevatron's, thus giving birth to a new era for top quark physics. While the LHC is also down at the time of this writing, many top quark physics results are being extracted out of the 7\\,TeV and 8\\,TeV proton proton collisions by the ATLAS and CMS collaborations, and many more are expected to appear before the LHC will be turned on again sometime in 2015. These proceedings cover a selection of recent results produced by the Tevatron and LHC experiments.

  10. Trigger system study of the dimuon spectrometer in the ALICE experiment at CERN-LHC; Etude du systeme de declenchement du spectrometre dimuons de l'experience alice au Cern-LHC

    Energy Technology Data Exchange (ETDEWEB)

    Roig, O

    1999-12-01

    This work is a contribution to the study of nucleus-nucleus collisions at the LHC with ALICE. The aim of this experiment is to search for a new phase of matter, the quark-gluon plasma (QGP). The dimuon forward spectrometer should measure one of the most promising probes of the QGP, the production of heavy quark vector mesons (J/{psi}, {gamma}, {gamma}', {gamma}'') through their muonic decays. The dimuon trigger selects the interesting events performing a cut on the transverse momentum of the tracks. The trigger decision is taken by a dedicated electronics using RPC (''Resistive Plate Chambers'') detector information. We have made our own R and D program on the RPC detector with various beam tests. We show the performances obtained during these tests of a low resistivity RPC operating in streamer mode. The ALICE requirements concerning the rate capability, the cluster size and the time resolution are fulfilled. We have optimised the trigger with simulations which include a complete description of the read-out planes and the trigger logic (algorithm). In particular, a technique of clustering is proposed and validated. A method called ''Ds reduction'' is introduced in order to limit the effects of combinatorial background on the trigger rates. The efficiencies and the trigger rates are calculated for Pb-Pb, Ca-Ca, p-p collisions at the LHC. Other more sophisticated cuts, on the invariant mass for example, using again the RPC information have been simulated but have not shown significant improvements of the trigger rates. (author)

  11. Beam Cleaning in Experimental IRs in HL-LHC for the Incoming Beam

    CERN Document Server

    Garcia-Morales, H; Bruce, Roderik; Redaelli, Stefano

    2015-01-01

    The HL-LHC will store 675 MJ of energy per beam, about 300 MJ more than the nominal LHC. Due to the increase in stored energy and a different interaction region (IR) optics layout, the collimation system for the incoming beam must be revisited in order to avoid dangerous losses that could cause quenches or machine damage. This paper studies the effectiveness of the current LHC collimation system in intercepting cleaning losses close to the experiments in the HL-LHC. The study reveals that additional tertiary collimators would be beneficial in order to protect not only the final focusing triplets but also the two quadrupoles further upstream.

  12. Beam cleaning of the incoming beam in experimental IRs in HL-LHC

    CERN Document Server

    Garcia Morales, Hector; Redaelli, Stefano; De Maria, Riccardo; CERN. Geneva. ATS Department

    2017-01-01

    The HL-LHC will store 675 MJ of energy per beam, about 300 MJ more than the nominal LHC. Due to the increase in stored energy and a different interaction region (IR) layout and optics design, the collimation system for the incoming beam must be revisited in order to avoid dangerous losses that could cause quenches and machine damage. This paper studies the effectiveness of the current LHC collimation system in intercepting cleaning losses close to the experiments in the HL-LHC. The study reveals that in addition to the triplet also the Q4 needs local protection, which could be provided by an additional pair of TCTs.

  13. Safe LHC beam commissioning

    International Nuclear Information System (INIS)

    Uythoven, J.; Schmidt, R.

    2007-01-01

    Due to the large amount of energy stored in magnets and beams, safety operation of the LHC is essential. The commissioning of the LHC machine protection system will be an integral part of the general LHC commissioning program. A brief overview of the LHC Machine Protection System will be given, identifying the main components: the Beam Interlock System, the Beam Dumping System, the Collimation System, the Beam Loss Monitoring System and the Quench Protection System. An outline is given of the commissioning strategy of these systems during the different commissioning phases of the LHC: without beam, injection and the different phases with stored beam depending on beam intensity and energy. (author)

  14. Important step towards the LHC

    CERN Document Server

    2001-01-01

    The TI2 tunnel, one of the two tunnels that will transfer protons from the SPS to the LHC, broke through into the LEP/LHC ring on 15 May. TI2 will carry clockwise-moving protons from under the Laboratory's West Area to Point 2, future home of the ALICE experiment. It is coming up to 16:00 on 15 May and a group of some 50 people, fully kitted out in boots, helmets, and masks is intently watching a point on the wall in front of them. They are down in the LEP/LHC tunnel waiting for civil engineers to excavate the last few centimetres separating them from the TI2 transfer tunnel. The noise of machines begins, and just five minutes later the wall comes tumbling down. The excavator breaks through right on target, bringing a two-year project to a happy conclusion. Later, the survey team published the outstanding result that the tunnel junction was made within 6 millimetres of target. TI2 measures 2648 metres in length and three metres in diameter. Around 32,000 cubic metres of rock have been excavated to make it, so...

  15. An FPGA-based track finder for the L1 trigger of the CMS experiment at the HL-LHC

    CERN Document Server

    Cieri, Davide; Harder, Kristian; Manolopoulos, Konstantinos; Shepherd-Themistocleous, Claire; Tomalin, Ian; Aggleton, Robin; Ball, Fionn; Brooke, Jim; Clement, Emyr; Newbold, Dave; Paramesvaran, Sudarshan; Hobson, Peter; Morton, Alexander Davide; Reid, Ivan; Hall, Geoff; Iles, Gregory; James, Thomas Owen; Matsushita, Takashi; Pesaresi, Mark; Rose, Andrew William; Shtipliyski, Antoni; Summers, Sioni; Tapper, Alex; Uchida, Kirika; Vichoudis, Paschalis; Ardila-Perez, Luis; Balzer, Matthias; Caselle, Michele; Sander, Oliver; Schuh, Thomas; Weber, Marc

    2017-01-01

    A new tracking detector is under development for use by the CMS experiment at the High-Luminosity LHC (HL-LHC). A crucial component of this upgrade will be the ability to reconstruct within a few microseconds all charged particle tracks with transverse momentum above 3 GeV, so they can be used in the Level-1 trigger decision. A concept for an FPGA-based track finder using a fully time-multiplexed architecture is presented, where track candidates are reconstructed using a projective binning algorithm based on the Hough Transform followed by a track fitting based on the linear regression technique. A hardware demonstrator using MP7 processing boards has been assembled to prove the entire system, from the output of the tracker readout boards to the reconstruction of tracks with fitted helix parameters. It successfully operates on one eighth of the tracker solid angle at a time, processing events taken at 40 MHz, each with up to 200 superimposed proton-proton interactions, whilst satisfying latency constraints. T...

  16. Measurement of K(892)*0 resonance production in Pb-Pb collisions with the ALICE experiment at the LHC

    CERN Document Server

    Bellini, Francesca

    The analysis of the K(892)*0 resonance production in Pb–Pb collisions at √sNN = 2.76 TeV with the ALICE detector at the LHC is presented. The analysis is motivated by the interest in the measurement of short-lived resonances production that can provide insights on the properties of the medium produced in heavy-ion collisions both during its partonic (Quark-Gluon Plasma) and hadronic phase. This particular analysis exploits particle identification of the ALICE Time-Of-Flight detector. The ALICE experiment is presented, with focus on the performance of the Time-Of-Flight system. The aspects of calibration and data quality controls are discussed in detail, while illustrating the excellent and very stable performance of the system in different collision environments at the LHC. A full analysis of the K*0 resonance production is presented: from the resonance reconstruction to the determination of the efficiency and the systematic uncertainty. The results show that the analysis strategy discussed is a valid too...

  17. New data processing technologies at LHC: From Grid to Cloud Computing and beyond

    International Nuclear Information System (INIS)

    De Salvo, A.

    2011-01-01

    Since a few years the LHC experiments at CERN are successfully using the Grid Computing Technologies for their distributed data processing activities, on a global scale. Recently, the experience gained with the current systems allowed the design of the future Computing Models, involving new technologies like Could Computing, virtualization and high performance distributed database access. In this paper we shall describe the new computational technologies of the LHC experiments at CERN, comparing them with the current models, in terms of features and performance.

  18. Tracking detectors for the sLHC, the LHC upgrade

    CERN Document Server

    Sadrozinski, Hartmut F W

    2005-01-01

    The plans for an upgrade of the Large Hadron Collider (LHC) to the Super-LHC (sLHC) are reviewed with special consideration of the environment for the inner tracking system. A straw-man detector upgrade for ATLAS is presented, which is motivated by the varying radiation levels as a function of radius, and choices for detector geometries and technologies are proposed, based on the environmental constraints. A few promising technologies for detectors are discussed, both for sensors and for the associated front-end electronics. On-going research in silicon detectors and in ASIC technologies will be crucial for the success of the upgrade.

  19. A proposal to study a tracking/preshower detector for the LHC

    CERN Document Server

    Munday, D J; Anghinolfi, Francis; Bonino, R; Campbell, M; Fassò, A; Gildemeister, O; Heijne, Erik H M; Jarron, Pierre; Mapelli, Livio P; Pentney, J M; Poppleton, Alan; Stevenson, Graham Roger; Gössling, C; Pollmann, D; Sondermann, V; Tsesmelis, E; Clark, A G; Kienzle-Focacci, M N; Martin, M; Rosselet, L; Fretwurst, E; Lindström, G; Reich, V; Bardos, R A; Gorfine, G W; Taylor, G; Tovey, Stuart N; Stapnes, Steinar; Weidberg, A R; Lubrano, P; Pepé, M; Grayer, Geoffrey H; Sharp, P; Bakich, A M; Peak, L S; CERN. Geneva. Detector Research and Development Committee

    1990-01-01

    We describe a program of studies aimed at determining whether the track stub/preshower technique of electron identification can be used at the highest operating luminosities of the proposed LHC collider. The proposal covers detector and electronics developments required for the construction of a track-stub and preshower detector preceding the electromagnetic calorimeter of an LHC experiment.

  20. The Radiation Tolerant Electronics for the LHC Cryogenic Controls: Basic Design and First Operational Experience

    CERN Document Server

    Casas-Cubillos, J; Rodríguez-Ruiz, M A

    2008-01-01

    The LHC optics is based in the extensive use of superconducting magnets covering 23 km inside the tunnel. The associated cryogenic system for keeping the magnets in nominal conditions is hence distributed all around the 27 km LHC tunnel and the cryogenic instrumentation submitted to the LHC radiation environment is composed of about 18’000 sensors and actuators. Radiation Tolerant (RadTol) electronics was designed and procured in order to keep the signals integrity against electromagnetic interference and to reduce cabling costs required in case of sending the analog signals into the 30 radiation protected areas. This paper presents the basic design, the qualification of the main RadTol components and the first operational results.

  1. Instrumentation for silicon tracking at the HL-LHC

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00524651; Strandberg, Sara; Garcia-Sciveres, Maurice

    2017-06-14

    In 2027 the Large Hadron Collider (LHC) at CERN will enter a high luminosity phase, deliver- ing 3000 fb 1 over the course of ten years. The High Luminosity LHC (HL-LHC) will increase the instantaneous luminosity delivered by a factor of 5 compared to the current operation pe- riod. This will impose significant technical challenges on all aspects of the ATLAS detector but particularly the Inner Detector, trigger, and data acquisition systems. In addition, many of the components of the Inner Detector are reaching the end of their designed lifetime and will need to be exchanged. As such, the Inner Detector will be entirely replaced by an all silicon tracker, known as the Inner Tracker (ITk). The layout of the Pixel and strip detectors will be optimised for the upgrade and will extend their forward coverage. To reduce the per-pixel hit rate and explore novel techniques for deal- ing with the conditions in HL-LHC, an inter-experiment collaboration called RD53 has been formed. RD53 is tasked with producing a front...

  2. ATLAS physics prospects with the High-Luminosity LHC

    CERN Document Server

    Khanov, Alexander; The ATLAS collaboration

    2016-01-01

    Run-I at the LHC was very successful with the discovery of a new boson of about 125 GeV mass with properties compatible with those of the Higgs boson predicted by Standard Model.Precise measurements of the properties of this new boson, and the search for new physics beyond the Standard Model, are primary goals of the just restarted LHC running at 13 TeV collision energy and all future running at the LHC, including its luminosity upgrade, HL-LHC, that should allow the collection of 3000 fb-1 of data per experiment. The physics prospects with a pp centre-of-mass energy of 14 TeV are presented for 300 and 3000 fb-1. The ultimate precision attainable on measurements of the couplings of the 125 GeV boson to elementary fermions and bosons is discussed, as well as perspectives on the searches for partners associated with it. The electroweak sector is further studied with the analysis of the vector boson scattering, testing the SM predictions. Supersymmetry is one of the best motivated extensions of the Standard Mode...

  3. 3rd report from the LHC performance workshop

    CERN Multimedia

    Bulletin's correspondent from Chamonix

    2012-01-01

    Outside it's a little warmer but Wednesday was spent inside looking forward to the long shutdown (LS1) planned for 2013/14. The total length of the shutdown for the LHC is provisionally around 20 months and there is a huge, huge amount of work on the cards. Provisional planning was presented. The key driver is the splice consolidation work which foresees opening every magnet interconnect in the ring, measuring carefully the resistance of each joint in the cables which carry the current between the dipole and quadrupoles in the arcs of the LHC. It is estimated that 15% of the splices will be re-done; shunts and clamps will be installed across each splice. The aim is to definitively exclude the possibility of a repeat of the incident of 19 September 2008. Besides this, each of the LHC experiments have extensive programs of maintenance and upgrades. Some of the key LHC systems (cryogenics, vacuum, quench protection system, electrical distribution, cooling, ventilation, access, and RF) will undergo m...

  4. Pulling the trigger on LHC electronics

    CERN Document Server

    CERN. Geneva

    2001-01-01

    The conditions at CERN's Large Hadron Collider pose severe challenges for the designers and builders of front-end, trigger and data acquisition electronics. A recent workshop reviewed the encouraging progress so far and discussed what remains to be done. The LHC experiments have addressed level one trigger systems with a variety of high-speed hardware. The CMS Calorimeter Level One Regional Trigger uses 160 MHz logic boards plugged into the front and back of a custom backplane, which provides point-to-point links between the cards. Much of the processing in this system is performed by five types of 160 MHz digital applications-specific integrated circuits designed using Vitesse submicron high-integration gallium arsenide gate array technology. The LHC experiments make extensive use of field programmable gate arrays (FPGAs). These offer programmable reconfigurable logic, which has the flexibility that trigger designers need to be able to alter algorithms so that they can follow the physics and detector perform...

  5. CERN LHC Technical Infrastructure Monitoring (TIM)

    CERN Document Server

    Epting, U; Martini, R; Sollander, P; Bartolomé, R; Vercoutter, B; Morodo-Testa, M C

    1999-01-01

    The CERN Large Hadron Collider (LHC) will start to deliver particles to its experiments in the year 2005. However, all the primary services such as electricity, cooling, ventilation, safety systems and others such as vacuum and cryogenics will be commissioned gradually between 2001 and 2005. This technical infrastructure will be controlled using industrial control systems, which have either already been purchased from specialized companies or are currently being put together for tender. This paper discusses the overall architecture and interfaces that will be used by the CERN Technical Control Room (TCR) to monitor the technical services at CERN and those of the LHC and its experiments. The issue of coherently integrating existing and future control systems over a period of five years with constantly evolving technology is addressed. The paper also summarizes the functionality of all the tools needed by the control room such as alarm reporting, data logging systems, man machine interfaces and the console mana...

  6. Test and performances of the RPC trigger chambers of the ATLAS experiment at LHC

    CERN Document Server

    Aielli, G; Ammosov, A; Biglietti, M; Brambilla, Elena; Camarri, P; Canale, V; Caprio, M A; Cardarelli, R; Carlino, G; Cataldi, G; Chiodini, G; Di Simone, A; Di Ciaccio, A; Della Volpe, D; De Asmundis, R; Della Pietra, M; Grancagnolo, F; Gorini, E; Iengo, P; Liberti, B; Patricelli, S; Perrino, R; Primavera, M; Santonico, R; Sehkniadze, G; Spagnolo, S; Sviridov, Yu; Zaetz, V G

    2004-01-01

    RPCs will be used as trigger detectors in the barrel region of the Muon Spectrometer of the ATLAS experiment at LHC. The total number of RPC units to be installed is 1088, covering a total surface of about 3500m**2. ATLAS RPCs work in avalanche mode with C//2H//2F //4/C//4H //1//0/SF//6 (94.7%/5%/0.3%) gas mixture. A cosmic ray test stand has been designed and built in Naples laboratories in order to carry out a complete test of the ATLAS RPC units. Since August 2002 about 300 units have been tested. A description of the test stand, test procedure and results are presented.

  7. The LHC at the AAAS

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    The American Association for the Advancement of Science held its annual meeting in the Walter E. Washington Convention Center in Washington D.C. last week.   Veteran science writer Tim Radford introduces LHC scientists during a media briefing at the AAAS annual meeting. Left to right: Felicitas Pauss, Tom LeCompte, Yves Schutz and Nick Hadley. As the world’s largest popular science meeting, the AAAS meeting is a major event in the calendar of science journalists.  At this year’s LHC session, CERN’s coordinator for international relations, Felicitas Pauss, opened the discussion, paving the way for Tom LeCompte of ATLAS, Joe Incandela of CMS, Yves Schutz of ALICE and Monica Pepe-Altarelli of LHCb to report on the status of the first year’s analysis from their experiments.    

  8. The CMS ECAL Upgrade for Precision Crystal Calorimetry at the HL-LHC

    CERN Document Server

    Petyt, David Anthony

    2018-01-01

    The electromagnetic calorimeter (ECAL) of the Compact Muon Solenoid Experiment (CMS) is operating at the Large Hadron Collider (LHC) in 2016 with proton-proton collisions at 13 TeV center-of-mass energy and at a bunch spacing of 25 ns. Challenging running conditions for CMS are expected after the High-Luminosity upgrade of the LHC (HL-LHC). We review the design and R and D studies for the CMS ECAL crystal calorimeter upgrade and present first test beam studies. Particular challenges at HL-LHC are the harsh radiation environment, the increasing data rates and the extreme level of pile-up events, with up to 200 simultaneous proton-proton collisions. We present test beam results of hadron irradiated PbWO$_{4}$ crystals up to fluences expected at the HL-LHC. We also report on the R and D for the new readout and trigger electronics, which must be upgraded due to the increased trigger and latency requirements at the HL-LHC.

  9. Open heart surgery at the LHC

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    On 17 January this year there was a race against time in the CMS cavern. In order to replace a faulty LHC component, members of the Vacuums, Surfaces and Coatings (VSC) Group, in collaboration with the CMS experiment team, had to extract and then reinsert a 2-m long section of vacuum chamber. And they had one hour to do it.   At the start of the LHC's winter technical stop, an X-ray was done to check the position of the RF fingers at Point 5. The X-ray at the top confirmed that the RF fingers (in the red circle) were not in the correct position, unlike on the lower picture. If the vacuum is insufficient, pressure mounts and the problems start. In the LHC the ideal pressure is around 10-10 mbar. Once this threshold is exceeded, the “noise”, which means the interference generated by the residual gas present in the machine, compromises physics measurements. In early summer 2011, a pressure a hundred times in excess of the ideal pressure was observed at the connec...

  10. Feasibility Studies for Quarkonium Production at a Fixed-Target Experiment Using the LHC Proton and Lead Beams (AFTER@LHC)

    International Nuclear Information System (INIS)

    Hadjidakis, C.; Kikola, D.; Massacrier, L.; Trzeciak, B.; Lansberg, J. P.; Fleuret, F.; Shao, H.-S.

    2015-01-01

    Being used in the fixed-target mode, the multi-TeV LHC proton and lead beams allow for studies of heavy-flavour hadroproduction with unprecedented precision at backward rapidities, far negative Feynman-x, using conventional detection techniques. At the nominal LHC energies, quarkonia can be studied in detail in p+p, p+d, and p+A collisions at √(s_N_N)≃115 GeV and in Pb + p and Pb + A collisions at √(s_N_N)≃72 GeV with luminosities roughly equivalent to that of the collider mode that is up to 20 fb"−"1 yr"−"1 in p+p and p+d collisions, up to 0.6 fb"−"1 yr"−"1 in p+A collisions, and up to 10 nb"−"1 yr"−"1 in Pb + A collisions. In this paper, we assess the feasibility of such studies by performing fast simulations using the performance of a LHCb-like detector.

  11. Feasibility studies for quarkonium production at a fixed-target experiment using the LHC proton and lead beams (AFTER@LHC)

    CERN Document Server

    Massacrier, L; Fleuret, F; Hadjidakis, C; Kikola, D; Lansberg, J P; Shao, H -S

    2015-01-01

    Used in the fixed-target mode, the multi-TeV LHC proton and lead beams allow for studies of heavy-flavour hadroproduction with unprecedented precision at backward rapidities - far negative Feyman-x - using conventional detection techniques. At the nominal LHC energies, quarkonia can be studies in detail in p+p, p+d and p+A collisions at sqrt(s_NN) ~ 115 GeV as well as in Pb+p and Pb+A collisions at sqrt(s_NN) ~ 72 GeV with luminosities roughly equivalent to that of the collider mode, i.e. up to 20 fb-1 yr-1 in p+p and p+d collisions, up to 0.6 fb-1 yr-1 in p+A collisions and up to 10 nb-1 yr-1 in Pb+A collisions. In this paper, we assess the feasibility of such studies by performing fast simulations using the performance of a LHCb-like detector.

  12. Vector meson production in the dimuon channel in the ALICE experiment at the LHC

    CERN Document Server

    Massacrier, L.

    2011-01-01

    The purpose of the ALICE experiment at the LHC is the study of the Quark Gluon Plasma (QGP) formed in ultra-relativistic heavy-ion collisions, a state of matter in which quarks and gluons are deconfined. The properties of this state of strongly-interacting matter can be accessed through the study of light vector mesons ($\\rho$, $\\omega$ and $\\phi$). Indeed, the strange quark content ($s\\bar{s}$) of the $\\phi$ meson makes its study interesting in connection with the strangeness enhancement observed in heavy-ion collisions. Moreover, $\\rho$ and $\\omega$ spectral function studies give information on chiral symmetry restoration. Vector meson production in pp collisions is important as a baseline for heavy-ion studies and for constraining hadronic models. We present results on light vector meson production obtained with the muon spectrometer of the ALICE experiment in pp collisions at $\\sqrt{s}$=7 TeV. Production ratios, integrated and differential cross sections for $\\phi$ and $\\omega$ are presented. Those result...

  13. ASD IC for the thin gap chambers in the LHC Atlas experiment

    International Nuclear Information System (INIS)

    Sasaki, Osamu; Yoshida, Mitsuhiro

    1999-01-01

    An amplifier-shaper-discriminator (ASD) chip was designed and built for Thin Gap Chambers in the forward muon trigger system of the LHC Atlas experiment. The ASD IC uses SONY Analog Master Slice bipolar technology. The IC contains 4 channels in a QFP48 package. The gain of its first stage (preamplifier) is approximately 0.8V/pC and output from the preamplifier is received by a shaper (main-amplifier) with a gain of 7. The baseline restoration circuit is incorporated in the main-amplifier. The threshold voltage for discriminator section is common to the 4 channels and their digital output level is LVDS-compatible. The IC also has analog output of the preamplifier. The equivalent noise charge at input capacitance of 150 pF is around 7,500 electrons. The power dissipation with LDVS outputs (100 Omega load) is 59mW/ch

  14. ASD IC for the thin gap chambers in the LHC ATLAS experiment

    CERN Document Server

    Sasaki, O

    1998-01-01

    An amplifier-shaper-discriminator (ASD) chip was designed and built for Thin Gap Chambers in the forward muon trigger system of the LHC ATLAS experiment. The ASD IC uses SONY Analog Master Slice bipolar technology. The IC contains 4 $9 channels in a QFP48 package. The gain of its first stage (preamplifier) is approximately 0.8 V/pC and output from the preamplifier is received by a shaper (main-amplifier) with a gain of 7. The baseline restoration circuit is $9 incorporated in the main-amplifier. The threshold voltage for the discriminator section is common to the 4 channels and their digital output level is LVDS-compatible. The IC also has analog output for the preamplifier. The equivalent $9 noise charge at input capacitance of 150 pF is around 7500 electrons. The power dissipation with LDVS outputs (100 Omega load) is 59 mW/ch. (8 refs).

  15. ASD IC for the thin gap chambers in the LHC ATLAS Experiment

    CERN Document Server

    Sasaki, O

    1999-01-01

    An amplifier-shaper-discriminator (ASD) chip was designed and built for Thin Gap Chambers in the forward muon trigger system of the LHC Atlas experiment. The ASD IC uses SONY Analog Master Slice bipolar technology. The IC contains 4 channels in a QFP48 package. The gain of its first stage (preamplifier) is approximately 0.8 V/pC and output from the preamplifier is received by a shaper (main-amplifier) with a gain of 7. The baseline restoration circuit is incorporated in the main-amplifier. The threshold voltage for discriminator section is common to the 4 channels and their digital output level is LVDS- compatible. The IC also has analog output of the preamplifier. The equivalent noise charge at input capacitance of 150 pF is around 7500 electrons. The power dissipation with LDVS outputs (100 Omega load) is 59 mW/ch.

  16. Results from the first heavy ion run at the LHC

    CERN Document Server

    Schukraft, J

    2012-01-01

    Early November 2010, the LHC collided for the first time heavy ions, Pb on Pb, at a centre-of-mass energy of 2.76 TeV/nucleon. This date marked both the end of almost 20 years of preparing for nuclear collisions at the LHC, as well as the start of a new era in ultra-relativistic heavy ion physics at energies exceeding previous machines by more than an order of magnitude. This contribution summarizes some of the early results from all three experiments participating in the LHC heavy ion program (ALICE, ATLAS, and CMS), which show that the high density matter created at the LHC, while much hotter and larger, still behaves like the very strongly interacting, almost perfect liquid discovered at RHIC. Some surprising and even puzzling results are seen in particle ratios, jet-quenching, and Quarkonia suppression observables. The overall experimental conditions at the LHC, together with its set of powerful and state-of-the-art detectors, should allow for precision measurements of quark-gluon-plasma parameters like v...

  17. An improved scattering routine for collimation tracking studies at LHC

    CERN Document Server

    Tambasco, Claudia; Salvachua Ferrando, Maria Belen; Cavoto, Gianluca

    The present Master thesis work has been carried out at CERN in the framework of the LHC (Large Hadron Collider) Collimation project. The LHC accelerates proton beams up to 7 TeV colliding in the experiment detectors installed in four points of the accelerator ring. The LHC is built to store a energy of 360MJ for each beam. The energy deposition induced by local beam losses could quench the superconducting magnets located around the accelerator beam pipes. To prevent and keep under control dangerous beam losses, an efficient collimation system is required. In addition, the achievable LHC beam intensity is related to the beam loss rate and, consequently, to the cleaning efficiency of the collimation system. Collimation studies at LHC are carried out also by means of simulations by using SixTrack, a dedicated simulation tool that tracks a large numbers of particles for many turns around the ring. The SixTrack code includes a scattering routine to model proton interactions with the material of the collimators j...

  18. Real-time data analysis at the LHC: present and future

    CERN Document Server

    Gligorov, V.V.

    2015-01-01

    The Large Hadron Collider (LHC), which collides protons at an energy of 14 TeV, produces hundreds of exabytes of data per year, making it one of the largest sources of data in the world today. At present it is not possible to even transfer most of this data from the four main particle detectors at the LHC to "offline" data facilities, much less to permanently store it for future processing. For this reason the LHC detectors are equipped with real-time analysis systems, called triggers, which process this volume of data and select the most interesting proton-proton collisions. The LHC experiment triggers reduce the data produced by the LHC by between 1/1000 and 1/100000, to tens of petabytes per year, allowing its economical storage and further analysis. The bulk of the data-reduction is performed by custom electronics which ignores most of the data in its decision making, and is therefore unable to exploit the most powerful known data analysis strategies. I cover the present status of real-time data analysis ...

  19. Nuclear suppression of J/Ψ: From RHIC to the LHC

    International Nuclear Information System (INIS)

    Kopeliovich, B.Z.; Potashnikova, I.K.; Schmidt, Ivan

    2011-01-01

    A parameter-free calculation for J/Ψ suppression in pA collisions, based on the dipole description, is confronted with the new data from the PHENIX experiment. Achieving good agreement, we employed this model predicting the contribution of initial state interactions (ISI) to J/Ψ suppression in AA collisions. Such a transition from pA to AA is not straightforward, since involves specific effects of double color filtering and boosting of the saturation scale. Relying on this refined ISI contribution, we updated the previous analysis of RHIC data on J/Ψ production in Cu-Cu and Au-Au collisions at √(s)=200 GeV, and determined the transport coefficient of the created dense medium at q-hat 0 =0.6 GeV 2 /fm. Nuclear effects for J/Ψ production at the LHC are predicted using the transport coefficient q-hat 0 =0.8 GeV 2 /fm, extracted from data on suppression of high-p T hadrons in central lead-lead collisions at √(s)=2.76 TeV. Our analysis covers only direct J/Ψ production, while data may also include the feed-down from decay of heavier states and B-mesons.

  20. A Virtual CAD Model of the LHC

    CERN Document Server

    Chemli, S; Messerli, R; Muttoni, Y; Prin, H; Van Uytvinck, E

    2000-01-01

    Integrating the large and complex LHC machine into the existing LEP tunnel is a major challenge. Space was not really a problem to fit the LEP machine into its tunnel, but LHC cryostats are much larger than the LEP quadrupoles and the external cryogenic line fills even more the tunnel. Space problems lead to small clearances. Possible conflicts, or at least the most penalising ones, between installed equipment or with transport, must be solved beforehand in order to avoid unacceptable delays and extra costs during the installation. Experience gained with LEP has already shown the help that Computer-Aided Engineering tools could provide for the integration. A virtual model of the LHC is presently prepared. The actual LEP tunnel, known with a quite good accuracy (centimetre level), has been modelled and all the elements of the machine constructed as 3D objects with the CAD system are positioned accurately on the basis of data generated from the theoretical definition. These layouts are used to generate the refe...

  1. Remote Inspection, Measurement and Handling for LHC

    CERN Document Server

    Kershaw, K; Coin, A; Delsaux, F; Feniet, T; Grenard, J L; Valbuena, R

    2007-01-01

    Personnel access to the LHC tunnel will be restricted to varying extents during the life of the machine due to radiation, cryogenic and pressure hazards. The ability to carry out visual inspection, measurement and handling activities remotely during periods when the LHC tunnel is potentially hazardous offers advantages in terms of safety, accelerator down time, and costs. The first applications identified were remote measurement of radiation levels at the start of shut-down, remote geometrical survey measurements in the collimation regions, and remote visual inspection during pressure testing and initial machine cool-down. In addition, for remote handling operations, it will be necessary to be able to transmit several real-time video images from the tunnel to the control room. The paper describes the design, development and use of a remotely controlled vehicle to demonstrate the feasibility of meeting the above requirements in the LHC tunnel. Design choices are explained along with operating experience to-dat...

  2. Heavy-ion performance of the LHC and future colliders

    CERN Document Server

    AUTHOR|(SzGeCERN)696614; Stahl, Achim; Jowett, John M

    2015-10-09

    In 2008 the Large Hadron Collider (LHC) and its experiments started operation at the European Centre of Nuclear Research (CERN) in Geneva with the main aim of finding or excluding the Higgs boson. Only four years later, on the 4th of July 2012, the discovery of a Higgs-like particle was proven and first published by the two main experiments ATLAS and CMS. Even though proton–proton collisions are the main operation mode of the LHC, it also acts as an heavy-ion collider. Here, the term “heavy-ion collisions” refers to the collision between fully stripped nuclei. While the major hardware system of the LHC is compatible with heavy-ion operation, the beam dynamics and performance limits of ion beams are quite different from those of protons. Because of the higher mass and charge of the ions, beam dynamic effects like intra-beam scattering and radiation damping are stronger. Also the electromagnetic cross-sections in the collisions are larger, leading to significantly faster intensity decay and thus shorter l...

  3. The great adventure of the LHC - From big bang to the Higgs boson

    International Nuclear Information System (INIS)

    Denegri, D.; Guyot, C.; Hoecker, A.; ); Roos, L.; Rubbia, C.

    2014-03-01

    This book presents what has been the biggest scientific equipment ever designed on earth: the LHC (large hadron collider) and its associated experiments (ATLAS, CMS, LHCb and ALICE) that led to the discovery of the Higgs boson in 2012. About 10.000 physicists and engineers from 50 countries have taken part into the project that began in 1989. This book is composed of the following chapters: 1) the standard model (SM) of particle physics, 2) the experimental success of SM, 3) the shortfalls of SM, 4) the new physics, 5) the original big bang, 6) the LHC, 7) particle detection, 8) ATLAS and CMS experiments, 9) the first data from LHC, 10) data analysis, 11) the quest for the Higgs boson, 12) the search for new physics, 13) LHCb and ALICE experiments, and 14) future prospects

  4. CERN data services for LHC computing

    Science.gov (United States)

    Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.

    2017-10-01

    Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.

  5. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    Science.gov (United States)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  6. Grain refinement of AZ31 by (SiC)P: Theoretical calculation and experiment

    International Nuclear Information System (INIS)

    Guenther, R.; Hartig, Ch.; Bormann, R.

    2006-01-01

    Grain refinement of gravity die-cast Mg-alloys can be achieved via two methods: in situ refinement by primary precipitated metallic or intermetallic phases, and inoculation of the melt via ceramic particles that remain stable in the melt due to their high thermodynamic stability. In order to clarify grain refinement mechanisms and optimize possible potent refiners in Mg-alloys, a simulation method for heterogeneous nucleation based on a free growth model has been developed. It allows the prediction of the grain size as a function of the particle size distribution, the volumetric content of ceramic inoculants, the cooling rate and the alloy constitution. The model assumptions were examined experimentally by a study of the grain refinement of (SiC) P in AZ31. Additions of (SiC) P result in significant grain refinement, if appropriate parameters for ceramic particles are chosen. The model makes quantitatively correct predictions for the grain size and its variation with cooling rate

  7. Heavy Ion Physics with the ATLAS Detector at the LHC

    International Nuclear Information System (INIS)

    Trzupek, A.

    2009-01-01

    The heavy-ion program at LHC will be pursued by three experiments including ATLAS, a multipurpose detector to study p + p collisions. A report on the potential of the ATLAS detector to uncover new physics in Pb + Pb collisions at energies thirty times larger than energy available at RHIC will be presented. Key aspects of the heavy-ion program of the ATLAS experiment, implied by measurements at RHIC, will be discussed. They include measurement capability of high-p T hadronic and electromagnetic probes, quarkonia as well as elliptic flow and other bulk phenomena. Measurements by the ATLAS experiment will provide crucial information about the formation of a quark-gluon plasma at the new energy scale accessible at the LHC. (author)

  8. Radiation background with the CMS RPCs at the LHC

    CERN Document Server

    Costantini, Silvia; Cai, J.; Li, Q.; Liu, S.; Qian, S.; Wang, D.; Xu, Z.; Zhang, F.; Choi, Y.; Goh, J.; Kim, D.; Choi, S.; Hong, B.; Kang, J.W.; Kang, M.; Kwon, J.H.; Lee, K.S.; Lee, S.K.; Park, S.K.; Pant, L.M.; Mohanty, A.K.; Chudasama, R.; Singh, J.B.; Bhatnagar, V.; Mehta, A.; Kumar, R.; Cauwenbergh, S.; Cimmino, A.; Crucy, S.; Fagot, A.; Garcia, G.; Ocampo, A.; Poyraz, D.; Salva, S.; Thyssen, F.; Tytgat, M.; Zaganidis, N.; Doninck, W.V.; Cabrera, A.; Chaparro, L.; Gomez, J.P.; Gomez, B.; Sanabria, J.C.; Avila, C.; Ahmad, A.; Muhammad, S.; Shoaib, M.; Hoorani, H.; Awan, I.; Ali, I.; Ahmed, W.; Asghar, M.I.; Shahzad, H.; Sayed, A.; Ibrahim, A.; Aly, S.; Assran, Y.; Radi, A.; Elkafrawy, T.; Sharma, A.; Colafranceschi, S.; Abbrescia, M.; Calabria, C.; Colaleo, A.; Iaselli, G.; Loddo, F.; Maggi, M.; Nuzzo, S.; Pugliese, G.; Radogna, R.; Venditti, R.; Verwilligen, P.; Benussi, L.; Bianco, S.; Piccolo, D.; Paolucci, P.; Buontempo, S.; Cavallo, N.; Merola, M.; Fabozzi, F.; Iorio, O.M.; Braghieri, A.; Montagna, P.; Riccardi, C.; Salvini, P.; Vitulo, P.; Vai, I.; Magnani, A.; Dimitrov, A.; Litov, L.; Pavlov, B.; Petkov, P.; Aleksandrov, A.; Genchev, V.; Iaydjiev, P.; Rodozov, M.; Sultanov, G.; Vutova, M.; Stoykova, S.; Hadjiiska, R.; Ibargüen, H.S.; Morales, M.I.P.; Bernardino, S.C.; Bagaturia, I.; Tsamalaidze, Z.; Crotty, I.; Kim, M.S.

    2015-05-28

    The Resistive Plate Chambers (RPCs) are employed in the CMS experiment at the LHC as dedicated trigger system both in the barrel and in the endcap. This note presents results of the radiation background measurements performed with the 2011 and 2012 proton-proton collision data collected by CMS. Emphasis is given to the measurements of the background distribution inside the RPCs. The expected background rates during the future running of the LHC are estimated both from extrapolated measurements and from simulation.

  9. Plans for Deployment of Hollow Electron Lenses at the LHC for Enhanced Beam Collimation

    Energy Technology Data Exchange (ETDEWEB)

    Redaelli, S. [CERN; Bertarelli, A. [CERN; Bruce, R. [CERN; Perini, D. [CERN; Rossi, A. [CERN; Salvachua, B. [CERN; Stancari, G. [Fermilab; Valishev, A. [Fermilab

    2015-06-01

    Hollow electron lenses are considered as a possible means to improve the LHC beam collimation system, providing active control of halo diffusion rates and suppressing the population of transverse halos. After a very successful experience at the Tevatron, a conceptual design of a hollow e-lens optimized for the LHC was produced. Recent further studies have led to a mature preliminary technical design. In this paper, possible scenarios for the deployment of this technology at the LHC are elaborated in the context of the scheduled LHC long shutdowns until the full implementation of the HL-LHC upgrade in 2023. Possible setups of electron beam test stands at CERN and synergies with other relevant electron beam programmes are also discussed.

  10. LHC collimator controls for a safe LHC operation

    International Nuclear Information System (INIS)

    Redaelli, S.; Assmann, R.; Losito, R.; Donze, M.; Masi, A.

    2012-01-01

    The Large Hadron Collider (LHC) collimation system is designed to protect the machine against beam losses and consists of 108 collimators, 100 of which are movable, located along the 27 km long ring and in the transfer lines. The cleaning performance and machine protection role of the system depend critically on accurate jaw positioning. A fully redundant control system has been developed to ensure that the collimators dynamically follow optimum settings in all phases of the LHC operational cycle. Jaw positions and collimator gaps are interlocked against dump limits defined redundantly as functions of time, beam energy and the β functions, which describe the focusing property of the beams. In this paper, the architectural choices that guarantee a safe LHC operation are presented. Hardware and software implementations that ensure the required performance are described. (authors)

  11. Irradiation of a very forward calorimeter in the LHC environment: Some consequences

    International Nuclear Information System (INIS)

    Ferrando, A.; Josa, M. I.; Malinin, A.; Martinez-Laso, L.; Pojidaev, V.; Salicio, J. M.

    1994-01-01

    We have computed the level of irradiation in the very forward region (2.5 < | η | < 4.7) 4.7) of an LHC experiment, using the proposed CMS (Compact Solenoidal Detector for LHC) setup. Information about the induced radioactivity in the absorber of a proposed iron/gas Very Forward Calorimeter has been extracted. (Author) 11 refs

  12. Irradiation of a very forward calorimeter in the LHC environment: Some consequences

    Energy Technology Data Exchange (ETDEWEB)

    Ferrando, A.; Josa, M. I.; Malinin, A.; Martinez-Laso, L.; Pojidaev, V.; Salicio, J. M.

    1994-07-01

    We have computed the level of irradiation in the very forward region (2.5 < | {eta} | < 4.7) 4.7) of an LHC experiment, using the proposed CMS (Compact Solenoidal Detector for LHC) setup. Information about the induced radioactivity in the absorber of a proposed iron/gas Very Forward Calorimeter has been extracted. (Author) 11 refs.

  13. The QuarkNet CMS masterclass: bringing the LHC to students

    Science.gov (United States)

    Cecire, Kenneth; McCauley, Thomas

    2016-04-01

    QuarkNet is an educational program which brings high school teachers and their students into the particle physics research community. The program supports research experiences and professional development workshops and provides inquiry-oriented investigations, some using real experimental data. The CMS experiment at the LHC has released several thousand proton-proton collision events for use in education and outreach. QuarkNet, in collaboration with CMS, has developed a physics masterclass and e-Lab based on this data. A masterclass is a day-long educational workshop where high school students travel to nearby universities and research laboratories. There they learn from LHC physicists about the basics of particle physics and detectors. They then perform a simple measurement using LHC data, and share their results with other students around the world via videoconference. Since 2011 thousands of students from over 25 countries have participated in the CMS masterclass as organized by QuarkNet and the International Particle Physics Outreach Group (IPPOG).We describe here the masterclass exercise: the physics, the online event display and database preparation behind it, the measurement the students undertake, their results and experiences, and future plans for the exercise.

  14. Operating the worldwide LHC computing grid: current and future challenges

    International Nuclear Information System (INIS)

    Molina, J Flix; Forti, A; Girone, M; Sciaba, A

    2014-01-01

    The Wordwide LHC Computing Grid project (WLCG) provides the computing and storage resources required by the LHC collaborations to store, process and analyse their data. It includes almost 200,000 CPU cores, 200 PB of disk storage and 200 PB of tape storage distributed among more than 150 sites. The WLCG operations team is responsible for several essential tasks, such as the coordination of testing and deployment of Grid middleware and services, communication with the experiments and the sites, followup and resolution of operational issues and medium/long term planning. In 2012 WLCG critically reviewed all operational procedures and restructured the organisation of the operations team as a more coherent effort in order to improve its efficiency. In this paper we describe how the new organisation works, its recent successes and the changes to be implemented during the long LHC shutdown in preparation for the LHC Run 2.

  15. Towards automated crystallographic structure refinement with phenix.refine

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V., E-mail: pafonine@lbl.gov; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel; Headd, Jeffrey J.; Moriarty, Nigel W. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Mustyakimov, Marat; Terwilliger, Thomas C. [Los Alamos National Laboratory, M888, Los Alamos, NM 87545 (United States); Urzhumtsev, Alexandre [CNRS–INSERM–UdS, 1 Rue Laurent Fries, BP 10142, 67404 Illkirch (France); Université Henri Poincaré, Nancy 1, BP 239, 54506 Vandoeuvre-lès-Nancy (France); Zwart, Peter H. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); Adams, Paul D. [Lawrence Berkeley National Laboratory, One Cyclotron Road, MS64R0121, Berkeley, CA 94720 (United States); University of California Berkeley, Berkeley, CA 94720 (United States)

    2012-04-01

    phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods. phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods.

  16. Lecture 7: Worldwide LHC Computing Grid Overview

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This presentation will introduce in an informal, but technically correct way the challenges that are linked to the needs of massively distributed computing architectures in the context of the LHC offline computing. The topics include technological and organizational aspects touching many aspects of LHC computing, from data access, to maintenance of large databases and huge collections of files, to the organization of computing farms and monitoring. Fabrizio Furano holds a Ph.D in Computer Science and has worked in the field of Computing for High Energy Physics for many years. Some of his preferred topics include application architectures, system design and project management, with focus on performance and scalability of data access. Fabrizio has experience in a wide variety of environments, from private companies to academic research in particular in object oriented methodologies, mainly using C++. He has also teaching experience at university level in Software Engineering and C++ Programming.

  17. LHC beam energy in 2012

    International Nuclear Information System (INIS)

    Siemko, A.; Charifouline, Z.; Dahlerup-Petersen, K.; Denz, R.; Ravaioli, E.; Schmidt, R.; Verweij, A.

    2012-01-01

    The interconnections between the LHC main magnets are made of soldered joints (splices) of two superconducting cables stabilized by a copper bus bar. The measurements performed in 2009 in the whole machine, in particular in sector 3-4 during the repair after the 2008 accident, demonstrated that there is a significant fraction of defective copper bus bar joints in the machine. In this paper, the limiting factors for operating the LHC at higher energies with defective 13 kA bus bar joints are briefly reviewed. The experience gained during the 2011 run, including the quench statistics and dedicated quench propagation tests impacting on maximum safe energy are presented. The impact of the by-pass diode contact resistance issue is also addressed. Finally, a proposal for running at the highest possible safe energy compatible with the pre-defined risk level is presented. (authors)

  18. LHC Beam Energy in 2012

    CERN Document Server

    Siemko, A; Dahlerup-Petersen, K; Denz, R; Ravaioli, E; Schmidt, R; Verweij, A

    2012-01-01

    The interconnections between the LHC main magnets are made of soldered joints (splices) of two superconducting cables stabilized by a copper bus bar. The measurements performed in 2009 in the whole machine, in particular in sector 3-4 during the repair after the 2008 accident, demonstrated that there is a significant fraction of defective copper bus bar joints in the machine. In this paper, the limiting factors for operating the LHC at higher energies with defective 13 kA bus bar joints are briefly reviewed. The experience gained during the 2011 run, including the quench statistics and dedicated quench propagation tests impacting on maximum safe energy are presented. The impact of the by-pass diode contact resistance issue is also addressed. Finally, a proposal for running at the highest possible safe energy compatible with the pre-defined risk level is presented.

  19. Progress with Long-Range Beam-Beam Compensation Studies for High Luminosity LHC

    Energy Technology Data Exchange (ETDEWEB)

    Rossi, Adriana; et al.

    2017-05-01

    Long-range beam-beam (LRBB) interactions can be a source of emittance growth and beam losses in the LHC during physics and will become even more relevant with the smaller '* and higher bunch intensities foreseen for the High Luminosity LHC upgrade (HL-LHC), in particular if operated without crab cavities. Both beam losses and emittance growth could be mitigated by compensat-ing the non-linear LRBB kick with a correctly placed current carrying wire. Such a compensation scheme is currently being studied in the LHC through a demonstration test using current-bearing wires embedded into col-limator jaws, installed either side of the high luminosity interaction regions. For HL-LHC two options are considered, a current-bearing wire as for the demonstrator, or electron lenses, as the ideal distance between the particle beam and compensating current may be too small to allow the use of solid materials. This paper reports on the ongoing activities for both options, covering the progress of the wire-in-jaw collimators, the foreseen LRBB experiments at the LHC, and first considerations for the design of the electron lenses to ultimately replace material wires for HL-LHC.

  20. Contribution to the gamma calibration by the radiative decay Z → μμγ, in the CMS experiment at LHC (CERN)

    International Nuclear Information System (INIS)

    Baty, C.

    2009-11-01

    The LHC has started to take data since november 2009. This opened a new era of discovery in particle physics. The CMS detector is one of the main experiment at the LHC (CERN). One goal of this experiment is the Higgs's boson discovery, that can be related to the electroweak symmetry breaking. After a contextual position of the LHC and CMS within the nowadays' particle physics, I will explain the whole chain allowing to go from the physical event to the final analysis, in order to extract the reconstructed particles and the information allowing us, at the end, to discover new particles like the Higgs's boson. The first part of this work was about the measurement and the study of the acquisition electronics gains-ratios. This work aimed at having a precise measurement of the photons energy on the whole available energy band (35 MeV -> 1.7 TeV). In particular, this work deals with the validation of the different calibration methods for the VFE acquisition cards within the detector. A second part of my work was about the way that we have to generate the physics events avoiding double-counting between photons coming from matrix-element generators and those coming from parton-shower algorithms. An anti-double-counting veto has been created. Finally the last part of the work was about the way the radiative decay of the Z 0 neutral electroweak gauge boson allow, by the selection of certified photons, the extraction of the photons energy scale inside the electromagnetic calorimeter of CMS. (author)

  1. First years experience of LHC Beam Instrumentation

    CERN Document Server

    Jones, O R

    2011-01-01

    The LHC is equipped with a full suite of sophisticated beam instrumentation which has been essential for rapid commissioning, the safe increase in total stored beam power and the understanding of machine optics and accelerator physics phenomena. This paper will comment on all of these systems and on their contributions to the various stages of beam commissioning. It will include details on: the beam position system and its use for realtime global orbit feedback; the beam loss system and its role in machine protection; total and bunch by bunch intensity measurements; tune measurement and feedback; synchrotron light diagnostics for transverse beam size measurements, abort gap monitoring and longitudinal density measurements. Issues and problems encountered along the way will also be discussed together with the prospect for future upgrades.

  2. the CMS Experiment at the HL-LHC

    Directory of Open Access Journals (Sweden)

    Pozzobon Nicola

    2016-01-01

    Full Text Available A major upgrade of the readout and trigger electronics of the CMS Drift Tubes muon detector is foreseen in order to allow its efficient operation at the High Luminosity LHC. A proposal for a new L1 Trigger Primitives Generator for this detector is presented, featuring an algorithm operating on the time of charge collection measurements provided by the asynchronous readout of the new TDC system being developed. The algorithm is being designed around the implementation in state-of-the-art FPGA devices of the original development of a Compact Hough Transform (CHT algorithm combined with a Majority Mean-Timer, to identify both the parent bunch crossing and the muon track parameters. The current state of the design is presented along with the performance requirements, focusing on the future developments.

  3. Single Event Burnout in DC-DC Converters for the LHC Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Claudio H. Rivetta et al.

    2001-09-24

    High voltage transistors in DC-DC converters are prone to catastrophic Single Event Burnout in the LHC radiation environment. This paper presents a systematic methodology to analyze single event effects sensitivity in converters and proposes solutions based on de-rating input voltage and output current or voltage.

  4. Past Experiences and Future Trends on Vertex Detector Cooling at LHC

    CERN Document Server

    Petagna, Paolo

    2014-01-01

    Substantially different approaches have been ad opted for the refrigeration plants of the first generation of vertex detectors at LHC: those of ALICE, ATLAS and CMS use PFC fluids, either in single phase or in a traditional Joule-Thomson cycle, while carbon dioxide in a pumped two-phase loop has been selected for the LHCb VELO. For what concerns the on-board thermal management of the sensors and related electronics, a traditional design has been followed, based on a common general approach and only differing in the specific choices related to the local configuration. Although the global performance of the detectors in this first phase of LHC operation can be claimed as fully satisfactory, it appears that the additional challenges posed by the coming upgrade phases can only be tackled through an effort on technology innovation and, in particular on much stronger and earlier integration of all the cooling-related aspects in the detector conception. Carbon dioxide seems to be the preferred choice for the refrige...

  5. An FPGA based track finder for the L1 trigger of the CMS experiment at the High Luminosity LHC

    CERN Document Server

    Tomalin, Ian; Ball, Fionn Amhairghen; Balzer, Matthias Norbert; Boudoul, Gaelle; Brooke, James John; Caselle, Michele; Calligaris, Luigi; Cieri, Davide; Clement, Emyr John; Dutta, Suchandra; Hall, Geoffrey; Harder, Kristian; Hobson, Peter; Iles, Gregory Michiel; James, Thomas Owen; Manolopoulos, Konstantinos; Matsushita, Takashi; Morton, Alexander; Newbold, David; Paramesvaran, Sudarshan; Pesaresi, Mark Franco; Pozzobon, Nicola; Reid, Ivan; Rose, A. W; Sander, Oliver; Shepherd-Themistocleous, Claire; Shtipliyski, Antoni; Schuh, Thomas; Skinnari, Louise; Summers, Sioni Paris; Tapper, Alexander; Thea, Alessandro; Uchida, Kirika; Vichoudis, Paschalis; Viret, Sebastien; Weber, M; Aggleton, Robin Cameron

    2017-12-14

    A new tracking detector is under development for use by the CMS experiment at the High-Luminosity LHC (HL-LHC). A crucial requirement of this upgrade is to provide the ability to reconstruct all charged particle tracks with transverse momentum above 2-3 GeV within 4$\\mu$s so they can be used in the Level-1 trigger decision. A concept for an FPGA-based track finder using a fully time-multiplexed architecture is presented, where track candidates are reconstructed using a projective binning algorithm based on the Hough Transform, followed by a combinatorial Kalman Filter. A hardware demonstrator using MP7 processing boards has been assembled to prove the entire system functionality, from the output of the tracker readout boards to the reconstruction of tracks with fitted helix parameters. It successfully operates on one eighth of the tracker solid angle acceptance at a time, processing events taken at 40 MHz, each with up to 200 superimposed proton-proton interactions, whilst satisfying the latency requirement. ...

  6. Silicon Detectors for the sLHC - an Overview of Recent RD50 Results

    CERN Document Server

    Pellegrini, Giulio

    2009-01-01

    It is foreseen to significantly increase the luminosity of the Large Hadron Collider(LHC) at CERN around 2018 by upgrading the LHC towards the sLHC (Super-LHC). Due to the radiation damage to the silicon detectors used, the physics experiment will require new tracking detectors for sLHC operation. All-silicon central trackers are being studied in ATLAS, CMS and LHCb, with extremely radiation hard silicon sensors on the innermost layers. The radiation hardness of these new sensors must surpass the one of LHC detectors by roughly an order of magnitude. Within the CERN RD50 collaboration, a massive R&D programme is underway to develop silicon sensors with sufficient radiation tolerance. Among the R&D topics are the development of new sensor types like 3D silicon detectors designed for the extreme radiation levels of the sLHC. We will report on the recent results obtained by RD50 from tests of several detector technologies and silicon materials at radiation levels corresponding to SLHC fluences. Based on ...

  7. The CMS ECAL Upgrade for Precision Crystal Calorimetry at the HL-LHC

    CERN Document Server

    Jofrehei, Arash

    2017-01-01

    The Compact Muon Solenoid Experiment (CMS) is operating at the Large Hadron Collider (LHC) with proton-proton collisions at 13 TeV center-of-mass energy and at a bunch spacing of 25 ns. Challenging running conditions for CMS are expected after the High-Luminosity upgrade of the LHC (HL-LHC). We review the CMS ECAL crystal calorimeter upgrade and present results from the first test beam studies. Particular challenges at HL-LHC are the harsh radiation environment, the increasing data rates and the extreme level of pile-up events, with up to 200 simultaneous proton-proton collisions. Precision timing can be exploited to reduce the effect of the pile-up. We report on the timing resolution studies performed with test-beams. We discuss the new readout and trigger electronics, which must be upgraded due to the increased trigger and latency requirements at the HL-LHC.

  8. Open access to high-level data and analysis tools in the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Calderon, A; Rodriguez-Marrero, A; Colling, D; Huffman, A; Lassila-Perini, K; McCauley, T; Rao, A; Sexton-Kennedy, E

    2015-01-01

    The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display and histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data; example code is provided. We describe the accompanying tools and documentation and discuss the first experiences of data use. (paper)

  9. A browser-based event display for the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Hategan, M; McCauley, T; Nguyen, P

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  10. Trigger and data-acquisition challenges at the LHC

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    We review the main requirements placed on the Trigger and Data Acquisition (DAQ systems of the LHC experiments by their rich physics program and the LHC environment. A description of the architecture of the various systems, the motivation of each alternative and the conceptual design of each filtering stage will be discussed. We will then turn to a description of the major elements of the three distinct sub-systems, namely the Level-1 trigger, the DAQ with particular attention to the Event-Building and overall control and monitor, and finally the High-Level trigger system and the online farms.

  11. Physics validation of detector simulation tools for LHC

    International Nuclear Information System (INIS)

    Beringer, J.

    2004-01-01

    Extensive studies aimed at validating the physics processes built into the detector simulation tools Geant4 and Fluka are in progress within all Large Hardon Collider (LHC) experiments, within the collaborations developing these tools, and within the LHC Computing Grid (LCG) Simulation Physics Validation Project, which has become the primary forum for these activities. This work includes detailed comparisons with test beam data, as well as benchmark studies of simple geometries and materials with single incident particles of various energies for which experimental data is available. We give an overview of these validation activities with emphasis on the latest results

  12. HL-LHC alternatives

    CERN Document Server

    Tomás, R; White, S

    2014-01-01

    The HL-LHC parameters assume unexplored regimes for hadron colliders in various aspects of accelerator beam dynamics and technology. This paper reviews three alternatives that could potentially improve the LHC performance: (i) the alternative filling scheme 8b+4e, (ii) the use of a 200 MHz RF system in the LHC and (iii) the use of proton cooling methods to reduce the beam emittance (at top energy and at injection). The alternatives are assessed in terms of feasibility, pros and cons, risks versus benefits and the impact on beam availability.

  13. The super-LHC

    CERN Document Server

    Mangano, Michelangelo L

    2010-01-01

    We review here the prospects of a long-term upgrade programme for the Large Hadron Collider (LHC), CERN laboratory's new proton-proton collider. The super-LHC, which is currently under evaluation and design, is expected to deliver of the order of ten times the statistics of the LHC. In addition to a non-technical summary of the principal physics arguments for the upgrade, I present a pedagogical introduction to the technological challenges on the accelerator and experimental fronts, and a review of the current status of the planning.

  14. Scientific highlights from ATLAS – LHC Run 2

    CERN Document Server

    Barr, Alan; The ATLAS collaboration

    2017-01-01

    We review the recent progress made at the ATLAS experiment at the LHC, concentrating particularly in the scalar sector, and on searches for new particles. Slides for invited plenary talk at Scalars 2017, Warsaw, 30 November 2017

  15. LHC Supertable

    International Nuclear Information System (INIS)

    Pereira, M.; Lahey, T.E.; Lamont, M.; Mueller, G.J.; Teixeira, D.D.; McCrory, E.S.

    2012-01-01

    LHC operations generate enormous amounts of data. This data is being stored in many different databases. Hence, it is difficult for operators, physicists, engineers and management to have a clear view on the overall accelerator performance. Until recently the logging database, through its desktop interface TIMBER, was the only way of retrieving information on a fill-by-fill basis. The LHC Supertable has been developed to provide a summary of key LHC performance parameters in a clear, consistent and comprehensive format. The columns in this table represent main parameters that describe the collider operation such as luminosity, beam intensity, emittance, etc. The data is organized in a tabular fill-by-fill manner with different levels of detail. Particular emphasis was placed on data sharing by making data available in various open formats. Typically the contents are calculated for periods of time that map to the accelerator's states or beam modes such as Injection, Stable Beams, etc. Data retrieval and calculation is triggered automatically after the end of each fill. The LHC Supertable project currently publishes 80 columns of data on around 100 fills. (authors)

  16. Track Finding for the Level-1 Trigger of the CMS Experiment

    CERN Document Server

    James, Thomas Owen

    2017-01-01

    A new tracking system is under development for the CMS experiment at the High Luminosity LHC (HL-LHC), located at CERN. It includes a silicon tracker that will correlate clusters in two closely spaced sensor layers, for the rejection of hits from low transverse momentum tracks. This will allow tracker data to be read out to the Level-1 trigger at 40\\,MHz. The Level-1 track-finder must be able to identify tracks with transverse momentum above 2--3\\,$\\mathrm{GeV}/c$ within latency constraints. A concept for an FPGA-based track finder using a fully time-multiplexed architecture is presented, where track candidates are identified using a Hough Transform, and then refined with a Kalman Filter. Both steps are fully implemented in FPGA firmware. A hardware system built from MP7 MicroTCA processing cards has been assembled, which demonstrates a realistic slice of the track finder in order to help gauge the performance and requirements for a final system.

  17. A 120 mm Bore Quadrupole for the Phase 1 LHC Upgrade

    CERN Document Server

    Fessia, P; Borgnolutti, F; Regis, F; Richter, D; Todesco, E

    2010-01-01

    The phase I LHC upgrade foresees the installation of a new final focusing for the high luminosity experiences in order to be able to focus the beams in the interaction points to b*~ 0.25 cm. Key element of this upgrade is a large bore (120 mm) superconducting quadrupole. This article proposes a magnet design that will make use of the LHC main dipole superconducting cable. Due to the schedule constraints and to the budget restrictions, it is mandatory to integrate in the design the maximum number of features successfully used during the LHC construction. This paper presents this design option and the rationales behind the several technical choices.

  18. Quench Heater Experiments on the LHC Main Superconducting Magnets

    OpenAIRE

    Rodríguez-Mateos, F; Pugnat, P; Sanfilippo, S; Schmidt, R; Siemko, A; Sonnemann, F

    2000-01-01

    In case of a quench in one of the main dipoles and quadrupoles of CERN's Large Hadron Collider (LHC), the magnet has to be protected against excessive temperatures and high voltages. In order to uniformly distribute the stored magnetic energy in the coils, heater strips installed in the magnet are fired after quench detection. Tests of different quench heater configurations were performed on various 1 m long model and 15 m long prototype dipole magnets, as well as on a 3 m long prototype quad...

  19. Optimization of the powering tests of the LHC superconducting circuits

    CERN Document Server

    Bellesia, B; Denz, R; Fernandez-Robles, C; Pojer, M; Saban, R; Schmidt, R; Solfaroli Camillocci, M; Thiesen, H; Vergara Fernández, A

    2010-01-01

    The Large Hadron Collider has (LHC) 1572 superconducting circuits which are distributed along the eight 3.5 km LHC sectors [1]. Time and resources during the commissioning of the LHC technical systems were mostly consumed by the powering tests of each circuit. The tests consisted in carrying out several powering cycles at different current levels for each superconducting circuit. The Hardware Commissioning Coordination was in charge of planning, following up and piloting the execution of the test program. The first powering test campaign was carried out in summer 2007 for sector 7-8 with an expected duration of 12 weeks. The experience gained during these tests was used by the commissioning team for minimising the duration of the following powering campaigns to comply with the stringent LHC project deadlines. Improvements concerned several areas: strategy, procedures, control tools, automatization, and resource allocation led to an average daily test rate increase from 25 to 200 tests per day. This paper desc...

  20. Potential of stochastic cooling of heavy ions in the LHC

    CERN Document Server

    Schaumann, M; Blaskiewicz, M

    2013-01-01

    The dynamics of the high intensity lead beams in the LHC are strongly influenced by intra-beam scattering (IBS), leading to significant emittance growth and particle losses at all energies. Particle losses during collisions are dominated by nuclear electromagnetic processes and the debunching effect arising from the influence of IBS, resulting in a non-exponential intensity decay during the fill and short luminosity lifetimes. In the LHC heavy ion runs, 3 experiments will be taking data and the average fill duration will be rather short as a consequence of the high burn-off rate. The achievements with stochastic cooling at RHIC suggest that such a system at LHC could substantially reduce the emittance growth and the debunching component during injection and collisions. The luminosity lifetime and fill length could be improved to optimize the use of the limited run time of 4 weeks per year. This paper discusses the first results of a feasibility study to use stochastic cooling on the lead ion beams in the LHC....

  1. Very forward measurements at the LHC

    CERN Document Server

    Berretti, Mirko

    2017-01-01

    In this talk we present a selection of forward physics results recently obtained with the run-1 and run-2 LHC data by the CMS, LHCf and TOTEM experiments. The status of the very forward LHC proton spectrometer, CT-PPS, is discussed: emphasis is given to the physics potential of CT-PPS and to the analyses that are currently ongoing with the data collected in 2016. Very recent forward measurements obtained with the LHCf and the CMS-CASTOR calorimeter are then addressed. In particular, CMS measured the inclusive energy spectrum in the very forward direction for proton-proton collisions at a center-of-mass energy of 13 TeV and the jet cross sections for p+Pb collisions at 5.02 TeV. The LHCf experiment has instead recently published the inclusive energy spectra of forward photons for pp collisions at 13 TeV. Finally, the new measurements of the total, elastic and inelastic cross sections obtained by the TOTEM collaboration at 2.76 and 13 TeV center of mass energy are presented.

  2. The online muon identification of the ATLAS experiment at the LHC

    CERN Document Server

    Bernard, C; The ATLAS collaboration

    2014-01-01

    Identifying muons in the busy LHC environment is an important challenge for the ATLAS detector. This paper gives an overview of the ATLAS three-level muon trigger system, summarizing the online performance. In particular it discusses processing time and trigger rates as well as efficiency, resolution and other general performance figures.

  3. LHC-GCS a model-driven approach for automatic PLC and SCADA code generation

    CERN Document Server

    Thomas, Geraldine; Barillère, Renaud; Cabaret, Sebastien; Kulman, Nikolay; Pons, Xavier; Rochez, Jacques

    2005-01-01

    The LHC experiments’ Gas Control System (LHC GCS) project [1] aims to provide the four LHC experiments (ALICE, ATLAS, CMS and LHCb) with control for their 23 gas systems. To ease the production and maintenance of 23 control systems, a model-driven approach has been adopted to generate automatically the code for the Programmable Logic Controllers (PLCs) and for the Supervision Control And Data Acquisition (SCADA) systems. The first milestones of the project have been achieved. The LHC GCS framework [4] and the generation tools have been produced. A first control application has actually been generated and is in production, and a second is in preparation. This paper describes the principle and the architecture of the model-driven solution. It will in particular detail how the model-driven solution fits with the LHC GCS framework and with the UNICOS [5] data-driven tools.

  4. Engineering status of the superconducting end cap toroid magnets for the ATLAS experiment at LHC

    CERN Document Server

    Baynham, D Elwyn; Carr, F S; Courthold, M J D; Cragg, D A; Densham, C J; Evans, D; Holtom, E; Rochford, J; Sole, D; Towndrow, Edwin F; Warner, G P

    2000-01-01

    The ATLAS experiment at LHC, CERN will utilise a large, superconducting, air-cored toroid magnet system for precision muon measurements. The magnet system will consist of a long barrel and two end-cap toroids. Each end-cap toroid will contain eight racetrack coils mounted as a single cold mass in cryostat vessel of ~10 m diameter. The project has now moved from the design/specification stage into the fabrication phase. This paper presents the engineering status of the cold masses and vacuum vessels that are under fabrication in industry. Final designs of cold mass supports, cryogenic systems and control/protection systems are presented. Planning for toroid integration, test and installation is described. (3 refs).

  5. HL-LHC updates in Japan

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    At a recent meeting in Japan, updates on the High Luminosity LHC (HL-LHC) project were presented, including the progress made so far and the deadlines still to be met for the upgraded machine to be operational from 2020.   New magnets made with advanced superconductor Nb3Sn in the framework of the HL-LHC project. These magnets are currently under construction at CERN by the TE-MSC group. The LHC is the world’s most powerful particle accelerator, and in 2015 it will reach yet another new record for the energy of its colliding beams. One key factor of its discovery potential is its ability to produce collisions described in mathematical terms by the parameter known as “luminosity”. In 2025, the HL-LHC project will allow the total number of collisions in the LHC to increase by a factor of 10. The first step in this rich upgrade programme is the delivery of the Preliminary Design Report (PDR), which is also a key milestone of the HiLumi LHC Design Study partly fund...

  6. Proposal to negotiate an amendment to an existing blanket purchase contract for the supply and repair of subracks for the LHC experiments

    CERN Document Server

    2006-01-01

    This document concerns the proposal to negotiate an amendment to an existing blanket purchase contract for the supply and repair of subracks for the LHC experiments. For the reasons explained in this document, the Finance Committee is invited to agree to the negotiation of an amendment to the blanket purchase contract for the supply and repair of subracks for the LHC experiments with the company WIENER, PLEIN & BAUS (DE), for an extension of the period of supply from four to six years for an amount exceeding the previously authorised amount of 5 600 000 euros, subject to revision for inflation from January 2003, by up to 1 880 000 euros, subject to revision for inflation, bringing the total amount of the blanket purchase contract to a maximum amount of 7 480 000 euros, subject to revision for inflation. At the present rate of exchange, the total amended amount of the blanket purchase contract is equivalent to approximately 11 800 000 Swiss francs. CERN's total financial contribution to the funding of the ...

  7. Proposal for the award of a blanket contract for automatic air-sampling systems for fire and gas detection in the LHC experiments

    CERN Document Server

    2004-01-01

    This document concerns the award of a blanket contract for automatic air-sampling systems for fire and gas detection in the LHC experiments. Following a market survey carried out among 119 firms in ten Member States, a call for tenders (IT-2891/ST) was sent on 1 August 2003 to four firms, in three Member States. By the closing date, CERN had received two tenders from one firm and one consortium, in three Member States. The Finance Committee is invited to agree to the negotiation of a blanket contract with ICARE (FR), the lowest bidder, for the supply of automatic air-sampling systems for fire and gas detection in the LHC experiments for a total amount not exceeding 1 750 000 euros (2 714 000 Swiss francs), subject to revision for inflation from 1 January 2007 with options for air-sampling smoke detection systems for electrical racks, for an additional amount of 400 000 euros (620 000 Swiss francs), subject to revision for inflation from 1 January 2007, bringing the total amount to a maximum of 2 150 000 euros...

  8. A Forward Silicon Strip System for the ATLAS HL-LHC Upgrade

    CERN Document Server

    Wonsak, S; The ATLAS collaboration

    2012-01-01

    The LHC is successfully accumulating luminosity at a centre-of-mass energy of 8 TeV this year. At the same time, plans are rapidly progressing for a series of upgrades, culminating roughly eight years from now in the High Luminosity LHC (HL-LHC) project. The HL-LHC is expected to deliver approximately five times the LHC nominal instantaneous luminosity, resulting in a total integrated luminosity of around 3000 fb-1 by 2030. The ATLAS experiment has a rather well advanced plan to build and install a completely new Inner Tracker (IT) system entirely based on silicon detectors by 2020. This new IT will be made from several pixel and strip layers. The silicon strip detector system will consist of single-sided p-type detectors with five barrel layers and six endcap (EC) disks on each forward side. Each disk will consist of 32 trapezoidal objects dubbed “petals”, with all services (cooling, read-out, command lines, LV and HV power) integrated into the petal. Each petal will contain 18 silicon sensors grouped in...

  9. submitter LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    CERN Document Server

    Barranco, Javier; Cameron, David; Crouch, Matthew; De Maria, Riccardo; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Van der Veken, Frederik; Zacharov, Igor

    2017-01-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted i...

  10. Performance of the LHCb RICH detector at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Adinolfi, M.; Brook, N.H.; Coombes, M.; Hampson, T.; Rademacker, J.H.; Solomin, A.; Voong, D. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Aglieri Rinella, G.; Albrecht, E.; D' Ambrosio, C.; Forty, R.; Frei, C.; Gys, T.; Kanaya, N.; Koblitz, S.; Mollen, A.; Morant, J.; Piedigrossi, D.; Storaci, B.; Ullaland, O.; Vervink, K.; Wyllie, K. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Bellunato, T.; Calvi, M.; Fanchini, E.; Giachero, A.; Gotti, C.; Kucharczyk, M.; Maino, M.; Matteuzzi, C.; Perego, D.L.; Pessina, G. [Sezione INFN di Milano Bicocca, Milano (Italy); Benson, S.; Eisenhardt, S.; Fitzpatrick, C.; Kim, Y.M.; Lambert, D.; Main, A.; Muheim, F.; Playfer, S.; Sparkes, A.; Young, R. [University of Edinburgh, School of Physics and Astronomy, Edinburgh (United Kingdom); Blake, T. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Imperial College London, London (United Kingdom); Blanks, C.; Cameron, B.; Carson, L.; Egede, U.; Owen, P.; Patel, M.; Plackett, R.; Savidge, T.; Sepp, I.; Soomro, F.; Websdale, D. [Imperial College London, London (United Kingdom); Brisbane, S.; Contu, A.; Gandini, P.; Gao, R.; Harnew, N.; Hill, D.; Hunt, P.; John, M.; Johnson, D.; Malde, S.; Muresan, R.; Powell, A.; Thomas, C.; Topp-Joergensen, S.; Torr, N.; Wilkinson, G.; Xing, F. [University of Oxford, Department of Physics, Oxford (United Kingdom); Cardinale, R.; Fontanelli, F.; Mini' , G.; Petrolini, A.; Sannino, M. [Sezione INFN di Genova, Genova (Italy); Easo, S. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); STFC Rutherford Appleton Laboratory, Didcot (United Kingdom); Garra Tico, J.; Gibson, V.; Gregson, S.; Haines, S.C.; Jones, C.R.; Katvars, S.; Kerzel, U.; Mangiafave, N.; Rogers, G.J.; Sigurdsson, S.; Wotton, S.A. [University of Cambridge, Cavendish Laboratory, Cambridge (United Kingdom); Mountain, R. [Syracuse University, Syracuse, NY (United States); Morris, J.V.; Nardulli, J.; Papanestis, A.; Patrick, G.N.; Ricciardi, S. [STFC Rutherford Appleton Laboratory, Didcot (United Kingdom); Sail, P.; Soler, F.J.P.; Spradlin, P. [University of Glasgow, School of Physics and Astronomy, Glasgow (United Kingdom); Collaboration: The LHCb RICH Collaboration

    2013-05-15

    The LHCb experiment has been taking data at the Large Hadron Collider (LHC) at CERN since the end of 2009. One of its key detector components is the Ring-Imaging Cherenkov (RICH) system. This provides charged particle identification over a wide momentum range, from 2-100 GeV/c. The operation and control, software, and online monitoring of the RICH system are described. The particle identification performance is presented, as measured using data from the LHC. Excellent separation of hadronic particle types ({pi}, K, p) is achieved. (orig.)

  11. LHC Nobel Symposium Proceedings

    Science.gov (United States)

    Ekelöf, Tord

    2013-12-01

    In the summer of 2012, a great discovery emerged at the Large Hadron Collider (LHC) at CERN in Geneva. A plethora of new precision data had already by then been collected by the ATLAS and CMS experiments at LHC, providing further extensive support for the validity of the Standard Model of particle physics. But what now appeared was the first evidence for what was not only the last unverified prediction of the Standard Model, but also perhaps the most decisive one: the prediction made already in 1964 of a unique scalar boson required by the theory of François Englert and Peter Higgs on how fundamental particles acquire mass. At that moment in 2012, it seemed particularly appropriate to start planning a gathering of world experts in particle physics to take stock of the situation and try to answer the challenging question: what next? By May 2013, when the LHC Nobel Symposium was held at the Krusenberg Mansion outside Uppsala in Sweden, the first signs of a great discovery had already turned into fully convincing experimental evidence for the existence of a scalar boson of mass about 125 GeV, having properties compatible with the 50-year-old prediction. And in October 2013, the evidence was deemed so convincing that the Swedish Royal Academy of Sciences awarded the Nobel Prize in Physics to Englert and Higgs for their pioneering work. At the same time the search at the LHC for other particles, beyond those predicted by the Standard Model, with heavier masses up to—and in some cases beyond—1 TeV, had provided no positive result. The triumph of the Standard Model seems resounding, in particular because the mass of the discovered scalar boson is such that, when identified with the Higgs boson, the Standard Model is able to provide predictions at energies as high as the Planck mass, although at the price of accepting that the vacuum would be metastable. However, even if there were some feelings of triumph, the ambience at the LHC Nobel Symposium was more one of

  12. The LHC and its electrotechnical challenges

    International Nuclear Information System (INIS)

    Bordry, F.

    2010-01-01

    After a brief presentation of the CERN, the European organization for nuclear research, this article presents the LHC, the Large Hadron Collider, the largest and most powerful particle accelerator in the world. The project somehow started in 1984 and relies on several technological challenges which are herein described: superconducting magnets (their characteristics and cryogenic operation), operation security with particularly high energies stored in magnets and beams, LHC electricity supply (electric circuits with high time constant, a required precision and reproducibility of the magnetic field during all the operation phases, importance of power converters). Then the author evokes the starting procedures, some serious damages which occurred, and the restart of the operation period with spectacular results in terms of beam energy. Future experiments and expected results are also evoked

  13. The long journey to the Higgs boson and beyond at the LHC: Emphasis on ATLAS

    Science.gov (United States)

    Jenni, Peter

    2016-09-01

    The journey in search for the Higgs boson with the ATLAS and CMS experiments at the Large Hadron Collider (LHC) at CERN started more than two decades ago. But the first discussions motivating the LHC project dream date back even further into the 1980s. This article will recall some of these early historical considerations, mention some of the LHC machine milestones and achievements, focus as an example of a technological challenge on the unique ATLAS superconducting magnet system, and then give an account of the physics results so far, leading to, and featuring particularly, the Higgs boson results, and sketching finally prospects for the future. With its emphasis on the ATLAS experiment it is complementary to the preceding article by Tejinder S. Virdee which focused on the CMS experiment.

  14. Statistics for the LHC: Quantifying our Scientific Narrative (1/4)

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Now that the LHC physics program is well under way and results have begun to pour out of the experiments, the statistical methodology used for these results is a hot topic. This is a challenge at the LHC, as we have sensitivity to discover new physics in a stage of the experiments where systematic uncertainties can still be quite large. The emphasis of these lectures is how we can translate the scientific narrative of why we think we know what we know into quantitative statistical statements about the presence or absence of new physics. Topics will include statistical modeling, incorporation of control samples to constrain systematics, and Bayesian and Frequentist statistical tests that are capable of answering these questions.

  15. The ATLAS computing challenge for HL-LHC

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment successfully commissioned a software and computing infrastructure to support the physics program during LHC Run 2. The next phases of the accelerator upgrade will present new challenges in the offline area. In particular, at High Luminosity LHC (also known as Run 4) the data taking conditions will be very demanding in terms of computing resources: between 5 and 10 KHz of event rate from the HLT to be reconstructed (and possibly further reprocessed) with an average pile-up of up to 200 events per collision and an equivalent number of simulated samples to be produced. The same parameters for the current run are lower by up to an order of magnitude. While processing and storage resources would need to scale accordingly, the funding situation allows one at best to consider a flat budget over the next few years for offline computing needs. In this paper we present a study quantifying the challenge in terms of computing resources for HL-LHC and present ideas about the possible evolution of the ...

  16. LHC Report: First 13TeV collisions

    CERN Multimedia

    Jan Uythoven for the LHC team

    2015-01-01

    On Wednesday 20 May at around 10.30 p.m., protons collided in the LHC at the record-breaking energy of 13 TeV for the first time. These test collisions were to set up various systems and, in particular, the collimators. The tests and the technical adjustments will continue in the coming days.   The CCC was abuzz as the LHC experiments saw 13 TeV collisions.   Preparation for the first physics run at 6.5 TeV per beam has continued in the LHC. This included the set-up and verification of the machine protection systems. In addition, precise measurements of the overall focusing properties of the ring – the so-called “optics” – were performed by inducing oscillations of the bunches, and observing the response over many turns with the beam position monitors (BPM). The transverse beam size in the accelerator changes from the order of a millimetre around most of the circumference down to some tens of microns at the centre of the exper...

  17. Topics in the measurement of electrons with the ATLAS detector at the LHC

    CERN Document Server

    Thioye, Moustapha

    2008-01-01

    Upon completion in 2008, the Large Hadron Collider (LHC) will accelerate and collide protons with a 14~TeV center-of-mass energy at a designed luminosity of $10^{34}\\rm {cm^{-2}s^{-1}}$. The LHC will also be able to accelerate and collide heavy ions (Pb-Pb) at a nucleon-nucleon center of mass of 5.5~TeV. It will be the most powerful instrument ever built to investigate particles properties. The ATLAS (A Toroidal LHC ApparatuS) experiment is one of five experiments at the LHC. ATLAS is a general-purpose detector designed for the discovery of new particles predicted by the Standard Model (i.e Higgs boson), and of signatures of physics beyond the Standard Model (i.e supersymmetry). These discoveries require a highly efficient detection and high-resolution measurement of leptons or photons in the final state. In ATLAS, the liquid Argon (LAr) calorimeters identify and measure electrons and photons with high resolution. This dissertation reports on a study of various topics relevant to the measurement of electrons ...

  18. The LHC is safe

    CERN Document Server

    CERN. Geneva; Alvarez-Gaumé, Luís

    2008-01-01

    Concerns have been expressed from time to time about the safety of new high-energy colliders, and the LHC has been no exception. The LHC Safety Assessment Group (LSAG)(*) was asked last year by the CERN management to review previous LHC safety analyses in light of additional experimental results and theoretical understanding. LSAG confirms, updates and extends previous conclusions that there is no basis for any conceivable threat from the LHC. Indeed, recent theoretical and experimental developments reinforce this conclusion. In this Colloquium, the basic arguments presented by LSAG will be reviewed. Cosmic rays of much higher effective centre-of-mass energies have been bombarding the Earth and other astronomical objects for billions of years, and their continued existence shows that the Earth faces no dangers from exotic objects such as hypothetical microscopic black holes that might be produced by the LHC - as discussed in a detailed paper by Giddings and Mangano(**). Measurements of strange particle produc...

  19. External post-operational checks for the LHC beam dumping system

    International Nuclear Information System (INIS)

    Magnin, N.; Baggiolini, V.; Carlier, E.; Goddard, B.; Gorbonosov, R.; Khasbulatov, D.; Uythoven, J.; Zerlauth, M.

    2012-01-01

    The LHC Beam Dumping System (LBDS) is a critical part of the LHC machine protection system. After every LHC beam dump action the various signals and transient data recordings of the beam dumping control systems and beam instrumentation measurements are automatically analysed by the external Post-Operational Checks (XPOC) system to verify the correct execution of the dump action and the integrity of the related equipment. This software system complements the LHC machine protection hardware, and has to ascertain that the beam dumping system is 'as good as new' before the start of the next operational cycle. This is the only way by which the stringent reliability requirements can be met. The XPOC system has been developed within the framework of the LHC 'Post-Mortem' system, allowing highly dependable data acquisition, data archiving, live analysis of acquired data and replay of previously recorded events. It is composed of various analysis modules, each one dedicated to the analysis of measurements coming from specific equipment. This paper describes the global architecture of the XPOC system and gives examples of the analyses performed by some of the most important analysis modules. It explains the integration of the XPOC into the LHC control infrastructure along with its integration into the decision chain to allow proceeding with beam operation. Finally, it discusses the operational experience with the XPOC system acquired during the first years of LHC operation, and illustrates examples of internal system faults or abnormal beam dump executions which it has detected. (authors)

  20. Development of reconstruction algorithms for inelastic processes studies in the TOTEM experiment at LHC

    CERN Document Server

    Berretti, Mirko; Latino, Giuseppe

    The TOTEM experiment at the Large Hadron Collider (LHC) is designed and optimized to measure the total pp cross section at a center of mass energy of E = 14 TeV with a precision of about 1÷2 %, to study the nuclear elastic pp cross section over a wide range of the squared four-momentum transfer (10^{-3} GeV^2 < |t| < 10 GeV^2) and to perform a comprehensive physics program on diffractive dissociation processes, partially in cooperation with the CMS experiment. Based on the “luminosity independent method”, the evaluation of the total cross section with such a small error will in particular require simultaneous measurement of the pp elastic scattering cross section d\\sigma/dt down to |t| ~10^{-3} GeV^2 (to be extrapolated to t = 0) as well as of the pp inelastic interaction rate, with a large coverage in the forward region. The TOTEM physics programme will be accomplished by using three different types of detectors: elastically scattered protons will be detected by Roman Pots detectors (based on sili...