WorldWideScience

Sample records for polarimeter uvsp software

  1. The Penn Polarimeters

    Directory of Open Access Journals (Sweden)

    Robert H. Koch

    2012-03-01

    Full Text Available This report describes the inception, development and extensive use over 30 years of elliptical polarimeters at the University of Pennsylvania. The initial Mark I polarimeter design utilized oriented retarder plates and a calcite Foster-Clarke prism as the analyzer. The Mark I polarimeter was used on the Kitt Peak 0.9 m in 1969-70 to accomplish a survey of approximately 70 objects before the device was relocated to the 0.72 m reflector at the Flower and Cook Observatory. Successive generations of automation and improvements included the early-80’s optical redesign to utilize a photoelastic modulated wave plate and an Ithaco lock-in amplifier–the photoelastic modulating polarimeter. The final design in 2000 concluded with a fully remote operable device. The legacy of the polarimetric programs includes studies of close binaries, pulsating hot stars, and luminous late-type variables.

  2. HERTZ, A Submillimeter Polarimeter

    Science.gov (United States)

    Schleuning, D. A.; Dowell, C. D.; Hildebrand, R. H.; Platt, S. R.; Novak, G.

    1997-03-01

    We describe a 32 pixel polarimeter, Hertz, for use at the Caltech Submillimeter Observatory. We present polarization maps of the Orion molecular cloud (OMC-1) at 350 \\mum (46 detections) and 450 \\mum (19 detections) with 3\\sigma or better statistical significance. The 350 \\mum polarization ranges from 1.4 to 6.8% with a median value of 3.3%. The position angles are fairly uniform across the souce at an angle of \\sim30 degrees (east of north). We describe the design and performance characteristics of the polarimeter and discuss systematic effects due to telescope and instrumental polarization, atmospheric fluctuations, and reference beam flux. (SECTION: Astronomical Instrumentation)

  3. Metasurface-Based Polarimeters

    Directory of Open Access Journals (Sweden)

    Fei Ding

    2018-04-01

    Full Text Available The state of polarization (SOP is an inherent property of light that can be used to gain crucial information about the composition and structure of materials interrogated with light. However, the SOP is difficult to experimentally determine since it involves phase information between orthogonal polarization states, and is uncorrelated with the light intensity and frequency, which can be easily determined with photodetectors and spectrometers. Rapid progress on optical gradient metasurfaces has resulted in the development of conceptually new approaches to the SOP characterization. In this paper, we review the fundamentals of and recent developments within metasurface-based polarimeters. Starting by introducing the concepts of generalized Snell’s law and Stokes parameters, we explain the Pancharatnam–Berry phase (PB-phase which is instrumental for differentiating between orthogonal circular polarizations. Then we review the recent progress in metasurface-based polarimeters, including polarimeters, spectropolarimeters, orbital angular momentum (OAM spectropolarimeters, and photodetector integrated polarimeters. The review is ended with a short conclusion and perspective for future developments.

  4. Optical polarimeter based on Fourier analysis and electronic control

    International Nuclear Information System (INIS)

    Vilardy, J; Salas, V.; Torres, C.

    2016-01-01

    In this paper, we show the design and implementation of an optical polarimeter using electronic control and the Fourier analysis. The polarimeter prototype will be used as a main tool for the students of the Universidad Popular del Cesar that belong to the following university programs: Electronics engineering (optoelectronics area), Math and Physics degree and the Master in Physics Sciences, in order to learning the theory and experimental aspects of the state of optical polarization via the Stokes vector measurement. Using the electronic polarimeter proposed in this paper, the students will be able to observe (in an optical bench) and understand the different interactions of the states of optical polarization when the optical waves pass through to the polarizers and retarder waves plates. The electronic polarimeter has a software that captures the optical intensity measurement and evaluates the Stokes vector. (Author)

  5. Optoelectronic polarimeter controlled by a graphical user interface of Matlab

    International Nuclear Information System (INIS)

    Vilardy, J M; Torres, R; Jimenez, C J

    2017-01-01

    We show the design and implementation of an optical polarimeter using electronic control. The polarimeter has a software with a graphical user interface (GUI) that controls the optoelectronic setup and captures the optical intensity measurement, and finally, this software evaluates the Stokes vector of a state of polarization (SOP) by means of the synchronous detection of optical waves. The proposed optoelectronic polarimeter can determine the Stokes vector of a SOP in a rapid and efficient way. Using the polarimeter proposed in this paper, the students will be able to observe (in an optical bench) and understand the different interactions of the SOP when the optical waves pass through to the linear polarizers and retarder waves plates. The polarimeter prototype could be used as a main tool for the students in order to learn the theory and experimental aspects of the SOP for optical waves via the Stokes vector measurement. The proposed polarimeter controlled by a GUI of Matlab is more attractive and suitable to teach and to learn the polarization of optical waves. (paper)

  6. Microprocessor Card for Cuban Series polarimeters Laserpol

    International Nuclear Information System (INIS)

    Arista Romeu, E.; Mora Mazorra, W.

    2012-01-01

    We present the design consists of a card based on a micro-processor 8-bit adds new software components and their basic living, which allow to deliver new services and expand the possibilities for use in other applications of the polarimeter LASERPOL series, as the polarimetric detection. Given the limitations of the original card it was necessary to introduce a series of changes that would allow to address new user requirements, and expand the possible applications of the instruments. This was done the expansion of the capacity of the EPROM and RAM memory, the decoder circuit was implemented memory map using a programmable integrated circuit, and introduced a real time clock with nonvolatile RAM, these features are exploited to the introduction of new features such as the realization of the polarimeter calibration by the user from a sample pattern or a calibration pattern used as a reference, and the incorporation of the time and date to the reports of measurements required industry for quality control processes. Card that is achieved along with the rest of the components is compatible with polarimeters LASERPOL 101M Series, 3M and LP4, pin to pin, which facilitates their incorporation into the polarimeters in operation in the industry 'in situ' replacement cards from previous models, allowing to extend the possibilities of statistical processing, precision and accuracy of the instruments. Improved measurements in the industry, resulting in significant savings by elimination of losses in production and raw materials. The improved response speed of reading the polarimeters LASERPOL Use and polarimetric detectors. (Author)

  7. PEM-based polarimeters for industrial applications

    Science.gov (United States)

    Wang, Baoliang

    2010-11-01

    A polarimeter is an optical instrument used in the transmissive mode for determining the polarization state of a light beam, or the polarization-altering properties of a sample, such as diattenuation, retardation and depolarizion.1 (Reflective "polarimeters" are typically called ellipsometers.) Polarimeters can, thus, be broadly categorized as either light-measuring polarimeters or sample-measuring polarimeters. A light-measuring polarimeter is also known as a Stokes polarimeter, which measures the polarization state of a light beam as described by the Stokes parameters. A sample-measuring polarimeter is also known as a Mueller polarimeter, which measures the complete set or a subset of polarization-altering properties of a sample. Polarimeters can also be categorized by whether they measure the complete set of polarization properties. If a Stokes polarimeter measures all four Stokes parameters, it is called a complete Stokes polarimeter; otherwise, an incomplete or a special Stokes polarimeter. Similarly, there are complete and incomplete Mueller polarimeters. Nearly all samplemeasuring polarimeters are incomplete or special polarimeters, particularly for industrial applications. These special polarimeters bear different names. For example, a circular dichroism spectrometer, which measures the differential absorption between left and right circularly polarized light (▵A= AL - AR), is a special polarimeter for measuring the circular diattenuation of a sample; a linear birefringence measurement system is a special polarimeter for measuring the linear retardation of a sample. Polarimeters have a broad range of applications in both academic research and industrial metrology. Polarimeters are applied to chemistry, biology, physics, astronomy, material science and many other scientific areas. Polarimeters are used as metrology tools in the semiconductor, fiber telecommunication, flat panel display, pharmaceutical and many other industries. Different branches of

  8. Proton polarimeters for spin transfer experiments

    International Nuclear Information System (INIS)

    McNaughton, M.W.

    1985-01-01

    The design and use of proton polarimeters for spin transfer (Wolfenstein parameter) measurements is discussed. Polarimeters are compared with polarized targets for spin dependent experiments. 32 refs., 4 figs

  9. DAQ system for high energy polarimeter at the LHE, JINR: implementation based on the qdpb (data processing with branchpoints) system

    International Nuclear Information System (INIS)

    Isupov, A.Yu.

    2001-01-01

    Online data acquisition (DAQ) system's implementation for the High Energy Polarimeter (HEP) at the LHE, JINR is described. HEP DAQ is based on the qdpb system. Software modules specific for such implementation (HEP data and hardware dependent) are discussed

  10. Laser-based capillary polarimeter.

    Science.gov (United States)

    Swinney, K; Hankins, J; Bornhop, D J

    1999-01-01

    A laser-based capillary polarimeter has been configured to allow for the detection of optically active molecules in capillary tubes with a characteristic inner diameter of 250 microm and a 39-nL (10(-9)) sample volume. The simple optical configuration consists of a HeNe laser, polarizing optic, fused-silica capillary, and charge-coupled device (CCD) camera in communication with a laser beam analyzer. The capillary scale polarimeter is based on the interaction between a polarized laser beam and a capillary tube, which results in a 360 degree fan of scattered light. This array of scattered light contains a set of interference fringe, which respond in a reproducible manner to changes in solute optical activity. The polarimetric utility of the instrument will be demonstrated by the analysis of two optically active solutes, R-mandelic acid and D-glucose, in addition to the nonoptically active control, glycerol. The polarimetric response of the system is quantifiable with detection limits facilitating 1.7 x 10(-3) M or 68 x 10(-12) nmol (7 psi 10(-9) g) sensitivity.

  11. Intrinsic coincident full-Stokes polarimeter using stacked organic photovoltaics and architectural comparison of polarimeter techniques

    Science.gov (United States)

    Yang, Ruonan; Sen, Pratik; O'Connor, B. T.; Kudenov, M. W.

    2017-08-01

    An intrinsic coincident full-Stokes polarimeter is demonstrated by using stain-aligned polymer-based organic photovoltaics (OPVs) which can preferentially absorb certain polarized states of incident light. The photovoltaic-based polarimeter is capable of measuring four stokes parameters by cascading four semitransparent OPVs in series along the same optical axis. Two wave plates were incorporated into the system to modulate the S3 stokes parameter so as to reduce the condition number of the measurement matrix. The model for the full-Stokes polarimeter was established and validated, demonstrating an average RMS error of 0.84%. The optimization, based on minimizing the condition number of the 4-cell OPV design, showed that a condition number of 2.4 is possible. Performance of this in-line polarimeter concept was compared to other polarimeter architectures, including Division of Time (DoT), Division of Amplitude (DoAm), Division of Focal Plane (DoFP), and Division of Aperture (DoA) from signal-to-noise ratio (SNR) perspective. This in-line polarimeter concept has the potential to enable both high temporal (as compared with a DoT polarimeter) and high spatial resolution (as compared with DoFP and DoA polarimeters). We conclude that the intrinsic design has the same √2 SNR advantage as the DoAm polarimeter, but with greater compactness.

  12. Hard X-ray Photoelectric Polarimeter

    Data.gov (United States)

    National Aeronautics and Space Administration — Our objective is to determine the gas mixtures and pressures that would enable a sensitive, hard X-ray polarimeter using existing flight components with the goal of...

  13. The Compton polarimeter at ELSA

    International Nuclear Information System (INIS)

    Doll, D.

    1998-06-01

    In order to measure the degree of transverse polarization of the stored electron beam in the Electron Stretcher Accelerator ELSA a compton polarimeter is built up. The measurement is based on the polarization dependent cross section for the compton scattering of circular polarized photons off polarized electrons. Using a high power laser beam and detecting the scattered photons a measuring time of two minutes with a statistical error of 5% is expected from numerical simulations. The design and the results of a computer controlled feedback system to enhance the laser beam stability at the interaction point in ELSA are presented. The detection of the scattered photons is based on a lead converter and a silicon-microstrip detector. The design and test results of the detector module including readout electronic and computer control are discussed. (orig.)

  14. Polarimeter for the General Fusion SPECTOR machine

    Energy Technology Data Exchange (ETDEWEB)

    Carle, Patrick, E-mail: patrick.carle@generalfusion.com; Froese, Aaron; Wong, Adrian; Howard, Stephen; O’Shea, Peter; Laberge, Michel [General Fusion, Inc., Burnaby, British Columbia V3N4T5 (Canada)

    2016-11-15

    A polarimeter has been designed to measure Faraday rotation and help to understand the profile of its safety factor, q, on the recently built SPECTOR magnetized target fusion machine at General Fusion. The polarimeter uses two counter-rotating, circularly polarized, 118.8 μm beams to probe the plasma. Grad-Shafranov simulations have been used to investigate the effect of measurement error and chord geometry.

  15. A Burst Chasing X-ray Polarimeter

    Science.gov (United States)

    Hill, Joanne; Hill, Joe; Barthelmy, S.; Black, K.; Deines-Jones, P.; Jahoda, K.; Sakamoto, T.; Kaaret, P.; McConnell, M.; Bloser, P.; hide

    2007-01-01

    Tihs is a viewgraph presentation of a discussion of the X-ray Polarimeter. Gamma-ray bursts are one of the most powerful explosions in the universe and have been detected out to distances of almost 13 billion light years. The exact origin of these energetic explosions is still unknown but the resulting huge release of energy is thought to create a highly relativistic jet of material and a power-law distribution of electrons. There are several theories describing the origin of the prompt GRB emission that currently cannot be distinguished. Measurements of the linear polarization would provide unique and important constraints on the mechanisms thought to drive these powerful explosions. We present the design of a sensitive, and extremely versatile gamma-ray burst polarimeter. The instrument is a photoelectric polarimeter based on a time-projection chamber. The photoelectric time-projection technique combines high sensitivity with broad band-pass and is potentially the most powerful method between 2 and 100 keV where the photoelectric effect is the dominant interaction process We present measurements of polarized and unpolarized X-rays obtained with a prototype detector and describe the two mission concepts, the Gamma-Ray Burst Polarimeter (GRBP) for thc U S Naval Academy satellite MidSTAR-2, and thc Low Energy Polarimeter (LEP) onboard POET, a broadband polarimetry concept for a small explorer mission.

  16. PoET: Polarimeters for Energetic Transients

    Science.gov (United States)

    McConnell, Mark; Barthelmy, Scott; Hill, Joanne

    2008-01-01

    This presentation focuses on PoET (Polarimeters for Energetic Transients): a Small Explorer mission concept proposed to NASA in January 2008. The principal scientific goal of POET is to measure GRB polarization between 2 and 500 keV. The payload consists of two wide FoV instruments: a Low Energy Polarimeter (LEP) capable of polarization measurements in the energy range from 2-15 keV and a high energy polarimeter (Gamma-Ray Polarimeter Experiment - GRAPE) that will measure polarization in the 60-500 keV energy range. Spectra will be measured from 2 keV up to 1 MeV. The PoET spacecraft provides a zenith-pointed platform for maximizing the exposure to deep space. Spacecraft rotation will provide a means of effectively dealing with systematics in the polarization response. PoET will provide sufficient sensitivity and sky coverage to measure statistically significant polarization for up to 100 GRBs in a two-year mission. Polarization data will also be obtained for solar flares, pulsars and other sources of astronomical interest.

  17. Imaging polarimetry of circumstellar environments with the Extreme Polarimeter

    NARCIS (Netherlands)

    Rodenhuis, M.; Canovas, H.; Jeffers, S.V.; Min, M.; Keller, C.U.

    2010-01-01

    Three successful observation campaigns have been conducted with the Extreme Polarimeter, an imaging polarimeter for the study of circumstellar environments in scattered light at visible wavelengths. A contrast ratio between the central star and the circumstellar source of 10-5 can be achieved with

  18. Cryogenic system for liquid hydrogen polarimeter

    International Nuclear Information System (INIS)

    Kitami, T.; Chiba, M.; Hirabayashi, H.; Ishii, T.; Kato, S.

    1979-01-01

    A cryogenic system has been constructed for a liquid hydrogen polarimeter in order to measure polarization of high energy proton at the 1.3 GeV electron synchrotron of Institute for Nuclear Study, University of Tokyo. The system principally consists of a cryogenerator with a cryogenic transfer line, a liquid hydrogen cryostat, and a 14.5 l target container of thin aluminum alloy where liquid hydrogen is served for the experiment. The refrigeration capacity is about 54 W at 20.4 K without a target container. (author)

  19. Multichannel far-infrared interferometer/polarimeter

    International Nuclear Information System (INIS)

    Young, P.E.

    1984-01-01

    Studies of the time development of the current density profile in a tokamak plasma have been incomplete due to the lack of adequate, direct measurements of the internal magnetic field. Experimental confirmation of anomalous current penetration during the startup phase of a tokamak discharge and the detailed study of major disruptions and their relation to the presence of tearing modes are needed to complete the understanding of tokamak confinement properties. The major obstacle to studies of the tokamak current density profile has been the lack of a reliable, nonperturbing diagnostic. This dissertation describes a multichannel interferometer/polarimeter that was developed for this purpose

  20. A fast Stokes polarimeter: preliminary design

    Science.gov (United States)

    Vaughn, Israel J.; Alenin, Andrey S.; Tyo, J. Scott

    2017-09-01

    Designing polarimetric systems directly in the channel space has provided insight into how to design new types of polarimetric systems, including systems which use carriers in hybrid domains of space, time, or spectrum. Utilizing linear systems theory, we present a full Stokes imaging polarimeter design which has the potential to operate at half the frame rate of the imaging sensor of the system by utilizing a hybrid spatio-temporal carrier design. The design places channels on the faces and the edges of the Nyquist cube resulting in the potential for half the Nyquist limit to be achieved, provided that the spatial frequency of the objects being imaged are bandlimited to less than 0.25 cycles per pixel. If the objects are not spatially bandlimited, then the achievable temporal bandwidth is more difficult to analyze. However, a spatio-temporal tradeoff still exists allowing for increased temporal bandwidth. We present the design of a "Fast Stokes'' polarimeter and some simulated images using this design.

  1. Spectral line polarimetry with a channeled polarimeter.

    Science.gov (United States)

    van Harten, Gerard; Snik, Frans; Rietjens, Jeroen H H; Martijn Smit, J; Keller, Christoph U

    2014-07-01

    Channeled spectropolarimetry or spectral polarization modulation is an accurate technique for measuring the continuum polarization in one shot with no moving parts. We show how a dual-beam implementation also enables spectral line polarimetry at the intrinsic resolution, as in a classic beam-splitting polarimeter. Recording redundant polarization information in the two spectrally modulated beams of a polarizing beam-splitter even provides the possibility to perform a postfacto differential transmission correction that improves the accuracy of the spectral line polarimetry. We perform an error analysis to compare the accuracy of spectral line polarimetry to continuum polarimetry, degraded by a residual dark signal and differential transmission, as well as to quantify the impact of the transmission correction. We demonstrate the new techniques with a blue sky polarization measurement around the oxygen A absorption band using the groundSPEX instrument, yielding a polarization in the deepest part of the band of 0.160±0.010, significantly different from the polarization in the continuum of 0.2284±0.0004. The presented methods are applicable to any dual-beam channeled polarimeter, including implementations for snapshot imaging polarimetry.

  2. Active polarimeter optical system laser hazard analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  3. Analysis, optimization and implementation of a variable retardance based polarimeter

    Directory of Open Access Journals (Sweden)

    Moreno I.

    2010-06-01

    Full Text Available We present a comprehensive analysis, optimization and implementation of a Stokes polarimeter based on two liquid crystals acting as variable retarders. For the optimization process, the Conditional Number or the Equally Weighted Variance indicators are applied and compared as a function of different number of polarization analyzers. Moreover, some of the optimized polarimeter configurations are experimentally implemented and the influence of small experimental deviations from the optimized configuration values on the amplification of the Stokes component error is also studied. Some experimental results obtained by using the implemented polarimeters, when measuring different incidence states of polarization, are provided.

  4. DAQ systems for the high energy and nuclotron internal target polarimeters with network access to polarization calculation results and raw data

    International Nuclear Information System (INIS)

    Isupov, A.Yu.

    2004-01-01

    On-line data acquisition (DAQ) system for the Nuclotron Internal Target Polarimeter (ITP) at the LHE, JINR, is explained in respect of design and implementation, based on the distributed data acquisition and processing system qdpb. Software modules specific for this implementation (dependent on ITP data contents and hardware layout) are discussed briefly in comparison with those for the High Energy Polarimeter (HEP) at the LHE, JINR. User access methods both to raw data and to results of polarization calculations of the ITP and HEP are discussed

  5. Electro-Optic Imaging Fourier Transform Spectral Polarimeter, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Boulder Nonlinear Systems, Inc. (BNS) proposes to develop an Electro-Optic Imaging Fourier Transform Spectral Polarimeter (E-O IFTSP). The polarimetric system is...

  6. Gamma-Ray Imager Polarimeter for Solar Flares Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose here to develop the Gamma-Ray Imager/Polarimeter for Solar flares (GRIPS), the next-generation instrument for high-energy solar observations. GRIPS will...

  7. MOPTOP: Multi-colour Optimised Optical Polarimeter

    Science.gov (United States)

    Jermak, Helen; Steele, Iain A.; Smith, Robert J.

    Polarimetric measurements are essential for the study of jetted sources associated with black holes, such as γ-ray bursts and blazars. The relativistic jets launched from regions close to the black hole are threaded with magnetic fields, which produce synchrotron emission, and can be studied with polarimetric measurements. The multi-colour, optimised, optical polarimeter (MOPTOP) is a multi-band imaging instrument designed for use on the Liverpool Telescope. By replacing the rotating polaroid with a half wave plate and beam splitter, the instrument utilises twice as much of the incoming beam of light from the telescope compared to its predecessor, Ringo3. MOPTOP also builds on the successful introduction of dichroic mirrors to perform simultaneous multi-waveband polarimetric and photometric analysis in Ringo3, and enhances the sensitivity of the instrument with sCMOS cameras to use all photons as efficiently as possible.

  8. Performance of neutron polarimeter SMART-NPOL

    International Nuclear Information System (INIS)

    Noji, S.; Miki, K.; Yako, K.; Kawabata, T.; Kuboki, H.; Sakai, H.; Sekiguchi, K.; Suda, K.

    2007-01-01

    The neutron polarimeter SMART-NPOL has been constructed at the RIKEN Accelerator Research Facility for measuring polarization correlations of proton-neutron systems. The SMART-NPOL system consists of 12 parallel neutron counter planes of two dimensionally position-sensitive plastic scintillators with a size of 60x60x3.0cm 3 . Polarimetry measurements were made using the analyzing power of the H1(n-vector,n)H1 reaction occurring in the plastic scintillators. The effective analyzing power of SMART-NPOL was measured with polarized neutrons from the zero-degree Li6(d-vector,n-vector) reaction with an incident deuteron energy of 135MeV/A. The effective analyzing power thus obtained was 0.26±0.01 stat ±0.03 syst and the double scattering efficiency was 1.1x10 -3

  9. The SLAC E-154 {sup 3}He polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, E.W. [California Institute of Technology, Pasadena, California 91125 (United States); Chupp, T.E.; Coulter, K.P.; Smith, T.B.; Welsh, R. [University of Michigan, Ann Arbor, Michigan 48109 (United States); Thompson, A.K. [National Institute of Standards and Technology, Gainesville, Maryland 20899 (United States); Romalis, M.V.; Bogorad, P.L.; Cates, G.D.; Kumar, K.S. [Princeton University, Princeton, New Jersey 08544 (United States); Johnson, J.R. [University of Wisconsin, Madison, Wisconsin 53706 (United States)

    1998-01-01

    We describe the NMR and Rb Zeeman frequency shift polarimeters used for determining the {sup 3}He polarization in a recent precision measurement of the neutron spin structure function g{sub 1} at SLAC (E-154). We performed a detailed study of the systematic errors associated with the calibration of the NMR polarimeter. A new technique was used for determining the {sup 3}He polarization from the frequency shift of the Rb Zeeman resonance. {copyright} {ital 1998 American Institute of Physics.}

  10. Performance of the PRAXyS X-ray polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Iwakiri, W.B., E-mail: wataru.iwakiri@riken.jp [RIKEN Nishina Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Black, J.K. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Rock Creek Scientific, 1400 East-West Hwy, Silver Spring, MD 20910 (United States); Cole, R. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Enoto, T. [The Hakubi Center for Advanced Research, Kyoto University, Kyoto 606-8302 (Japan); Department of Astronomy, Kyoto University, Kitashirakawa-Oiwake-cho, Sakyo-ku, Kyoto 606-8502 (Japan); Hayato, A. [RIKEN Nishina Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Hill, J.E.; Jahoda, K. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Kaaret, P. [University of Iowa, Iowa City, IA 52242 (United States); Kitaguchi, T. [Department of Physical Science, Hiroshima University, 1-3-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Kubota, M. [Department of Physics, Tokyo University of Science, 3-1 Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan. (Japan); RIKEN Nishina Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Marlowe, H.; McCurdy, R. [University of Iowa, Iowa City, IA 52242 (United States); Takeuchi, Y. [Department of Physics, Tokyo University of Science, 3-1 Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan. (Japan); RIKEN Nishina Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Tamagawa, T. [RIKEN Nishina Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Department of Physics, Tokyo University of Science, 3-1 Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan. (Japan)

    2016-12-01

    The performance of the Time Projection Chamber (TPC) polarimeter for the Polarimeter for Relativistic Astrophysical X-ray Sources (PRAXyS) Small Explorer was evaluated using polarized and unpolarized X-ray sources. The PRAXyS mission will enable exploration of the universe through X-ray polarimetry in the 2–10 keV energy band. We carried out performance tests of the polarimeter at the Brookhaven National Laboratory, National Synchrotron Light Source (BNL-NSLS) and at NASA's Goddard Space Flight Center. The polarimeter was tested with linearly polarized, monochromatic X-rays at 11 different energies between 2.5 and 8.0 keV. At maximum sensitivity, the measured modulation factors at 2.7, 4.5 and 8.0 keV are 27%, 43% and 59%, respectively and the measured angle of polarization is consistent with the expected value at all energies. Measurements with a broadband, unpolarized X-ray source placed a limit of less than 1% on false polarization in the PRAXyS polarimeter.

  11. The rocket experiment demonstration of a soft x-ray polarimeter (REDSoX Polarimeter)

    Science.gov (United States)

    Marshall, Herman L.; Schulz, Norbert S.; Trowbridge Heine, Sarah N.; Heilmann, Ralf K.; Günther, H. Moritz; Egan, Mark; Hellickson, Tim; Schattenburg, Mark; Chakrabarty, Deepto; Windt, David L.; Gullikson, Eric M.; Ramsey, Brian D.; Weisskopf, Martin; Tagliaferri, Gianpiero; Pareschi, Giovanni; Marscher, Alan; Jorstad, Svetlana

    2017-08-01

    The Rocket Experiment Demonstration of a Soft X-ray Polarimeter (REDSoX Polarimeter) is a sounding rocket instrument that can make the first measurement of the linear X-ray polarization of an extragalactic source in the 0.2-0.8 keV band as low as 10%. We employ multilayer-coated mirrors as Bragg reflectors at the Brewster angle. By matching the dispersion of a spectrometer using replicated optics from MSFC and critical angle transmission gratings from MIT to three laterally graded multilayer mirrors (LGMLs), we achieve polarization modulation factors over 90%. We present a novel arrangement of gratings, designed optimally for the purpose of polarimetry with a converging beam. The entrance aperture is divided into six equal sectors; pairs of blazed gratings from opposite sectors are oriented to disperse to the same LGML. The LGML position angles are 120 degrees to each other. CCD detectors then measure the intensities of the dispersed spectra after reflection and polarizing by the LGMLs, giving the three Stokes parameters needed to determine a source's linear polarization fraction and orientation. A current grant is funding further development to improve the LGMLs. Sample gratings for the project have been fabricated at MIT and the development team continues to improve them under separate funding. Our technological approach is the basis for a possible orbital mission

  12. Design of a polarimeter for slow e sup + beams

    CERN Document Server

    Kumita, T; Hamatsu, R; Hirose, M; Hirose, T; Irako, M; Kawasaki, N; Yang, J

    2000-01-01

    A polarimeter which utilizes ortho-positronium quenching in a magnetic field is used to measure polarization of slow positron beams. This polarimeter is employed for a polarization measurement at an e sup + beam system where the beam is provided from the beta sup + decay of sup 2 sup 7 Si produced via the sup 2 sup 7 Al(p,n) sup 2 sup 7 Si reaction caused by proton irradiation. The beam polarization is determined to be 38.4+-4.0(statistical)+-8.7(systematic)%.

  13. A variable energy Moeller polarimeter at the S-DALINAC

    Energy Technology Data Exchange (ETDEWEB)

    Barday, Roman; Enders, Joachim [Institut fuer Kernphysik, TU Darmstadt (Germany); Mueller, Wolfgang; Steiner, Bastian [Institut fuer Theorie Elektromagnetischer Felder, TU Darmstadt (Germany)

    2008-07-01

    A coincidence Moeller polarimeter is designed for both cw and pulsed beam of the Superconducting Darmstadt Linear Accelerator S-DALINAC where polarized electron beams will become available in 2008. The designed polarimeter covers an energy region between 15 and 130 MeV. The beam polarisation at currents of up to 1 {mu}A is inferred from measurement of the asymmetry in polarized electron-electron scattering from the Fe-Co foil. The influence of the atomic motion of the target electrons on the polarisation, the so-called Levchuk effect was investigated.

  14. The Skylab ten color photoelectric polarimeter. [sky brightness

    Science.gov (United States)

    Weinberg, J. L.; Hahn, R. C.; Sparrow, J. G.

    1975-01-01

    A 10-color photoelectric polarimeter was used during Skylab missions SL-2 and SL-3 to measure sky brightness and polarization associated with zodiacal light, background starlight, and the spacecraft corona. A description is given of the instrument and observing routines together with initial results on the spacecraft corona and polarization of the zodiacal light.

  15. Polarisation at HERA. Reanalysis of the HERA II polarimeter data

    Energy Technology Data Exchange (ETDEWEB)

    Sobloher, B.; Behnke, T.; Olsson, J.; Pitzl, D.; Schmitt, S.; Tomaszewska, J.; Fabbri, R.

    2012-01-15

    In this technical note we briefly present the analysis of the HERA polarimeters (transversal and longitudinal) as of summer 2011. We present the final reanalysis of the TPOL data, and discuss the systematic uncertainties. A procedure to combine and average LPOL and TPOL data is presented. (orig.)

  16. Polarisation at HERA. Reanalysis of the HERA II polarimeter data

    International Nuclear Information System (INIS)

    Sobloher, B.; Behnke, T.; Olsson, J.; Pitzl, D.; Schmitt, S.; Tomaszewska, J.; Fabbri, R.

    2012-01-01

    In this technical note we briefly present the analysis of the HERA polarimeters (transversal and longitudinal) as of summer 2011. We present the final reanalysis of the TPOL data, and discuss the systematic uncertainties. A procedure to combine and average LPOL and TPOL data is presented. (orig.)

  17. Dynamic spectro-polarimeter based on a modified Michelson interferometric scheme.

    Science.gov (United States)

    Dembele, Vamara; Jin, Moonseob; Baek, Byung-Joon; Kim, Daesuk

    2016-06-27

    A simple dynamic spectro-polarimeter based on a modified Michelson interferometric scheme is described. The proposed system can extract a spectral Stokes vector of a transmissive anisotropic object. Detail theoretical background is derived and experiments are conducted to verify the feasibility of the proposed novel snapshot spectro-polarimeter. The proposed dynamic spectro-polarimeter enables us to extract highly accurate spectral Stokes vector of any transmissive anisotropic object with a frame rate of more than 20Hz.

  18. An upgraded interferometer-polarimeter system for broadband fluctuation measurements

    International Nuclear Information System (INIS)

    Parke, E.; Ding, W. X.; Brower, D. L.; Duff, J.

    2016-01-01

    Measuring high-frequency fluctuations (above tearing mode frequencies) is important for diagnosing instabilities and transport phenomena. The Madison Symmetric Torus interferometer-polarimeter system has been upgraded to utilize improved planar-diode mixer technology. The new mixers reduce phase noise and allow more sensitive measurements of fluctuations at high frequency. Typical polarimeter rms phase noise values of 0.05°–0.07° are obtained with 400 kHz bandwidth. The low phase noise enables the resolution of fluctuations up to 250 kHz for polarimetry and 600 kHz for interferometry. The importance of probe beam alignment for polarimetry is also verified; previously reported tolerances of ≤0.1 mm displacement for equilibrium and tearing mode measurements minimize contamination due to spatial misalignment to within acceptable levels for chords near the magnetic axis.

  19. An upgraded interferometer-polarimeter system for broadband fluctuation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Parke, E., E-mail: eparke@ucla.edu; Ding, W. X.; Brower, D. L. [Department of Physics and Astronomy, University of California Los Angeles, Los Angeles, California 90095 (United States); Duff, J. [Department of Physics, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States)

    2016-11-15

    Measuring high-frequency fluctuations (above tearing mode frequencies) is important for diagnosing instabilities and transport phenomena. The Madison Symmetric Torus interferometer-polarimeter system has been upgraded to utilize improved planar-diode mixer technology. The new mixers reduce phase noise and allow more sensitive measurements of fluctuations at high frequency. Typical polarimeter rms phase noise values of 0.05°–0.07° are obtained with 400 kHz bandwidth. The low phase noise enables the resolution of fluctuations up to 250 kHz for polarimetry and 600 kHz for interferometry. The importance of probe beam alignment for polarimetry is also verified; previously reported tolerances of ≤0.1 mm displacement for equilibrium and tearing mode measurements minimize contamination due to spatial misalignment to within acceptable levels for chords near the magnetic axis.

  20. PHYSICS WITH A FOCAL PLANE PROTON POLARIMETER FOR HALL A AT CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Ron Gilman; F.T. Baker; Louis Bimbot; Ed Brash; Charles Glashausser; Mark Jones; Gerfried Kumbartzki; Sirish Nanda; Charles F. Perdrisat; Vina Punjabi; Ronald Ransome; Paul Rutt

    1994-09-01

    A focal plane polarimeter intended for the CEBAF Hall A high resolution hadron spectrometer is under construction at Rutgers University and the College of William and Mary. Experiments with focal plane polarimeters are only now beginning at electron accelerators; they play a prominent role in the list of approved experiments for Hall A. Construction of the polarimeter is in progress, it is expected to be brought to CEBAF in spring 1995. Several coincidence (e,e'p) and singles (gamma, p) measurements by the Hall A Collaboration are expected to start in 1996. In this paper we describe the polarimeter and the physics program planned for it.

  1. The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP)

    Science.gov (United States)

    Ishikawa, Shin-nosuke; Kano, R.; Kobayashi, K.; Bando, T.; Narukage, N..; Ishikawa, R.; Kubo, M.; Katsukawa, Y.; Suematsu, Y.; Hara, H.; hide

    2014-01-01

    To Understand energy release process in the Sun including solar flares, it is essentially important to measure the magnetic field of the atmosphere of the Sun. Magnetic field measurement of the upper layers (upper chromosphere and above) was technically difficult and not well investigated yet. Upper chromosphere and transition region magnetic field measurement by Chromospheric Lyman-Alpha SpectroPolarimeter (CLASP) sounding rocket to be launched in 2015. The proposal is already selected and developments of the flight components are going.

  2. Simple modification of Compton polarimeter to redirect synchrotron radiation

    Directory of Open Access Journals (Sweden)

    J. Benesch

    2015-11-01

    Full Text Available Synchrotron radiation produced as an electron beam passes through a bending magnet is a significant source of background in many experiments. Using modeling, we show that simple modifications of the magnet geometry can reduce this background by orders of magnitude in some circumstances. Specifically, we examine possible modifications of the four dipole magnets used in Jefferson Lab’s Hall A Compton polarimeter chicane. This Compton polarimeter has been a crucial part of experiments with polarized beams and the next generation of experiments will utilize increased beam energies, up to 11 GeV, requiring a corresponding increase in Compton dipole field to 1.5 T. In consequence, the synchrotron radiation (SR from the dipole chicane will be greatly increased. Three possible modifications of the chicane dipoles are studied; each design moves about 2% of the integrated bending field to provide a gentle bend in critical regions along the beam trajectory which, in turn, greatly reduces the synchrotron radiation within the acceptance of the Compton polarimeter photon detector. Each of the modifications studied also softens the SR energy spectrum at the detector sufficiently to allow shielding with 5 mm of lead. Simulations show that these designs are each capable of reducing the background signal due to SR by three orders of magnitude. The three designs considered vary in their need for vacuum vessel changes and in their effectiveness.

  3. The Polarimeter for Relativistic Astrophysical X-ray Sources

    Science.gov (United States)

    Jahoda, Keith; Kallman, Timothy R.; Kouveliotou, Chryssa; Angelini, Lorella; Black, J. Kevin; Hill, Joanne E.; Jaeger, Theodore; Kaaret, Philip E.; Markwardt, Craig B.; Okajima, Takashi; Petre, Robert; Schnittman, Jeremy; Soong, Yang; Strohmayer, Tod E.; Tamagawa, Toru; Tawara, Yuzuru

    2016-07-01

    The Polarimeter for Relativistic Astrophysical X-ray Sources (PRAXyS) is one of three Small Explorer (SMEX) missions selected by NASA for Phase A study, with a launch date in 2020. The PRAXyS Observatory exploits grazing incidence X-ray mirrors and Time Projection Chamber Polarimeters capable of measuring the linear polarization of cosmic X-ray sources in the 2-10 keV band. PRAXyS combines well-characterized instruments with spacecraft rotation to ensure low systematic errors. The PRAXyS payload is developed at the Goddard Space Flight Center with the Johns Hopkins University Applied Physics Laboratory, University of Iowa, and RIKEN (JAXA) collaborating on the Polarimeter Assembly. The LEOStar-2 spacecraft bus is developed by Orbital ATK, which also supplies the extendable optical bench that enables the Observatory to be compatible with a Pegasus class launch vehicle. A nine month primary mission will provide sensitive observations of multiple black hole and neutron star sources, where theory predicts polarization is a strong diagnostic, as well as exploratory observations of other high energy sources. The primary mission data will be released to the community rapidly and a Guest Observer extended mission will be vigorously proposed.

  4. X-Ray Spectro-Polarimetry with Photoelectric Polarimeters

    Science.gov (United States)

    Strohmayer, T. E.

    2017-01-01

    We derive a generalization of forward fitting for X-ray spectroscopy to include linear polarization of X-ray sources, appropriate for the anticipated next generation of space-based photoelectric polarimeters. We show that the inclusion of polarization sensitivity requires joint fitting to three observed spectra, one for each of the Stokes parameters, I(E), U(E), and Q(E). The equations for StokesI (E) (the total intensity spectrum) are identical to the familiar case with no polarization sensitivity, and for which the model-predicted spectrum is obtained by a convolution of the source spectrum, F (E), with the familiar energy response function,(E) R(E,E), where (E) and R(E,E) are the effective area and energy redistribution matrix, respectively. In addition to the energy spectrum, the two new relations for U(E) and Q(E) include the source polarization fraction and position angle versus energy, a(E), and 0(E), respectively, and the model-predicted spectra for these relations are obtained by a convolution with the modulated energy response function, (E)(E) R(E,E), where(E) is the energy-dependent modulation fraction that quantifies a polarimeters angular response to 100 polarized radiation. We present results of simulations with response parameters appropriate for the proposed PRAXyS Small Explorer observatory to illustrate the procedures and methods, and we discuss some aspects of photoelectric polarimeters with relevance to understanding their calibration and operation.

  5. A New Cost-Effective Diode Laser Polarimeter Apparatus Constructed by Undergraduate Students

    Science.gov (United States)

    Lisboa, Pedro; Sotomayor, Joo; Ribeiro, Paulo

    2010-01-01

    The construction of a diode laser polarimeter apparatus by undergraduate students is described. The construction of the modular apparatus by undergraduate students gives them an insight into how it works and how the measurement of a physical or chemical property is conducted. The students use the polarimeter to obtain rotation angle values for the…

  6. Calibration of a neutron polarimeter in the 0.2-1.1GeV region

    International Nuclear Information System (INIS)

    Semenov, A.Yu.; Zhang, W.M.; Madey, R.; Ahmidouch, A.; Anderson, B.D.; Assamagan, K.; Avery, S.; Baldwin, A.R.; Crowell, A.S.; Eden, T.; Manley, D.M.; Markowitz, P.; Milleret, G.; Prout, D.; Reichelt, T.; Semenova, I.A.; Ulmer, P.E.; Voutier, E.; Watson, J.W.; Wells, S.P.

    2006-01-01

    We measured the analyzing power and the efficiency of a neutron polarimeter at the Saturne National Laboratory in France with central energies of the neutron beam of 261,533,752,922, and 1057MeV. This polarimeter was a prototype designed to measure G E n , the neutron electric form factor, at the Thomas Jefferson National Accelerator Facility

  7. The 270 MeV deuteron beam polarimeter at the Nuclotron Internal Target Station

    Energy Technology Data Exchange (ETDEWEB)

    Kurilkin, P.K. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Moscow State Institute of Radio-engineering Electronics and Automation (Technical University), Moscow (Russian Federation); Ladygin, V.P., E-mail: vladygin@jinr.ru [Joint Institute for Nuclear Research, Dubna (Russian Federation); Moscow State Institute of Radio-engineering Electronics and Automation (Technical University), Moscow (Russian Federation); Uesaka, T. [Center for Nuclear Study, University of Tokyo, Tokyo 113-0033 (Japan); Suda, K. [RIKEN Nishina Center, Saitama (Japan); Gurchin, Yu.V.; Isupov, A.Yu. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Itoh, K. [Department of Physics, Saitama University, Saitama (Japan); Janek, M. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Physics Department, University of Zilina, 010 26 Zilina (Slovakia); Karachuk, J.-T. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Advanced Research Institute for Electrical Engineering, Bucharest (Romania); Kawabata, T. [Center for Nuclear Study, University of Tokyo, Tokyo 113-0033 (Japan); Khrenov, A.N.; Kiselev, A.S.; Kizka, V.A. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Kliman, J. [Institute of Physics of Slovak Academy of Sciences, Bratislava (Slovakia); Krasnov, V.A.; Livanov, A.N. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Institute for Nuclear Research, Moscow (Russian Federation); Maeda, Y. [Kyushi University, Hakozaki (Japan); Malakhov, A.I. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Matousek, V.; Morhach, M. [Institute of Physics of Slovak Academy of Sciences, Bratislava (Slovakia)

    2011-06-21

    A deuteron beam polarimeter has been constructed at the Internal Target Station at the Nuclotron of JINR. The polarimeter is based on spin-asymmetry measurements in the d-p elastic scattering at large angles and the deuteron kinetic energy of 270 MeV. It allows to measure vector and tensor components of the deuteron beam polarization simultaneously.

  8. A Shoebox Polarimeter: An Inexpensive Analytical Tool for Teachers and Students

    Science.gov (United States)

    Mehta, Akash; Greenbowe, Thomas J.

    2011-01-01

    A polarimeter can determine the optical activity of an organic or inorganic compound by providing information about the optical rotation of plane-polarized light when transmitted through that compound. This "Journal" has reported various construction methods for polarimeters. We report a unique construction using a shoebox, recycled office…

  9. LEAP - A Large Area GRB Polarimeter for the ISS

    Science.gov (United States)

    McConnell, Mark L.; Baring, Matthew G.; Bloser, Peter F.; Briggs, Michael Stephen; Connaughton, Valerie; Dwyer, Joseph; Gaskin, Jessica; Grove, J. Eric; Gunji, Shuichi; Hartmann, Dieter; Hayashida, Kiyoshi; Hill, Joanne E.; Kippen, R. Marc; Kishimoto, Shunji; Kishimoto, Yuji; Krizmanic, John F.; Lundman, Christoffer; Mattingly, David; McBreen, Sheila; Meegan, Charles A.; Mihara, Tatehiro; Nakamori, Takeshi; Pearce, Mark; Phlips, Bernard; Preece, Robert D.; Produit, Nicolas; Ryan, James M.; Ryde, Felix; Sakamoto, Takanori; Strickman, Mark Samuel; Sturner, Steven J.; Takahashi, Hiromitsu; Toma, Kenji; Vestrand, W. Thomas; Wilson-Hodge, Colleen A.; yatsu, Yoichi; Yonetoku, Daisuke; Zhang, Bing

    2017-08-01

    The LargE Area burst Polarimeter (LEAP) is a mission concept for a wide FOV Compton scatter polarimeter instrument that would be mounted as an external payload on the International Space Station (ISS) in 2022. It has recently been proposed as an astrophysics Mission of Opportunity (MoO), with the primary objective of measuring polarization of the prompt emission of Gamma Ray Bursts (GRBs). It will achieve its science objectives with a simple mission design that features a single instrument based entirely on well-established, flight-proven scintillator-photomultiplier tube (PMT) technologies. LEAP will provide GRB polarization measurements from 30-500 keV and GRB spectroscopy from 5 keV up to 5 MeV, and will self-sufficiently provide the source localization that is required for analysis of the polarization data. The instrument consists of 9 independent polarimeter modules and associated electronics. Each module is a 12 x 12 array of independent plastic and CsI(Tl) scintillator elements, each with individual PMT readout, to identify and measure Compton scatter events. It will provide coverage of GRB spectra over a range that includes most values of Ep. With a total geometric scintillator area of 5000 cm2, LEAP will provide a total effective area for polarization (double scatter) events of ~500 cm2. LEAP will trigger on >200 GRBs within its FOV during a two-year mission. At least 120 GRBs will have sufficient counts to enable localization with an error of MDP) better than 30%. If GRBs are polarized at levels >50%, as suggested by published results, LEAP will provide definitive polarization measurements on ~100 GRBs. These data will allow LEAP to differentiate between the intrinsic and geometric classes of GRB models and further distinguish between two geometric models at the 95% confidence level. Detailed time-resolved and/or energy-resolved studies will be conducted for the brightest GRBs.

  10. Hard X-ray Imaging Polarimeter for PolariS

    Science.gov (United States)

    Hayashida, Kiyoshi

    2016-07-01

    We present the current status of development of hard X-ray imaging polarimeters for the small satellite mission PolariS. The primary aim of PolariS is hard X-ray (10-80keV) polarimetry of sources brighter than 10mCrab. Its targets include stellar black holes, neutron stars, super nova remnants, and active galactic nuclei. This aim is enabled with three sets of hard X-ray telescopes and imaging polarimeters installed on their focal planes. The imaging polarimeter consists of two kinds of (plastic and GSO) scintillator pillars and multi-anode photo multiplier tubes (MAPMTs). When an X-ray photon incident to a plastic scintillator cause a Compton scattering, a recoiled electron makes a signal on the corresponding MAPMT pixel, and a scatted X-rays absorbed in surrounding GSO makes another signal. This provide information on the incident position and the scattered direction. The latter information is employed for polarimetry. For 20keV X-ray incidence, the recoiled electron energy is as low as 1keV. Thus, the performance of this imaging polarimeter is primarily determined by the efficiency that we can detect low level signal of recoiled electrons generated in plastic scintillators. The efficiency could depend on multiple factors, e.g. quenching of light in scintillators, electric noise, pedestal error, cross talk of the lights to adjacent MAPMT pixels, MAPMT dark current etc. In this paper, we examined these process experimentally and optimize the event selection algorithm, in which single photo-electron events are selected. We then performed an X-ray (10-80keV monochromatic polarized beam) irradiation test at a synchrotron facility. The modulation contrast (M) is about 60% in 15-80keV range. We succeeded in detecting recoiled electrons for 10-80keV X-ray incidence, though detection efficiency is lower at lowest end of the energy range. Expected MDP will also be shown.

  11. Design of an adaptable Stokes polarimeter for exploring chromospheric magnetism

    Science.gov (United States)

    Louis, Rohan E.; Bayanna, A. Raja; Socas Navarro, Héctor

    2017-10-01

    The chromosphere is a highly complex and dynamic layer of the Sun, that serves as a conduit for mass and energy supply between two, very distinct regions of the solar atmosphere, namely, the photosphere and corona. Inferring magnetic fields in the chromosphere, has thus become an important topic, that can be addressed with large-aperture solar telescopes to carry out highly sensitive polarimetric measurements. In this article, we present a design of a polarimeter for investigating the chromospheric magnetic field. The instrument consists of a number of lenses, two ferro-electric liquid crystals, a Wollaston prism, and a CCD camera. The optical design is similar to that of a commercial zoom lens which allows a variable f# while maintaining focus and aberrations well within the Airy disc. The optical design of the Adaptable ChRomOspheric POLarimeter (ACROPOL) makes use of off-the-shelf components and is described for the 70 cm Vacuum Tower Telescope and the 1.5 m GREGOR telescope at Observatorio del Teide, Tenerife, Spain. Our design shows that the optical train can be separated into two units where the first unit, consisting of a single lens, has to be changed while going from the VTT to the GREGOR configuration. We also discuss the tolerances within which, diffraction limited performance can be achieved with our design.

  12. Liquid Water Cloud Properties During the Polarimeter Definition Experiment (PODEX)

    Science.gov (United States)

    Alexandrov, Mikhail D.; Cairns, Brian; Wasilewski, Andrzei P.; Ackerman, Andrew S.; McGill, Matthew J.; Yorks, John E.; Hlavka, Dennis L.; Platnick, Steven; Arnold, George; Van Diedenhoven, Bastiaan; hide

    2015-01-01

    We present retrievals of water cloud properties from the measurements made by the Research Scanning Polarimeter (RSP) during the Polarimeter Definition Experiment (PODEX) held between January 14 and February 6, 2013. The RSP was onboard the high-altitude NASA ER-2 aircraft based at NASA Dryden Aircraft Operation Facility in Palmdale, California. The retrieved cloud characteristics include cloud optical thickness, effective radius and variance of cloud droplet size distribution derived using a parameter-fitting technique, as well as the complete droplet size distribution function obtained by means of Rainbow Fourier Transform. Multi-modal size distributions are decomposed into several modes and the respective effective radii and variances are computed. The methodology used to produce the retrieval dataset is illustrated on the examples of a marine stratocumulus deck off California coast and stratus/fog over California's Central Valley. In the latter case the observed bimodal droplet size distributions were attributed to two-layer cloud structure. All retrieval data are available online from NASA GISS website.

  13. SHARP: the SHARC-II polarimeter for CSO

    Science.gov (United States)

    Li, H.; Attard, M.; Dowell, C. D.; Hildebrand, R. H.; Houde, M.; Kirby, L.; Novak, G.; Vaillancourt, J. E.

    2006-06-01

    SHARC-II is a 32 × 12 pixel submillimeter camera that is used with the ten-meter diameter Caltech Submillimeter Observatory (CSO) on Mauna Kea. This camera can be operated at either 350 or 450 microns. We developed a module that is installed at the CSO Nasmyth focus in order to convert SHARC-II into a sensitive imaging polarimeter, which we refer to as "SHARP". SHARP splits the incident beam into two orthogonal polarized beams that are then re-imaged onto different halves of the SHARC-II bolometer array. When this removable polarimetry module is in use, SHARC-II becomes a dual-beam 12 × 12 pixel polarimeter. Sky noise is a significant source of error for submillimeter continuum observations. Because SHARP will simultaneously observe two orthogonal polarization components, we are able to eliminate or greatly reduce this source of error. Here we describe the design of SHARP and report preliminary results of tests and observations carried out during our first two runs at CSO in August 2005 and January 2006.

  14. Unique electron polarimeter analyzing power comparison and precision spin-based energy measurement

    International Nuclear Information System (INIS)

    Joseph Grames; Charles Sinclair; Joseph Mitchell; Eugene Chudakov; Howard Fenker; Arne Freyberger; Douglas Higinbotham; Poelker, B.; Michael Steigerwald; Michael Tiefenback; Christian Cavata; Stephanie Escoffier; Frederic Marie; Thierry Pussieux; Pascal Vernin; Samuel Danagoulian; Kahanawita Dharmawardane; Renee Fatemi; Kyungseon Joo; Markus Zeier; Viktor Gorbenko; Rakhsha Nasseripour; Brian Raue; Riad Suleiman; Benedikt Zihlmann

    2004-01-01

    Precision measurements of the relative analyzing powers of five electron beam polarimeters, based on Compton, Moller, and Mott scattering, have been performed using the CEBAF accelerator at the Thomas Jefferson National Accelerator Facility (Jefferson Laboratory). A Wien filter in the 100 keV beamline of the injector was used to vary the electron spin orientation exiting the injector. High statistical precision measurements of the scattering asymmetry as a function of the spin orientation were made with each polarimeter. Since each polarimeter receives beam with the same magnitude of polarization, these asymmetry measurements permit a high statistical precision comparison of the relative analyzing powers of the five polarimeters. This is the first time a precise comparison of the analyzing powers of Compton, Moller, and Mott scattering polarimeters has been made. Statistically significant disagreements among the values of the beam polarization calculated from the asymmetry measurements made with each polarimeter reveal either errors in the values of the analyzing power, or failure to correctly include all systematic effects. The measurements reported here represent a first step toward understanding the systematic effects of these electron polarimeters. Such studies are necessary to realize high absolute accuracy (ca. 1%) electron polarization measurements, as required for some parity violation measurements planned at Jefferson Laboratory. Finally, a comparison of the value of the spin orientation exiting the injector that provides maximum longitudinal polarization in each experimental hall leads to an independent and very precise (better than 10-4) absolute measurement of the final electron beam energy

  15. Characterizing and Modeling the Noise and Complex Impedance of Feedhorn-Coupled TES Polarimeters

    Science.gov (United States)

    Appel, J. W.; Austermann, J. E.; Beall, J. A.; Becker, D.; Benson, B. A.; Bleem, L. E.; Britton, J.; Chang, C. L.; Carlstrom, J. E.; Cho, H. M.; Crites, A. T.; Essinger-Hileman, T.; Everett, W.; Halverson, N. W.; Henning, J. W.; Hilton, G. C.; Irwin, K. D.; McMahon, J.; Mehl, J.; Meyer, S. S.; Niemack, M. D.; Parker, L. P.; Simon, S. M.; Staggs, S. T.; Visnjic, C.; Yoon, K. W.; Zhao, Y.

    2009-12-01

    We present results from modeling the electrothermal performance of feedhorn-coupled transition edge sensor (TES) polarimeters under development for use in cosmic microwave background (CMB) polarization experiments. Each polarimeter couples radiation from a corrugated feedhorn through a planar orthomode transducer, which transmits power from orthogonal polarization modes to two TES bolometers. We model our TES with two- and three-block thermal architectures. We fit the complex impedance data at multiple points in the TES transition. From the fits, we predict the noise spectra. We present comparisons of these predictions to the data for two TESes on a prototype polarimeter.

  16. The MESA polarimetry chain and the status of its double scattering polarimeter

    International Nuclear Information System (INIS)

    Aulenbacher, K.; Bartolomé, P. Aguar; Molitor, M.; Tioukine, V.

    2013-01-01

    We plan to have two independent polarimetry systems at MESA based on totally different physical processes. A first one tries to minimize the systematic uncertainties in double polarized Mo/ller scattering, which is to be achieved by stored hydrogen atoms in an atomic trap (Hydro-Mo/ller-Polarimeter). The other one relies on the equality of polarizing and analyzing power which allows to measure the effective analyzing power of a polarimeter with very high accuracy. Since the status of Hydro-Mo/ller is presented in a separate paper we concentrate on the double scattering polarimeter in this article

  17. Passive New UV Polarimeter for Remote Surface and Atmospheric Sensing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Our imaging polarimeter concept makes available for the first time, the passive remote imagery of all four Stokes vector components at UV wavelengths shorter than...

  18. Development of the GEM-TPC X-ray Polarimeter with the Scalable Readout System

    Directory of Open Access Journals (Sweden)

    Kitaguchi Takao

    2018-01-01

    Full Text Available We have developed a gaseous Time Projection Chamber (TPC containing a single-layered foil of a gas electron multiplier (GEM to open up a new window on cosmic X-ray polarimetry in the 2–10 keV band. The micro-pattern TPC polarimeter in combination with the Scalable Readout System produced by the RD51 collaboration has been built as an engineering model to optimize detector parameters and improve polarimeter sensitivity. The polarimeter was characterized with unpolarized X-rays from an X-ray generator in a laboratory and polarized X-rays on the BL32B2 beamline at the SPring-8 synchrotron radiation facility. Preliminary results show that the polarimeter has a comparable modulation factor to a prototype of the flight one.

  19. The Hertz/VPM Polarimeter: Design and First Light Observations

    Science.gov (United States)

    Krejny, Megan; Chuss, David; d'Aubigny, Christian Drouet; Golish, Dathon; Houde, Martin; Hui, Howard; Kulesa, Craig; Loewenstein, Robert F.; Moseley, Harvey; Novak, Giles; hide

    2008-01-01

    We present first results of Hertz/VPM, the first submillimeter polarimeter employing the dual Variable-delay Polarization Modulator (dual-VPM). This device differs from previously used polarization modulators in that it, operates in translation rather than mechanical rotation. We discuss the basic theory behind this device, and its potential advantages over the commonly used half wave plate (HFVP). The dual-VPM was tested both at the Submillimeter Telescope Observatory (SMTO) and in the lab. In each case we present a detailed description of the setup. We discovered nonideal behavior in the system. This is at least in part due to properties of the VPM wire grids (diameter, spacing) employed in our experiment. Despite this, we found that the dual-VPM system is robust, operating with high efficiency and low instrumental polarization. This device is well suited for air and space-borne applications.

  20. Interpolation strategies for reducing IFOV artifacts in microgrid polarimeter imagery.

    Science.gov (United States)

    Ratliff, Bradley M; LaCasse, Charles F; Tyo, J Scott

    2009-05-25

    Microgrid polarimeters are composed of an array of micro-polarizing elements overlaid upon an FPA sensor. In the past decade systems have been designed and built in all regions of the optical spectrum. These systems have rugged, compact designs and the ability to obtain a complete set of polarimetric measurements during a single image capture. However, these systems acquire the polarization measurements through spatial modulation and each measurement has a varying instantaneous field-of-view (IFOV). When these measurements are combined to estimate the polarization images, strong edge artifacts are present that severely degrade the estimated polarization imagery. These artifacts can be reduced when interpolation strategies are first applied to the intensity data prior to Stokes vector estimation. Here we formally study IFOV error and the performance of several bilinear interpolation strategies used for reducing it.

  1. Rocket Experiment Demonstration of a Soft X-ray Polarimeter

    Science.gov (United States)

    Marshall, Herman

    This proposal is the lead proposal. Boston University will submit, via NSPIRES, a Co-I proposal, per instructions for Suborbital proposals for multiple-award. Our scientific goal of the Rocket Experiment Demonstration of a Soft X-ray Polarimeter (REDSoX Polarimeter) is to make the first measurement of the linear X-ray polarization of an extragalactic source in the 0.2-0.8 keV band. The first flight of the REDSoX Polarimeter would target Mk 421, which is commonly modeled as a highly relativistic jet aimed nearly along the line of sight. Such sources are likely to be polarized at a level of 30-60%, so the goal is to obtain a significant detection even if it is as low as 10%. Significant revisions to the models of jets emanating from black holes at the cores of active galaxies would be required if the polarization fraction lower than 10%. We employ multilayer-coated mirrors as Bragg reflectors at the Brewster angle. By matching to the dispersion of a spectrometer, one may take advantage of high multilayer reflectivities and achieve polarization modulation factors over 90%. Using replicated foil mirrors from MSFC and gratings made at MIT, we construct a spectrometer that disperses to three laterally graded multilayer mirrors (LGMLs). The lateral grading changes the wavelength of the Bragg peak for 45 degree reflections linearly across the mirror, matching the dispersion of the spectrometer. By dividing the entrance aperture into six equal sectors, pairs of blazed gratings from opposite sectors are oriented to disperse to the same LGML. The position angles for the LGMLs are 120 degrees to each other. CCD detectors then measure the intensities of the dispersed spectra after reflection and polarizing by the LGMLs, giving the three Stokes parameters needed to determine the source polarization. We will rely on components whose performance has been verified in the laboratory or in space. The CCD detectors are based on Chandra and Suzaku heritage. The mirror fabrication team

  2. Use of proportional tubes in a muon polarimeter

    International Nuclear Information System (INIS)

    Kenney, C.J.; Eckhause, M.; Ginkel, J.F.

    1988-01-01

    A prototype muon polarimeter was built to study the feasibility of measuring the positive muon polarization in the decay K/sub L/ → μ + μ/sup /minus//. The system consisted of alternating layers of extruded aluminum gas proportional tubes and polarization-retaining absorber plates of either aluminum or marble. Longitudinally polarized positive muons from the Stopped Muon Channel at the Clinton P. Anderson Meson Physics Facility (LAMPF) were stopped in the absorber plates where they precessed in a field of 60 gauss. Decay times were recorded in 100 ns first-in-first-out memories for all wires hit during a 12.8 μs period centered about the muon stop trigger. The performance of the system was studied for different beam rates and absorber thicknesses. The value of imposing time and spacial cuts on track data to enhance the precession signal was also investigated. 7 refs., 4 figs., 1 tab

  3. CLASP2: The Chromospheric LAyer Spectro-Polarimeter

    Science.gov (United States)

    Rachmeler, Laurel; E McKenzie, David; Ishikawa, Ryohko; Trujillo Bueno, Javier; Auchère, Frédéric; Kobayashi, Ken; Winebarger, Amy; Bethge, Christian; Kano, Ryouhei; Kubo, Masahito; Song, Donguk; Narukage, Noriyuki; Ishikawa, Shin-nosuke; De Pontieu, Bart; Carlsson, Mats; Yoshida, Masaki; Belluzzi, Luca; Stepan, Jiri; del Pino Alemná, Tanausú; Ballester, Ernest Alsina; Asensio Ramos, Andres

    2017-08-01

    We present the instrument, science case, and timeline of the CLASP2 sounding rocket mission. The successful CLASP (Chromospheric Lyman-Alpha Spectro-Polarimeter) sounding rocket flight in 2015 resulted in the first-ever linear polarization measurements of solar hydrogen Lyman-alpha line, which is sensitive to the Hanle effect and can be used to constrain the magnetic field and geometric complexity of the upper chromosphere. Ly-alpha is one of several upper chromospheric lines that contain magnetic information. In the spring of 2019, we will re-fly the modified CLASP telescope to measure the full Stokes profile of Mg II h & k near 280 nm. This set of lines is sensitive to the upper chromospheric magnetic field via both the Hanle and the Zeeman effects.

  4. Faraday rotation calculations for a FIR polarimeter on ITER

    International Nuclear Information System (INIS)

    Nieswand, C.

    1997-01-01

    The measurement of the safety factor profile has been considered as an essential diagnostics for ITER. Without the presence of a neutral beam, the only reliable diagnostics which can fulfill the requirements for the q-profile determination is at present the polarimetry. This paper presents the results of calculations of the Faraday rotation and the Cotton-Mouton effect for various plasma configurations (considered as typical) and various beam geometries which can eventually be realized in spite of the restricted access. The calculations should help to find a decision for the wavelength and the number and the position of the observation chords of a possible polarimeter system on ITER. The paper does not deal with technical questions concerning the implementation of such a system on ITER. The potential use of internal retro-reflectors or waveguides for the beams is not discussed. (author) 4 figs., 3 refs

  5. Analysis of AGS polarimeter data at G gamma=7.5.

    CERN Document Server

    Huang, H; Spinka, H M; Underwood, D G

    2003-01-01

    Data were collected with the AGS internal polarimeter at G gamma = 7.5 during the recent FY02 polarized proton run. The addition of new forward scintillation counters permitted an absolute calibration of the polarimeter for both nylon and carbon targets. The results are summarized, and the polarization measured at G gamma = 7.5 is compared to that at 200 MeV.

  6. Calibration of the Gamma-RAy Polarimeter Experiment (GRAPE) at a polarized hard X-ray beam

    International Nuclear Information System (INIS)

    Bloser, P.F.; Legere, J.S.; McConnell, M.L.; Macri, J.R.; Bancroft, C.M.; Connor, T.P.; Ryan, J.M.

    2009-01-01

    The Gamma-RAy Polarimeter Experiment (GRAPE) is a concept for an astronomical hard X-ray Compton polarimeter operating in the 50-500 keV energy band. The instrument has been optimized for wide-field polarization measurements of transient outbursts from energetic astrophysical objects such as gamma-ray bursts and solar flares. The GRAPE instrument is composed of identical modules, each of which consists of an array of scintillator elements read out by a multi-anode photomultiplier tube (MAPMT). Incident photons Compton scatter in plastic scintillator elements and are subsequently absorbed in inorganic scintillator elements; a net polarization signal is revealed by a characteristic asymmetry in the azimuthal scattering angles. We have constructed a prototype GRAPE module containing a single CsI(Na) calorimeter element, at the center of the MAPMT, surrounded by 60 plastic elements. The prototype has been combined with custom readout electronics and software to create a complete 'engineering model' of the GRAPE instrument. This engineering model has been calibrated using a nearly 100% polarized hard X-ray beam at the Advanced Photon Source at Argonne National Laboratory. We find modulation factors of 0.46±0.06 and 0.48±0.03 at 69.5 and 129.5 keV, respectively, in good agreement with Monte Carlo simulations. In this paper we present details of the beam test, data analysis, and simulations, and discuss the implications of our results for the further development of the GRAPE concept.

  7. Super-resolution for imagery from integrated microgrid polarimeters.

    Science.gov (United States)

    Hardie, Russell C; LeMaster, Daniel A; Ratliff, Bradley M

    2011-07-04

    Imagery from microgrid polarimeters is obtained by using a mosaic of pixel-wise micropolarizers on a focal plane array (FPA). Each distinct polarization image is obtained by subsampling the full FPA image. Thus, the effective pixel pitch for each polarization channel is increased and the sampling frequency is decreased. As a result, aliasing artifacts from such undersampling can corrupt the true polarization content of the scene. Here we present the first multi-channel multi-frame super-resolution (SR) algorithms designed specifically for the problem of image restoration in microgrid polarization imagers. These SR algorithms can be used to address aliasing and other degradations, without sacrificing field of view or compromising optical resolution with an anti-aliasing filter. The new SR methods are designed to exploit correlation between the polarimetric channels. One of the new SR algorithms uses a form of regularized least squares and has an iterative solution. The other is based on the faster adaptive Wiener filter SR method. We demonstrate that the new multi-channel SR algorithms are capable of providing significant enhancement of polarimetric imagery and that they outperform their independent channel counterparts.

  8. SPIDER: Probing the Early Universe with a Suborbital Polarimeter

    Science.gov (United States)

    Fraisse, Aurélien A.; SPIDER Collaboration

    2012-01-01

    SPIDER is a balloon-borne polarimeter designed to detect a divergence-free polarization pattern ("B-modes") in the Cosmic Microwave Background (CMB). In the inflationary scenario, the spectrum of the tensor perturbations that generate this signal is proportional to that of the primordial scalar perturbations through the tensor-to-scalar ratio r. The expected level of systematic error in the SPIDER instrument is significantly below the amplitude of an interesting cosmological B-mode signal with r=0.03. An optimized scanning strategy enables us to minimize uncertainty in the reconstruction of the Stokes parameters used to characterize the CMB, while providing access to a relatively wide range of angular scales. In the SPIDER field, the polarized emission from interstellar dust is as bright or brighter than the cosmological r=0.03 B-mode signal at all SPIDER frequencies (90, 150, and 280 GHz), a situation similar to that found in the "Southern Hole." Despite this foreground contamination, two 20-day flights of the SPIDER instrument will constrain the amplitude of the B-mode signal to rAPRA-NNX07AL64G), the National Science Foundation (ANT-1043515), the Gordon and Betty Moore Foundation, and the David and Lucile Packard Foundation. Support in Canada is provided by NSERC, the Canadian Space Agency, the Canada Foundation for Innovation, and CIFAR.

  9. Design, construction and calibration of a polarimeter for gamma radiation

    International Nuclear Information System (INIS)

    Macchiavelli, A.O.; Marti, G.V.; Gimenez, C.R.; Laffranchi, J.A.; Behar, M.

    1980-01-01

    Information on different nuclear states can be obtained from the analysis of the angular distribution of the emitted gamma radiation. When the information is not sufficient to determine certain relevant parameters, or is ambiguous, a measurement of the linear polarization of radiation together with the angular distribution allows, in many cases, to resolve this ambiguity. This in turn, necessitates of a detector radiation: this is a Ge(Li) planar detector with a width d of the compensated zone smaller than the length L (L/d greater than 1), built from a germanium block with a square section of 33 mm side, compensated with lithium up to 3 mm depth, by means of usual techniques. The detector characteristics, measured by conventional electronics, were: system's total resolution (Full Width at Half Maximum) 2.4 keV; pico-Compton relation of 6/1 and relative efficiency of 0. for γ rays of 1.33 MeV from 60 Co. Using γ-γ fast-slow coincidence techniques (Ge(Li)-INa system), the curve of polarization efficiency in the 0.1.5 MeV energy range was determined and a polarization efficiency of approximately 17% was obtained for said energy range. This value is comparable to the results obtained in previous works for polarimeters of similar dimensions and can be used to determine multipolarity of nuclear states. (M.E.L.) [es

  10. Lifetime estimation of a time projection chamber x-ray polarimeter

    Science.gov (United States)

    Hill, Joanne E.; Black, J. Kevin; Brieda, Lubos; Dickens, Patsy L.; Montt de Garcia, Kristina; Hawk, Douglas L.; Hayato, Asami; Jahoda, Keith; Mohammed, Jelila

    2013-09-01

    The Gravity and Extreme Magnetism Small Explorer (GEMS) X-ray polarimeter Instrument (XPI) was designed to measure the polarization of 23 sources over the course of its 9 month mission. The XPI design consists of two telescopes each with a polarimeter assembly at the focus of a grazing incidence mirror. To make sensitive polarization measurements the GEMS Polarimeter Assembly (PA) employed a gas detection system based on a Time Projection Chamber (TPC) technique. Gas detectors are inherently at risk of degraded performance arising from contamination from outgassing of internal detector components or due to loss of gas. This paper describes the design and the materials used to build a prototype of the flight polarimeter with the required GEMS lifetime. We report the results from outgassing measurements of the polarimeter subassemblies and assemblies, enclosure seal tests, life tests, and performance tests that demonstrate that the GEMS lifetime is achievable. Finally we report performance measurements and the lifetime enhancement from the use of a getter.

  11. Run05 Proton Beam Polarization Measurements by pC-Polarimeter (ver. 1.1)

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa,I.; Alekseev, I.; Bazilevsky, A.; Bravar, A.; Bunce, G.; Dhawan, S.; Eyser, K.O.; Gill, R.; Haeberli, W.; Huang, H.; Makdisi, Y.; Nass, A.; Okada, H.; Stephenson, E.; Svirida, D.N.; Wise, T.; Wood, J.; Yip, K.; Zelenski, A.

    2008-07-01

    The polarization of the proton beams [1, 2] at the Relativistic Heavy Ion Collider (RHIC)[3] RHIC ring. The H-Jet polarimeter is located at the collision point allowing measurements of absolute normalization is provided by the hydrogen polarimeter, which measures over 1 {approx} 2 another measurement rather than measuring the absolute polarization. both beams. Two identical pC-polarimeters are equipped in the yellow and blue rings, where carbon ribbon target, providing fast feedback to beam operations and experiments. The days to obtain {approx} 5% statistical uncertainty (in Run05). Thus, the operation of the carbon is measured using both an atomic beam source hydrogen gas jet (H-Jet)[4, 5] and proton-carbon polarimeters was focused on better control of relative stability between one measurement to statistical accuracy within 20 to 30 seconds using an ultra-thin (typically 6 {approx} 8 {micro}g/cm{sup 2}) the rings are separated. The pC-polarimeter measures relative polarization to a few percent.

  12. Airborne Polarimeter Intercomparison for the NASA Aerosols-Clouds-Ecosystems (ACE) Mission

    Science.gov (United States)

    Knobelspiesse, Kirk; Redemann, Jens

    2014-01-01

    The Aerosols-Clouds-Ecosystems (ACE) mission, recommended by the National Research Council's Decadal Survey, calls for a multi-angle, multi-spectral polarimeter devoted to observations of atmospheric aerosols and clouds. In preparation for ACE, NASA funds the deployment of airborne polarimeters, including the Airborne Multi-angle SpectroPolarimeter Imager (AirMSPI), the Passive Aerosol and Cloud Suite (PACS) and the Research Scanning Polarimeter (RSP). These instruments have been operated together on NASA's ER-2 high altitude aircraft as part of field campaigns such as the POlarimeter DEfinition EXperiment (PODEX) (California, early 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, California and Texas, summer 2013). Our role in these efforts has been to serve as an assessment team performing level 1 (calibrated radiance, polarization) and level 2 (retrieved geophysical parameter) instrument intercomparisons, and to promote unified and generalized calibration, uncertainty assessment and retrieval techniques. We will present our progress in this endeavor thus far and describe upcoming research in 2015.

  13. Progress in Airborne Polarimeter Inter Comparison for the NASA Aerosols-Clouds-Ecosystems (ACE) Mission

    Science.gov (United States)

    Knobelspiesse, Kirk; Redemann, Jens

    2014-01-01

    The Aerosols-Clouds-Ecosystems (ACE) mission, recommended by the National Research Council's Decadal Survey, calls for a multi-angle, multi-spectral polarimeter devoted to observations of atmospheric aerosols and clouds. In preparation for ACE, NASA funds the deployment of airborne polarimeters, including the Airborne Multiangle SpectroPolarimeter Imager (AirMSPI), the Passive Aerosol and Cloud Suite (PACS) and the Research Scanning Polarimeter (RSP). These instruments have been operated together on NASA's ER-2 high altitude aircraft as part of field campaigns such as the POlarimeter DEfinition EXperiment (PODEX) (California, early 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, California and Texas, summer 2013). Our role in these efforts has been to serve as an assessment team performing level 1 (calibrated radiance, polarization) and level 2 (retrieved geophysical parameter) instrument intercomparisons, and to promote unified and generalized calibration, uncertainty assessment and retrieval techniques. We will present our progress in this endeavor thus far and describe upcoming research in 2015.

  14. Role of the null space of the DRM in the performance of modulated polarimeters.

    Science.gov (United States)

    LaCasse, Charles F; Tyo, J Scott; Chipman, Russell A

    2012-03-15

    Imaging polarimeters infer the spatial distribution of the polarization state of the optical field as a function of time and/or wavelength. A polarimeter indirectly determines the polarization state by first modulating the intensity of the light field and then demodulating the measured data to infer the polarization parameters. This Letter considers passive Stokes parameter polarimeters and their inversion methods. The most widely used method is the data reduction matrix (DRM), which builds up a matrix equation that can be inverted to find the polarization state from a set of intensity measurements. An alternate strategy uses linear system formulations that allow band limited reconstruction through a filtering perspective. Here we compare these two strategies for overdetermined polarimeters and find that design of the null space of the inversion operator provides degrees of freedom to optimize the trade off between accuracy and signal-to-noise ratio. We further describe adaptive filtering techniques that could optimize the reconstruction for a particular experimental configuration. This Letter considers time-varying Stokes parameters, but the methods apply equally to polarimeters that are modulated in space or in wavelength.

  15. Using a polarizing film in the manufacture of panoramic Stokes polarimeters at the Main Astronomical Observatory of NAS of Ukraine

    Science.gov (United States)

    Syniavskyi, I.; Ivanov, Yu.; Vidmachenko, A. P.; Sergeev, A.

    2015-08-01

    The construction of an imaging Stokes-polarimeter in the MAO NAS of Ukraine is proposed. It allows measuring the three components of the Stokes vector simultaneously in large FOV without restrictions on the relative aperture of the system. Moreover, the polarimeter can be converted to a low resolution spectropolarimeter by placement into optical axis of the transparence diffraction grating.

  16. MHD marking using the MSE polarimeter optics in ILW JET plasmas

    CERN Document Server

    Reyes Cortes, S.; Alves, D.; Baruzzo, M.; Bernardo, J.; Buratti, P.; Coelho, R.; Challis, C.; Chapman, I.; Hawkes, N.; Hender, T.C.; Hobirk, J.; Joffrin, E.

    2016-01-01

    In this communication we propose a novel diagnostic technique, which uses the collection optics of the JET Motional Stark Effect (MSE) diagnostic, to perform polarimetry marking of observed MHD in high temperature plasma regimes. To introduce the technique, first we will present measurements of the coherence between MSE polarimeter, electron cyclotron emission, and Mirnov coil signals aiming to show the feasibility of the method. The next step consists of measuring the amplitude fluctuation of the raw MSE polarimeter signals, for each MSE channel, following carefully the MHD frequency on Mirnov coil data spectrograms. A variety of experimental examples in JET ITER-Like Wall (ILW) plasmas are presented, providing an adequate picture and interpretation for the MSE optics polarimeter technique.

  17. Origins Space Telescope: The Far Infrared Imager and Polarimeter FIP

    Science.gov (United States)

    Staguhn, Johannes G.; Chuss, David; Howard, Joseph; Meixner, Margaret; Vieira, Joaquin; Amatucci, Edward; Bradley, Damon; Carter, Ruth; Cooray, Asantha; Flores, Anel; Leisawitz, David; Moseley, Samuel Harvey; Wollack, Edward; Origins Space Telescope Study Team

    2018-01-01

    The Origins Space Telescope (OST)* is the mission concept for the Far-Infrared Surveyor, one of the four science and technology definition studies of NASA Headquarters for the 2020 Astronomy and Astrophysics Decadal survey. The current "concept 1", which envisions a cold (4K) 9m space telescope, includes 5 instruments, providing a wavelength coverage ranging from 6um and 667um. The achievable sensitivity of the observatory will provide three to four orders of magnitude of improvement in sensitivity over current observational capabilities, allowing to address a wide range of new and so far inaccessible scientific questions, ranging from bio-signatures on exo-planets to mapping primordial H_2 from the "dark ages" before the universe went through the phase of re-ionization.Here we present the Far Infrared Imager and Polarimeter (FIP) for OST. The cameral will cover four bands, 40um, 80um, 120um, and 240um. It will allow for differential polarimetry in those bands with the ability to observe two colors in polarimtery mode simultaneously, while all four bands can be observed simultaneously in total power mode. While the confusion limit will be reached in only 32ms at 240um, at 40um the source density on the sky is so low, that at the angular resolution of 1" of OST at this wavelength there will be no source confusion, even for the longest integration times. Science topics that can be addressed by FIP include but are not limited to galactic and extragalactic magnetic field studies, Deep Galaxy Surveys, and Outer Solar System objects..*Origins will enable flagship-quality general observing programs led by the astronomical community in the 2030s. We welcome you to contact the Science and Technology Definition Team (STDT) with your science needs and ideas by emailing us at ost_info@lists.ipac.caltech.edu

  18. Measurement of the nuclear polarization of hydrogen and deuterium molecules using a Lamb-shift polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Engels, Ralf, E-mail: r.w.engels@fz-juelich.de; Gorski, Robert; Grigoryev, Kiril; Mikirtychyants, Maxim; Rathmann, Frank; Seyfarth, Hellmut; Ströher, Hans; Weiss, Philipp [Institut für Kernphysik, Forschungszentrum Jülich, Wilhelm-Johnen-Str. 1, 52428 Jülich (Germany); Kochenda, Leonid; Kravtsov, Peter; Trofimov, Viktor; Tschernov, Nikolay; Vasilyev, Alexander; Vznuzdaev, Marat [Laboratory of Cryogenic and Superconductive Technique, Petersburg Nuclear Physics Institute, Orlova Roscha 1, 188300 Gatchina (Russian Federation); Schieck, Hans Paetz gen. [Institut für Kernphysik, Universität zu Köln, Zülpicher Str. 77, 50937 Köln (Germany)

    2014-10-15

    Lamb-shift polarimeters are used to measure the nuclear polarization of protons and deuterons at energies of a few keV. In combination with an ionizer, the polarization of hydrogen and deuterium atoms was determined after taking into account the loss of polarization during the ionization process. The present work shows that the nuclear polarization of hydrogen or deuterium molecules can be measured as well, by ionizing the molecules and injecting the H{sub 2}{sup +} (or D{sub 2}{sup +}) ions into the Lamb-shift polarimeter.

  19. Multilayer based soft-x-ray polarimeter at MAX IV Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Grizolli, Walan; Laksman, Joakim; Hennies, Franz; Jensen, Brian Norsk; Nyholm, Ralf; Sankari, Rami, E-mail: rami.sankari@maxlab.lu.se [MAX IV Laboratory, P.O. Box 118, SE-22100 Lund (Sweden)

    2016-02-15

    A high precision five rotation-axes polarimeter using transmission multilayers as polarizers and reflection multilayers as analyzers has been designed and manufactured. To cover the extreme ultraviolet regime, Mo/Si, Cr/C, Sc/Cr, and W/B{sub 4}C multilayers for transmission and reflection have also been designed and produced. The polarimeter mechanics is supported on a hexapod to simplify the alignment relative to photon beam. The instrument is designed so that it can be easily transferred between different beamlines.

  20. SOLPOL: A Solar Polarimeter for Hard X-Rays and Gamma-Rays

    Science.gov (United States)

    McConnell, Michael L.

    1999-01-01

    Th goal of this project was to continue the development of a hard X-ray polarimeter for studying solar flares. In earlier work (funded by a previous SR&T grant), we had already achieved several goals, including the following: 1) development of a means of producing a polarized radiation source in the lab that could be used for prototype development; 2) demonstrated the basic Compton scatter polarimeter concept using a simple laboratory setup; 3) used the laboratory results to verify our Monte Carlo simulations; and 4) investigated various detector technologies that could be incorporated into the polarimeter design. For the current one-year program, we wanted to fabricate and test a laboratory science model based on our SOLPOL (Solar Polarimeter) design. The long-term goal of this effort is to develop and test a prototype design that could be used to study flare emissions from either a balloon- or space-borne platform. The current program has achieved its goal of fabricating and testing a science model of the SOLPOL design, although additional testing of the design (and detailed comparison with Monte Carlo simulations) is still desired. This one-year program was extended by six months (no-cost extension) to cover the summer of 1999, when undergraduate student support was available to complete some of the laboratory testing.

  1. The ZIMPOL high-contrast imaging polarimeter for SPHERE: design, manufacturing, and testing

    NARCIS (Netherlands)

    Roelfsema, R.; Schmid, H.M.; Pragt, J.; Gisler, D.; Waters, R.; Bazzon, A.; Baruffolo, A.; Beuzit, J.-L.; Boccaletti, A.; Charton, J.; Cumani, C.; Dohlen, K.; Downing, M.; Elswijk, E.; Feldt, M.; Groothuis, C.; de Haan, M.; Hanenburg, H.; Hubin, N.; Joos, F.; Kasper, M.; Keller, C.; Kragt, J.; Lizon, J.-L.; Mouillet, D.; Pavlov, A.; Rigal, F.; Rochat, S.; Salasnich, B.; Steiner, P.; Thalmann, C.; Venema, L.; Wildi, F.

    2010-01-01

    ZIMPOL is the high contrast imaging polarimeter subsystem of the ESO SPHERE instrument. ZIMPOL is dedicated to detect the very faint reflected and hence polarized visible light from extrasolar planets. ZIMPOL is located behind an extreme AO system (SAXO) and a stellar coronagraph. SPHERE is foreseen

  2. SHARP: The SHARC-II polarimeter at the Caltech Submillimeter Observatory

    Science.gov (United States)

    Vaillancourt, John E.; Attard, M.; Dowell, C. D.; Hildebrand, R. H.; Houde, M.; Kirby, L.; Krejny, M.; Li, H.; Novak, G.; Shinnaga, H.

    2006-12-01

    The Submillimeter High Angular Resolution Camera II (SHARC-II) is a 12 × 32 pixel camera used with the 10 meter diameter Caltech Submillimeter Observatory (CSO). We have deployed an optics module between the telescope and camera which converts SHARC-II into a sensitive imaging polarimeter, "SHARP." The camera and polarimeter currently operate at wavelengths of 350 and 450 μm we are planning an additional passband at 620 μm. The incident beam is split into two orthogonally polarized components by the SHARP optics module and re-imaged onto opposite ends of the SHARC-II array. The result is a dual-beam 12 × 12 pixel polarimeter. The modular nature of the optics design allows the user to easily switch between polarimeter and camera modes during a single observing session. Here we review the optical design of SHARP, report on the instrument's performance, and review our data reduction methodology. SHARP will be used to study the magnetic field structure and dust emission properties in young stellar objects, Galactic clouds, and external galaxies. We present the first polarimetric maps of celestial sources made from SHARP observations and compare them to previous results. This work has been supported by NSF grants AST 02-41356 and 05-05230 to Northwestern University and 05-05124 to the University of Chicago.

  3. A CAMAC-resident microprocessor for the monitoring of polarimeter spin states

    International Nuclear Information System (INIS)

    Reid, D.; DuPlantis, D.; Yoder, N.; Dale, D.

    1992-01-01

    A CAMAC module for the reporting of polarimeter spin states is being developed using a resident microcontroller. The module will allow experimenters at the Indiana University Cyclotron Facility to monitor spin states and correlate spin information with other experimental data. The use of a microprocessor allows for adaptation of the module as new requirements ensue without change to the printed circuit board layout. (author)

  4. Exploring a possible origin of a 14 deg y-normal spin tilt at RHIC polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-06-15

    A possible origin of a 14 deg y-normal spin n0 tilt at the polarimeter is in snake angle defects. This possible cause is investigated by scanning the snake axis angle µ, and the spin rotation angle at the snake, φ, in the vicinity of their nominal values.

  5. Development of a hard x-ray polarimeter for gamma-ray bursts

    International Nuclear Information System (INIS)

    McConnell, M. L.; Forrest, D. J.; Macri, J.; Ryan, J. M.; Vestrand, W. T.

    1998-01-01

    We describe recent work on the development of a Compton scatter polarimeter for measuring the polarization of hard X-rays (100-300 keV) from astrophysical sources. Results from measurements with a laboratory prototype are summarized, along with comparisons to Monte Carlo simulations. We also present a new design concept that envisions a complete polarimeter module on the front end of a 5-inch position-sensitive PMT. Although the emphasis of our effort is measuring hard X-ray polarization in solar flares, our design has the advantage that it is sensitive over a rather large FoV (>1 sr), a feature that makes the design especially attractive for γ-ray burst studies

  6. High-precision soft x-ray polarimeter at Diamond Light Source.

    Science.gov (United States)

    Wang, H; Dhesi, S S; Maccherozzi, F; Cavill, S; Shepherd, E; Yuan, F; Deshmukh, R; Scott, S; van der Laan, G; Sawhney, K J S

    2011-12-01

    The development and performance of a high-precision polarimeter for the polarization analysis in the soft x-ray region is presented. This versatile, high-vacuum compatible instrument is supported on a hexapod to simplify the alignment with a resolution less than 5 μrad, and can be moved with its own independent control system easily between different beamlines and synchrotron facilities. The polarimeter can also be used for the characterization of reflection and transmission properties of optical elements. A W/B(4)C multilayer phase retarder was used to characterize the polarization state up to 1200 eV. A fast and accurate alignment procedure was developed, and complete polarization analysis of the APPLE II undulator at 712 eV has been performed.

  7. Non-uniformity calibration for MWIR polarization imagery obtained with integrated microgrid polarimeters

    Science.gov (United States)

    Liu, Hai-Zheng; Shi, Ze-Lin; Feng, Bin; Hui, Bin; Zhao, Yao-Hong

    2016-03-01

    Integrating microgrid polarimeters on focal plane array (FPA) of an infrared detector causes non-uniformity of polarization response. In order to reduce the effect of polarization non-uniformity, this paper constructs an experimental setup for capturing raw flat-field images and proposes a procedure for acquiring non-uniform calibration (NUC) matrix and calibrating raw polarization images. The proposed procedure takes the incident radiation as a polarization vector and offers a calibration matrix for each pixel. Both our matrix calibration and two-point calibration are applied to our mid-wavelength infrared (MWIR) polarization imaging system with integrated microgrid polarimeters. Compared with two point calibration, our matrix calibration reduces non-uniformity by 30 40% under condition of flat-field data test with polarization. The ourdoor scene observation experiment indicates that our calibration can effectively reduce polarization non-uniformity and improve the image quality of our MWIR polarization imaging system.

  8. Design and Deployment of a Multichroic Polarimeter Array on the Atacama Cosmology Telescope

    Science.gov (United States)

    Datta, R.; Austermann, J.; Beall, J. A.; Becker, D.; Coughlin, K. P.; Duff, S. M.; Gallardo, P.A.; Grace, E.; Hasselfield, M.; Henderson, S. W.; hide

    2016-01-01

    We present the design and the preliminary on-sky performance with respect to beams and pass bands of a multichroic polarimeter array covering the 90 and 146 GHz cosmic microwave background bands and its enabling broad-band optical system recently deployed on the Atacama Cosmology Telescope (ACT). The constituent pixels are feedhorn-coupled multichroic polarimeters fabricated at NIST. This array is coupled to the ACT telescope via a set of three silicon lenses incorporating novel broad-band metamaterial anti-reflection coatings. This receiver represents the first multichroic detector array deployed for a CMB experiment and paves the way for the extensive use of multichroic detectors and broad-band optical systems in the next generation of CMB experiments.

  9. A novel comparison of Møller and Compton electron-beam polarimeters

    Directory of Open Access Journals (Sweden)

    J.A. Magee

    2017-03-01

    Full Text Available We have performed a novel comparison between electron-beam polarimeters based on Møller and Compton scattering. A sequence of electron-beam polarization measurements were performed at low beam currents (<5 μA during the Qweak experiment in Hall-C at Jefferson Lab. These low current measurements were bracketed by the regular high current (180 μA operation of the Compton polarimeter. All measurements were found to be consistent within experimental uncertainties of 1% or less, demonstrating that electron polarization does not depend significantly on the beam current. This result lends confidence to the common practice of applying Møller measurements made at low beam currents to physics experiments performed at higher beam currents. The agreement between two polarimetry techniques based on independent physical processes sets an important benchmark for future precision asymmetry measurements that require sub-1% precision in polarimetry.

  10. Radiometric and Polarimetric Accuracy Assessment and Calibration of the Hyper-Angular Rainbow Polarimeter (HARP) Instrument

    Science.gov (United States)

    McBride, B.; Martins, J. V.; Fernandez Borda, R. A.; Barbosa, H. M.

    2017-12-01

    The Laboratory for Aerosols, Clouds, and Optics (LACO) at the University of Maryland, Baltimore County (UMBC) present a novel, wide FOV, hyper-angular imaging polarimeter for the microphysical sampling of clouds and aerosols from aircraft and space. The instrument, the Hyper-Angular Rainbow Polarimeter (HARP), is a precursor to the multi-angle imaging polarimeter solicited by the upcoming NASA Aerosols, Clouds, and Ecosystems (ACE) mission. HARP currently operates in two forms: a spaceborne CubeSat slated for a January 2018 launch to the ISS orbit, and an identical aircraft platform that participated in the Lake Michigan Ozone Study (LMOS) and Aerosol Characterization from Polarimeter and Lidar (ACEPOL) NASA campaigns in 2017. To ensure and validate the instrument's ability to produce high quality Level 2 cloud and aerosol microphysical products, a comprehensive calibration scheme that accounts for flatfielding, radiometry, and all optical interference processes that contribute to the retrieval of Stokes parameters I, Q, and U, is applied across the entirety of HARP's 114° FOV. We present an innovative calibration algorithm that convolves incident polarization from a linear polarization state generator with intensity information observed at three distinct linear polarizations. The retrieved results are pixel-level, modified Mueller matrices that characterize the entire HARP optical assembly, without the need to characterize every individual element or perform ellipsometric studies. Here we show results from several pre- and post- LMOS campaign radiometric calibrations at NASA GSFC and polarimetric calibration using a "polarization dome" that allows for full-FOV characterization of Stokes parameters I, Q, and U. The polarization calibration is verified by passing unpolarized light through partially-polarized, tilted glass plates with well-characterized degree of linear polarization (DoLP). We apply this calibration to a stratocumulous cloud deck case observed

  11. The Cosmology Large Angular Scale Surveyor (CLASS): 38 GHz Detector Array of Bolometric Polarimeters

    Science.gov (United States)

    Appel, John W.; Ali, Aamir; Amiri, Mandana; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; Colazo, Felipe; hide

    2014-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) experiment aims to map the polarization of the Cosmic Microwave Background (CMB) at angular scales larger than a few degrees. Operating from Cerro Toco in the Atacama Desert of Chile, it will observe over 65% of the sky at 38, 93, 148, and 217 GHz. In this paper we discuss the design, construction, and characterization of the CLASS 38 GHz detector focal plane, the first ever Q-band bolometric polarimeter array.

  12. The multichannel triple-laser interferometer/polarimeter system at RTP

    NARCIS (Netherlands)

    Rommers, J. H.; Donne, A. J. H.; Karelse, F. A.; Howard, J.

    1997-01-01

    yA 19-channel combined interferometer and polarimeter system has recently become operational at the Rijnhuizen Tokamak Project (a = 0.164 m, R = 0.72 m, B-tor less than or equal to 2.5 T, I-p less than or equal to 150 kA, plasma pulse duration less than or equal to 500 ms), in order to determine the

  13. POLARIMETER: A Soft X-Ray 8-Axis UHV-Diffractometer at BESSY II

    Directory of Open Access Journals (Sweden)

    Andrey Sokolov

    2016-11-01

    Full Text Available A versatile UHV-polarimeter for the EUV XUV spectral range is described which incorporates two optical elements: a phase retarder and a reflection analyzer. Both optics are azimuthally rotatable around the incident synchrotron radiation beam and the incidence angle is freely selectable. This allows for a variety of reflectometry, polarimetry and ellipsometry applications on magnetic or non-magnetic samples and multilayer optical elements.

  14. Instrumentations in x-ray plasma polarization spectroscopy. Crystal spectrometer, polarimeter and detectors for astronomical observations

    Energy Technology Data Exchange (ETDEWEB)

    Baronova, Elena O.; Stepanenko, Mikhail M. [RRC Kurchatov Institute, Nuclear Fusion Institute, Moscow (Russian Federation); Jakubowski, Lech [Soltan Institute for Nuclear Studies, Swierk-Otwock (Poland); Tsunemi, Hiroshi [Osaka Univ., Graduate School of Science, Osaka (Japan)

    2002-08-01

    This report discusses the various problems which are encountered when a crystal spectrometer is used for the purpose of observing polarized x-ray lines. A polarimeter is proposed based on the novel idea of using two series of equivalent atomic planes in a single crystal. The present status of the astronomical x-ray detection techniques are described with emphasis on two dimensional detectors which are polarization sensitive. (author)

  15. POMME: A medium energy deuteron polarimeter based on semi-inclusive d-carbon scattering

    International Nuclear Information System (INIS)

    Bonin, B.; Boudard, A.; Fanet, H.; Fergerson, R.W.; Garcon, M.; Giorgetti, C.; Habault, J.; Le Meur, J.; Lombard, R.M.; Lugol, J.C.; Mayer, B.; Mouly, J.P.; Tomasi-Gustafsson, E.; Morlet, M.; Wiele, J. van de; Willis, A.; Greeniaus, G.; British Columbia Univ., Vancouver; Gaillard, G.; Markowitz, P.; Perdrisat, C.F.; Abegg, R.; Hutcheon, D.A.

    1990-01-01

    POMME is the first calibrated deuteron polarimeter using a d + carbon semi-inclusive scattering reaction. We present the results of its calibration in the region T d =150-700 MeV, with the polarized deuteron beam from the synchrotron Saturne. A parametrization of the measured analyzing powers, and a discussion of the obtained efficiency and figure of merit are also given. (orig.)

  16. Development of a Hydrogen Møller Polarimeter for Precision Parity-Violating Electron Scattering

    Science.gov (United States)

    Gray, Valerie M.

    2013-10-01

    Parity-violating electron scattering experiments allow for testing the Standard Model at low energy accelerators. Future parity-violating electron scattering experiments, like the P2 experiment at the Johannes Gutenberg University, Mainz, Germany, and the MOLLER and SoLID experiments at Jefferson Lab will measure observables predicted by the Standard Model to high precision. In order to make these measurements, we will need to determine the polarization of the electron beam to sub-percent precision. The present way of measuring the polarization, with Møller scattering in iron foils or using Compton laser backscattering, will not easily be able to reach this precision. The novel Hydrogen Møller Polarimeter presents a non-invasive way to measure the electron polarization by scattering the electron beam off of atomic hydrogen gas polarized in a 7 Tesla solenoidal magnetic trap. This apparatus is expected to be operational by 2016 in Mainz. Currently, simulations of the polarimeter are used to develop the detection system at College of William & Mary, while the hydrogen trap and superconducting solenoid magnet are being developed at the Johannes Gutenberg University, Mainz. I will discuss the progress of the design and development of this novel polarimeter system. This material is based upon work supported by the National Science Foundation under Grant No. PHY-1206053.

  17. Silicon photomultipliers as readout elements for a Compton effect polarimeter: the COMPASS project

    CERN Document Server

    Del Monte, E; Brandonisio, A; Muleri, F; Soffitta, P; Costa, E; di Persio, G; Cosimo, S Di; Massaro, E; Morbidini, A; Morelli, E; Pacciani, L; Fabiani, S; Michilli, D; Giarrusso, S; Catalano, O; Impiombato, D; Mineo, T; Sottile, G; Billotta, S

    2016-01-01

    COMpton Polarimeter with Avalanche Silicon readout (COMPASS) is a research and development project that aims to measure the polarization of X-ray photons through Compton Scattering. The measurement is obtained by using a set of small rods of fast scintillation materials with both low-Z (as active scatterer) and high-Z (as absorber), all read-out with Silicon Photomultipliers. By this method we can operate scattering and absorbing elements in coincidence, in order to reduce the background. In the laboratory we are characterising the SiPMs using different types of scintillators and we are optimising the performances in terms of energy resolution, energy threshold and photon tagging efficiency. We aim to study the design of two types of satellite-borne instruments: a focal plane polarimeter to be coupled with multilayer optics for hard X-rays and a large area and wide field of view polarimeter for transients and Gamma Ray Bursts. In this paper we describe the status of the COMPASS project, we report about the la...

  18. First results from the J-TEXT high-resolution three-wave polarimeter-interferometera)

    Science.gov (United States)

    Chen, J.; Zhuang, G.; Wang, Z. J.; Gao, L.; Li, Q.; Chen, W.; Brower, D. L.; Ding, W. X.

    2012-10-01

    A laser-based far-infrared polarimeter-interferometer system utilizing the three-wave technique has been implemented on the J-TEXT tokamak. The polarimeter determines the Faraday effect by measuring the phase difference between two collinear, counter-rotating, circularly polarized laser beams. The first results of the polarimeter-interferometer designed for J-TEXT have been obtained in the most recent J-TEXT experimental campaign. Simultaneous polarimetric and interferometric measurement is achieved, with phase resolution up to 0.1°, at bandwidth of 50 kHz. The temporal resolution, which is dependent on the laser's frequency offset, is ˜1 μs. Continual spatial measurement covering 45 cm (80% of the plasma cross-section) is realized by utilizing 1D parabolic beam expansion optics. Three initial test chords are installed and future plans call for expansion up to 30 chords with 1.5 cm chord spacing, providing high spatial resolution for measurement of electron density and current density profiles. Reliability of both polarimetric and interferometric measurement is confirmed by comparison with computation and data from a hydrocyanic acid (HCN) interferometer. With the high temporal and phase resolution, perturbations associated with the sawtooth cycle and MHD activity have been observed.

  19. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  20. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter

    Science.gov (United States)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with 10 e-/pixel/second dark current, 25 e- read noise, a gain of 2.0 +/- 0.5 and 1.0 percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  1. Imaging as a tool for the characterization of the gas pixel detector photoelectric polarimeter

    Science.gov (United States)

    Fabiani, Sergio

    2017-08-01

    The Gas Pixel Detector (GPD) is an X-ray polarimeter exploiting the photoelectric effect both to measure polarization and to obtain the image of astrophysical sources. This detector is on board the IXPE (Imaging X-ray Polarimetry Explorer) mission selected by NASA in the framework of the Explorer program. We report on the imaging capability of the GPD as a tool to perform a full detector characterization. The analysis of a radiation beam penetrating inclined in the gas cell is discussed showing how this measurement can be used to study different properties of the gas.

  2. Total elimination of sampling errors in polarization imagery obtained with integrated microgrid polarimeters.

    Science.gov (United States)

    Tyo, J Scott; LaCasse, Charles F; Ratliff, Bradley M

    2009-10-15

    Microgrid polarimeters operate by integrating a focal plane array with an array of micropolarizers. The Stokes parameters are estimated by comparing polarization measurements from pixels in a neighborhood around the point of interest. The main drawback is that the measurements used to estimate the Stokes vector are made at different locations, leading to a false polarization signature owing to instantaneous field-of-view (IFOV) errors. We demonstrate for the first time, to our knowledge, that spatially band limited polarization images can be ideally reconstructed with no IFOV error by using a linear system framework.

  3. Algorithm Validation of the Current Profile Reconstruction of EAST Based on Polarimeter/Interferometer

    International Nuclear Information System (INIS)

    Qian Jinping; Ren Qilong; Wan Baonian; Liu Haiqin; Zeng Long; Luo Zhengping; Chen Dalong; Shi Tonghui; Sun Youwen; Shen Biao; Xiao Bingjia; Lao, L. L.; Hanada, K.

    2015-01-01

    The method of plasma current profile reconstruction using the polarimeter/interferometer (POINT) data from a simulated equilibrium is explored and validated. It is shown that the safety factor (q) profile can be generally reconstructed from the external magnetic and POINT data. The reconstructed q profile is found to reasonably agree with the initial equilibriums. Comparisons of reconstructed q and density profiles using the magnetic data and the POINT data with 3%, 5% and 10% random errors are investigated. The result shows that the POINT data could be used to a reasonably accurate determination of the q profile. (fusion engineering)

  4. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  5. An optimized photoelectron track reconstruction method for photoelectric X-ray polarimeters

    Science.gov (United States)

    Kitaguchi, Takao; Black, Kevin; Enoto, Teruaki; Fukazawa, Yasushi; Hayato, Asami; Hill, Joanne E.; Iwakiri, Wataru B.; Jahoda, Keith; Kaaret, Philip; McCurdy, Ross; Mizuno, Tsunefumi; Nakano, Toshio; Tamagawa, Toru

    2018-02-01

    We present a data processing algorithm for angular reconstruction and event selection applied to 2-D photoelectron track images from X-ray polarimeters. The method reconstructs the initial emission angle of a photoelectron from the initial portion of the track, which is obtained by continuously cutting a track until the image moments or number of pixels fall below tunable thresholds. In addition, event selection which rejects round tracks quantified with eccentricity and circularity is performed so that polarimetry sensitivity considering a trade-off between the modulation factor and signal acceptance is maximized. The modulation factors with applying track selection are 26 . 6 ± 0 . 4, 46 . 1 ± 0 . 4, 62 . 3 ± 0 . 4, and 61 . 8 ± 0 . 3% at 2.7, 4.5, 6.4, and 8.0 keV, respectively, using the same data previously analyzed by Iwakiri et al. (2016), where the corresponding numbers are 26 . 9 ± 0 . 4, 43 . 4 ± 0 . 4, 54 . 4 ± 0 . 3, and 59 . 1 ± 0 . 3%. The method improves polarimeter sensitivity by 5%-10% at the high energy end of the band previously presented (Iwakiri et al. 2016).

  6. Compact acousto-optic imaging spectro-polarimeter for mineralogical investigations in the near infrared.

    Science.gov (United States)

    Belyaev, Denis A; Yushkov, Konstantin B; Anikin, Sergey P; Dobrolenskiy, Yuri S; Laskin, Aleksander; Mantsevich, Sergey N; Molchanov, Vladimir Ya; Potanin, Sergey A; Korablev, Oleg I

    2017-10-16

    Spectral imaging in the near infrared is a promising method for mineralogy analysis, in particular well-suited for airless celestial objects or those with faint atmospheres. Additional information about structure and composition of minerals can be obtained using spectral polarimetry with high spatial resolution. We report design and performance of laboratory prototype for a compact near infrared acousto-optic imaging spectro-polarimeter, which may be implemented for remote or close-up analysis of planetary surfaces. The prototype features telecentric optics, apochromatic design over the bandwidth of 0.8-1.75 µm, and simultaneous imaging of two orthogonal linear polarizations of the same scene with a single FPA detector. When validating the scheme, reflectance spectra of several minerals were measured with the spectral resolution of 100 cm -1 (10 nm passband at 1 µm). When imaging samples, the spatial resolution of 0.6 mm at the target distance of one meter was reached. It corresponds to 100 by 100 diffraction-limited elements resolved at the focal plane array (FPA) for each of the two light polarizations. A similar prototype is also being designed for the spectral range from 1.7 to 3.5 µm. This type of the spectro-polarimeter is considered as a potential reconnaissance and analysis tool for future planetary or moon landers and rovers.

  7. A Wavefront Division Polarimeter for the Measurements of Solute Concentrations in Solutions

    Directory of Open Access Journals (Sweden)

    Sergio Calixto

    2017-12-01

    Full Text Available Polarimeters are useful instruments that measure concentrations of optically active substances in a given solution. The conventional polarimetric principle consists of measuring the rotation angle of linearly polarized light. Here, we present a novel polarimeter based on the study of interference patterns. A Mach–Zehnder interferometer with linearly polarized light at the input is used. One beam passes through the liquid sample and the other is a reference beam. As the linearly polarized sample beam propagates through the optically active solution the vibration plane of the electric field will rotate. As a result, the visibility of the interference pattern at the interferometer output will decrease. Fringe contrast will be maximum when both beams present a polarization perpendicular to the plane of incidence. However, minimum visibility is obtained when, after propagation through the sample the polarization of the sample beam is oriented parallel to the plane of incidence. By using different solute concentrations, a calibration plot is obtained showing the behavior of visibility.

  8. A Michelson interferometer/polarimeter on the Tokamak Fusion Test Reactor (TFTR)

    International Nuclear Information System (INIS)

    Park, H.K.; Mansfield, D.K.; Johnson, L.C.; Ma, C.H.

    1987-01-01

    A multichannel interferometer/polarimeter for the Tokamak Fusion Test Reactor (TFTR) has been developed in order to study the time dependent plasma current density (J/sub p/) and electron density (n/sub e/) profile simultaneously. The goal of the TFTR is demonstration of breakeven via dueuterium and tritium (DT) plasma. In order to be operated and maintained during DT operation phase, the system is designed based on the Michelson geometry which possesses intrinsic standing wave problems. So far, there has been no observable signals due to these standing waves. However, a standing wave resulted from the beam path design to achieve a optimum use of the laser power was found. This standing wave has not prevented initial 10 channel interferometer operation. However, a single channel polarimeter test indicated this standing wave was fatal for Faraday notation measurements. Techniques employing 1/2 wave plates and polarizers have been applied to eliminate this standing wave problem. The completion of 10 channel Faraday rotation measurements may be feasible in the near future

  9. First results of the J-TEXT high-resolution 3-wave polarimeter-interferometer system

    Science.gov (United States)

    Zhuang, G.; Chen, J.; Li, Q.; Gao, L.; Wang, Z. J.; Liu, Y.; Chen, W.

    2013-10-01

    A far-infrared laser polarimeter-interferometer system based on Three-wave technique has been established on the J-TEXT tokamak. The system determines Faraday angle by measuring phase difference between two collinear, counter-rotating, circularly polarized laser beams, and acquires line-integrated electron density simultaneously by phase comparison between the two beams and a third local oscilate (LO) beam. Three seperately pumped HCOOH lasers at 432 μm are adopted as sources, suppling more than 100 mW power output in sum. Parabolic mirrors are used to expand probe beams to 450 mm wide, covering ~ 80% of plasma cross section, which allows profile measurement with high spatial resolution. First experimental results of the polarimeter-interferometer have been obtained. 12 chords (3 cm chord spacing) simultaneous polarimetric and interferometric measurements are achieved, with phase resolution up to 0.1° at bandwidth of 50 kHz. With the high temporal and phase resolution, perturbations associated with the sawtooth cycle and MHD activity have been observed.

  10. A polarimeter for GeV protons of recirculating synchrotron beams

    CERN Document Server

    Bauer, F

    1999-01-01

    A polarimeter for use in recirculating beams of proton synchrotrons with energies from 300 MeV up to several GeV has been developed. The polarimetry is based on the asymmetry measurement of elastic p->p scattering on an internal CH sub 2 fiber target. The forward going protons are detected with two scintillator systems on either side of the beam pipe close to the angle THETA sub f of maximum analyzing power A sub N. Each one operates in coincidence with a broad (DELTA THETA sub b =21.4 deg. ), segmented detector system for the recoil proton of kinematically varying direction THETA sub b; this position resolution is also used for a concurrent measurement of the p->C and nonelastic p->p background. The CH sub 2 fiber can be replaced by a carbon fiber for detailed background studies; 'false' asymmetries are accounted for with a rotation of the polarimeter around the beam axis. Polarimetry has been performed in the internal beam of the Cooler Synchrotron COSY at fixed energies as well as during proton acceleratio...

  11. Performance Verification of the Gravity and Extreme Magnetism Small Explorer GEMS X-Ray Polarimeter

    Science.gov (United States)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kanako, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; hide

    2014-01-01

    olarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor greater than or equal to 35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, approximately 20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  12. Software Metrics and Software Metrology

    CERN Document Server

    Abran, Alain

    2010-01-01

    Most of the software measures currently proposed to the industry bring few real benefits to either software managers or developers. This book looks at the classical metrology concepts from science and engineering, using them as criteria to propose an approach to analyze the design of current software measures and then design new software measures (illustrated with the design of a software measure that has been adopted as an ISO measurement standard). The book includes several case studies analyzing strengths and weaknesses of some of the software measures most often quoted. It is meant for sof

  13. Forever software

    NARCIS (Netherlands)

    Rensink, Arend; Margaria, Tiziana; Steffen, Bernhard

    2014-01-01

    Any attempt to explain software engineering to a lay audience soon falls back on analogy: building software is like building a bridge, a car, a television set. A large part of the established practice within software engineering is also based on this premise. However, the analogy is false in some

  14. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  15. Measurement of the inclusive p-C analyzing power and cross section in the 1 GeV region and calibration of the new polarimeter POMME

    International Nuclear Information System (INIS)

    Bonin, B.; Boudard, A.; Fanet, H.; Fergerson, R.W.; Garcon, M.; Giorgetti, C.; Habault, J.; Le Meur, J.; Lombard, R.M.; Lugol, J.C.; Mayer, B.; Mouly, J.P.; Tomasi-Gustafsson, E.; Morlet, M.; Wiele, J. van de; Willis, A.; Greeniaus, G.; British Columbia Univ., Vancouver; Gaillard, G.; Markowitz, P.; Perdrisat, C.F.; Abegg, R.; Hutcheon, D.A.

    1990-01-01

    We describe the polarimeter POMME, and give the results of its calibration in the region 200-1200 MeV, using the Saturne polarized proton beam. The high energy part of this domain (800-1200 MeV) was previously unexplored. Parametrizations of the inclusive p-C analyzing power and polarimeter efficiency as a function of scattering angle and incident energy are given, completing the data already available at lower energies. The optimization of proton polarimeters in this energy domain is also discussed. (orig.)

  16. Fringe-jump corrected far infrared tangential interferometer/polarimeter for a real-time density feedback control system of NSTX plasmas.

    Science.gov (United States)

    Juhn, J-W; Lee, K C; Hwang, Y S; Domier, C W; Luhmann, N C; Leblanc, B P; Mueller, D; Gates, D A; Kaita, R

    2010-10-01

    The far infrared tangential interferometer/polarimeter (FIReTIP) of the National Spherical Torus Experiment (NSTX) has been set up to provide reliable electron density signals for a real-time density feedback control system. This work consists of two main parts: suppression of the fringe jumps that have been prohibiting the plasma density from use in the direct feedback to actuators and the conceptual design of a density feedback control system including the FIReTIP, control hardware, and software that takes advantage of the NSTX plasma control system (PCS). By investigating numerous shot data after July 2009 when the new electronics were installed, fringe jumps in the FIReTIP are well characterized, and consequently the suppressing algorithms are working properly as shown in comparisons with the Thomson scattering diagnostic. This approach is also applicable to signals taken at a 5 kHz sampling rate, which is a fundamental constraint imposed by the digitizers providing inputs to the PCS. The fringe jump correction algorithm, as well as safety and feedback modules, will be included as submodules either in the gas injection system category or a new category of density in the PCS.

  17. Software Licensing

    OpenAIRE

    Nygrýnová, Dominika

    2014-01-01

    Summary: Software Licensing The thesis deals with different practical aspects of commercial software licensing from the perspective of the Czech legal system. The focus is put on software license agreement as the most important legal instrument granting rights of use for computer programs. The thesis opens with a summary of Czech legislation in force in this area in the context of European community law and international law. The legislation in effect is largely governed by the Copyright Act....

  18. Analysis of low energy AGS polarimeter data and potential consequences for RHIC spin physics

    International Nuclear Information System (INIS)

    Cadman, R.; Krueger, K.; Spinka, H.; Underwood, D.; Yokosawa, A.; Huang, H.

    2001-01-01

    The small asymmetries measured at G γ = 7.5 during the RHIC spin commissioning were a serious concern. In earlier runs, asymmetries double those from the spin commissioning time (September 2000) had sometimes been observed, and there had been few changes to the AGS polarimeter hardware or operating conditions. Recently, the observed changes in the asymmetries measured at G γ = 7:5 have been ascribed to contamination of the carbon target asymmetry with that from the fishline target and vice-versa, because of the sizeable beam spot size compared to the separation of the targets. This note addresses this hypothesis using the observed asymmetries. This problem could directly impact spin physics at RHIC

  19. Characterization of a high-temperature superconducting bearing for use in a cosmic microwave background polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Hull, John R [Energy Technology Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Hanany, Shaul [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Matsumura, Tomotake [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Johnson, Bradley [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Jones, Terry [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States)

    2005-02-01

    We have previously presented a design for a cosmic microwave background (CMB) polarimeter in which a cryogenically cooled half-wave plate rotates by means of a high-temperature superconducting (HTS) bearing. Here, a prototype bearing, consisting of a commercially available ring-shaped permanent magnet and an array of YBCO bulk HTS material, has been constructed. We measured its coefficient of friction and vibrational property as a function of several parameters, including temperature between 15 and 83 K, rotation frequency between 0.3 and 3.5 Hz, levitation distance between 6 and 10 mm and ambient pressure of {approx}10{sup -7} Torr. We concluded that the low rotational drag of the HTS bearing would allow rotations for long periods with minimal input power and negligible wear and tear, thus making this technology suitable for a future satellite mission.

  20. A Spin-Light Polarimeter for Multi-GeV Longitudinally Polarized Electron Beams

    Energy Technology Data Exchange (ETDEWEB)

    Mohanmurthy, Prajwal [Mississippi State University, Starkville, MS (United States); Dutta, Dipangkar [Mississippi State University, Starkville, MS (United States) and Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

    2014-02-01

    The physics program at the upgraded Jefferson Lab (JLab) and the physics program envisioned for the proposed electron-ion collider (EIC) include large efforts to search for interactions beyond the Standard Model (SM) using parity violation in electroweak interactions. These experiments require precision electron polarimetry with an uncertainty of < 0.5 %. The spin dependent Synchrotron radiation, called "spin-light," can be used to monitor the electron beam polarization. In this article we develop a conceptual design for a "spin-light" polarimeter that can be used at a high intensity, multi-GeV electron accelerator. We have also built a Geant4 based simulation for a prototype device and report some of the results from these simulations.

  1. Single image super-resolution via regularized extreme learning regression for imagery from microgrid polarimeters

    Science.gov (United States)

    Sargent, Garrett C.; Ratliff, Bradley M.; Asari, Vijayan K.

    2017-08-01

    The advantage of division of focal plane imaging polarimeters is their ability to obtain temporally synchronized intensity measurements across a scene; however, they sacrifice spatial resolution in doing so due to their spatially modulated arrangement of the pixel-to-pixel polarizers and often result in aliased imagery. Here, we propose a super-resolution method based upon two previously trained extreme learning machines (ELM) that attempt to recover missing high frequency and low frequency content beyond the spatial resolution of the sensor. This method yields a computationally fast and simple way of recovering lost high and low frequency content from demosaicing raw microgrid polarimetric imagery. The proposed method outperforms other state-of-the-art single-image super-resolution algorithms in terms of structural similarity and peak signal-to-noise ratio.

  2. Design and Tests of the Hard X-Ray Polarimeter X-Calibur

    Science.gov (United States)

    Beilicke, M.; Baring, M. G.; Barthelmy, S.; Binns, W. R.; Buckley, J.; Cowsik, R.; Dowkontt, P.; Garson, A.; Guo, Q.; Haba, Y.; hide

    2012-01-01

    X-ray polarimetry promises to give qualitatively new information about high-energy astrophysical sources, such as binary black hole systems, micro-quasars, active galactic nuclei, and gamma-ray bursts. We designed, built and tested a hard X-ray polarimeter X-Calibur to be used in the focal plane of the InFOC(mu)S grazing incidence hard X-ray telescope. X-Calibur combines a low-Z Compton scatterer with a CZT detector assembly to measure the polarization of 10 - 80 keY X-rays making use of the fact that polarized photons Compton scatter preferentially perpendicular to the electric field orientation. X-Calibur achieves a high detection efficiency of order unity.

  3. Derivation of Cumulus Cloud Dimensions and Shape from the Airborne Measurements by the Research Scanning Polarimeter

    Science.gov (United States)

    Alexandrov, Mikhail D.; Cairns, Brian; Emde, Claudia; Ackerman, Andrew S.; Ottaviani, Matteo; Wasilewski, Andrzej P.

    2016-01-01

    The Research Scanning Polarimeter (RSP) is an airborne instrument, whose measurements have been extensively used for retrievals of microphysical properties of clouds. In this study we show that for cumulus clouds the information content of the RSP data can be extended by adding the macroscopic parameters of the cloud, such as its geometric shape, dimensions, and height above the ground. This extension is possible by virtue of the high angular resolution and high frequency of the RSP measurements, which allow for geometric constraint of the cloud's 2D cross section between a number of tangent lines of view. The retrieval method is tested on realistic 3D radiative transfer simulations and applied to actual RSP data.

  4. A three-cell liquid hydrogen target for an extended focal plane polarimeter

    International Nuclear Information System (INIS)

    Golovanov, L.B.; Chesny, P.; Gheller, J.M.; Guillier, G.; Ladygin, V.P.; Theure, Ph.; Tomasi-Gustafsson, E.

    1996-01-01

    This article describes the design and working principle of a three-cell liquid hydrogen target produced for the high-energy deuteron polarimeter HYPOM. This target uses liquid helium as a cooling agent. After a general description of the apparatus, tests and operating modes are thoroughly explained. In particular the air controlled self-regulation of helium flow in the cryostat to stabilize the liquid hydrogen level is presented. The main feature of this target is the simplicity of the design as well as its safeness towards any incident. Results of cooling down, filling up of the target and stabilization regime were processed during one experiment of physics at synchrotron Saturne II. (orig.)

  5. In-line phase retarder and polarimeter for conversion of linear to circular polarization

    Energy Technology Data Exchange (ETDEWEB)

    Kortright, J.B.; Smith, N.V.; Denlinger, J.D. [Lawrence Berkeley National Lab., CA (United States)] [and others

    1997-04-01

    An in-line polarimeter including phase retarder and linear polarizer was designed and commissioned on undulator beamline 7.0 for the purpose of converting linear to circular polarization for experiments downstream. In commissioning studies, Mo/Si multilayers at 95 eV were used both as the upstream, freestanding phase retarder and the downstream linear polarized. The polarization properties of the phase retarder were characterized by direct polarimetry and by collecting MCD spectra in photoemission from Gd and other magnetic surfaces. The resonant birefringence of transmission multilayers results from differing distributions of s- and p-component wave fields in the multilayer when operating near a structural (Bragg) interference condition. The resulting phase retardation is especially strong when the interference is at or near the Brewster angle, which is roughly 45{degrees} in the EUV and soft x-ray ranges.

  6. Design and Tests of the Hard X-ray Polarimeter X-Calibur

    Directory of Open Access Journals (Sweden)

    M. Beilicke

    2014-12-01

    Full Text Available X-ray polarimetry promises to give qualitatively new information bout high-energy astrophysical sources, such as binary black hole  systems, micro-quasars, active galactic nuclei, and gamma-ray bursts. We designed, built and tested ahard X-ray polarimeter, X-Calibur, to be used in the focal plane of the InFOCuS grazing incidence hard X-ray telescope.X-Calibur combines a low-Z Compton scatterer with a CZT detector assembly to measure the polarization of 20−60 keV X-rays making use of the fact that polarized photons Compton scatter preferentially perpendicular to the electric field orientation; in principal, a similar space-borne experiment could be operated in the 5−100 keV regime. X-Calibur achieves a high detection efficiency of order unity.

  7. Two-dimensional polarimeter with a charge-coupled-device image sensor and a piezoelastic modulator.

    Science.gov (United States)

    Povel, H P; Keller, C U; Yadigaroglu, I A

    1994-07-01

    We present the first measurements and scientific observations of the solar photosphere obtained with a new two-dimensional polarimeter based on piezoelastic modulators and synchronous demodulation in a CCD imager. This instrument, which is developed for precision solar-vector polarimetry, contains a specially masked CCD that has every second row covered with an opaque mask. During exposure the charges are shifted back and forth between covered and light-sensitive rows synchronized with the modulation. In this way Stokes I and one of the other Stokes parameters can be recorded. Since the charge shifting is performed at frequencies well above the seeing frequencies and both polarization states are measured with the same pixel, highly sensitive and accurate polarimetry is achieved. We have tested the instrument in laboratory conditions as well as at three solar telescopes.

  8. Measurement errors induced by deformation of optical axes of achromatic waveplate retarders in RRFP Stokes polarimeters.

    Science.gov (United States)

    Dong, Hui; Tang, Ming; Gong, Yandong

    2012-11-19

    The optical axes of achromatic waveplate retarders (AWR) may deform from ideal linear eigenpolarizations and be frequency-dependent owing to the imperfect design and fabrication. Such deformations result in the ellipticity error and the orientation error of an AWR away from the nominal values. In this paper, we address the measurement errors of Stokes parameters induced by deformation of optical axes of AWRs in roatatable retarder fixed polarizer (RRFP) Stokes polarimeters. A set of theoretical formulas is derived to reveal that such measurement errors actually depend on both retardance and angular orientations of the AWR in use, as well as the state of polarization (SOP) under test. We demonstrate that,by rotating the AWR to N (N≥5) uniformly spaced angles with the angle step of 180°/N or 360°/N, the measurement errors of Stokes parameters induced by the ellipticity error of the AWR can be suppressed compared with the result using any set of four specific angles, especially when the SOP under test is nearly circular. On the other hand, the measurement errors induced by the orientation error of the AWR have more complicated relationships with the angular orientations of the AWR: 1) when the SOP under test is nearly circular, above-mentioned N (N≥5) uniformly spaced angles also lead to much smaller measurement errors than any set of four specific angles; 2) when the SOP under test is nearly linear, N (N≥5) uniformly spaced angles result in smaller or larger measurement errors, depending on the SOP under test, compared with the usually-recommended sets of four specific angles. By theoretical calculations and numerical simulations, we can conclude that the RRFP Stokes polarimeters employing angle sets of N (N≥5) uniformly spaced angles, ( ± 90°, -54°, -18°, 18°, 54°) for instance, can effectively reduce the measurement errors of Stokes parameters induced by the optical axes deformation of the AWR.

  9. Determination of the Kinematics of the Qweak Experiment and Investigation of an Atomic Hydrogen Moller Polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Valerie M. [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2018-01-01

    The Qweak experiment has tested the Standard Model through making a precise measurement of the weak charge of the proton (QpW). This was done through measuring the parity-violating asymmetry for polarized electrons scattering off of unpolarized protons. The parity-violating asymmetry measured is directly proportional to the four-momentum transfer (Q^2) from the electron to the proton. The extraction of QpW from the measured asymmetry requires a precise Q^2 determination. The Qweak experiment had a Q^2 = 24.8 ± 0.1 m(GeV^2) which achieved the goal of an uncertainty of <= 0.5%. From the measured asymmetry and Q^2, QpW was determined to be 0.0719 ± 0.0045, which is in good agreement with the Standard Model prediction. This puts a 7.5 TeV lower limit on possible "new physics". This dissertation describes the analysis of Q^2 for the Qweak experiment. Future parity-violating electron scattering experiments similar to the Qweak experiment will measure asymmetries to high precision in order to test the Standard Model. These measurements will require the beam polarization to be measured to sub-0.5% precision. Presently the electron beam polarization is measured through Moller scattering off of a ferromagnetic foil or through using Compton scattering, both of which can have issues reaching this precision. A novel Atomic Hydrogen Moller Polarimeter has been proposed as a non-invasive way to measure the polarization of an electron beam via Moller scattering off of polarized monatomic hydrogen gas. This dissertation describes the development and initial analysis of a Monte Carlo simulation of an Atomic Hydrogen Moller Polarimeter.

  10. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  11. Determination of electron beam polarization using electron detector in Compton polarimeter with less than 1% statistical and systematic uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Narayan, Amrendra [Mississippi State Univ., Mississippi State, MS (United States)

    2015-05-01

    The Q-weak experiment aims to measure the weak charge of proton with a precision of 4.2%. The proposed precision on weak charge required a 2.5% measurement of the parity violating asymmetry in elastic electron - proton scattering. Polarimetry was the largest experimental contribution to this uncertainty and a new Compton polarimeter was installed in Hall C at Jefferson Lab to make the goal achievable. In this polarimeter the electron beam collides with green laser light in a low gain Fabry-Perot Cavity; the scattered electrons are detected in 4 planes of a novel diamond micro strip detector while the back scattered photons are detected in lead tungstate crystals. This diamond micro-strip detector is the first such device to be used as a tracking detector in a nuclear and particle physics experiment. The diamond detectors are read out using custom built electronic modules that include a preamplifier, a pulse shaping amplifier and a discriminator for each detector micro-strip. We use field programmable gate array based general purpose logic modules for event selection and histogramming. Extensive Monte Carlo simulations and data acquisition simulations were performed to estimate the systematic uncertainties. Additionally, the Moller and Compton polarimeters were cross calibrated at low electron beam currents using a series of interleaved measurements. In this dissertation, we describe all the subsystems of the Compton polarimeter with emphasis on the electron detector. We focus on the FPGA based data acquisition system built by the author and the data analysis methods implemented by the author. The simulations of the data acquisition and the polarimeter that helped rigorously establish the systematic uncertainties of the polarimeter are also elaborated, resulting in the first sub 1% measurement of low energy (?1 GeV) electron beam polarization with a Compton electron detector. We have demonstrated that diamond based micro-strip detectors can be used for tracking in a

  12. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  13. First attempt of the measurement of the beam polarization at an accelerator with the optical electron polarimeter POLO

    CERN Document Server

    Collin, B; Essabaa, S; Frascaria, R; Gacougnolle, R; Kunne, Ronald Alexander; Aulenbacher, K; Tioukine, V

    2004-01-01

    The conventional methods for measuring the polarization of electron beams are either time consuming, invasive or accurate only to a few percent. We developped a method to measure electron beam polarization by observing the light emitted by argon atoms following their excitation by the impact of polarized electrons. The degree of circular polarization of the emitted fluorescence is directly related to the electron polarization. We tested the polarimeter on a test GaAs source available at the MAMI electron accelerator in Mainz, Germany. The polarimeter determines the polarization of a 50 keV electron beam decelerated to a few eV and interacting with an effusive argon gas jet. The resulting decay of the excited states produces the emission of a circularly polarized radiation line at 811.5 nm which is observed and analyzed.

  14. Double-wedged Wollaston-type polarimeter design and integration to RTT150-TFOSC; initial tests, calibration, and characteristics

    Science.gov (United States)

    Helhel, S.; Khamitov, I.; Kahya, G.; Bayar, C.; Kaynar, S.; Gumerov, R.

    2015-10-01

    Photometric and spectroscopic observation capabilities of 1.5-m Russian-Turkish Telescope RTT150 has been broadened with the integration of presented polarimeter. The well-known double-wedged Wollaston-type dual-beam technique was preferred and applied to design and produce it. The designed polarimeter was integrated into the telescope detector TFOSC, and called TFOSC-WP. Its capabilities and limitations were attempted to be determined by a number of observation sets. Non-polarized and strongly polarized stars were observed to determine its limitations as well as its linearity. An instrumental intrinsic polarization was determined for the 1 × 5 arcmin field of view in equatorial coordinate system, the systematic error of polarization degree as 0.2 %, and position angle as 1.9∘. These limitations and capabilities are denoted as good enough to satisfy telescopes' present and future astrophysical space missions related to GAIA and SRG projects.

  15. Software Reviews.

    Science.gov (United States)

    McGrath, Diane

    1990-01-01

    Reviews two programs: (1) "The Weather Machine" on understanding weather and weather forecasting and (2) "The Mystery of the Hotel Victoria" on problem solving in mathematics. Presents the descriptions, advantages, and weaknesses of the software. (YP)

  16. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  17. Software Reviews.

    Science.gov (United States)

    Slatta, Richard W. And Others

    1987-01-01

    Describes a variety of computer software. Subjects reviewed include history simulations and wordprocessing programs. Some of the eleven packages reviewed are Thog, North Utilities, HBJ Writer, Textra, Pro-cite, and Simulation Construction Kit. (BSR)

  18. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1990-01-01

    Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)

  19. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  20. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly ratio...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation.......  Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...

  1. Design and initial performance of SHARP, a polarimeter for the SHARC-II camera at the Caltech Submillimeter Observatory

    Science.gov (United States)

    Li, H.; Dowell, C. D.; Kirby, L.; Novak, G.; Vaillancourt, J. E.

    2008-01-01

    We have developed a foreoptics module that converts the Submillimeter High Angular Resolution Camera generation II (SHARC-II) camera at the Caltech Submillimeter Observatory into a sensitive imaging polarimeter at wavelengths of 350 and 450 μm. We refer to this module as "SHARP." SHARP splits the incident radiation into two orthogonally polarized beams that are then reimaged onto opposite ends of the 32×12 pixel detector array in SHARC-II. A rotating half-wave plate is used just upstream from the polarization-splitting optics. The effect of SHARP is to convert SHARC-II into a dual-beam 12×12 pixel polarimeter. A novel feature of SHARP's design is the use of a crossed grid in a submillimeter polarimeter. Here we describe the detailed optical design of SHARP and present results of tests carried out during our first few observing runs. At 350 μm, the beam size (9 arc sec), throughput (75%), and instrumental polarization (<1%) are all very close to our design goals.

  2. Beam Size Measurement by Optical Diffraction Radiation and Laser System for Compton Polarimeter

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Chuyu [Peking Univ., Beijing (China)

    2012-12-31

    Beam diagnostics is an essential constituent of any accelerator, so that it is named as "organs of sense" or "eyes of the accelerator." Beam diagnostics is a rich field. A great variety of physical effects or physical principles are made use of in this field. Some devices are based on electro-magnetic influence by moving charges, such as faraday cups, beam transformers, pick-ups; Some are related to Coulomb interaction of charged particles with matter, such as scintillators, viewing screens, ionization chambers; Nuclear or elementary particle physics interactions happen in some other devices, like beam loss monitors, polarimeters, luminosity monitors; Some measure photons emitted by moving charges, such as transition radiation, synchrotron radiation monitors and diffraction radiation-which is the topic of the first part of this thesis; Also, some make use of interaction of particles with photons, such as laser wire and Compton polarimeters-which is the second part of my thesis. Diagnostics let us perceive what properties a beam has and how it behaves in a machine, give us guideline for commissioning, controlling the machine and indispensable parameters vital to physics experiments. In the next two decades, the research highlight will be colliders (TESLA, CLIC, JLC) and fourth-generation light sources (TESLA FEL, LCLS, Spring 8 FEL) based on linear accelerator. These machines require a new generation of accelerator with smaller beam, better stability and greater efficiency. Compared with those existing linear accelerators, the performance of next generation linear accelerator will be doubled in all aspects, such as 10 times smaller horizontal beam size, more than 10 times smaller vertical beam size and a few or more times higher peak power. Furthermore, some special positions in the accelerator have even more stringent requirements, such as the interaction point of colliders and wigglor of free electron lasers. Higher performance of these accelerators increases the

  3. Combined retrievals of boreal forest fire aerosol properties with a polarimeter and lidar

    Directory of Open Access Journals (Sweden)

    K. Knobelspiesse

    2011-07-01

    Full Text Available Absorbing aerosols play an important, but uncertain, role in the global climate. Much of this uncertainty is due to a lack of adequate aerosol measurements. While great strides have been made in observational capability in the previous years and decades, it has become increasingly apparent that this development must continue. Scanning polarimeters have been designed to help resolve this issue by making accurate, multi-spectral, multi-angle polarized observations. This work involves the use of the Research Scanning Polarimeter (RSP. The RSP was designed as the airborne prototype for the Aerosol Polarimetery Sensor (APS, which was due to be launched as part of the (ultimately failed NASA Glory mission. Field observations with the RSP, however, have established that simultaneous retrievals of aerosol absorption and vertical distribution over bright land surfaces are quite uncertain. We test a merger of RSP and High Spectral Resolution Lidar (HSRL data with observations of boreal forest fire smoke, collected during the Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS. During ARCTAS, the RSP and HSRL instruments were mounted on the same aircraft, and validation data were provided by instruments on an aircraft flying a coordinated flight pattern. We found that the lidar data did indeed improve aerosol retrievals using an optimal estimation method, although not primarily because of the contraints imposed on the aerosol vertical distribution. The more useful piece of information from the HSRL was the total column aerosol optical depth, which was used to select the initial value (optimization starting point of the aerosol number concentration. When ground based sun photometer network climatologies of number concentration were used as an initial value, we found that roughly half of the retrievals had unrealistic sizes and imaginary indices, even though the retrieved spectral optical depths agreed within

  4. Combined Retrievals of Boreal Forest Fire Aerosol Properties with a Polarimeter and Lidar

    Science.gov (United States)

    Knobelspiesse, K.; Cairns, B.; Ottaviani, M.; Ferrare, R.; Haire, J.; Hostetler, C.; Obland, M.; Rogers, R.; Redemann, J.; Shinozuka, Y.; hide

    2011-01-01

    Absorbing aerosols play an important, but uncertain, role in the global climate. Much of this uncertainty is due to a lack of adequate aerosol measurements. While great strides have been made in observational capability in the previous years and decades, it has become increasingly apparent that this development must continue. Scanning polarimeters have been designed to help resolve this issue by making accurate, multi-spectral, multi-angle polarized observations. This work involves the use of the Research Scanning Polarimeter (RSP). The RSP was designed as the airborne prototype for the Aerosol Polarimetery Sensor (APS), which was due to be launched as part of the (ultimately failed) NASA Glory mission. Field observations with the RSP, however, have established that simultaneous retrievals of aerosol absorption and vertical distribution over bright land surfaces are quite uncertain. We test a merger of RSP and High Spectral Resolution Lidar (HSRL) data with observations of boreal forest fire smoke, collected during the Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS). During ARCTAS, the RSP and HSRL instruments were mounted on the same aircraft, and validation data were provided by instruments on an aircraft flying a coordinated flight pattern. We found that the lidar data did indeed improve aerosol retrievals using an optimal estimation method, although not primarily because of the constraints imposed on the aerosol vertical distribution. The more useful piece of information from the HSRL was the total column aerosol optical depth, which was used to select the initial value (optimization starting point) of the aerosol number concentration. When ground based sun photometer network climatologies of number concentration were used as an initial value, we found that roughly half of the retrievals had unrealistic sizes and imaginary indices, even though the retrieved spectral optical depths agreed within uncertainties to

  5. Concept and realization of the A4 Compton backscattering polarimeter at MAMI

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jeong Han

    2008-12-15

    The main concern of the A4 parity violation experiment at the Mainzer Microtron accelerator facility is to study the electric and magnetic contributions of strange quarks to the charge and magnetism of the nucleons at the low momentum transfer region. More precisely, the A4 collaboration investigates the strange quarks' contribution to the electric and magnetic vector form factors of the nucleons. Thus, it is important that the A4 experiment uses an adequate and precise non-destructive online monitoring tool for the electron beam polarization when measuring single spin asymmetries in elastic scattering of polarized electrons from unpolarized nucleons. As a consequence, the A4 Compton backscattering polarimeter was designed and installed such that we can take the absolute measurement of the electron beam polarization without interruption to the parity violation experiment. The present study shows the development of an electron beam line that is called the chicane for the A4 Compton backscattering polarimeter. The chicane is an electron beam transport line and provides an interaction region where the electron beam and the laser beam overlap. After studying the properties of beam line components carefully, we developed an electron beam control system that makes a beam overlap between the electron beam and the laser beam. Using the system, we can easily achieve the beam overlap in a short time. The electron control system, of which the performance is outstanding, is being used in production beam times. And the study presents the development of a scintillating fiber electron detector that reduces the statistical error in the electron polarization measurement. We totally redesigned the scintillating fiber detector. The data that were taken during a 2008 beam time shows a huge background suppression, approximately 80 percent, while leaving the Compton spectra almost unchanged when a coincidence between the fiber detector and the photon detector is used. Thus, the

  6. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  7. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...

  8. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  9. First Results from BISTRO: A SCUBA-2 Polarimeter Survey of the Gould Belt

    Energy Technology Data Exchange (ETDEWEB)

    Ward-Thompson, Derek; Pattle, Kate; Kirk, Jason M. [Jeremiah Horrocks Institute, University of Central Lancashire, Preston PR1 2HE (United Kingdom); Bastien, Pierre; Coudé, Simon [Centre de recherche en astrophysique du Québec and département de physique, Université de Montréal, C.P. 6128, Succ. Centre-ville, Montréal, QC, H3C 3J7 (Canada); Furuya, Ray S. [Tokushima University, Minami Jousanajima-machi 1-1, Tokushima 770-8502 (Japan); Kwon, Woojin; Choi, Minho; Hoang, Thiem [Korea Astronomy and Space Science Institute, 776 Daedeokdae-ro, Yuseong-gu, Daejeon 34055 (Korea, Republic of); Lai, Shih-Ping [Institute of Astronomy and Department of Physics, National Tsing Hua University, Hsinchu 30013, Taiwan (China); Qiu, Keping [School of Astronomy and Space Science, Nanjing University, 163 Xianlin Avenue, Nanjing 210023 (China); Berry, David; Friberg, Per; Graves, Sarah F. [East Asian Observatory, 660 N. A‘ohōkū Place, University Park, Hilo, HI 96720 (United States); Francesco, James Di; Johnstone, Doug [NRC Herzberg Astronomy and Astrophysics, 5071 West Saanich Road, Victoria, BC V9E 2E7 (Canada); Franzmann, Erica [Department of Physics and Astronomy, The University of Manitoba, Winnipeg, Manitoba R3T2N2 (Canada); Greaves, Jane S. [School of Physics and Astronomy, Cardiff University, The Parade, Cardiff, CF24 3AA (United Kingdom); Houde, Martin [Department of Physics and Astronomy, The University of Western Ontario, 1151 Richmond Street, London N6A 3K7 (Canada); Koch, Patrick M., E-mail: dward-thompson@uclan.ac.uk, E-mail: kmpattle@uclan.ac.uk, E-mail: jmkirk@uclan.ac.uk, E-mail: spseyres@uclan.ac.uk [Academia Sinica Institute of Astronomy and Astrophysics, P.O. Box 23-141, Taipei 10617, Taiwan (China); and others

    2017-06-10

    We present the first results from the B-fields In STar-forming Region Observations (BISTRO) survey, using the Sub-millimetre Common-User Bolometer Array 2 camera, with its associated polarimeter (POL-2), on the James Clerk Maxwell Telescope in Hawaii. We discuss the survey’s aims and objectives. We describe the rationale behind the survey, and the questions that the survey will aim to answer. The most important of these is the role of magnetic fields in the star formation process on the scale of individual filaments and cores in dense regions. We describe the data acquisition and reduction processes for POL-2, demonstrating both repeatability and consistency with previous data. We present a first-look analysis of the first results from the BISTRO survey in the OMC 1 region. We see that the magnetic field lies approximately perpendicular to the famous “integral filament” in the densest regions of that filament. Furthermore, we see an “hourglass” magnetic field morphology extending beyond the densest region of the integral filament into the less-dense surrounding material, and discuss possible causes for this. We also discuss the more complex morphology seen along the Orion Bar region. We examine the morphology of the field along the lower-density northeastern filament. We find consistency with previous theoretical models that predict magnetic fields lying parallel to low-density, non-self-gravitating filaments, and perpendicular to higher-density, self-gravitating filaments.

  10. Effects of stray lights on Faraday rotation measurement for polarimeter-interferometer system on EAST

    Science.gov (United States)

    Zou, Z. Y.; Liu, H. Q.; Ding, W. X.; Chen, J.; Brower, D. L.; Lian, H.; Wang, S. X.; Li, W. M.; Yao, Y.; Zeng, L.; Jie, Y. X.

    2018-01-01

    A double-pass radially view 11 chords polarimeter-interferometer system has been operated on the experimental advanced superconducting tokamak and provides important current profile information for plasma control. Stray light originating from spurious reflections along the optical path (unwanted reflections from various optical components/mounts and transmissive optical elements such as windows, waveplates, and lens as well as the detectors) and also direct feedback from the retro-reflector used to realize the double-pass configuration can both contribute to contamination of the Faraday rotation measurement accuracy. Modulation of the Faraday rotation signal due to the interference from multiple reflections is observable when the interferometer phase (plasma density) varies with time. Direct reflection from the detector itself can be suppressed by employing an optical isolator consisting of a λ/4-waveplate and polarizer positioned in front of the mixer. A Faraday angle oscillation during the density ramping up (or down) can be reduced from 5°-10° to 1°-2° by eliminating reflections from the detector. Residual modulation arising from misalignment and stray light from other sources must be minimized to achieve accurate measurements of Faraday rotation.

  11. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    Science.gov (United States)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument currently being developed by NASA's Marshall Space Flight Center (MSFC), the National Astronomical Observatory of Japan (NAOJ), and other partners. The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's chromosphere. The polarized spectrum imaged by the CCD cameras will capture information about the local magnetic field, allowing for measurements of magnetic strength and structure. In order to make accurate measurements of this effect, the performance characteristics of the three on- board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, read noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of 2.0+/- 0.5 e--/DN, a read noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non- linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  12. Squids, snakes, and polarimeters: A new technique for measuring the magnetic moments of polarized beams

    International Nuclear Information System (INIS)

    Cameron, P.R.; Luccio, A.U.; Shea, T.J.; Tsoupas, N.; Goldberg, D.A.

    1997-01-01

    Effective polarimetry at high energies in hadron and lepton synchrotrons has been a long-standing and difficult problem. In synchrotrons with polarized beams it is possible to cause the direction of the polarization vector of a given bunch to alternate at a frequency which is some subharmonic of the rotation frequency. This can result in the presence of lines in the beam spectrum which are due only to the magnetic moment of the beam and which are well removed from the various lines due to the charge of the beam. The magnitude of these lines can be calculated from first principles. They are many orders of magnitude weaker than the Schottky signals. Measurement of the magnitude of one of these lines would be an absolute measurement of beam polarization. For measuring magnetic field, the Superconducting Quantum Interference Device, or squid, is about five orders of magnitude more sensitive than any other transducer. Using a squid, such a measurement might be accomplished with the proper combination of shielding, pickup loop design, and filtering. The resulting instrument would be fast, non-destructive, and comparatively cheap. In addition, techniques developed in the creation of such an instrument could be used to measure the Schottky spectrum in unprecedented detail. We present specifics of a polarimeter design for the Relativistic Heavy Ion Collider (RHIC) and briefly discuss the possibility of using this technique to measure polarization at high-energy electron machines like LEP and HERA. copyright 1997 American Institute of Physics

  13. First Results from BISTRO: A SCUBA-2 Polarimeter Survey of the Gould Belt

    Science.gov (United States)

    Ward-Thompson, Derek; Pattle, Kate; Bastien, Pierre; Furuya, Ray S.; Kwon, Woojin; Lai, Shih-Ping; Qiu, Keping; Berry, David; Choi, Minho; Coudé, Simon; Di Francesco, James; Hoang, Thiem; Franzmann, Erica; Friberg, Per; Graves, Sarah F.; Greaves, Jane S.; Houde, Martin; Johnstone, Doug; Kirk, Jason M.; Koch, Patrick M.; Kwon, Jungmi; Lee, Chang Won; Li, Di; Matthews, Brenda C.; Mottram, Joseph C.; Parsons, Harriet; Pon, Andy; Rao, Ramprasad; Rawlings, Mark; Shinnaga, Hiroko; Sadavoy, Sarah; van Loo, Sven; Aso, Yusuke; Byun, Do-Young; Eswaraiah, Chakali; Chen, Huei-Ru; Chen, Mike C.-Y.; Chen, Wen Ping; Ching, Tao-Chung; Cho, Jungyeon; Chrysostomou, Antonio; Chung, Eun Jung; Doi, Yasuo; Drabek-Maunder, Emily; Eyres, Stewart P. S.; Fiege, Jason; Friesen, Rachel K.; Fuller, Gary; Gledhill, Tim; Griffin, Matt J.; Gu, Qilao; Hasegawa, Tetsuo; Hatchell, Jennifer; Hayashi, Saeko S.; Holland, Wayne; Inoue, Tsuyoshi; Inutsuka, Shu-ichiro; Iwasaki, Kazunari; Jeong, Il-Gyo; Kang, Ji-hyun; Kang, Miju; Kang, Sung-ju; Kawabata, Koji S.; Kemper, Francisca; Kim, Gwanjeong; Kim, Jongsoo; Kim, Kee-Tae; Kim, Kyoung Hee; Kim, Mi-Ryang; Kim, Shinyoung; Lacaille, Kevin M.; Lee, Jeong-Eun; Lee, Sang-Sung; Li, Dalei; Li, Hua-bai; Liu, Hong-Li; Liu, Junhao; Liu, Sheng-Yuan; Liu, Tie; Lyo, A.-Ran; Mairs, Steve; Matsumura, Masafumi; Moriarty-Schieven, Gerald H.; Nakamura, Fumitaka; Nakanishi, Hiroyuki; Ohashi, Nagayoshi; Onaka, Takashi; Peretto, Nicolas; Pyo, Tae-Soo; Qian, Lei; Retter, Brendan; Richer, John; Rigby, Andrew; Robitaille, Jean-François; Savini, Giorgio; Scaife, Anna M. M.; Soam, Archana; Tamura, Motohide; Tang, Ya-Wen; Tomisaka, Kohji; Wang, Hongchi; Wang, Jia-Wei; Whitworth, Anthony P.; Yen, Hsi-Wei; Yoo, Hyunju; Yuan, Jinghua; Zhang, Chuan-Peng; Zhang, Guoyin; Zhou, Jianjun; Zhu, Lei; André, Philippe; Dowell, C. Darren; Falle, Sam; Tsukamoto, Yusuke

    2017-06-01

    We present the first results from the B-fields In STar-forming Region Observations (BISTRO) survey, using the Sub-millimetre Common-User Bolometer Array 2 camera, with its associated polarimeter (POL-2), on the James Clerk Maxwell Telescope in Hawaii. We discuss the survey’s aims and objectives. We describe the rationale behind the survey, and the questions that the survey will aim to answer. The most important of these is the role of magnetic fields in the star formation process on the scale of individual filaments and cores in dense regions. We describe the data acquisition and reduction processes for POL-2, demonstrating both repeatability and consistency with previous data. We present a first-look analysis of the first results from the BISTRO survey in the OMC 1 region. We see that the magnetic field lies approximately perpendicular to the famous “integral filament” in the densest regions of that filament. Furthermore, we see an “hourglass” magnetic field morphology extending beyond the densest region of the integral filament into the less-dense surrounding material, and discuss possible causes for this. We also discuss the more complex morphology seen along the Orion Bar region. We examine the morphology of the field along the lower-density northeastern filament. We find consistency with previous theoretical models that predict magnetic fields lying parallel to low-density, non-self-gravitating filaments, and perpendicular to higher-density, self-gravitating filaments.

  14. The First Multichroic Polarimeter Array on the Atacama Cosmology Telescope: Characterization and Performance

    Science.gov (United States)

    Ho, S. P.; Pappas, C. G.; Austermann, J.; Beall, J. A.; Becker, D.; Choi, S. K.; Datta, R.; Duff, S. M.; Gallardo, P. A.; Grace, E.; hide

    2016-01-01

    The Atacama Cosmology Telescope Polarimeter (ACTPol) is a polarization sensitive receiver for the 6-meter Atacama Cosmology Telescope (ACT) and measures the small angular scale polarization anisotropies in the cosmic microwave background (CMB). The full focal plane is composed of three detector arrays, containing over 3000 transition edge sensors (TES detectors) in total. The first two detector arrays, observing at 146 gigahertz, were deployed in 2013 and 2014, respectively. The third and final array is composed of multichroic pixels sensitive to both 90 and 146 gigahertz and saw first light in February 2015. Fabricated at NIST, this dichroic array consists of 255 pixels, with a total of 1020 polarization sensitive bolometers and is coupled to the telescope with a monolithic array of broad-band silicon feedhorns. The detectors are read out using time-division SQUID multiplexing and cooled by a dilution refrigerator at 110 meter Kelvins. We present an overview of the assembly and characterization of this multichroic array in the lab, and the initial detector performance in Chile. The detector array has a TES detector electrical yield of 85 percent, a total array sensitivity of less than 10 microns Kelvin root mean square speed, and detector time constants and saturation powers suitable for ACT CMB observations.

  15. Scanning polarimeter for measurement of the poloidal magnetic field in a tokamak

    International Nuclear Information System (INIS)

    Wroblewski, D.; Huang, L.K.; Moos, H.W.

    1988-01-01

    The internal magnetic field in a magnetically confined plasma may be deduced from the analysis of circular polarization of spectral lines emitted by the plasma. The theory of the measurement and a detailed design of a polarimeter constructed to measure the poloidal field profile in the Texas Experimental Tokamak (TEXT) are presented. The instrument measures the difference between left-hand and right-hand circularly polarized line profiles, a quantity directly proportional to the magnetic field component in the direction of observation. The high throughput of the Fabry--Perot interferometer employed in this design, combined with efficient light-collecting optics and lock-in detection of the polarization signal, allows measurement of the fractional circular polarization of the magnetic dipole line Ti XVII 3834 A with an accuracy on the order of 10 -3 . The line-of-sight averaged poloidal field is determined with uncertainty as small as 50 G. The line emission used in the present measurement is not well localized in the plasma, necessitating the use of a spatial inversion procedure to obtain the local values of the field

  16. The Primordial Inflation Explorer (PIXIE): A Nulling Polarimeter for Cosmic Microwave Background Observations

    Science.gov (United States)

    Kogut, Alan J.; Fixsen, D. J.; Chuss, D. T.; Dotson, J.; Dwek, E.; Halpern, M.; Hinshaw, G. F.; Meyer, S. M.; Moseley, S. H.; Seiffert, M. D.; hide

    2011-01-01

    The Primordial Inflation Explorer (PIXIE) is a concept for an Explorer-class mission to measure the gravity-wave signature of primordial inflation through its distinctive imprint on the linear polarization of the cosmic microwave background. The instrument consists of a polarizing Michelson interferometer configured as a nulling polarimeter to measure the difference spectrum between orthogonal linear polarizations from two co-aligned beams. Either input can view the sky or a temperature-controlled absolute reference blackbody calibrator. Rhe proposed instrument can map the absolute intensity and linear polarization (Stokes I, Q, and U parameters) over the full sky in 400 spectral channels spanning 2.5 decades in frequency from 30 GHz to 6 THz (1 cm to 50 micron wavelength). Multi-moded optics provide background-limited sensitivity using only 4 detectors, while the highly symmetric design and multiple signal modulations provide robust rejection of potential systematic errors. The principal science goal is the detection and characterization of linear polarization from an inflationary epoch in the early universe, with tensor-to-scalar ratio r < 10..3 at 5 standard deviations. The rich PIXIE data set can also constrain physical processes ranging from Big Bang cosmology to the nature of the first stars to physical conditions within the interstellar medium of the Galaxy.

  17. Software Reviews.

    Science.gov (United States)

    Science Software Quarterly, 1984

    1984-01-01

    Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…

  18. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  19. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  20. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  1. MEASUREMENT OF LOW ENERGY DETECTION EFFICIENCY OF A PLASTIC SCINTILLATOR: IMPLICATIONS ON THE LOWER ENERGY LIMIT AND SENSITIVITY OF A HARD X-RAY FOCAL PLANE COMPTON POLARIMETER

    International Nuclear Information System (INIS)

    Chattopadhyay, T.; Vadawale, S. V.; Shanmugam, M.; Goyal, S. K.

    2014-01-01

    The polarization measurements in X-rays offer a unique opportunity for the study of physical processes under the extreme conditions prevalent at compact X-ray sources, including gravitation, magnetic field, and temperature. Unfortunately, there has been no real progress in observational X-ray polarimetry thus far. Although photoelectron tracking-based X-ray polarimeters provide realistic prospects of polarimetric observations, they are effective in the soft X-rays only. With the advent of hard X-ray optics, it has become possible to design sensitive X-ray polarimeters in hard X-rays based on Compton scattering. An important point that should be carefully considered for the Compton polarimeters is the lower energy threshold of the active scatterer, which typically consists of a plastic scintillator due to its lowest effective atomic number. Therefore, an accurate understanding of the plastic scintillators energy threshold is essential to make a realistic estimate of the energy range and sensitivity of any Compton polarimeter. In this context, we set up an experiment to investigate the plastic scintillators behavior for very low energy deposition events. The experiment involves the detection of Compton scattered photons from a long, thin, plastic scintillator (a similar configuration as the eventual Compton polarimeter) by a high resolution CdTe detector at different scattering angles. We find that it is possible to detect energy deposition well below 1 keV, though with decreasing efficiency. We present detailed semianalytical modeling of our experimental setup and discuss the results in the context of the energy range and sensitivity of the Compton polarimeter involving plastic scintillators

  2. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP)

    Science.gov (United States)

    Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtin, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30%) quantum efficiency at the Lyman-$\\alpha$ line. The CLASP cameras were designed to operate with =10 e- /pixel/second dark current, = 25 e- read noise, a gain of 2.0 and =0.1% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  3. Use of the Far Infrared Tangential Interferometer/Polarimeter diagnostic for the study of rf driven plasma waves on NSTX.

    Science.gov (United States)

    Kim, J; Lee, K C; Kaita, R; Phillips, C K; Domier, C W; Valeo, E; Luhmann, N C; Bonoli, P T; Park, H

    2010-10-01

    A rf detection system for waves in the 30 MHz range has been constructed for the Far Infrared Tangential Interferometer/Polarimeter on National Spherical Torus Experiment (NSTX). It is aimed at monitoring high frequency density fluctuations driven by 30 MHz high harmonic fast wave fields. The levels of density fluctuations at various radial chords and antenna phase angles can be estimated using the electric field calculated by TORIC code and linearized continuity equation for the electron density. In this paper, the experimental arrangement for the detection of rf signal and preliminary results of simulation will be discussed.

  4. Development and manufacturing of panoramic Stokes polarimeter using the polarization films in the Main Astronomical Observatory of NAS of Ukraine

    Science.gov (United States)

    Vidmachenko, A. P.; Ivanov, Yu. S.; Syniavskyi, I. I.; Sergeev, A. V.

    2015-08-01

    In the Main Astronomical Observatory of NAS of Ukraine is proposed and implemented the concept of the imaging Stokes polarimeter [1-5]. This device allows carrying out measurements of the four Stokes vector components at the same time, in a wide field, and without any restrictions on the relative aperture of the optical system. Its scheme is developed so that only by turning wheel with replaceable elements, photopolarimeter could be transformed into a low resolution spectropolarimeter. The device has four film's polarizers with positional angles 0°, 45°, 90°, 135°. The device uses a system of special deflecting prisms in each channel. These prisms were achromatizing in the spectral range of 420-850 nm [2], the distortion of the polarimeter optical system is less than 0.65%. In manufacturing version of spectropolarimeter provided for the possibility of using working on passing the diffraction grating with a frequency up to 100 lines/mm. Has begun the laboratory testing of instrument. References. 1. Sinyavskii I.I., Ivanov Yu. S., Vidmachenko Anatoliy P., Karpov N.V. Panoramic Stokes-polarimeter // Ecological bulettin of research centers of the Black Sea Economic Cooperation. - 2013. - V. 3, No 4. - P. 123-127. 2. Sinyavskii I. I., Ivanov Yu. S., Vil'machenko A. P. Concept of the construction, of the optical setup of a panoramic Stokes polarimeter for small telescopes // Journal of Optical Technology. - 2013. - V. 80, Issue 9. - P. 545-548. 3. Vidmachenko A. P., Ivanov Yu. S., Morozhenko A. V., Nevodovsky E. P., Syniavskyi I. I., Sosonkin M. G. Spectropolarimeter of ground-based accompanying for the space experiment "Planetary Monitoring" // Kosmichna Nauka i Tekhnologiya. - 2007. - V. 13, No. 1, p. 63 - 70. 4. Yatskiv Ya. S., Vidmachenko A. P., Morozhenko A. V., Sosonkin M. G., Ivanov Yu. S., Syniavskyi I. I. Spectropolarimetric device for overatmospheric investigations of Solar System bodies // Kosmichna Nauka i Tekhnologiya. - 2008. - V. 14, No. 2. - P. 56

  5. Correlation lifetimes of quiet and magnetic granulation from the SOUP instrument on Spacelab 2. [Solar Optical Universal Polarimeter

    Science.gov (United States)

    Title, A.; Tarbell, T.; Topka, K.; Acton, L.; Duncan, D.

    1988-01-01

    The time sequences of diffraction limited granulation images obtained by the Solar Optical Universal Polarimeter on Spacelab 2 are presented. The uncorrection autocorrelation limetime in magnetic regions is dominated by the 5-min oscillation. The removal of this oscillation causes the autocorrelation lifetime to increase by more than a factor of 2. The results suggest that a significant fraction of granule lifetimes are terminated by nearby explosions. Horizontal displacements and transverse velocities in the intensity field are measured. Lower limits to the lifetime in the quiet and magnetic sun are set at 440 s and 950 s, respectively.

  6. Preliminary Neutronics Analysis of the ITER Toroidal Interferometer and Polarimeter Diagnostic Corner Cube Retroreflectors

    Energy Technology Data Exchange (ETDEWEB)

    Tresemer, K. R.

    2015-07-01

    ITER is an international project under construction in France that will demonstrate nuclear fusion at a power plant-relevant scale. The Toroidal Interferometer and Polarimeter (TIP) Diagnostic will be used to measure the plasma electron line density along 5 laser-beam chords. This line-averaged density measurement will be input to the ITER feedback-control system. The TIP is considered the primary diagnostic for these measurements, which are needed for basic ITER machine control. Therefore, system reliability & accuracy is a critical element in TIP’s design. There are two major challenges to the reliability of the TIP system. First is the survivability and performance of in-vessel optics and second is maintaining optical alignment over long optical paths and large vessel movements. Both of these issues greatly depend on minimizing the overall distortion due to neutron & gamma heating of the Corner Cube Retroreflectors (CCRs). These are small optical mirrors embedded in five first wall locations around the vacuum vessel, corresponding to certain plasma tangency radii. During the development of the design and location of these CCRs, several iterations of neutronics analyses were performed to determine and minimize the total distortion due to nuclear heating of the CCRs. The CCR corresponding to TIP Channel 2 was chosen for analysis as a good middle-road case, being an average distance from the plasma (of the five channels) and having moderate neutron shielding from its blanket shield housing. Results show that Channel 2 meets the requirements of the TIP Diagnostic, but barely. These results suggest other CCRs might be at risk of exceeding thermal deformation due to nuclear heating.

  7. Remote Sensing of Cloud Top Heights Using the Research Scanning Polarimeter

    Science.gov (United States)

    Sinclair, Kenneth; van Diedenhoven, Bastiaan; Cairns, Brian; Yorks, John; Wasilewski, Andrzej

    2015-01-01

    Clouds cover roughly two thirds of the globe and act as an important regulator of Earth's radiation budget. Of these, multilayered clouds occur about half of the time and are predominantly two-layered. Changes in cloud top height (CTH) have been predicted by models to have a globally averaged positive feedback, however observational changes in CTH have shown uncertain results. Additional CTH observations are necessary to better and quantify the effect. Improved CTH observations will also allow for improved sub-grid parameterizations in large-scale models and accurate CTH information is important when studying variations in freezing point and cloud microphysics. NASA's airborne Research Scanning Polarimeter (RSP) is able to measure cloud top height using a novel multi-angular contrast approach. RSP scans along the aircraft track and obtains measurements at 152 viewing angles at any aircraft location. The approach presented here aggregates measurements from multiple scans to a single location at cloud altitude using a correlation function designed to identify the location-distinct features in each scan. During NASAs SEAC4RS air campaign, the RSP was mounted on the ER-2 aircraft along with the Cloud Physics Lidar (CPL), which made simultaneous measurements of CTH. The RSPs unique method of determining CTH is presented. The capabilities of using single and combinations of channels within the approach are investigated. A detailed comparison of RSP retrieved CTHs with those of CPL reveal the accuracy of the approach. Results indicate a strong ability for the RSP to accurately identify cloud heights. Interestingly, the analysis reveals an ability for the approach to identify multiple cloud layers in a single scene and estimate the CTH of each layer. Capabilities and limitations of identifying single and multiple cloud layers heights are explored. Special focus is given to sources of error in the method including optically thin clouds, physically thick clouds, multi

  8. Performance measurement of HARPO: A time projection chamber as a gamma-ray telescope and polarimeter

    Science.gov (United States)

    Gros, P.; Amano, S.; Attié, D.; Baron, P.; Baudin, D.; Bernard, D.; Bruel, P.; Calvet, D.; Colas, P.; Daté, S.; Delbart, A.; Frotin, M.; Geerebaert, Y.; Giebels, B.; Götz, D.; Hashimoto, S.; Horan, D.; Kotaka, T.; Louzir, M.; Magniette, F.; Minamiyama, Y.; Miyamoto, S.; Ohkuma, H.; Poilleux, P.; Semeniouk, I.; Sizun, P.; Takemoto, A.; Yamaguchi, M.; Yonamine, R.; Wang, S.

    2018-01-01

    We analyse the performance of a gas time projection chamber (TPC) as a high-performance gamma-ray telescope and polarimeter in the e+e- pair-creation regime. We use data collected at a gamma-ray beam of known polarisation. The TPC provides two orthogonal projections (x, z) and (y, z) of the tracks induced by each conversion in the gas volume. We use a simple vertex finder in which vertices and pseudo-tracks exiting from them are identified. We study the various contributions to the single-photon angular resolution using Monte Carlo simulations, compare them with the experimental data and find that they are in excellent agreement. The distribution of the azimuthal angle of pair conversions shows a bias due to the non-cylindrical-symmetric structure of the detector. This bias would average out for a long duration exposure on a space mission, but for this pencil-beam characterisation we have ensured its accurate simulation by a double systematics-control scheme, data taking with the detector rotated at several angles with respect to the beam polarisation direction and systematics control with a non-polarised beam. We measure, for the first time, the polarisation asymmetry of a linearly polarised gamma-ray beam in the low energy pair-creation regime. This sub-GeV energy range is critical for cosmic sources as their spectra are power laws which fall quickly as a function of increasing energy. This work could pave the way to extending polarised gamma-ray astronomy beyond the MeV energy regime.

  9. THE IMAGING PROPERTIES OF THE GAS PIXEL DETECTOR AS A FOCAL PLANE POLARIMETER

    International Nuclear Information System (INIS)

    Fabiani, S.; Costa, E.; Del Monte, E.; Muleri, F.; Soffitta, P.; Rubini, A.; Bellazzini, R.; Brez, A.; De Ruvo, L.; Minuti, M.; Pinchera, M.; Sgró, C.; Spandre, G.; Spiga, D.; Tagliaferri, G.; Pareschi, G.; Basso, S.; Citterio, O.; Burwitz, V.; Burkert, W.

    2014-01-01

    X-rays are particularly suited to probing the physics of extreme objects. However, despite the enormous improvements of X-ray astronomy in imaging, spectroscopy, and timing, polarimetry remains largely unexplored. We propose the photoelectric polarimeter Gas Pixel Detector (GPD) as a candidate instrument to fill the gap created by more than 30 yr without measurements. The GPD, in the focus of a telescope, will increase the sensitivity of orders of magnitude. Moreover, since it can measure the energy, the position, the arrival time, and the polarization angle of every single photon, it allows us to perform polarimetry of subsets of data singled out from the spectrum, the light curve, or an image of the source. The GPD has an intrinsic, very fine imaging capability, and in this work we report on the calibration campaign carried out in 2012 at the PANTER X-ray testing facility of the Max-Planck-Institut für extraterrestrische Physik of Garching (Germany) in which, for the first time, we coupled it with a JET-X optics module with a focal length of 3.5 m and an angular resolution of 18 arcsec at 4.5 keV. This configuration was proposed in 2012 aboard the X-ray Imaging Polarimetry Explorer (XIPE) in response to the ESA call for a small mission. We derived the imaging and polarimetric performance for extended sources like pulsar wind nebulae and supernova remnants as case studies for the XIPE configuration and also discuss possible improvements by coupling the detector with advanced optics that have a finer angular resolution and larger effective areas to study extended objects with more detail

  10. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  11. Calibration and performance studies of the balloon-borne hard X-ray polarimeter PoGO+

    Science.gov (United States)

    Chauvin, M.; Friis, M.; Jackson, M.; Kawano, T.; Kiss, M.; Mikhalev, V.; Ohashi, N.; Stana, T.; Takahashi, H.; Pearce, M.

    2017-07-01

    Polarimetric observations of celestial sources in the hard X-ray band stand to provide new information on emission mechanisms and source geometries. PoGO+ is a Compton scattering polarimeter (20-150 keV) optimised for the observation of the Crab (pulsar and wind nebula) and Cygnus X-1 (black hole binary), from a stratospheric balloon-borne platform launched from the Esrange Space Centre in summer 2016. Prior to flight, the response of the polarimeter has been studied with polarised and unpolarised X-rays allowing a Geant4-based simulation model to be validated. The expected modulation factor for Crab observations is found to be MCrab = (41.75 ± 0.85) % , resulting in an expected Minimum Detectable Polarisation (MDP) of 7.3% for a 7 day flight. This will allow a measurement of the Crab polarisation parameters with at least 5 σ statistical significance assuming a polarisation fraction ∼ 20 % - a significant improvement over the PoGOLite Pathfinder mission which flew in 2013 and from which the PoGO+ design is developed.

  12. Measurement errors resulted from misalignment errors of the retarder in a rotating-retarder complete Stokes polarimeter.

    Science.gov (United States)

    Dai, Hu; Yan, Changxiang

    2014-05-19

    Rotatable retarder fixed polarizer (RRFP) Stokes polarimeters, which employ uniformly spaced angles over 180° or 360°, are most commonly used to detect the state of polarization (SOP) of an electromagnetic (EM) wave. The misalignment error of the retarder is one of the major error sources. We suppose that the misalignment errors of the retarder obey a uniform normal distribution and are independent of each other. Then, we derive analytically the covariance matrices of the measurement errors. Based on the covariance matrices derived, we can conclude that 1) the measurement errors are independent of the incident intensity s0, but seriously depend on the Stokes parameters (s1, s2, s3) and the retardance of the retarder δ; 2) for any mean incident SOP, the optimal initial angle and retardance to minimize the measurement error both can be achieved; 3) when N = 5, 10, 12, the initial orienting angle could be used as an added degree of freedom to strengthen the immunity of RRFP Stokes polarimeters to the misalignment error. Finally, a series of simulations are performed to verify these theoretical results.

  13. Using a polarizing film in the manufacture of panoramic Stokes polarimeters at the Main Astronomical Observatory of the National Academy of Sciences of Ukraine

    Science.gov (United States)

    Sinyavskiy, I. I.; Ivanov, Yu. S.; Vidmachenko, A. P.; Sergeev, A. V.

    2013-09-01

    MAO of NASU proposed and implemented the concept [1] of imaging Stokes polarimeter, which allows to measure four components of the Stokes vector at the same time, in a wide field, and without restrictions on the relative aperture of the system. And polarimeter can be converted into low-resolution spectropolarimeter by rotation of the wheel with replaceable elements. To full utilization of the CCD area in the device installed four film's polarizer with positional angles 0°, 45°, 90°, 135°. In each channel of this device installed the system of special deflecting prisms, which achromatize for the spectral range 420-850 nm [2]. Distortion is less than 0.65%. Also have the opportunity the use of the diffraction grating with a frequency up to 100 lines / mm, working on the transmission. References. 1. Sinyavskii I.I., Ivanov Yu.S., Vidmachenko A.P., Karpov N.V. Panoramic Stokes polarimeter // Ecological Bulletin of Research Centers of the Black Sea Economic Cooperation, ISSN: 1729-5459. - 2013 - V. 3, No 4. - P. 123-127. 2. Sinyavskii, I. I.; Ivanov, Yu. S.; Vil'machenko, A. P. Concept of the construction, of the optical setup of a panoramic Stokes polarimeter for small telescopes // Journal of Optical Technology. - 2013. - V. 80, Issue 9. - P. 545-548.

  14. PEPSI: The high-resolution échelle spectrograph and polarimeter for the Large Binocular Telescope

    Science.gov (United States)

    Strassmeier, K. G.; Ilyin, I.; Järvinen, A.; Weber, M.; Woche, M.; Barnes, S. I.; Bauer, S.-M.; Beckert, E.; Bittner, W.; Bredthauer, R.; Carroll, T. A.; Denker, C.; Dionies, F.; DiVarano, I.; Döscher, D.; Fechner, T.; Feuerstein, D.; Granzer, T.; Hahn, T.; Harnisch, G.; Hofmann, A.; Lesser, M.; Paschke, J.; Pankratow, S.; Plank, V.; Plüschke, D.; Popow, E.; Sablowski, D.

    2015-05-01

    PEPSI is the bench-mounted, two-arm, fibre-fed and stabilized Potsdam Echelle Polarimetric and Spectroscopic Instrument for the 2×8.4 m Large Binocular Telescope (LBT). Three spectral resolutions of either 43 000, 120 000 or 270 000 can cover the entire optical/red wavelength range from 383 to 907 nm in three exposures. Two 10.3k×10.3k CCDs with 9-μm pixels and peak quantum efficiencies of 94-96 % record a total of 92 échelle orders. We introduce a new variant of a wave-guide image slicer with 3, 5, and 7 slices and peak efficiencies between 92-96 %. A total of six cross dispersers cover the six wavelength settings of the spectrograph, two of them always simultaneously. These are made of a VPH-grating sandwiched by two prisms. The peak efficiency of the system, including the telescope, is 15 % at 650 nm, and still 11 % and 10 % at 390 nm and 900 nm, respectively. In combination with the 110 m2 light-collecting capability of the LBT, we expect a limiting magnitude of ≈ 20th mag in V in the low-resolution mode. The R = 120 000 mode can also be used with two, dual-beam Stokes IQUV polarimeters. The 270 000-mode is made possible with the 7-slice image slicer and a 100-μm fibre through a projected sky aperture of 0.74 arcsec, comparable to the median seeing of the LBT site. The 43 000-mode with 12-pixel sampling per resolution element is our bad seeing or faint-object mode. Any of the three resolution modes can either be used with sky fibers for simultaneous sky exposures or with light from a stabilized Fabry-Pérot étalon for ultra-precise radial velocities. CCD-image processing is performed with the dedicated data-reduction and analysis package PEPSI-S4S. Its full error propagation through all image-processing steps allows an adaptive selection of parameters by using statistical inferences and robust estimators. A solar feed makes use of PEPSI during day time and a 500-m feed from the 1.8 m VATT can be used when the LBT is busy otherwise. In this paper, we

  15. Radiation, smoke and clouds observed in the southeastern Atlantic with the Research Scanning Polarimeter

    Science.gov (United States)

    Sinclair, K.; Cairns, B.; Stamnes, S.; Chowdhary, J.; Ferrare, R. A.; Hostetler, C. A.; Burton, S. P.

    2017-12-01

    The ObseRvations of Aerosols above Clouds and their interactions (ORACLES) project is making a series of field deployments to the southeastern Atlantic with NASA ER-2 and P3 aircraft to acquire both detailed remote sensing observations and in situ measurements of the aerosols and clouds in that region. This area is home to one of the largest low-level cloud decks on Earth that is seasonally affected by vast plumes of smoke from biomass burning, which in effect provides a natural experiment testing the radiative and microphysical interactions between the smoke and the clouds. The downward solar radiation at the surface, or cloud top, is always reduced by the presence of smoke. However, whether the amount of sunlight reflected back out to space is increased, or decreased by the presence of smoke is sensitively dependent on the brightness of the clouds and the fraction of light that the smoke absorbs each time light hits a smoke particle. In this study we use data from the Research Scanning Polarimeter, an along track scanning instrument, that provides measurements of the Stokes parameters I, Q and U at 410, 470, 555, 670, 865, 960, 1590, 1880 and 2260 nm at 150 viewing angles over a range of ±60° from nadir for each contiguous sub-aircraft pixel ( 300 m in size). A retrieval algorithm is applied to the data acquired with a table look up technique, similar to that of the operational POLDER algorithm, to provide a first guess of the complex refractive index, optical depth and size distribution of the smoke particles together with cloud droplet size and optical depth. A subsequent iterative fitting procedure, where the fact that the doubling/adding method allows the construction of the Green's function for the radiative transfer equation, is used to obtain an efficient and statistically optimal estimate of the aerosol and cloud retrieval parameters. These retrieval parameters are evaluated against in situ observations, when available, and the optical depth and

  16. Applications of the Hyper Angular Rainbow Polarimeter (HARP) instrument from aircraft and from space

    Science.gov (United States)

    Martins, J. V.; Fernandez Borda, R. A.; McBride, B.; Remer, L. A.; Barbosa, H. M.; Dubovik, O.

    2017-12-01

    The remote sensing of aerosol and cloud microphysics is essential for the global assessment of aerosol and cloud properties. Current spectral techniques utilized by MODIS, VIIRS and similar sensors lack details on the retrieval of the cloud and aerosol particle microphysical properties desired by the scientific community. Multi-spectral hyperangular polarization measurements provide enough information for this additional microphysical retrievals. The HARP (HyperAngular Rainbow Polarimeter) is a compact and modular imaging instrument with wide Field Of View (94 deg cross track and up to 114 degrees along track) and up to 60 along track viewing angles. Spectrally, HARP is envisioned to have modules in the UV, VNIR and SWIR ranges. Currently there are two existing HARP VNIR sensors, for airborne (AirHARP) and space-borne applications respectively, both with 4 wavelengths centered at 440, 550, 670, and 865nm. The space-borne HARP sensor has been designed for a 3U CubeSat satellite currently scheduled for launch to the International Space Station in January 2018 and to be released as a free flying satellite shortly after. At this orbit HARP will provide pixel resolution at the ground of about 400m, which will be binned to coarse resolutions (e.g. 2.5 Km) for data rate reduction. The AirHARP instrument has recently flown in the NASA Langley UC12 aircraft during the LMOS (Lake Michigan Ozone Study) collecting a large data set on aerosol, clouds, and surface properties. AirHARP will also fly in the ACEPOL campaign on board the NASA ER2 aircraft in October/November 2017. These campaigns are supporting HARP's algorithm development and validation in preparation to HARP's Cubesat launch and possibly other HARP space-borne missions. This presentation will describe details of the HARP and AirHARP instruments, as well and preliminary results with level 1 and level 2 data collected during the LMOS and the ACEPOL aircraft campaigns showing clouds and aerosol retrieval results.

  17. Retrievals of Cloud Droplet Size from the Research Scanning Polarimeter Data: Validation Using in Situ Measurements

    Science.gov (United States)

    Alexandrov, M. D.; Cairns, B.; Sinclair, K.; Wasilewski, A. P.; Ziemba, L. D.; Crosbie, E.; Hair, J. W.; Hu, Y.; Hostetler, C. A.; Stamnes, S.

    2016-12-01

    We present comparisons of cloud droplet size distributions retrieved from the Research Scanning Polarimeter (RSP) data with correlative in situ measurements made during the North Atlantic Aerosols and Marine Ecosystems Study (NAAMES). This field experiment was based at St. John's airport, Newfoundland, Canada with the latest deployment in May - June 2016. RSP was onboard the NASA C-130 aircraft together with an array of in situ and other remote sensing instrumentation. The RSP is an along-track scanner measuring polarized and total reflectances in 9 spectral channels. Its unique high angular resolution allows for characterization of liquid water droplet size using the rainbow structure observed in the polarized reflectances in the scattering angle range between 135 and 165 degrees. A parametric fitting algorithm applied to the polarized reflectances provides retrievals of the droplet effective radius and variance assuming a prescribed size distribution shape (gamma distribution). In addition to this, we use a non-parametric method, Rainbow Fourier Transform (RFT), which allows us to retrieve the droplet size distribution (DSD) itself. The latter is important in the case of clouds with complex structure, which results in multi-modal DSDs. During NAAMES the aircraft performed a number of flight patterns specifically designed for comparison of remote sensing retrievals and in situ measurements. These patterns consisted of two flight segments above the same straight ground track. One of these segments was flown above clouds allowing for remote sensing measurements, while the other was at the cloud top where cloud droplets were sampled. We compare the DSDs retrieved from the RSP data with in situ measurements made by the Cloud Droplet Probe (CDP). The comparisons show generally good agreement with deviations explainable by the position of the aircraft within cloud and by presence of additional cloud layers in RSP view that do not contribute to the in situ DSDs. In the

  18. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  19. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  20. Development of fast data processing electronics for a stacked x-ray detector system with application as a polarimeter

    Science.gov (United States)

    Maier, Daniel; Dick, Jürgen; Distratis, Giuseppe; Kendziorra, Eckhard; Santangelo, Andrea; Schanz, Thomas; Tenzer, Christoph; Warth, Gabriele

    2012-09-01

    We have assembled a stacked setup consisting of a soft and hard X-ray detector with cooling capability and control-, readout-, and data processing electronics at the Institut für Astronomie und Astrophysik Tübingen (IAAT). The detector system is a 64 ×64 DePFET-Matrix in front of a CdTe-Caliste module. The detectors were developed at the Max-Planck Institute Semiconductor Laboratory (HLL) in Neuperlach and the Commissariat a l'Energie Atomique (CEA) in Saclay, respectively. In this combined structure the DePFET detector works as Low Energy Detector (LED) while the Caliste module (HED) only detects the high energy photons that have passed through the LED. In this work we present the current status of the setup. Furthermore, an intended application of the detector system as a polarimeter is described.

  1. A solar magnetic and velocity field measurement system for Spacelab 2: The solar optical universal polarimeter (SOUP)

    Science.gov (United States)

    Tarbell, Theodore D.; Title, Alan M.

    1992-08-01

    The Solar Optical Universal Polarimeter flew on the Shuttle Mission Spacelab 2 (STS-51F) in August, 1985, and collected historic solar observations. SOUP is the only solar telescope on either a spacecraft or balloon which has delivered long sequences of diffraction-limited images. These movies led to several discoveries about the solar atmosphere which were published in the scientific journals. After Spacelab 2, reflights were planned on the Space Shuttle Sunlab Mission, which was cancelled after the Challenger disaster, and on balloon flights, which were also cancelled for funding reasons. In the meantime, the instrument was used in a productive program of ground-based observing, which collected excellent scientific data and served as instrument tests. This report gives an overview of the history of the SOUP program, the scientific discoveries, and the instrument design and performance.

  2. The IMaX polarimeter for the solar telescope SUNRISE of the NASA long duration balloon program

    Science.gov (United States)

    Alvarez-Herrero, A.; Martínez-Pillet, V.; Del Toro Iniesta, J. C.; Domingo, V.

    2010-06-01

    On June 8th 2009 the SUNRISE mission was successfully launched. This mission consisted of a 1m aperture solar telescope on board of a stratospheric balloon within the Long Duration Balloon NASA program. The flight followed the foreseen circumpolar trajectory over the Artic and the duration was 5 days and 17 hours. One of the two postfocal instruments onboard was IMaX, the Imaging Magnetograph eXperiment. This instrument is a solar magnetograph which is a diffraction limited imager capable to resolve 100 km on the solar surface, and simultaneously a high sensitivity polarimeter (Aeroespacial field and it is an important precedent for future space missions such as Solar Orbiter from ESA. Among these novel technologies the utilization of Liquid Crystal Variable Retarders (LCVRs) as polarization modulators and a LiNbO3 etalon as tunable spectral filter are remarkable. Currently the data obtained is being analyzed and the preliminary results show unprecedented information about the solar dynamics.

  3. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  4. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  5. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  6. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  7. Software Metrics: Measuring Haskell

    OpenAIRE

    Ryder, Chris; Thompson, Simon

    2005-01-01

    Software metrics have been used in software engineering as a mechanism for assessing code quality and for targeting software development activities, such as testing or refactoring, at areas of a program that will most benefit from them. Haskell has many tools for software engineering, such as testing, debugging and refactoring tools, but software metrics have mostly been neglected. The work presented in this paper identifies a collection of software metrics for use with Haskell programs. Thes...

  8. Software systems as cities

    OpenAIRE

    Wettel, Richard; Lanza, Michele

    2010-01-01

    Software understanding takes up a large share of the total cost of a software system. The high costs attributed to software understanding activities are caused by the size and complexity of software systems, by the continuous evolution that these systems are subject to, and by the lack of physical presence which makes software intangible. Reverse engineering helps practitioners deal with the intrinsic complexity of software, by providing a broad range of patterns and techniques. One of...

  9. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  10. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Recommend that DoN create a software acquisition specialty, mandate basic schooling for software acquisition specialists, close certain acquisition loopholes that permit poor development practices...

  11. Software Release Management

    National Research Council Canada - National Science Library

    Hoek, Andre van der; Hall, Richard S; Heimbigner, Dennis; Wolf, Alexander L

    1996-01-01

    .... Both developers and users of such software are affected by these complications. Developers need to accurately document complex and changing dependencies among the systems constituting the software...

  12. A new polarimeter scheme based on solid state semiconductors Un nuevo esquema para polarímetros basado en semiconductor de estado sólido

    Directory of Open Access Journals (Sweden)

    Heiner Castro Gutierrez

    2012-12-01

    Full Text Available A new kind of polarimeter scheme is suggested using solid state semiconductors. The new approach is based on the modulation over the intensities of the diffracted beams through a two-dimensional chiral grating, reported recently. It will be demonstrated that at least four intensity measurements of no equivalent diffracted beams are needed in order to estimate the polarization state of the incident beam. The incident beam azimuth was varied by routing a linear polarizer lens mounting in a stepped motor. The intensities of four diffracted beams were measured using a screen, a CCD camera and some algorithms running in a computer. The LabVIEW development environment software was used for controlling the hardware and for presenting the results. MATLAB© was used for calculating the intensities of the diffracted beams and computing the azimuth of the incident beam. Although both the azimuth and ellipticity should be estimated, the experiments show that only the azimuth estimation yields accurate results. The ellipticity cannot be estimated with precision. The error on the azimuth estimation depends on the variation in the power of the incident beam. It was found that the azimuth estimation is accurate between [0,140 and (150,180] degrees. The cause of huge errors in the azimuth found between 140 and 150 degrees are kept unknown.Un nuevo esquema de polarímetro es sugerido usando semiconductores de estado sólido. La nueva aproximación está basada en la modulación sobre las intensidades de los rayos difractados a través de una rejilla quiral de dos dimensiones reportado recientemente. Será demostrado que al menos cuatro mediciones de intensidades de rayos difractados no equivalentes son necesarios para estimar el estado de polarización del rayo incidente. El azimut del rayo incidente es variado, rotando un lente polarizador lineal montado en un motor paso a paso. La intensidad de cuatro rayos difractados es medida por medio de una peque

  13. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  14. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  15. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  16. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  17. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  18. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  19. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  20. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  1. Multichannel spin polarimeter for energy- and angle-dispersive photoemission measurements; Vielkanal-Spinpolarimeter fuer energie- und winkeldispersive Photoemissionsmessungen

    Energy Technology Data Exchange (ETDEWEB)

    Kolbe, Michaela

    2011-09-09

    Spin polarization measurements of free electrons remain challenging since their first realization by Mott. The relevant quantity of a spin polarimeter is its figure of merit, FoM=S{sup 2}I/I{sub 0}, with the asymmetry function S and the ratio between scattered and primary intensity I/I{sub 0}. State-of-the-art devices are based on single-channel scattering (spin-orbit or exchange interaction) which is characterized by FoM {approx_equal}10{sup -4}. On the other hand, modern hemispherical analyzers feature an efficient multichannel detection of spin-integral intensity with more than 10{sup 4} data points simultaneously. In comparison between spin-resolved and spin-integral electron spectroscopy we are thus faced with a difference in counting efficiency by 8 orders of magnitude. The present work concentrates on the development and investigation of a novel technique for increasing the efficiency in spin-resolved electron spectroscopy by multichannel detection. The spin detector was integrated in a {mu}-metal shielded UHV-chamber and mounted behind a conventional hemispherical analyzer. The electrostatic lens system's geometry was determined by electron-optical simulations. The basic concept is the k {sub parallel} -conserving elastic scattering of the (0,0)-beam on a W(100) scattering crystal under 45 impact angle. It could be demonstrated that app. 960 data points (15 energy and 64 angular points) could be displayed simultaneously on a delayline detector in an energy interval of {approx_equal}3 eV. This leads to a two-dimensional figure of merit of FoM{sub 2D}=1.7. Compared to conventional spin detectors, the new type is thus characterized by a gain in efficiency of 4 orders of magnitude. The operational reliability of the new spin polarimeter could be proven by measurements with a Fe/MgO(100) and O p(1 x 1)/Fe(100)-sample, where results from the literature were reproduced with strongly decreased measuring time. Due to the high intensity it becomes possible, to

  2. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  3. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  4. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    ...) of Senate Report 106-50, is given for reference in Table 1-1 of the body of this report. This paper recommends a set of software acquisition and software engineering best practices that addresses the issues raised in the Senate Report...

  5. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    Today, concern for quality has become an international movement. Even though most industrial organizations have now adopted modern quality principles, the software community has continued to rely on testing as the principal quality management method. Different decades have different trends in software engineering.

  6. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  7. Optical Alignment of the Chromospheric Lyman-Alpha SpectroPolarimeter using Sophisticated Methods to Minimize Activities under Vacuum

    Science.gov (United States)

    Giono, G.; Katsukawa, Y.; Ishikawa, R.; Narukage, N.; Kano, R.; Kubo, M.; Ishikawa, S.; Bando, T.; Hara, H.; Suematsu, Y.; hide

    2016-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding-rocket instrument developed at the National Astronomical Observatory of Japan (NAOJ) as a part of an international collaboration. The in- strument main scientific goal is to achieve polarization measurement of the Lyman-alpha line at 121.56 nm emitted from the solar upper-chromosphere and transition region with an unprecedented 0.1% accuracy. For this purpose, the optics are composed of a Cassegrain telescope coated with a "cold mirror" coating optimized for UV reflection and a dual-channel spectrograph allowing for simultaneous observation of the two orthogonal states of polarization. Although the polarization sensitivity is the most important aspect of the instrument, the spatial and spectral resolutions of the instrument are also crucial to observe the chromospheric features and resolve the Ly- pro les. A precise alignment of the optics is required to ensure the resolutions, but experiments under vacuum conditions are needed since Ly-alpha is absorbed by air, making the alignment experiments difficult. To bypass this issue, we developed methods to align the telescope and the spectrograph separately in visible light. We will explain these methods and present the results for the optical alignment of the CLASP telescope and spectrograph. We will then discuss the combined performances of both parts to derive the expected resolutions of the instrument, and compare them with the flight observations performed on September 3rd 2015.

  8. Combined neural network/Phillips–Tikhonov approach to aerosol retrievals over land from the NASA Research Scanning Polarimeter

    Directory of Open Access Journals (Sweden)

    A. Di Noia

    2017-11-01

    Full Text Available In this paper, an algorithm for the retrieval of aerosol and land surface properties from airborne spectropolarimetric measurements – combining neural networks and an iterative scheme based on Phillips–Tikhonov regularization – is described. The algorithm – which is an extension of a scheme previously designed for ground-based retrievals – is applied to measurements from the Research Scanning Polarimeter (RSP on board the NASA ER-2 aircraft. A neural network, trained on a large data set of synthetic measurements, is applied to perform aerosol retrievals from real RSP data, and the neural network retrievals are subsequently used as a first guess for the Phillips–Tikhonov retrieval. The resulting algorithm appears capable of accurately retrieving aerosol optical thickness, fine-mode effective radius and aerosol layer height from RSP data. Among the advantages of using a neural network as initial guess for an iterative algorithm are a decrease in processing time and an increase in the number of converging retrievals.

  9. The IMaX polarimeter for the solar telescope SUNRISE of the NASA long duration balloon program

    Directory of Open Access Journals (Sweden)

    Domingo V.

    2010-06-01

    Full Text Available On June 8th 2009 the SUNRISE mission was successfully launched. This mission consisted of a 1m aperture solar telescope on board of a stratospheric balloon within the Long Duration Balloon NASA program. The flight followed the foreseen circumpolar trajectory over the Artic and the duration was 5 days and 17 hours. One of the two postfocal instruments onboard was IMaX, the Imaging Magnetograph eXperiment. This instrument is a solar magnetograph which is a diffraction limited imager capable to resolve 100 km on the solar surface, and simultaneously a high sensitivity polarimeter (<10-3 and a high resolution spectrograph (bandwidth <70mÅ. The magnetic vectorial map can be extracted thanks to the well-know Zeeman effect, which takes place in the solar atoms, allowing to relate polarization and spectral measurements to magnetic fields. The technological challenge of the IMaX development has a special relevance due to the utilization of innovative technologies in the Aeroespacial field and it is an important precedent for future space missions such as Solar Orbiter from ESA. Among these novel technologies the utilization of Liquid Crystal Variable Retarders (LCVRs as polarization modulators and a LiNbO3 etalon as tunable spectral filter are remarkable. Currently the data obtained is being analyzed and the preliminary results show unprecedented information about the solar dynamics.

  10. The Effect of Trifluoroethanol and Glycerol on the Thermal Properties of Collagen Using Optical Displacement-Enhanced Heterodyne Polarimeter

    Directory of Open Access Journals (Sweden)

    Chien-Ming Wu

    2015-11-01

    Full Text Available An angular displacement-enhanced heterodyne polarimeter has been employed to investigate the interplay between trifluoroethanol (TFE and glycerol on the thermal denaturation of type I collagen. The concentration of the collagen solution was fixed at 0.341 (mg/mL, and was heated from 25 °C to 55 °C. TFE solutions with concentrations of 5%, 10%, 15%, 20%, 40% and 80% (v/v were prepared and the phase change was recorded for the determination of thermal denaturation. It was observed that the thermal denaturation temperature (Td is decreased with increasing TFE concentration due to the partial cleavage of the triple-helical structure. With TFE concentration higher than 20% (v/v, the degree of optical rotation appears to be nearly the same, reflecting that the collagen triple helices have been completely destructed. Moreover, the addition of glycerol in inhibiting the thermal denaturation of collagen is investigated. It has been shown that glycerol can improve the thermal denaturation of both collagen and TFE-mixed collagen. Experimental results show that, in the presence of 2 M glycerol, the Td of collagen remained at around 41.9 °C, meanwhile the Td of 20% (v/v TFE-mixed collagen is significantly restored to 32.8 °C.

  11. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  12. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  13. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, V.F.

    1996-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  14. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  15. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  16. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  17. Software service history report

    Science.gov (United States)

    2002-01-01

    The safe and reliable operation of software within civil aviation systems and equipment has historically been assured through the application of rigorous design assurance applied during the software development process. Increasingly, manufacturers ar...

  18. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  19. Agent Building Software

    Science.gov (United States)

    2000-01-01

    AgentBuilder is a software component developed under an SBIR contract between Reticular Systems, Inc., and Goddard Space Flight Center. AgentBuilder allows software developers without experience in intelligent agent technologies to easily build software applications using intelligent agents. Agents are components of software that will perform tasks automatically, with no intervention or command from a user. AgentBuilder reduces the time and cost of developing agent systems and provides a simple mechanism for implementing high-performance agent systems.

  20. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  1. Software quality challenges.

    OpenAIRE

    Fitzpatrick, Ronan; Smith, Peter; O'Shea, Brendan

    2004-01-01

    This paper sets out a number of challenges facing the software quality community. These challenges relate to the broader view of quality and the consequences for software quality definitions. These definitions are related to eight perspectives of software quality in an end-to-end product life cycle. Research and study of software quality has traditionally focused on product quality for management information systems and this paper considers the challenge of defining additional quality factors...

  2. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  3. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  4. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  5. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  6. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  7. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  8. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  9. Astronomical Software Directory Service

    Science.gov (United States)

    Hanisch, R. J.; Payne, H.; Hayes, J.

    1998-01-01

    This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.

  10. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  11. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  12. Global Software Development : - Software Architecture - Organization - Communication

    OpenAIRE

    Førde, Dan Sørensen

    2003-01-01

    Our globalized world has an impact on almost any area of our lives. The globalization affecting the business running around the globe, and forces employees and managers to think of new ways of doing their business. Globalization in the software development industry increased through the 1990s and is still increasing. The Internet makes the collaboration possible and the developers do not need to be co-located to work together on a common software development project. The ...

  13. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL) ......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research.......  This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL...

  14. Design of the Telescope Truss and Gondola for the Balloon-Borne X-ray Polarimeter X-Calibur

    Science.gov (United States)

    Kislat, Fabian; Beheshtipour, Banafsheh; Dowkontt, Paul; Guarino, Victor; Lanzi, R. James; Okajima, Takashi; Braun, Dana; Cannon, Scott; de Geronimo, Gialuigi; Heatwole, Scott; Hoorman, Janie; Li, Shaorui; Mori, Hideyuki; Shreves, Christopher M.; Stuchlik, David; Krawczynski, Henric

    X-ray polarimetry has seen a growing interest in recent years. Improvements in detector technology and focusing X-ray optics now enable sensitive astrophysical X-ray polarization measurements. These measurements will provide new insights into the processes at work in accreting black holes, the emission of X-rays from neutron stars and magnetars, and the structure of AGN jets. X-Calibur is a balloon-borne hard X-ray scattering polarimeter. An X-ray mirror with a focal length of 8m focuses X-rays onto the detector, which consists of a plastic scattering element surrounded by Cadmium-Zinc-Telluride detectors, which absorb and record the scattered X-rays. Since X-rays preferentially scatter perpendicular to their polarization direction, the polarization properties of an X-ray beam can be inferred from the azimuthal distribution of scattered X-rays. A close alignment of the X-ray focal spot with the center of the detector is required in order to reduce systematic uncertainties and to maintain a high photon detection efficiency. This places stringent requirements on the mechanical and thermal stability of the telescope structure. During the flight on a stratospheric balloon, X-Calibur makes use of the Wallops Arc-Second Pointer (WASP) to point the telescope at astrophysical sources. In this paper, we describe the design, construction, and test of the telescope structure, as well as its performance during a 25-h flight from Ft. Sumner, New Mexico. The carbon fiber-aluminum composite structure met the requirements set by X-Calibur and its design can easily be adapted for other types of experiments, such as X-ray imaging or spectroscopic telescopes.

  15. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  16. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  17. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  18. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  19. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...... architecture. Following the early aspect paradigm, Tran SAT allows the software architect to design a software architecture stepwise in terms of aspects at the design stage. It realises the evolution as the weaving of new architectural aspects into an existing software architecture....

  20. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  1. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  2. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  3. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  4. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  5. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  6. Software systems for astronomy

    CERN Document Server

    Conrad, Albert R

    2014-01-01

    This book covers the use and development of software for astronomy. It describes the control systems used to point the telescope and operate its cameras and spectrographs, as well as the web-based tools used to plan those observations. In addition, the book also covers the analysis and archiving of astronomical data once it has been acquired. Readers will learn about existing software tools and packages, develop their own software tools, and analyze real data sets.

  7. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  8. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  9. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  10. Software evolution in prototyping

    OpenAIRE

    Berzins, V.; Qi, Lu

    1996-01-01

    This paper proposes a model of software changes for supporting the evolution of software prototypes. The software evolution steps are decomposed into primitive substeps that correspond to monotonic specification changes. This structure is used to rearrange chronological derivation sequences into structures containing only meaning-preserving changes. The authors indicate how this structure can be used to automatically combine different changes to a specification. A set of examples illustrates ...

  11. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  12. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  13. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  14. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  15. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  16. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  17. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  18. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  19. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  20. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  1. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  2. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  3. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  4. UWB Tracking Software Development

    Science.gov (United States)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  5. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  6. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...

  7. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  8. ITOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, S.; Pruess, K.; Fraser, P.

    1996-10-01

    The purpose of this report is to provide all software baseline documents necessary for the software qualification of ITOUGH2. ITOUGH2 is a computer program providing inverse modeling capabilities for TOUGH2. TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media.

  9. Cactus: Software Priorities

    Science.gov (United States)

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  10. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  11. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  12. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  13. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  14. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  15. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  16. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  17. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  18. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  19. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  20. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... project- and quality management and their implementation in practice. So far, our results suggest that the necessity for a systematic software development is well recognized, while software development still follows an ad-hoc rather than a systematized style. Our results provide initial findings, which we...

  1. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  2. Colors in kindergarten software

    Directory of Open Access Journals (Sweden)

    Montell, Ireivys

    2012-01-01

    Full Text Available The article aims to address elements related to the use of color in educational software for early ages. The meaning of colors in pre-school age is presented from a theoretical perspective. A psychoeducational assessment of the influence of colors in educational software as a teaching aid to develop general intellectual abilities is explained. Likewise, the paper explains how achieving a balance between colors and software design leads to a proper interaction of children with new technology, a new resource for achieving objectives in educations and stimulating cognitive process development, both in institutions and in non-institutional channels.

  3. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  4. Flow Analysis Software Toolkit

    Science.gov (United States)

    Watson, Velvin; Castagnera, Karen; Plessel, Todd; Merritt, Fergus; Kelaita, Paul; West, John; Sandstrom, Tim; Clucas, Jean; Globus, AL; Bancroft, Gordon; hide

    1993-01-01

    Flow Analysis Software Toolkit (FAST) computer program provides software environment facilitating visualization of data. Collection of separate programs (modules) running simultaneously and helps user to examine results of numerical and experimental simulations. Intended for graphical depiction of computed flows, also assists in analysis of other types of data. Combines capabilities of such programs as PLOT3D, RIP, SURF, and GAS into one software environment with modules sharing data. All modules have consistent, highly interactive graphical user interface. Modular construction makes it flexible and extensible. Environment custom-configured, and new modules developed and added as needed. Written in ANSI compliant FORTRAN 77 and C language.

  5. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  6. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  7. Sobre software libre

    OpenAIRE

    Matellán Olivera, Vicente; González Barahona, Jesús; Heras Quirós, Pedro de las; Robles Martínez, Gregorio

    2004-01-01

    220 p. "Sobre software libre" reune casi una treintena de ensayos sobre temas de candente actualidad relacionados con el software libre (del cual Linux es su ex- ponente más conocido). Los ensayos que el lector encontrará están divididos en bloques temáticos que van desde la propiedad intelectual o las cuestiones económicas y sociales de este modelo hasta su uso en la educación y las administraciones publicas, pasando por alguno que repasa la historia del software libre en l...

  8. "IBSAR" Software 4.0

    Directory of Open Access Journals (Sweden)

    2004-06-01

    Full Text Available A review for Arabic software entitled "IBSAR" software assigned to help blinds in usage of the computer, the software pronounces the commands and the contents of screens and applications browsed by users, this review includes general introduction about the software, the components and commands of the software , system requirements , and its functions with Windows operating system and Microsoft Word.

  9. TestingScientificSoftware.pdf

    OpenAIRE

    Dubey, Anshu

    2017-01-01

    Testing scientific software is critical for producing credible results and for code maintenance. The IDEAS scientific software productivity project aims toward increasing software productivity and sustainability, with participants from many projects that define the state of practice in software engineering in the HPC community. This tutorial distills the combined knowledge of IDEAS team members in the area of scientific software testing.

  10. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  11. Collaborative software development

    NARCIS (Netherlands)

    M. de Jonge (Merijn); E. Visser; J.M.W. Visser (Joost)

    2001-01-01

    textabstractWe present an approach to collaborative software development where obtaining components and contributing components across organizational boundaries are explicit phases in the development process. A lightweight generative infrastructure supports this approach with an online package base,

  12. Next Generation Hydro Software

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; Van Dam, A.; De Goede, E.; Icke, J.; Putten, H.

    2014-01-01

    An overview paper, describes motivation and main deliverables of the Next Generation Hydro Software (NGHS) project. Important technological innovations include development of the new computational core Delft3D Flexible Mesh, as well as the open modelling environment Delta Shell.

  13. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  14. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    and want to get an overview of the different aspects of the topic, and for those who are experts with many years of experience, it particularly targets the needs of researchers and Ph.D. students in the area of software and systems engineering or information systems who study advanced topics concerning...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation......This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...

  15. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage....... We found collaboration and teams, processes and organization, sourcing and supplier management, and success factors to be the topics gaining the most interest of researchers and practitioners. Beyond the analysis of the past conferences, we also look at current trends in GSE to motivate further...

  16. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  17. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  18. Social Software and Strategy

    OpenAIRE

    Haefliger S; Monteiro E; Foray D; von Krogh G

    2011-01-01

    Social software challenges strategic thinking in important ways: empowering creative independent individuals implies indeterminate and uncertain reactions and creations in support of or in opposition to management's original thinking. We build a framework that organizes research on social software taking perspectives from both inside and outside companies. We use this framework to introduce the contributions to this special issue in terms of strategy technology and community and to ask a seri...

  19. Deprogramming Large Software Systems

    OpenAIRE

    Coppel, Yohann; Candea, George

    2008-01-01

    Developers turn ideas, designs and patterns into source code, then compile the source code into executables. Decompiling turns executables back into source code, and deprogramming turns code back into designs and patterns. In this paper we introduce DeP, a tool for deprogramming software systems. DeP abstracts code into a dependency graph and mines this graph for patterns. It also gives programmers visual means for manipulating the program. We describe DeP's use in several software engineerin...

  20. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  1. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    Software’s Impact on System and System of Systems Reliability [ Goodenough 2010]. This report suggests some techniques that address those challenges...noted in Goodenough (2012), confidence for software intensive systems is a slippery subject [ Goodenough 2012]. There is a subjective aspect to it...on finding and removing code faults through code inspection and testing [ Goodenough 2010]. But funding a software reliability improvement program

  2. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  3. A Framework for Instituting Software Metrics in Small Software Organizations

    OpenAIRE

    Hisham M. Haddad; Nancy C. Ross; Donald E. Meredith

    2012-01-01

    The role of metrics in software quality is well-recognized; however, software metrics are yet to be standardized and integrated into development practices across the software industry. Literature reports indicate that software companies with less than 50 employees may represent up to 85% of the software organizations in several countries, including the United States. While process, project, and product metrics share a common goal of contributing to software quality and reliability, utilizatio...

  4. Managing Research Software Development: Better Software, Better Research

    OpenAIRE

    Hong, Neil Chue

    2017-01-01

    Research in most domains relies on software, but many researchers have not had experience in managing software projects, and many institutions are only recently providing infrastructure and support for software development. This talk will highlight recent best practice and efforts to improve the development and maintenance of software used in research, including Software Management Plans and what we can learn from the Open Source Software community.

  5. Survey on Impact of Software Metrics on Software Quality

    OpenAIRE

    Mrinal Singh Rawat; Arpita Mittal; Sanjay Kumar Dubey

    2012-01-01

    Software metrics provide a quantitative basis for planning and predicting software development processes. Therefore the quality of software can be controlled and improved easily. Quality in fact aids higher productivity, which has brought software metrics to the forefront. This research paper focuses on different views on software quality. Moreover, many metrics and models have been developed; promoted and utilized resulting in remarkable successes. This paper examines the realm of software e...

  6. Software Process Assessment (SPA)

    Science.gov (United States)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  7. Encyclopedia of Software Components

    Science.gov (United States)

    Warren, Lloyd V. (Inventor); Beckman, Brian C. (Inventor)

    1997-01-01

    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.

  8. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  9. Computing and software

    Science.gov (United States)

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  10. Optimal configuration of partial Mueller matrix polarimeter for measuring the ellipsometric parameters in the presence of Poisson shot noise and Gaussian noise

    Science.gov (United States)

    Quan, Naicheng; Zhang, Chunmin; Mu, Tingkui

    2018-05-01

    We address the optimal configuration of a partial Mueller matrix polarimeter used to determine the ellipsometric parameters in the presence of additive Gaussian noise and signal-dependent shot noise. The numerical results show that, for the PSG/PSA consisting of a variable retarder and a fixed polarizer, the detection process immune to these two types of noise can be optimally composed by 121.2° retardation with a pair of azimuths ±71.34° and a 144.48° retardation with a pair of azimuths ±31.56° for four Mueller matrix elements measurement. Compared with the existing configurations, the configuration presented in this paper can effectively decrease the measurement variance and thus statistically improve the measurement precision of the ellipsometric parameters.

  11. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  12. Accounting for the water-leaving radiance in the simultaneous retrieval of atmosphere and ocean properties from collocated polarimeters and lidar measurements: results for the SABOR campaign

    Science.gov (United States)

    Chowdhary, J.; Brian, C.; Stamnes, S.; Hostetler, C. A.; Cetinic, I.; Slade, W. H.; Hu, Y.

    2017-12-01

    Ocean spectra typically contribute less than 10% to top-of-atmosphere (TOA) radiance observations in the visible (VIS). The remaining 90% of TOA radiance originates from scattering in the atmosphere which needs to be removed (i.e. corrected) but varies substantially with the aerosol present at the time of observation. The traditional approach for atmospheric correction (AC), used for ocean color sensors such as SeaWiFS, MODIS, and VIIRS, estimates aerosol scattering properties from TOA radiance observations in the near-infrared/short-wave infrared (NIR/SWIR) where the ocean becomes dark. The aerosol model is subsequently used to compute the atmospheric scattering contribution to the TOA radiance in the VIS. The final step is to subtract this computed scattering contribution from the real (i.e. observed) TOA radiance. As an alternative to the traditional approach for AC, we retrieve the atmosphere (i.e., aerosol) and ocean (i.e., color) properties simultaneously from measurements in the VIS. To separate the information content for the atmosphere and ocean, we use lidar measurements and multi-angle polarization measurements. Lidar and polarimeter measurements are powerful tools to enhance the ocean product retrievals from conventional ocean color sensors, and are under consideration to accompany future generation ocean color sensors. Here, we present results of simultaneous atmosphere-ocean retrievals using collocated airborne lidar and polarimeter data that were acquired during the Ship-Aircraft Bio-Optical Research (SABOR) campaign. We discuss 2 hydrosol models (which differ in number of free parameters) that were used for these inversions. We then compare our ocean retrievals with measurements obtained from the accompanying cruise ship. Finally, we touch upon a next generation of hydrosol models that accommodates the unique sensitivity of ocean lidar profiles to plankton morphology.

  13. Imaging Polarimeter for a Sub-MeV Gamma-Ray All-sky Survey Using an Electron-tracking Compton Camera

    Energy Technology Data Exchange (ETDEWEB)

    Komura, S.; Takada, A.; Mizumura, Y.; Miyamoto, S.; Takemura, T.; Kishimoto, T.; Kubo, H.; Matsuoka, Y.; Mizumoto, T.; Nakamasu, Y.; Nakamura, K.; Oda, M.; Parker, J. D.; Sonoda, S.; Tanimori, T.; Tomono, D.; Yoshikawa, K. [Graduate School of Science, Kyoto University, Sakyo, Kyoto 606-8502 (Japan); Kurosawa, S. [New Industry Creation Hatchery Center (NICHe), Tohoku University, Sendai, Miyagi, 980-8579 (Japan); Miuchi, K. [Department of Physics, Kobe University, Kobe, Hyogo, 658-8501 (Japan); Sawano, T., E-mail: komura@cr.scphys.kyoto-u.ac.jp [College of Science and Engineering, School of Mathematics and Physics, Kanazawa University, Kanazawa, Ishikawa, 920-1192 (Japan)

    2017-04-10

    X-ray and gamma-ray polarimetry is a promising tool to study the geometry and the magnetic configuration of various celestial objects, such as binary black holes or gamma-ray bursts (GRBs). However, statistically significant polarizations have been detected in few of the brightest objects. Even though future polarimeters using X-ray telescopes are expected to observe weak persistent sources, there are no effective approaches to survey transient and serendipitous sources with a wide field of view (FoV). Here we present an electron-tracking Compton camera (ETCC) as a highly sensitive gamma-ray imaging polarimeter. The ETCC provides powerful background rejection and a high modulation factor over an FoV of up to 2 π sr thanks to its excellent imaging based on a well-defined point-spread function. Importantly, we demonstrated for the first time the stability of the modulation factor under realistic conditions of off-axis incidence and huge backgrounds using the SPring-8 polarized X-ray beam. The measured modulation factor of the ETCC was 0.65 ± 0.01 at 150 keV for an off-axis incidence with an oblique angle of 30° and was not degraded compared to the 0.58 ± 0.02 at 130 keV for on-axis incidence. These measured results are consistent with the simulation results. Consequently, we found that the satellite-ETCC proposed in Tanimori et al. would provide all-sky surveys of weak persistent sources of 13 mCrab with 10% polarization for a 10{sup 7} s exposure and over 20 GRBs down to a 6 × 10{sup −6} erg cm{sup −2} fluence and 10% polarization during a one-year observation.

  14. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  15. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distr......Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  16. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  17. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software maintenance is the process of modifying existing operational software by correcting errors, migration of the software to new technologies and platforms, and adapting it to deal with new environmental requirements. It denotes any change made to a software product before and after delivery to customer or user.

  18. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  19. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  20. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  1. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  2. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  3. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  4. CONRAD Software Architecture

    Science.gov (United States)

    Guzman, J. C.; Bennett, T.

    2008-08-01

    The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.

  5. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  6. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  7. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  8. New Media as Software

    Directory of Open Access Journals (Sweden)

    Manuel Portela

    2014-03-01

    Full Text Available Review of Lev Manovich, Software Takes Command: Extending the Language of New Media. London: Bloomsbury, 2013, 358 pp. ISBN 978-1-6235-6817-7. In Lev Manovich’s most recent book, this programmatic interrogation of our medial condition leads to the following question: do media still exist after software? This is the question that triggers Manovich’s dialogue both with computing history and with theories of digital media of recent decades, including the extension of his own previous formulations in The Language of New Media, published in 2001, and which became a major reference work in the field. The subtitle of the new book points precisely to this critical revisiting of his earlier work in the context of ubiquitous computing and accelerated transcoding of social, cultural and artistic practices by software.

  9. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  10. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  11. Inventory of safeguards software

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Horino, Koichi

    2009-03-01

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  12. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  13. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  14. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  15. The PANIC software system

    Science.gov (United States)

    Ibáñez Mengual, José M.; Fernández, Matilde; Rodríguez Gómez, Julio F.; García Segura, Antonio J.; Storz, Clemens

    2010-07-01

    PANIC is the Panoramic Near Infrared Camera for the 2.2m and 3.5m telescopes at Calar Alto observatory. The aim of the project is to build a wide-field general purpose NIR camera. In this paper we describe the software system of the instrument, which comprises four main packages: GEIRS for the instrument control and the data acquisition; the Observation Tool (OT), the software used for detailed definition and pre-planning the observations, developed in Java; the Quick Look tool (PQL) for easy inspection of the data in real-time and a scientific pipeline (PAPI), both based on the Python programming language.

  16. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  17. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  18. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-07-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  19. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  20. Agile Software Development

    OpenAIRE

    Stewart, Rhonda

    2009-01-01

    One of the most noticeable changes to software process thinking in the last ten years has been the appearance of the word ‘agile’ (Fowler, 2005). In the Information Technology (IT) industry Agile Software Development, or simply Agile is used to refer to a family of lightweight development approaches that share a common set of values and principles1 focused around adapting to change and putting people first (Fowler, 2005). Such Agile methods2 provide an alternative to the well-established Wate...

  1. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  2. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  3. Some challenges facing software engineers developing software for scientists

    OpenAIRE

    Segal, Judith

    2009-01-01

    In this paper, the author discusses two types of challenges facing software engineers as they develop software for scientists. The first type is those challenges that arise from the experience that scientists might have of developing their own software. From this experience, they internalise a model of software development but may not realise the contextual factors which make such a model successful. They thus have expectations and assumptions which prove challenging to software engineers. Th...

  4. The fallacy of Software Patents

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  5. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  6. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  7. JSATS Decoder Software Manual

    Energy Technology Data Exchange (ETDEWEB)

    Flory, Adam E.; Lamarche, Brian L.; Weiland, Mark A.

    2013-05-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Decoder is a software application that converts a digitized acoustic signal (a waveform stored in the .bwm file format) into a list of potential JSATS Acoustic MicroTransmitter (AMT) tagcodes along with other data about the signal including time of arrival and signal to noise ratios (SNR). This software is capable of decoding single files, directories, and viewing raw acoustic waveforms. When coupled with the JSATS Detector, the Decoder is capable of decoding in ‘real-time’ and can also provide statistical information about acoustic beacons placed within receive range of hydrophones within a JSATS array. This document details the features and functionality of the software. The document begins with software installation instructions (section 2), followed in order by instructions for decoder setup (section 3), decoding process initiation (section 4), then monitoring of beacons (section 5) using real-time decoding features. The last section in the manual describes the beacon, beacon statistics, and the results file formats. This document does not consider the raw binary waveform file format.

  8. Reflections on Software Research

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 17; Issue 8. Reflections on Software Research. Dennis M Ritchie. Classics Volume 17 Issue 8 August 2012 pp 810-816. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/017/08/0810-0816. Author Affiliations.

  9. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  10. Overview of NWIS software

    Energy Technology Data Exchange (ETDEWEB)

    Mullens, J.A.; Mihalczo, J.T.

    2000-01-01

    The Nuclear Weapons Identification System (NWIS) is a system that performs radiation signature measurements on objects such as nuclear weapons components. NWIS consists of a {sup 252} Cf fission source, radiation detectors and associated analog electronics, data acquisition boards, and computer running Windows NT and the application software. NWIS uses signal processing techniques to produced a radiation signature from the radiation emitted from the object. The signature can be stored and later compared to another signature to determine whether two objects are similar. A library of such signatures can be used to identify objects in closed containers as well as determine such attributes as fissile mass and it some cases enrichment. There are three executables built from the software: (1) Windows NT kernel-mode device driver; (2) data acquisition application; and (3) data analysis application. The device driver is the interface between the NWIS data acquisition boards and the remainder o f the software. The data acquisition executable is the user's tool for making an NWIS measurement; it has limited data display abilities. The data analysis executable is a user's tool for displaying an NWIS measurement, including matching it to other NWIS measurements. A users manual for the software is included.

  11. The FARE Software

    Science.gov (United States)

    Pitarello, Adriana

    2015-01-01

    This article highlights the importance of immediate corrective feedback in tutorial software for language teaching in an academic learning environment. We aim to demonstrate that, rather than simply reporting on the performance of the foreign language learner, this feedback can act as a mediator of students' cognitive and metacognitive activity.…

  12. Green Software Products

    NARCIS (Netherlands)

    Jagroep, E.A.

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  13. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    2011. This memorandum expanded on the previous 2010 ban on the procurement of servers and 10 voice switching equipment to include construction...Autodesk products, at a discount of up to 10% off of the GSA price. CA Unicenter enterprise management software is available at 64% off; BPwin and Erwin

  14. Hardening Software Defined Networks

    Science.gov (United States)

    2014-07-01

    the layers to act upon each other in very distinct ways. Examining the literature, we selected bipartite and tripartite network models are those...identify characteristics of multilayered networks . Bipartite and tripartite models are potentially most promising (and somewhat underutilized) in the... tripartite models are particularly well-suited to a confluence of traditional networks and software defined networks where SDN components are

  15. Software configuration management

    International Nuclear Information System (INIS)

    Arribas Peces, E.; Martin Faraldo, P.

    1993-01-01

    Software Configuration Management is directed towards identifying system configuration at specific points of its life cycle, so as to control changes to the configuration and to maintain the integrity and traceability of the configuration throughout its life. SCM functions and tasks are presented in the paper

  16. Software management issues

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  17. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  18. Patterns in Software Development

    DEFF Research Database (Denmark)

    Corry, Aino Vonge

    the university and I entered a project to industry within Center for Object Technology (COT). I focused on promoting the pattern concept to the Danish software industry in order to help them take advantage of the benefits of applying patterns in system development. In the obligatory stay abroad, I chose to visit...

  19. SEER*Stat Software

    Science.gov (United States)

    If you have access to SEER Research Data, use SEER*Stat to analyze SEER and other cancer-related databases. View individual records and produce statistics including incidence, mortality, survival, prevalence, and multiple primary. Tutorials and related analytic software tools are available.

  20. MOCASSIN-prot software

    Science.gov (United States)

    MOCASSIN-prot is a software, implemented in Perl and Matlab, for constructing protein similarity networks to classify proteins. Both domain composition and quantitative sequence similarity information are utilized in constructing the directed protein similarity networks. For each reference protein i...

  1. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  2. Software Architecture Evolution

    Science.gov (United States)

    2013-12-01

    software development, such as TOGAF and the Rational Unified Pro- cess. Example: “Rather than simply utilize EA group and the knowl- edge and experience...we have [. . . ], we’re trying to use a couple additional tools, one of which is TOGAF . We’re not strictly fol- lowing that, but that’s the basis of

  3. Limits of Software Reuse

    NARCIS (Netherlands)

    Holenderski, L.

    2006-01-01

    Software reuse is considered one of the main techniques to increasesoftware productivity. We present two simple mathematical argumentsthat show some theoretical limits of reuse. It turns out that the increase of productivity due to internal reuse is at most linear, farfrom the needed exponential

  4. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  5. TWRS engineering bibliography software listing

    International Nuclear Information System (INIS)

    Husa, E.I.

    1995-01-01

    This document contains the computer software listing for Engineering Bibliography software, developed by E. Ivar Husa. This software is in the working prototype stage of development. The code has not been tested to requirements. TWRS Engineering created this software for engineers to share bibliographic references across the Hanford site network (HLAN). This software is intended to store several hundred to several thousand references (a compendium with limited range). Coded changes are needed to support the larger number of references

  6. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  7. Understanding service-oriented software.

    OpenAIRE

    Gold, N. E.; Knight, C.; Mohan, A.; Munro, M.

    2004-01-01

    Service-oriented software is being hailed as the next revolutionary approach to software development. Service orientation allows organizations to rapidly and dynamically form new software applications to meet changing business needs, thus alleviating the software evolution problems that occur with traditional applications. The largest of these problems is the need to understand existing software before changing it. This article looks ahead toward the automated construction of service-oriented...

  8. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  9. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  10. Software Tag : Empirical Software Engineering Data for Traceability and Transparency of Software Project

    OpenAIRE

    Inoue, Katsuro

    2007-01-01

    In this paper, we propose a scheme, named Software Tag, of software trade and development for improvement of traceability and transparency. Empirical data is collected during development, and processed into two types, open tag and secret tag, composing a software tag which is finally delivered to software purchaser.

  11. Software testing - A way to improve software reliability

    Science.gov (United States)

    Mahindru, Andy

    1986-01-01

    Various software testing techniques are described. The techniques are classified as dynamic or static, structural or functional, and manual or automated. The objects tested include the elements designed during the development of the software, such as codes, data structures, and requirements. Testing techniques and procedures applicable to each phase of software development are examined; the development phases are: software requirements analysis, preliminary design, detailed design, coding, testing, and operation and maintenance. The characteristics of a future software engineering environment for software testing and validation are discussed.

  12. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  13. Chemical recognition software

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.J. Jr.; Tisone, G.C.

    1994-12-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures. even when the mixture is noisy and contaminated with unknowns.

  14. Chemical recognition software

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, J.S.; Trahan, M.W.; Nelson, W.E.; Hargis, P.H. Jr.; Tisone, G.C.

    1994-06-01

    We have developed a capability to make real time concentration measurements of individual chemicals in a complex mixture using a multispectral laser remote sensing system. Our chemical recognition and analysis software consists of three parts: (1) a rigorous multivariate analysis package for quantitative concentration and uncertainty estimates, (2) a genetic optimizer which customizes and tailors the multivariate algorithm for a particular application, and (3) an intelligent neural net chemical filter which pre-selects from the chemical database to find the appropriate candidate chemicals for quantitative analyses by the multivariate algorithms, as well as providing a quick-look concentration estimate and consistency check. Detailed simulations using both laboratory fluorescence data and computer synthesized spectra indicate that our software can make accurate concentration estimates from complex multicomponent mixtures, even when the mixture is noisy and contaminated with unknowns.

  15. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...... in conjunction with informal roles and relationships such as clan-like control inherent in agile development. Overall, the study demonstrates that, if appropriately applied, communication technologies can significantly support distributed, agile practices by allowing concurrent enactment of both formal...

  16. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  17. Software Configurable Multichannel Transceiver

    Science.gov (United States)

    Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter

    2009-01-01

    Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.

  18. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  19. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  20. Continuous software delivery

    OpenAIRE

    Krmavnar, Nina

    2015-01-01

    The main purpose of the thesis is the demonstration of one of the best possible approaches to an automated continuous delivery process as it relates to certain application types. In the introductory part, the main reason for choosing the subject is presented, along with a few examples of why nowadays - in order to keep pace with the competition - such an approach seems necessary. Following chapters discuss the basics of software delivery, starting with configuration and version control manage...

  1. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  2. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  3. Software Intensive Systems

    Science.gov (United States)

    2006-07-01

    Mr. Carl Siel - CHENG Executive Secretaries: • Dr. William Bail, MITRE • Ms. Cathy Ricketts, PEO - IWS • Mr. Fred Heinemann, EDO Study...computer producers by location China US Japan Globalizing of Software and Hardware In order to fulfill the growing needs, companies have been...computer manufacturers, the trend towards offshoring has been significant. In a three year period , the proportion of 300mm fabrication plants in the U.S

  4. ThermalTracker Software

    Energy Technology Data Exchange (ETDEWEB)

    2016-08-10

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  5. Introduction to Software Design

    Science.gov (United States)

    1989-01-01

    are also available through the National Technical Information Service . For information on ordering, please contact NTIS directly: National Technical...Information Service , U.S. Department of Commerce, Springfield, VA 22161. Use of any trademarks in this report is not intended in any way to infringe on...Maintenance Exercises for a Software CM-2 ntodicoon to Sofware Design En~gineerig Project Course CM-3 The Softwaire TedwkW aleview Process* EM-2 APSE

  6. Office software Individual coaching

    CERN Multimedia

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get the help of our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.). Please discover these new courses in our catalogue! Tel. 74924

  7. Image Processing Software

    Science.gov (United States)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  8. Dependency Tree Annotation Software

    Science.gov (United States)

    2015-11-01

    between words. DTE supports the widely used Conference on Computational Natural Language Learning (CoNLL)-X format as well as several other file...formats, and it provides numerous options for customizing how dependency trees are displayed. Built entirely in Java , it can run on a wide range of...software application called Dependency Tree Editor (DTE) that can read files in Computational Natural Language Learning (CoNLL)-X format and use them

  9. Standard software for CAMAC

    International Nuclear Information System (INIS)

    Lenkszus, F.R.

    1978-01-01

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  10. FPGAs for software programmers

    CERN Document Server

    Hannig, Frank; Ziener, Daniel

    2016-01-01

    This book makes powerful Field Programmable Gate Array (FPGA) and reconfigurable technology accessible to software engineers by covering different state-of-the-art high-level synthesis approaches (e.g., OpenCL and several C-to-gates compilers). It introduces FPGA technology, its programming model, and how various applications can be implemented on FPGAs without going through low-level hardware design phases. Readers will get a realistic sense for problems that are suited for FPGAs and how to implement them from a software designer’s point of view. The authors demonstrate that FPGAs and their programming model reflect the needs of stream processing problems much better than traditional CPU or GPU architectures, making them well-suited for a wide variety of systems, from embedded systems performing sensor processing to large setups for Big Data number crunching. This book serves as an invaluable tool for software designers and FPGA design engineers who are interested in high design productivity through behavi...

  11. Evidence of Absence software

    Science.gov (United States)

    Dalthorp, Daniel; Huso, Manuela M. P.; Dail, David; Kenyon, Jessica

    2014-01-01

    Evidence of Absence software (EoA) is a user-friendly application used for estimating bird and bat fatalities at wind farms and designing search protocols. The software is particularly useful in addressing whether the number of fatalities has exceeded a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software is applicable even when zero carcasses have been found in searches. Depending on the effectiveness of the searches, such an absence of evidence of mortality may or may not be strong evidence that few fatalities occurred. Under a search protocol in which carcasses are detected with nearly 100 percent certainty, finding zero carcasses would be convincing evidence that overall mortality rate was near zero. By contrast, with a less effective search protocol with low probability of detecting a carcass, finding zero carcasses does not rule out the possibility that large numbers of animals were killed but not detected in the searches. EoA uses information about the search process and scavenging rates to estimate detection probabilities to determine a maximum credible number of fatalities, even when zero or few carcasses are observed.

  12. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...... presented at the conference, and the experiences of the authors, give rise to certain recommendations for instructors and software developers. The main conclusions are that both calculation software and illustration software are needed in the teaching and learning process, and that no computing skills...

  13. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  14. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2009-03-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  15. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2012-08-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  16. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "In......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  17. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  18. NASA PC software evaluation project

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  19. The software invention cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    J.A. Bergstra; P. Klint (Paul)

    2008-01-01

    htmlabstractThe patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention

  20. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... after delivery to customer or user. Software maintenance is an important activity of many of organizations today. This is ..... experience with the application. Businesswise, software is being evolved because the software is successful in the marketplace, revenue accrues from it is high, user demand is strong, ...

  1. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  2. The Software Invention Cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube

  3. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  4. Design Principles for Interactive Software

    DEFF Research Database (Denmark)

    The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as...

  5. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  6. Software Engineering for Human Spaceflight

    Science.gov (United States)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  7. Perspectives on Open Source Software

    National Research Council Canada - National Science Library

    Hissam, Scott

    2001-01-01

    Open source software (OSS) is emerging as the software community's next "silver bullet" and appears to be playing a significant role in the acquisition and development plans of the Department of Defense (DoD) and industry...

  8. Imperfect Requirements in Software Development

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet; Sawyer, Pete; Paech, Barbara; Heymans, Patrick

    2007-01-01

    Requirement Specifications are very difficult to define. Due to lack of information and differences in interpretation, software engineers are faced with the necessity to redesign and iterate. This imperfection in software requirement specifications is commonly addressed by incremental design. In

  9. Software Metrics Capability Evaluation Guide

    National Research Council Canada - National Science Library

    Budlong, Faye

    1995-01-01

    ...: disseminating information regarding the U.S. Air Force Policy on software metrics, providing metrics information to the public through CrossTalk, conducting customer workshops in software metrics, guiding metrics technology adoption programs...

  10. Software Components for Web Services

    OpenAIRE

    Ramachandran, Muthu; Nair, T. R. Gopalakrsihnan; Selvarani, R.

    2010-01-01

    Service-oriented computing has emerged as the new area to address software as a service. This paper proposes a model for component based development for service-oriented systems and have created best practice guidelines on software component design.

  11. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  12. Research on Software Security Testing

    OpenAIRE

    Gu Tian-yang; Shi Yin-sheng; Fang You-yuan

    2010-01-01

    Software security testing is an important means to ensure software security and trustiness. This paper first mainly discusses the definition and classification of software security testing, and investigates methods and tools of software security testing widely. Then it analyzes and concludes the advantages and disadvantages of various methods and the scope of application, presents a taxonomy of security testing tools. Finally, the paper points out future focus and development directions of so...

  13. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  14. Motivasi Pembajakan Software: Perspektif Mahasiswa

    OpenAIRE

    Wahid, Fathul

    2004-01-01

    The study aims to investigate (1) pattern of software piracy among Indonesianstudents?; and (2) motivating factors of the piracy? A survey to 122 undergraduate studentsin informatics discloses that rate of the software piracy among students is high. This finding isin line with the global software piracy statistics show that the piracy rate in Indonesia is veryhigh, though revenue loss to the piracy is very small (1.1%) when compared to that in US andCanada. Unaffordable price of legal softwar...

  15. Motivasi Pembajakan Software: Perspektif Mahasiswa

    OpenAIRE

    Wahid, Fathul

    2009-01-01

    The study aims to investigate (1) pattern of software piracy among Indonesianstudents?; and (2) motivating factors of the piracy? A survey to 122 undergraduate studentsin informatics discloses that rate of the software piracy among students is high. This finding isin line with the global software piracy statistics show that the piracy rate in Indonesia is veryhigh, though revenue loss to the piracy is very small (1.1%) when compared to that in US andCanada. Unaffordable price of legal softwar...

  16. Software engineering a practitioner's approach

    CERN Document Server

    Pressman, Roger S

    1997-01-01

    This indispensable guide to software engineering exploration enables practitioners to navigate the ins and outs of this rapidly changing field. Pressman's fully revised and updated Fourth Edition provides in-depth coverage of every important management and technical topic in software engineering. Moreover, readers will find the inclusion of the hottest developments in the field such as: formal methods and cleanroom software engineering, business process reengineering, and software reengineering.

  17. Free software: Some Brazilian translations

    OpenAIRE

    Pinheiro, Alexandre; Cukierman, Henrique

    2004-01-01

    We examine two histories in this paper. First, we briefly look at a North American history, in which we look at the relationship of free software with the founding principles of democracy. Second, we examine recent Brazilian history, especially the most policy decision to adopt free software, affirming technological autonomy. Democratic ideals, defended by the free software movement, are transformed in Brazilian politics, leading both to further free software development and a stronger democr...

  18. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  19. Decision making in software architecture

    NARCIS (Netherlands)

    van Vliet, Hans; Tang, Anthony

    2016-01-01

    Traditionally, software architecture is seen as the result of the software architecture design process, the solution, usually represented by a set of components and connectors. Recently, the why of the solution, the set of design decisions made by the software architect, is complementing or even

  20. Software product lines : Organizational alternatives

    NARCIS (Netherlands)

    Bosch, J

    2001-01-01

    Software product lines enjoy increasingly wide adoption in the software industry. Most authors focus on the technical and process aspects and assume an organizational model consisting of a domain engineering unit and several application engineering units. In our cooperation with several software

  1. Reflections on Software Engineering Education

    NARCIS (Netherlands)

    van Vliet, H.

    2006-01-01

    In recent years, the software engineering community has focused on organizing its existing knowledge and finding opportunities to transform that knowledge into a university curriculum. SWEBOK (the Guide to the Software Engineering Body of Knowledge) and Software Engineering 2004 are two initiatives

  2. Software that meets its Intent

    NARCIS (Netherlands)

    Huisman, Marieke; Bos, Herbert; Brinkkemper, Sjaak; van Deursen, Arie; Groote, Jan Friso; Lago, Patricia; van de Pol, Jaco; Visser, Eelco; Margaria, Tiziana; Steffen, Bernhard

    2016-01-01

    Software is widely used, and society increasingly depends on its reliability. However, software has become so complex and it evolves so quickly that we fail to keep it under control. Therefore, we propose intents: fundamental laws that capture a software systems’ intended behavior (resilient,

  3. Free Software and Free Textbooks

    Science.gov (United States)

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  4. Software Startups - A Research Agenda

    Directory of Open Access Journals (Sweden)

    Michael Unterkalmsteiner

    2016-10-01

    Full Text Available Software startup companies develop innovative, software-intensive products within limited time frames and with few resources, searching for sustainable and scalable business models. Software startups are quite distinct from traditional mature software companies, but also from micro-, small-, and medium-sized enterprises, introducing new challenges relevant for software engineering research. This paper's research agenda focuses on software engineering in startups, identifying, in particular, 70+ research questions in the areas of supporting startup engineering activities, startup evolution models and patterns, ecosystems and innovation hubs, human aspects in software startups, applying startup concepts in non-startup environments, and methodologies and theories for startup research. We connect and motivate this research agenda with past studies in software startup research, while pointing out possible future directions. While all authors of this research agenda have their main background in Software Engineering or Computer Science, their interest in software startups broadens the perspective to the challenges, but also to the opportunities that emerge from multi-disciplinary research. Our audience is therefore primarily software engineering researchers, even though we aim at stimulating collaborations and research that crosses disciplinary boundaries. We believe that with this research agenda we cover a wide spectrum of the software startup industry current needs.

  5. BLTC control system software

    Energy Technology Data Exchange (ETDEWEB)

    Logan, J.B., Fluor Daniel Hanford

    1997-02-10

    This is a direct revision to Rev. 0 of the BLTC Control System Software. The entire document is being revised and released as HNF-SD-FF-CSWD-025, Rev 1. The changes incorporated by this revision include addition of a feature to automate the sodium drain when removing assemblies from sodium wetted facilities. Other changes eliminate locked in alarms during cold operation and improve the function of the Oxygen Analyzer. See FCN-620498 for further details regarding these changes. Note the change in the document number prefix, in accordance with HNF-MD-003.

  6. Office software Individual coaching

    CERN Multimedia

    HR Department

    2010-01-01

    If one or several particular topics cause you sleepless nights, you can get help from our trainer who will come to your workplace for a multiple of 1-hour slots . All fields in which our trainer can help are detailed in the course description in our training catalogue (Microsoft Office software, Adobe applications, i-applications etc.) Discover these new courses in our catalogue! http://cta.cern.ch/cta2/f?p=110:9 Technical Training Service Technical.Training@cern.ch Tel 74924

  7. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  8. Calidad del software: camino hacia una verdadera industria del software

    Directory of Open Access Journals (Sweden)

    Saulo Ernesto Rojas Salamanca

    1999-07-01

    Full Text Available El software es quizá uno de los productos de la ingeniería que más ha evolucionado en muy poco tiempo, pasando desde el software empírico o artesanal hasta llegar al software desarrollado bajo los principios y herramientas de la ingeniería del software. Sin embargo, dentro de estos cambios, las personas encargadas de la elaboración del software se han enfrentado a problemas muy comunes: unos debido a la exigencia cada vez mayor en la capacidad de resultados del software, debido al permanente cambio de condiciones lo que aumenta su complejidad y obsolescencia; y otros, debido a la carencia de herramientas adecuadas y estándares de tipo organizacional encaminados al mejoramiento de los procesos en el desarrollo del software. Hacia la búsqueda de mecanismos de solución de estos últimos problemas se orienta este artículo...

  9. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  10. High resolution polarimeter-interferometer system for fast equilibrium dynamics and MHD instability studies on Joint-TEXT tokamak (invited)a)

    Science.gov (United States)

    Chen, J.; Zhuang, G.; Li, Q.; Liu, Y.; Gao, L.; Zhou, Y. N.; Jian, X.; Xiong, C. Y.; Wang, Z. J.; Brower, D. L.; Ding, W. X.

    2014-11-01

    A high-performance Faraday-effect polarimeter-interferometer system has been developed for the J-TEXT tokamak. This system has time response up to 1 μs, phase resolution perturbations associated with intrinsic Magneto-Hydro-Dynamic (MHD) instabilities and external coil-induced Resonant Magnetic Perturbations (RMP). The 3-wave technique, in which the line-integrated Faraday angle and electron density are measured simultaneously by three laser beams with specific polarizations and frequency offsets, is used. In order to achieve optimum resolution, three frequency-stabilized HCOOH lasers (694 GHz, >35 mW per cavity) and sensitive Planar Schottky Diode mixers are used, providing stable intermediate-frequency signals (0.5-3 MHz) with S/N > 50. The collinear R- and L-wave probe beams, which propagate through the plasma poloidal cross section (a = 0.25-0.27 m) vertically, are expanded using parabolic mirrors to cover the entire plasma column. Sources of systematic errors, e.g., stemming from mechanical vibration, beam non-collinearity, and beam polarization distortion are individually examined and minimized to ensure measurement accuracy. Simultaneous density and Faraday measurements have been successfully achieved for 14 chords. Based on measurements, temporal evolution of safety factor profile, current density profile, and electron density profile are resolved. Core magnetic and density perturbations associated with MHD tearing instabilities are clearly detected. Effects of non-axisymmetric 3D RMP in ohmically heated plasmas are directly observed by polarimetry for the first time.

  11. Polarization Calibration of the Chromospheric Lyman-Alpha SpectroPolarimeter for a 0.1% Polarization Sensitivity in the VUV Range. Part II: In-Flight Calibration

    Science.gov (United States)

    Giono, G.; Ishikawa, R.; Narukage, N.; Kano, R.; Katsukawa, Y.; Kubo, M.; Ishikawa, S.; Bando, T.; Hara, H.; Suematsu, Y.; Winebarger, A.; Kobayashi, K.; Auchère, F.; Trujillo Bueno, J.; Tsuneta, S.; Shimizu, T.; Sakao, T.; Cirtain, J.; Champey, P.; Asensio Ramos, A.; Štěpán, J.; Belluzzi, L.; Manso Sainz, R.; De Pontieu, B.; Ichimoto, K.; Carlsson, M.; Casini, R.; Goto, M.

    2017-04-01

    The Chromospheric Lyman-Alpha SpectroPolarimeter is a sounding rocket instrument designed to measure for the first time the linear polarization of the hydrogen Lyman-{α} line (121.6 nm). The instrument was successfully launched on 3 September 2015 and observations were conducted at the solar disc center and close to the limb during the five-minutes flight. In this article, the disc center observations are used to provide an in-flight calibration of the instrument spurious polarization. The derived in-flight spurious polarization is consistent with the spurious polarization levels determined during the pre-flight calibration and a statistical analysis of the polarization fluctuations from solar origin is conducted to ensure a 0.014% precision on the spurious polarization. The combination of the pre-flight and the in-flight polarization calibrations provides a complete picture of the instrument response matrix, and a proper error transfer method is used to confirm the achieved polarization accuracy. As a result, the unprecedented 0.1% polarization accuracy of the instrument in the vacuum ultraviolet is ensured by the polarization calibration.

  12. Calibration of the Breit-Rabi Polarimeter for the PAX Spin-Filtering Experiment at COSY/Jülich and AD/CERN

    CERN Document Server

    Barschel, Colin

    2010-01-01

    The PAX(PolarizedAntiproton eXperiment) experiment is proposed to polarize a stored antiproton beam for use at the planned High Energy Storage Ring (HESR) of the FAIR facility at GSI (Darmstadt, Germany). The polarization build-up will be achieved by spin-filtering, i.e., by a repetitive passage of the antiproton beam through a polarized atomic hydrogen or deuterium gas target. The experimental setup requires a Polarized Internal gas Target (PIT) surrounded with silicon detectors. The PIT includes an Atomic Beam Source (ABS), the target cell and a Breit-Rabi Polarimeter (BRP). The first phase of the Spin-Filtering Studies for PAX covers the commissioning of the PIT components and themeasurement of an absolute calibration standard for the BRP at the COSY ring in Jülich. The spin-filtering with protons aim at confirming the results of the FILTEX experiment and determine the pp hadronic spin dependent cross sections at 50MeV.The second phase will be realized in the Antiproton Decelerator ring (AD) at CERN to po...

  13. An efficient, FPGA-based, cluster detection algorithm implementation for a strip detector readout system in a Time Projection Chamber polarimeter

    Science.gov (United States)

    Gregory, Kyle J.; Hill, Joanne E.; Black, J. Kevin; Baumgartner, Wayne H.; Jahoda, Keith

    2016-05-01

    A fundamental challenge in a spaceborne application of a gas-based Time Projection Chamber (TPC) for observation of X-ray polarization is handling the large amount of data collected. The TPC polarimeter described uses the APV-25 Application Specific Integrated Circuit (ASIC) to readout a strip detector. Two dimensional photo- electron track images are created with a time projection technique and used to determine the polarization of the incident X-rays. The detector produces a 128x30 pixel image per photon interaction with each pixel registering 12 bits of collected charge. This creates challenging requirements for data storage and downlink bandwidth with only a modest incidence of photons and can have a significant impact on the overall mission cost. An approach is described for locating and isolating the photoelectron track within the detector image, yielding a much smaller data product, typically between 8x8 pixels and 20x20 pixels. This approach is implemented using a Microsemi RT-ProASIC3-3000 Field-Programmable Gate Array (FPGA), clocked at 20 MHz and utilizing 10.7k logic gates (14% of FPGA), 20 Block RAMs (17% of FPGA), and no external RAM. Results will be presented, demonstrating successful photoelectron track cluster detection with minimal impact to detector dead-time.

  14. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  15. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  16. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  17. CSAM Metrology Software Tool

    Science.gov (United States)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  18. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available perform according to customers’ expectations. Software methodologies are an important aspect in software development companies. Maddison (1984) defines a methodology as a “recommended collection of philosophies, phases, procedures, rules, techniques..., tools, documentation, management, and training for developers of information systems”. Hence, understanding the differences in customer interaction between software methodologies is not only important to the software team but also important...

  19. The Ragnarok Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    Ragnarok is an experimental software development environment that focuses on enhanced support for managerial activities in large scale software development taking the daily work of the software developer as its point of departure. The main emphasis is support in three areas: management, navigation......, and collaboration. The leitmotif is the software architecture, which is extended to handle managerial data in addition to source code; this extended software architecture is put under tight version- and configuration management control and furthermore used as basis for visualisation. Preliminary results of using...

  20. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  1. Secure software development training course

    Directory of Open Access Journals (Sweden)

    Victor S. Gorbatov

    2017-06-01

    Full Text Available Information security is one of the most important criteria for the quality of developed software. To obtain a sufficient level of application security companies implement security process into software development life cycle. At this stage software companies encounter with deficit employees who able to solve problems of software design, implementation and application security. This article provides a description of the secure software development training course. Training course of application security is designed for co-education students of different IT-specializations.

  2. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    In software development, there is an interplay between Software Process models and Software Process enactments. The former tends to be abstract descriptions or plans. The latter tends to be specific instantiations of some ideal procedure. In this paper, we examine the role of work artifacts...... and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...

  3. Robust Software Architecture for Robots

    Science.gov (United States)

    Aghazanian, Hrand; Baumgartner, Eric; Garrett, Michael

    2009-01-01

    Robust Real-Time Reconfigurable Robotics Software Architecture (R4SA) is the name of both a software architecture and software that embodies the architecture. The architecture was conceived in the spirit of current practice in designing modular, hard, realtime aerospace systems. The architecture facilitates the integration of new sensory, motor, and control software modules into the software of a given robotic system. R4SA was developed for initial application aboard exploratory mobile robots on Mars, but is adaptable to terrestrial robotic systems, real-time embedded computing systems in general, and robotic toys.

  4. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    Science.gov (United States)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  5. Factors That Affect Software Testability

    Science.gov (United States)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  6. Software for safety critical applications

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Jurickova, M.; Chudy, R.

    2001-01-01

    The contribution gives an overview of the project of the software development for safety critical applications. This project has been carried out since 1997. The principal goal of the project was to establish a research laboratory for the development of the software with the highest requirements for quality and reliability. This laboratory was established at the department, equipped with proper hardware and software to support software development. A research team of predominantly young researchers for software development was created. The activities of the research team started with studying and proposing the software development methodology. In addition, this methodology was applied to the real software development. The verification and validation process followed the software development. The validation system for the integrated hardware and software tests was brought into being and its control software was developed. The quality of the software tools was also observed, and the SOSAT tool was used during these activities. National and international contacts were established and maintained during the project solution.(author)

  7. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  8. Managing the Software Development Process

    Science.gov (United States)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  9. Software engineering beyond the project

    DEFF Research Database (Denmark)

    Dittrich, Yvonne

    2014-01-01

    software ecosystems. Objective The article provides a holistic understanding of the observed and reported practices as a starting point to device specific support for the development in software ecosystems. Method A qualitative interview study was designed based on previous long-term ethnographical...... these conditions are not given? The article claims that this is the case for software product specific ecosystems. As software is increasingly developed, adopted and deployed in the form of customisable and configurable products, software engineering as a discipline needs to take on the challenge to support...... inspired research. Results The analysis results in a set of common features of product development and evolution despite differences in size, kind of software and business models. Design is distributed and needs to be coordinated across heterogeneous design constituencies that, together with the software...

  10. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  11. Software libre vs. software propietario: programando nuestro futuro

    Directory of Open Access Journals (Sweden)

    Rafael Gómez Sánchez

    2008-12-01

    Full Text Available Este trabajo estudia la evolución de dos modelos contrapuestos: el software propietario y el software libre. Mientras el primero está plenamente establecido, y apoyado por la industria tradicional de los programas de ordenador, el software libre se presenta como una atractiva alternativa que promete corregir muchas de las deficiencias de aquel modelo. Basado en la filosofía de respetar las libertades del usuario -libertad de compartir, mejorar y utilizar los programas-, son cada vez más las administraciones, empresas y demás usuarios que optan por el software libre. La interacción entre ambos modelos y sus consecuencias, así como los intentos de las multinacionales del software por no perder mercado, serán asimismo objeto de estudio.________________________ABSTRACT:This work studies the evolution of two opposed models: propietary software and free software. Meanwhile the first one is fully established, and supported by the traditional computing industry, free software appears like an attractive alternative which promises to correct many deficiencies of that model. Based on the philosophy of respecting user’s liberties -freedom of sharing, improving and using the software-, an increasing number of administrations, companies and other users are moving to the model of free software. Interactions between both models and its consequences, as well as the attempts from the software’s multinational corporations of not to lose market, will also be objects to be studied.

  12. Documenting Software Architectures in an Agile World

    National Research Council Canada - National Science Library

    Clements, Paul

    2003-01-01

    This report compares the Software Engineering Institute's Views and Beyond approach for documenting software architectures with the documentation philosophy embodied in agile software-development methods...

  13. Energy Tracking Software Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ryan Davis; Nathan Bird; Rebecca Birx; Hal Knowles

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and help their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.

  14. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...

  15. WISDAAM software programmer's manual

    International Nuclear Information System (INIS)

    Ball, J.R.

    1992-10-01

    The WISDAAM system was developed to provide quality control over test data associated with in situ testing at the Waste Isolation Pilot Plant (WIPP). Assurance of data quality is of critical importance as these tests supply the information which will be used for development and verification of the technology required for repository implementation. The amount of data collected from the tests, which are some of the largest ever fielded in an underground facility, prompted the undertaking of a major project task to address data processing. The goal was to create a conceptual umbrella under which all of the activities associated with processing WIPP data (i.e., data reduction, archiving, retrieval, etc.) could be grouped. The WISDAAM system was the product of this task. The overall system covers electronic as well as manual data processing; however, this document deals primarily with those operations implemented by software running on a VAX computer

  16. BNL multiparticle spectrometer software

    International Nuclear Information System (INIS)

    Saulys, A.C.

    1984-01-01

    This paper discusses some solutions to problems common to the design, management and maintenance of a large high energy physics spectrometer software system. The experience of dealing with a large, complex program and the necessity of having the program controlled by various people at different levels of computer experience has led us to design a program control structure of mnemonic and self-explanatory nature. The use of this control language in both on-line and off-line operation of the program will be discussed. The solution of structuring a large program for modularity so that substantial changes to the program can be made easily for a wide variety of high energy physics experiments is discussed. Specialized tools for this type of large program management are also discussed

  17. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out......%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation....... there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big...

  18. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...... networks are designed and managed. This thesis argues that SDN can greatly simplify QoS provisioning in communication networks, and even improve QoS in various ways. To this end, the impact of SDN on QoS is assessed from both a network performance perspective (e.g. bandwidth, delay), and also from a more...... generic perspective (e.g. service provisioning speed, resources availability). As a result, new mechanisms for providing QoS are proposed, solutions for SDN-specific QoS challenges are designed and tested, and new network management concepts are prototyped, all aiming to improve QoS for network services...

  19. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  20. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  1. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  2. Software Configuration Management Problems and Solutions to Software Variability Management

    DEFF Research Database (Denmark)

    Bendix, Lars Gotfred

    2003-01-01

    These days more and more software is produced as product families. Products that have a lot in common, but all the same vary slightly in one or more aspects. Developing and maintaining these product families is a complex task. Software configuration management (SCM) can, in general, support...... the development and evolution of one single software product and to some degree also supports the concept of variants. It would be interesting to explore to what degree SCM already has solutions to some of the problems of product families and what are the problems where SCM has to invent new techniques to support...... software variability management....

  3. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  4. GNSS Software Receiver for UAVs

    DEFF Research Database (Denmark)

    Olesen, Daniel Madelung; Jakobsen, Jakob; von Benzon, Hans-Henrik

    2016-01-01

    This paper describes the current activities of GPS/GNSS Software receiver development at DTU Space. GNSS Software receivers have received a great deal of attention in the last two decades and numerous implementations have already been presented. DTU Space has just recently started development of ...... of our own GNSS software-receiver targeted for mini UAV applications, and we will in in this paper present our current progress and briefly discuss the benefits of Software Receivers in relation to our research interests.......This paper describes the current activities of GPS/GNSS Software receiver development at DTU Space. GNSS Software receivers have received a great deal of attention in the last two decades and numerous implementations have already been presented. DTU Space has just recently started development...

  5. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  6. Recommendation systems in software engineering

    CERN Document Server

    Robillard, Martin P; Walker, Robert J; Zimmermann, Thomas

    2014-01-01

    With the growth of public and private data stores and the emergence of off-the-shelf data-mining technology, recommendation systems have emerged that specifically address the unique challenges of navigating and interpreting software engineering data.This book collects, structures and formalizes knowledge on recommendation systems in software engineering. It adopts a pragmatic approach with an explicit focus on system design, implementation, and evaluation. The book is divided into three parts: "Part I - Techniques" introduces basics for building recommenders in software engineering, including techniques for collecting and processing software engineering data, but also for presenting recommendations to users as part of their workflow.?"Part II - Evaluation" summarizes methods and experimental designs for evaluating recommendations in software engineering.?"Part III - Applications" describes needs, issues and solution concepts involved in entire recommendation systems for specific software engineering tasks, fo...

  7. Sustainable embedded software lifecycle planning

    OpenAIRE

    Lee, Dong-Hyun; In, Hoh Peter; Lee, Keun; Park, Sooyong; Hinchey, Mike

    2012-01-01

    peer-reviewed Time-to-market is a crucial factor in increasing market share in the consumer electronics (CE) market. Furthermore, fierce competition in the market tends to sharply lower the prices of brand-new CE products as soon as they are released. Software-intensive embedded system design methods such as hardware/software co-design have been studied with the goal of reducing development lead-time by designing hardware and software simultaneously. Many researchers, however, concentra...

  8. UTM TCL2 Software Requirements

    Science.gov (United States)

    Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo

    2017-01-01

    The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.

  9. Jitter Controller Software

    Science.gov (United States)

    Lansdowne, Chatwin; Schlensinger, Adam

    2011-01-01

    Sinusoidal jitter is produced by simply modulating a clock frequency sinusoidally with a given frequency and amplitude. But this can be expressed as phase jitter, frequency jitter, or cycle-to-cycle jitter, rms or peak, absolute units, or normalized to the base clock frequency. Jitter using other waveforms requires calculating and downloading these waveforms to an arbitrary waveform generator, and helping the user manage relationships among phase jitter crest factor, frequency jitter crest factor, and cycle-to-cycle jitter (CCJ) crest factor. Software was developed for managing these relationships, automatically configuring the generator, and saving test results documentation. Tighter management of clock jitter and jitter sensitivity is required by new codes that further extend the already high performance of space communication links, completely correcting symbol error rates higher than 10 percent, and therefore typically requiring demodulation and symbol synchronization hardware to operating at signal-to-noise ratios of less than one. To accomplish this, greater demands are also made on transmitter performance, and measurement techniques are needed to confirm performance. It was discovered early that sinusoidal jitter can be stepped on a grid such that one can connect points by constant phase jitter, constant frequency jitter, or constant cycle-cycle jitter. The tool automates adherence to a grid while also allowing adjustments off-grid. Also, the jitter can be set by the user on any dimension and the others are calculated. The calculations are all recorded, allowing the data to be rapidly plotted or re-plotted against different interpretations just by changing pointers to columns. A key advantage is taking data on a carefully controlled grid, which allowed a single data set to be post-analyzed many different ways. Another innovation was building a software tool to provide very tight coupling between the generator and the recorded data product, and the operator

  10. Overview of NWIS Software

    Energy Technology Data Exchange (ETDEWEB)

    Mullens, J.A.

    1999-08-30

    The Nuclear Weapons Identification System (NWIS) is a system that performs radiation signature measurements on objects such as nuclear weapons components. NWIS consists of a {sup 252}Cf fission source, radiation detectors and associated analog electronics, data acquisition boards, and a computer running Windows NT and the application software. NWIS uses signal processing techniques to produce a radiation signature from the radiation emitted from the object. This signature can be stored and later compared to another signature to determine whether two objects are similar. A library of such signatures can be used to identify objects in closed containers as well as determine attributes such as fissile mass and in some cases enrichment. NWIS uses a {sup 252}Cf source on one side of the object to produce radiation that its detectors measure on the other side of the target (active mode). If the object naturally emits enough radiation, the {sup 252}Cf source is not required (passive mode). The NWIS data acquisition hardware has five detector channels. Each channel receives shaped detector pulses and times those pulses with 1 nanosecond resolution. In active mode measurements one of these channels receives pulses from a detector measuring the {sup 252}Cf source fissions. Thus, for active mode measurements, NWIS has the time of each {sup 252}Cf fission and the subsequent injection of neutrons and gamma rays into the object. The remaining channels receive pulses from the detectors measuring radiation from the object. These detectors record the amount and time of radiation exiting the object. By correlating the radiation events among the source and the other detectors, and among the detectors themselves, a characteristic response of the object to {sup 252}Cf radiation or its own internal radiation is measured. The data acquisition hardware consists of two custom-made boards. The Data Capture and Compression (DCC) board is built around a Gallium Arsine (GaAs) chip designed at

  11. Modeling software systems by domains

    Science.gov (United States)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  12. Software quality: Process or people

    Science.gov (United States)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  13. SBA Network Components & Software Inventory

    Data.gov (United States)

    Small Business Administration — SBA’s Network Components & Software Inventory contains a complete inventory of all devices connected to SBA’s network including workstations, servers, routers,...

  14. CEBAF beam viewer imaging software

    International Nuclear Information System (INIS)

    Bowling, B.A.; McDowell, C.

    1993-01-01

    This paper discusses the various software used in the analysis of beam viewer images at CEBAF. This software, developed at CEBAF, includes a three-dimensional viewscreen calibration code which takes into account such factors as multiple camera/viewscreen rotations and perspective imaging, and maintaining a calibration database for each unit. Additional software allows single-button beam spot detection, with determination of beam location, width, and quality, in less than three seconds. Software has also been implemented to assist in the determination of proper chopper RF control parameters from digitized chopper circles, providing excellent results

  15. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  16. A Software Configuration Management Course

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    curriculum. It is either not taught at all or is just a minor part of a general course in software engineering. In this paper, we report on our experience with giving a full course entirely dedicated to Software Configuration Management topics and start a discussion of what ideally should be the goal......Software Configuration Management has been a big success in research and creation of tools. There are also many vendors in the market of selling courses to companies. However, in the education sector Software Configuration Management has still not quite made it - at least not into the university...

  17. A Software Configuration Management Course

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    Software Configuration Management has been a big success in research and creation of tools. There are also many vendors in the market of selling courses to companies. However, in the education sector Software Configuration Management has still not quite made it - at least not into the university...... curriculum. It is either not taught at all or is just a minor part of a general course in software engineering. In this paper, we report on our experience with giving a full course entirely dedicated to Software Configuration Management topics and start a discussion of what ideally should be the goal...

  18. Silverlight 4 Business Intelligence Software

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software allows you to view different components of a business using a single visual platform, which makes comprehending mountains of data easier. BI is everywhere. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI. Currently, we are in the second generation of BI software - called BI 2.0 - which is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increasingly visually rich tech

  19. The Software Engineering Laboratory: An operational software experience factory

    Science.gov (United States)

    Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon

    1992-01-01

    For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.

  20. Software energy profiling: comparing releases of a software product

    NARCIS (Netherlands)

    Jagroep, Erik; van der Werf, J.M.E.M.; Brinkkemper, S.; Procaccianti, Guiseppe; Lago, Patricia; Blom, Leen; van Vliet, Rob

    2016-01-01

    In the quest for energy efficiency of Information and Communication Technology, so far research has mostly focused on the role of hardware. However, as hardware technology becomes more sophisticated, the role of software becomes crucial. Recently, the impact of software on energy consumption has

  1. Software Energy Profiling: Comparing Releases of a Software Product

    NARCIS (Netherlands)

    Jagroep, E.A.; van der Werf, J.M.E.M.; Procaccianti, G.; Lago, P.; Brinkkemper, Sjaak; Blom, L.; van Vliet, Rob; Dillon, Laura; Visser, Willem; Williams, Laurie

    2016-01-01

    In the quest for energy efficiency of Information and Communication Technology, so far research has mostly focused on the role of hardware. However, as hardware technology becomes more sophisticated, the role of software becomes crucial. Recently, the impact of software on energy consumption has

  2. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  3. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  4. Factors that motivate software developers in Nigerian's software ...

    African Journals Online (AJOL)

    It was also observed those courtesy, good reward systems, regular training, recognition, tolerance of mistakes and good leadership were high motivators of software developers. Keywords: Software developers, information technology, project managers, Nigeria International Journal of Natural and Applied Sciences, 6(4): ...

  5. Parallel time integration software

    Energy Technology Data Exchange (ETDEWEB)

    2014-07-01

    This package implements an optimal-scaling multigrid solver for the (non) linear systems that arise from the discretization of problems with evolutionary behavior. Typically, solution algorithms for evolution equations are based on a time-marching approach, solving sequentially for one time step after the other. Parallelism in these traditional time-integrarion techniques is limited to spatial parallelism. However, current trends in computer architectures are leading twards system with more, but not faster. processors. Therefore, faster compute speeds must come from greater parallelism. One approach to achieve parallelism in time is with multigrid, but extending classical multigrid methods for elliptic poerators to this setting is a significant achievement. In this software, we implement a non-intrusive, optimal-scaling time-parallel method based on multigrid reduction techniques. The examples in the package demonstrate optimality of our multigrid-reduction-in-time algorithm (MGRIT) for solving a variety of parabolic equations in two and three sparial dimensions. These examples can also be used to show that MGRIT can achieve significant speedup in comparison to sequential time marching on modern architectures.

  6. CMS Simulation Software

    CERN Document Server

    Banerjee, Sunanda

    2012-01-01

    The CMS simulation, based on the Geant4 toolkit, has been operational within the new CMS software framework for more than four years. The description of the detector including the forward regions has been completed and detailed investigation of detector positioning and material budget has been carried out using collision data. Detailed modeling of detector noise has been performed and validated with the collision data. In view of the high luminosity runs of the Large Hadron Collider, simulation of pile-up events has become a key issue. Challenges have raised from the point of view of providing a realistic luminosity profile and modeling of out-of-time pileup events, as well as computing issues regarding memory footprint and IO access. These will be especially severe in the simulation of collision events for the LHC upgrades; a new pileup simulation architecture has been introduced to cope with these issues. The CMS detector has observed anomalous energy deposit in the calorimeters and there has been a sub...

  7. Open source clustering software.

    Science.gov (United States)

    de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S

    2004-06-12

    We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.

  8. Evaluating software testing strategies

    Science.gov (United States)

    Selby, R. W., Jr.; Basili, V. R.; Page, J.; Mcgarry, F. E.

    1984-01-01

    The strategies of code reading, functional testing, and structural testing are compared in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. The major results are the following: (1) Code readers detected more faults than did those using the other techniques, while functional tester detected more faults than did structural testers; (2) Code readers had a higher fault detection rate than did those using the other methods, while there was no difference between functional testers and structural testers; (3) Subjects testing the abstract data type detected the most faults and had the highest fault detection rate, while individuals testing the database maintainer found the fewest faults and spent the most effort testing; (4) Subjects of intermediate and junior expertise were not different in number or percentage of faults found, fault detection rate, or fault detection effort; (5) subjects of advanced expertise found a greater number of faults than did the others, found a greater percentage of faults than did just those of junior expertise, and were not different from the others in either fault detection rate or effort; and (6) Code readers and functional testers both detected more omission faults and more control faults than did structural testers, while code readers detected more interface faults than did those using the other methods.

  9. Building Software with Gradle

    CERN Multimedia

    CERN. Geneva; Studer, Etienne

    2014-01-01

    In this presentation, we will give an overview of the key concepts and main features of Gradle, the innovative build system that has become the de-facto standard in the enterprise. We will cover task declaration and task graph execution, incremental builds, multi-project builds, dependency management, applying plugins, extracting reusable build logic, bootstrapping a build, and using the Gradle daemon. By the end of this talk, you will have a good understanding of what makes Gradle so powerful yet easy to use. You will also understand why companies like Pivotal, LinkedIn, Google, and other giants with complex builds count on Gradle. About the speakers Etienne is leading the Tooling Team at Gradleware. He has been working as a developer, architect, project manager, and CTO over the past 15 years. He has spent most of his time building software products from the ground up and successfully shipping them to happy customers. He had ...

  10. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  11. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  12. Explaining Synthesized Software

    Science.gov (United States)

    VanBaalen, Jeffrey; Robinson, Peter; Lowry, Michael; Pressburger, Thomas; Lau, Sonie (Technical Monitor)

    1998-01-01

    Motivated by NASA's need for high-assurance software, NASA Ames' Amphion project has developed a generic program generation system based on deductive synthesis. Amphion has a number of advantages, such as the ability to develop a new synthesis system simply by writing a declarative domain theory. However, as a practical matter, the validation of the domain theory for such a system is problematic because the link between generated programs and the domain theory is complex. As a result, when generated programs do not behave as expected, it is difficult to isolate the cause, whether it be an incorrect problem specification or an error in the domain theory. This paper describes a tool we are developing that provides formal traceability between specifications and generated code for deductive synthesis systems. It is based on extensive instrumentation of the refutation-based theorem prover used to synthesize programs. It takes augmented proof structures and abstracts them to provide explanations of the relation between a specification, a domain theory, and synthesized code. In generating these explanations, the tool exploits the structure of Amphion domain theories, so the end user is not confronted with the intricacies of raw proof traces. This tool is crucial for the validation of domain theories as well as being important in everyday use of the code synthesis system. It plays an important role in validation because when generated programs exhibit incorrect behavior, it provides the links that can be traced to identify errors in specifications or domain theory. It plays an important role in the everyday use of the synthesis system by explaining to users what parts of a specification or of the domain theory contribute to what pieces of a generated program. Comments are inserted into the synthesized code that document these explanations.

  13. Software Development at Belle II

    Science.gov (United States)

    Kuhr, Thomas; Hauth, Thomas

    2015-12-01

    Belle II is a next generation B-factory experiment that will collect 50 times more data than its predecessor Belle. This requires not only a major upgrade of the detector hardware, but also of the simulation, reconstruction, and analysis software. The challenges of the software development at Belle II and the tools and procedures to address them are reviewed in this article.

  14. The language of social software

    NARCIS (Netherlands)

    D.J.N. van Eijck (Jan)

    2010-01-01

    htmlabstractComputer software is written in languages likeC, Java orHaskell. In many cases social software is expressed in natural language. The paper explores connections between the areas of natural language analysis and analysis of social protocols, and proposes an extended program for natural

  15. Assessing optimal software architecture maintainability

    NARCIS (Netherlands)

    Bosch, Jan; Bengtsson, P.O.; Smedinga, Rein; Sousa, P; Ebert, J

    2000-01-01

    Over the last decade, several authors have studied the maintainability of software architectures. In particular, the assessment of maintainability has received attention. However, even when one has a quantitative assessment of the maintainability of a software architecture, one still does not have

  16. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  17. Future of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  18. R D software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Hood, F.C.

    1991-10-01

    Research software quality assurance (QA) requirements must be adequate to strengthen development or modification objectives, but flexible enough not to restrict creativity. Application guidelines are needed for the different kinds of research and development (R D) software activities to assure project objectives are achieved.

  19. What is Free Software? l

    Indian Academy of Sciences (India)

    We would want to use free software for a variety of reasons. First of all, it is generally more reliable than its counterparts. Linus. Law explains why: Given enough .... minix gave way to Linux in the'early 1990's, a creation of Linus. Torvalds, a Finnish student. Linus Torvalds released his software under the GPL license.

  20. Music Software for Special Needs.

    Science.gov (United States)

    McCord, Kimberly

    2001-01-01

    Discusses the use of computer software for students with special needs in the music classroom. Focuses on software programs that are appropriate for children with special needs such as: "Musicshop,""Band-in-a-Box,""Rock Rap'n Roll,""Music Mania,""Music Ace" and "Music Ace 2," and "Children's Songbook." (CMK)