WorldWideScience

Sample records for implementing wide baseline

  1. Implementing wide baseline matching algorithms on a graphics processing unit.

    Energy Technology Data Exchange (ETDEWEB)

    Rothganger, Fredrick H.; Larson, Kurt W.; Gonzales, Antonio Ignacio; Myers, Daniel S.

    2007-10-01

    Wide baseline matching is the state of the art for object recognition and image registration problems in computer vision. Though effective, the computational expense of these algorithms limits their application to many real-world problems. The performance of wide baseline matching algorithms may be improved by using a graphical processing unit as a fast multithreaded co-processor. In this paper, we present an implementation of the difference of Gaussian feature extractor, based on the CUDA system of GPU programming developed by NVIDIA, and implemented on their hardware. For a 2000x2000 pixel image, the GPU-based method executes nearly thirteen times faster than a comparable CPU-based method, with no significant loss of accuracy.

  2. Efficient Wide Baseline Structure from Motion

    Science.gov (United States)

    Michelini, Mario; Mayer, Helmut

    2016-06-01

    This paper presents a Structure from Motion approach for complex unorganized image sets. To achieve high accuracy and robustness, image triplets are employed and (an approximate) camera calibration is assumed to be known. The focus lies on a complete linking of images even in case of large image distortions, e.g., caused by wide baselines, as well as weak baselines. A method for embedding image descriptors into Hamming space is proposed for fast image similarity ranking. The later is employed to limit the number of pairs to be matched by a wide baseline method. An iterative graph-based approach is proposed formulating image linking as the search for a terminal Steiner minimum tree in a line graph. Finally, additional links are determined and employed to improve the accuracy of the pose estimation. By this means, loops in long image sequences are implicitly closed. The potential of the proposed approach is demonstrated by results for several complex image sets also in comparison with VisualSFM.

  3. [Implementation and results of the EU-wide baseline studies on the prevalence of Salmonella spp. in slaughter and breeding pigs in Austria].

    Science.gov (United States)

    Kostenzer, Klaus; Much, Peter; Kornschober, Christian; Lassnig, Heimo; Köfer, Josef

    2014-01-01

    The Member States of the European Union are following a common strategy on the control of Salmonella and other foodborne pathogens (Anonym, 2003). Within that framework baseline studies on the most relevant animal populations have been carried out. This paper describes the implementation and the results of the baseline studies on Salmonella spp. in slaughter and breeding pigs in Austria. A total of 647 slaughter pigs were sampled in 28 slaughterhouses between October 2006 and September 2007. Samples were taken from the ileocaecal lymphnodes to detect infection in pigs and from the surface of the carcasses to detect contamination. Out of the 617 datasets included in the final analysis, Salmonella prevalences of 2% in lymphnodes and 1.1% on the carcass surface were observed. S. Derby, S. Enteritidis and S. Typhimurium were the three most frequently identified serovars. In an additional study, a total of 252 holdings with breeding pigs has been sampled between January and December combined multiplier herds. Respectively prevalences of 5, 8, 5, 3 and 9.1% were obtained, with S. Typhimurium being the most frequently isolated serovar. Overall, compared to neighbouring Member States a rather low prevalence of Salmonella spp. in pigs was documented for Austria, in particular in slaughter pigs.The serovar distribution seemed to be similar throughout the pig populations, some also being represented in Austrian human isolates. Contamination of feed seems to play a minor role considering the overall low prevalence, but nevertheless has to be taken into account in any future control or monitoring strategy for Salmonella spp. in pigs.

  4. SSA FITARA Common Baseline Implementation Plan

    Data.gov (United States)

    Social Security Administration — This document describes the agency's plan to implement the Federal Information Technology Acquisition Reform Act (FITARA) Common Baseline per OMB memorandum M-15-14.

  5. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    angle θ of its corresponding rays w.r.t. the optical axis as θ = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as

  6. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  7. A FPGA Implementation of JPEG Baseline Encoder for Wearable Devices.

    Science.gov (United States)

    Li, Yuecheng; Jia, Wenyan; Luan, Bo; Mao, Zhi-Hong; Zhang, Hong; Sun, Mingui

    2015-04-01

    In this paper, an efficient field-programmable gate array (FPGA) implementation of the JPEG baseline image compression encoder is presented for wearable devices in health and wellness applications. In order to gain flexibility in developing FPGA-specific software and balance between real-time performance and resources utilization, A High Level Synthesis (HLS) tool is utilized in our system design. An optimized dataflow configuration with a padding scheme simplifies the timing control for data transfer. Our experiments with a system-on-chip multi-sensor system have verified our FPGA implementation with respect to real-time performance, computational efficiency, and FPGA resource utilization.

  8. AN INTEGRATED RANSAC AND GRAPH BASED MISMATCH ELIMINATION APPROACH FOR WIDE-BASELINE IMAGE MATCHING

    Directory of Open Access Journals (Sweden)

    M. Hasheminasab

    2015-12-01

    Full Text Available In this paper we propose an integrated approach in order to increase the precision of feature point matching. Many different algorithms have been developed as to optimizing the short-baseline image matching while because of illumination differences and viewpoints changes, wide-baseline image matching is so difficult to handle. Fortunately, the recent developments in the automatic extraction of local invariant features make wide-baseline image matching possible. The matching algorithms which are based on local feature similarity principle, using feature descriptor as to establish correspondence between feature point sets. To date, the most remarkable descriptor is the scale-invariant feature transform (SIFT descriptor , which is invariant to image rotation and scale, and it remains robust across a substantial range of affine distortion, presence of noise, and changes in illumination. The epipolar constraint based on RANSAC (random sample consensus method is a conventional model for mismatch elimination, particularly in computer vision. Because only the distance from the epipolar line is considered, there are a few false matches in the selected matching results based on epipolar geometry and RANSAC. Aguilariu et al. proposed Graph Transformation Matching (GTM algorithm to remove outliers which has some difficulties when the mismatched points surrounded by the same local neighbor structure. In this study to overcome these limitations, which mentioned above, a new three step matching scheme is presented where the SIFT algorithm is used to obtain initial corresponding point sets. In the second step, in order to reduce the outliers, RANSAC algorithm is applied. Finally, to remove the remained mismatches, based on the adjacent K-NN graph, the GTM is implemented. Four different close range image datasets with changes in viewpoint are utilized to evaluate the performance of the proposed method and the experimental results indicate its robustness and

  9. Implementing a University-Wide Change Initiative

    Science.gov (United States)

    Styron, Ronald A., Jr.; Michaelsen, Larry K.; Styron, Jennifer L.

    2015-01-01

    This paper provides an account of the pilot and first year of a university improvement initiative, developed in response to a reaffirmation mandate from the Southern Association of Colleges and Schools Commission on Colleges. The initiative focused on increasing student retention and enhancing learning through the campus-wide use of team-based…

  10. Homography Propagation and Optimization for Wide-Baseline Street Image Interpolation.

    Science.gov (United States)

    Nie, Yongwei; Zhang, Zhensong; Sun, Hanqiu; Su, Tan; Li, Guiqing

    2017-10-01

    Wide-baseline street image interpolation is useful but very challenging. Existing approaches either rely on heavyweight 3D reconstruction or computationally intensive deep networks. We present a lightweight and efficient method which uses simple homography computing and refining operators to estimate piecewise smooth homographies between input views. To achieve the goal, we show how to combine homography fitting and homography propagation together based on reliable and unreliable superpixel discrimination. Such a combination, other than using homography fitting only, dramatically increases the accuracy and robustness of the estimated homographies. Then, we integrate the concepts of homography and mesh warping, and propose a novel homography-constrained warping formulation which enforces smoothness between neighboring homographies by utilizing the first-order continuity of the warped mesh. This further eliminates small artifacts of overlapping, stretching, etc. The proposed method is lightweight and flexible, allows wide-baseline interpolation. It improves the state of the art and demonstrates that homography computation suffices for interpolation. Experiments on city and rural datasets validate the efficiency and effectiveness of our method.

  11. Implementation of a School-wide Clinical Intervention Documentation System

    OpenAIRE

    Stevenson, T. Lynn; Fox, Brent I.; Andrus, Miranda; Carroll, Dana

    2011-01-01

    Objective. To evaluate the effectiveness and impact of a customized Web-based software program implemented in 2006 for school-wide documentation of clinical interventions by pharmacy practice faculty members, pharmacy residents, and student pharmacists.

  12. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

    Energy Technology Data Exchange (ETDEWEB)

    University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

    2012-06-13

    Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

  13. Preliminary design and implementation of the baseline digital baseband architecture for advanced deep space transponders

    Science.gov (United States)

    Nguyen, T. M.; Yeh, H.-G.

    1993-01-01

    The baseline design and implementation of the digital baseband architecture for advanced deep space transponders is investigated and identified. Trade studies on the selection of the number of bits for the analog-to-digital converter (ADC) and optimum sampling schemes are presented. In addition, the proposed optimum sampling scheme is analyzed in detail. Descriptions of possible implementations for the digital baseband (or digital front end) and digital phase-locked loop (DPLL) for carrier tracking are also described.

  14. Hop-Diffusion Monte Carlo for Epipolar Geometry Estimation between Very Wide-Baseline Images.

    Science.gov (United States)

    Brahmachari, Aveek S; Sarkar, Sudeep

    2013-03-01

    We present a Monte Carlo approach for epipolar geometry estimation that efficiently searches for minimal sets of inlier correspondences in the presence of many outliers in the putative correspondence set, a condition that is prevalent when we have wide baselines, significant scale changes, rotations in depth, occlusion, and repeated patterns. The proposed Monte Carlo algorithm uses Balanced LOcal and Global Search (BLOGS) to find the best minimal set of correspondences. The local search is a diffusion process using Joint Feature Distributions that captures the dependencies among the correspondences. And, the global search is a hopping search process across the minimal set space controlled by photometric properties. Using a novel experimental protocol that involves computing errors for manually marked ground truth points and images with outlier rates as high as 90 percent, we find that BLOGS is better than related approaches such as MAPSAC, NAPSAC, and BEEM. BLOGS results are of similar quality as other approaches, but BLOGS generate them in 10 times fewer iterations. The time per iteration for BLOGS is also the lowest among the ones we studied.

  15. Effects of Coaching on Teachers' Implementation of Tier 1 School-Wide Positive Behavioral Interventions and Support Strategies

    Science.gov (United States)

    Bethune, Keri S.

    2017-01-01

    Fidelity of implementation of School-Wide Positive Behavioral Interventions and Supports (SWPBIS) procedures within schools is critical to the success of the program. Coaching has been suggested as one approach to helping ensure accuracy of implementation of SWPBIS plans. This study used a multiple baseline across participants design to examine…

  16. Broadband Wide Angle Lens Implemented with Dielectric Metamaterials

    Directory of Open Access Journals (Sweden)

    Anthony Starr

    2011-08-01

    Full Text Available The Luneburg lens is a powerful imaging device, exhibiting aberration free focusing for parallel rays incident from any direction. However, its advantages are offset by a focal surface that is spherical and thus difficult to integrate with standard planar detector and emitter arrays. Using the recently developed technique of transformation optics, it is possible to transform the curved focal surface to a flat plane while maintaining the perfect focusing behavior of the Luneburg over a wide field of view. Here we apply these techniques to a lesser-known refractive Luneburg lens and implement the design with a metamaterial composed of a semi-crystalline distribution of holes drilled in a dielectric. In addition, we investigate the aberrations introduced by various approximations made in the implementation of the lens. The resulting design approach has improved mechanical strength with small aberrations and is ideally suited to implementation at infrared and visible wavelengths.

  17. Broadband Wide Angle Lens Implemented with Dielectric Metamaterials

    Science.gov (United States)

    Hunt, John; Kundtz, Nathan; Landy, Nathan; Nguyen, Vinh; Perram, Tim; Starr, Anthony; Smith, David R.

    2011-01-01

    The Luneburg lens is a powerful imaging device, exhibiting aberration free focusing for parallel rays incident from any direction. However, its advantages are offset by a focal surface that is spherical and thus difficult to integrate with standard planar detector and emitter arrays. Using the recently developed technique of transformation optics, it is possible to transform the curved focal surface to a flat plane while maintaining the perfect focusing behavior of the Luneburg over a wide field of view. Here we apply these techniques to a lesser-known refractive Luneburg lens and implement the design with a metamaterial composed of a semi-crystalline distribution of holes drilled in a dielectric. In addition, we investigate the aberrations introduced by various approximations made in the implementation of the lens. The resulting design approach has improved mechanical strength with small aberrations and is ideally suited to implementation at infrared and visible wavelengths. PMID:22164056

  18. Toward Robust Climate Baselining: Objective Assessment of Climate Change Using Widely Distributed Miniaturized Sensors for Accurate World-Wide Geophysical Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Teller, E; Leith, C; Canavan, G; Marion, J; Wood, L

    2001-11-13

    A gap-free, world-wide, ocean-, atmosphere-, and land surface-spanning geophysical data-set of three decades time-duration containing the full set of geophysical parameters characterizing global weather is the scientific perquisite for defining the climate; the generally-accepted definition in the meteorological community is that climate is the 30-year running-average of weather. Until such a tridecadal climate base line exists, climate change discussions inevitably will have a semi-speculative, vs. a purely scientific, character, as the baseline against which changes are referenced will be at least somewhat uncertain. The contemporary technology base provides ways-and-means for commencing the development of such a meteorological measurement-intensive climate baseline, moreover with a program budget far less than the {approx}$2.5 B/year which the US. currently spends on ''global change'' studies. In particular, the recent advent of satellite-based global telephony enables real-time control of, and data-return from, instrument packages of very modest scale, and Silicon Revolution-based sensor, data-processing and -storage advances permit 'intelligent' data-gathering payloads to be created with 10 gram-scale mass budgets. A geophysical measurement system implemented in such modern technology is a populous constellation 03 long-lived, highly-miniaturized robotic weather stations deployed throughout the weather-generating portions of the Earths atmosphere, throughout its oceans and across its land surfaces. Leveraging the technological advances of the OS, the filly-developed atmospheric weather station of this system has a projected weight of the order of 1 ounce, and contains a satellite telephone, a GPS receiver, a full set of atmospheric sensing instruments and a control computer - and has an operational life of the order of 1 year and a mass-production cost of the order of $20. Such stations are effectively &apos

  19. Toward Robust Climate Baselining: Objective Assessment of Climate Change Using Widely Distributed Miniaturized Sensors for Accurate World-Wide Geophysical Measurements

    Science.gov (United States)

    Teller, E.; Leith, C.; Canavan, G.; Marion, J.; Wood, L.

    2001-11-13

    A gap-free, world-wide, ocean-, atmosphere-, and land surface-spanning geophysical data-set of three decades time-duration containing the full set of geophysical parameters characterizing global weather is the scientific perquisite for defining the climate; the generally-accepted definition in the meteorological community is that climate is the 30-year running-average of weather. Until such a tridecadal climate baseline exists, climate change discussions inevitably will have a semi-speculative, vs. a purely scientific, character, as the baseline against which changes are referenced will at least somewhat uncertain.

  20. AN EFFICIENT IMPLEMENTATION ARCHITECTURE FOR WIDE-BAND DIGITAL DOWNCONVERSION

    Institute of Scientific and Technical Information of China (English)

    Gao Zhicheng; Xiao Xianci

    2001-01-01

    The wide-band digital receiving systems require digital downconversion(DDC) with high data rate and short tuning time in order to intercept the narrow-band signals within broad tuning bandwidth. But these requirements can not be met by the commercial DDC. In this paper an efficient implementation architecture is presented. It combines the flexibility of DFT tuning with the efficiency of the polyphase filter bank decomposition. By first decimating the data prior to filtering and mixing, this architecture gives a better solution to the mismatch between the lower hardware speed and high data rate. The computer simulations show the feasibility of this processing architecture.

  1. Broadband Wide Angle Lens Implemented with Dielectric Metamaterials

    OpenAIRE

    Anthony Starr; Nathan Landy; Vinh Nguyen; Tim Perram; Nathan Kundtz; John Hunt; Smith, David R.

    2011-01-01

    The Luneburg lens is a powerful imaging device, exhibiting aberration free focusing for parallel rays incident from any direction. However, its advantages are offset by a focal surface that is spherical and thus difficult to integrate with standard planar detector and emitter arrays. Using the recently developed technique of transformation optics, it is possible to transform the curved focal surface to a flat plane while maintaining the perfect focusing behavior of the Luneburg over a wide fi...

  2. World wide web implementation of the Langley technical report server

    Science.gov (United States)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.

    1994-01-01

    On January 14, 1993, NASA Langley Research Center (LaRC) made approximately 130 formal, 'unclassified, unlimited' technical reports available via the anonymous FTP Langley Technical Report Server (LTRS). LaRC was the first organization to provide a significant number of aerospace technical reports for open electronic dissemination. LTRS has been successful in its first 18 months of operation, with over 11,000 reports distributed and has helped lay the foundation for electronic document distribution for NASA. The availability of World Wide Web (WWW) technology has revolutionized the Internet-based information community. This paper describes the transition of LTRS from a centralized FTP site to a distributed data model using the WWW, and suggests how the general model for LTRS can be applied to other similar systems.

  3. Implementation of revised strategy of filaria control-baseline clinico-parasitological survey.

    Science.gov (United States)

    Patel, S; Rai, R N; Mishra, R N

    2000-06-01

    A clinico-epidemiological study of filariasis was carried out in Varanasi District in October and November, 1997 to generate baseline data for assessing the impact of Mass Drug Administration (MDA) in the district. Disease rate was found to be 6.6% (9.9% in males and 3.0% in females) and microfilaria rate was 5.3% (5.2% in males and 5.5% in females). Mean microfilaria density was found to be 9.86 per 20 Cu.mm blood. Genital manifestations (77.5%) outnumbered all other forms of clinical manifestations. Vector infectivity rate was found to be 0.93%.

  4. Salud Mesoamérica 2015 Initiative: design, implementation, and baseline findings.

    Science.gov (United States)

    Mokdad, Ali H; Colson, Katherine Ellicott; Zúñiga-Brenes, Paola; Ríos-Zertuche, Diego; Palmisano, Erin B; Alfaro-Porras, Eyleen; Anderson, Brent W; Borgo, Marco; Desai, Sima; Gagnier, Marielle C; Gillespie, Catherine W; Giron, Sandra L; Haakenstad, Annie; Romero, Sonia López; Mateus, Julio; McKay, Abigail; Mokdad, Ali A; Murphy, Tasha; Naghavi, Paria; Nelson, Jennifer; Orozco, Miguel; Ranganathan, Dharani; Salvatierra, Benito; Schaefer, Alexandra; Usmanova, Gulnoza; Varela, Alejandro; Wilson, Shelley; Wulf, Sarah; Hernandez, Bernardo; Lozano, Rafael; Iriarte, Emma; Regalia, Ferdinando

    2015-01-01

    Health has improved markedly in Mesoamerica, the region consisting of southern Mexico and Central America, over the past decade. Despite this progress, there remain substantial inequalities in health outcomes, access, and quality of medical care between and within countries. Poor, indigenous, and rural populations have considerably worse health indicators than national or regional averages. In an effort to address these health inequalities, the Salud Mesoamérica 2015 Initiative (SM2015), a results-based financing initiative, was established. For each of the eight participating countries, health targets were set to measure the progress of improvements in maternal and child health produced by the Initiative. To establish a baseline, we conducted censuses of 90,000 households, completed 20,225 household interviews, and surveyed 479 health facilities in the poorest areas of Mesoamerica. Pairing health facility and household surveys allows us to link barriers to care and health outcomes with health system infrastructure components and quality of health services. Indicators varied significantly within and between countries. Anemia was most prevalent in Panama and least prevalent in Honduras. Anemia varied by age, with the highest levels observed among children aged 0 to 11 months in all settings. Belize had the highest proportion of institutional deliveries (99%), while Guatemala had the lowest (24%). The proportion of women with four antenatal care visits with a skilled attendant was highest in El Salvador (90%) and the lowest in Guatemala (20%). Availability of contraceptives also varied. The availability of condoms ranged from 83% in Nicaragua to 97% in Honduras. Oral contraceptive pills and injectable contraceptives were available in just 75% of facilities in Panama. IUDs were observed in only 21.5% of facilities surveyed in El Salvador. These data provide a baseline of much-needed information for evidence-based action on health throughout Mesoamerica. Our baseline

  5. LHCb base-line level-0 trigger 3D-flow implementation

    CERN Document Server

    Crosetto, D

    1999-01-01

    The LHCb Level-0 trigger implementation with the 3D-Flow system offers full programmability, allowing it to adapt to unexpected operating conditions and enabling new, unpredicted physics. The implementation is described in detail and refers to components and technology available today. The 3D-Flow Processor system is a new, technology-independent concept in very fast, real-time system architectures. Based on the replication of a single type of circuit of 100 k gates, which communicates in six directions: bi-directional with North, East, West, and South neighbors, unidirectional from Top to Bottom, the system offers full programmability, modularity, ease of expansion and adaptation to the latest technology. A complete study of its applicability to the LHCb calorimeter triggers is presented. Full description of the input data handling, either in digital or mixed digital-analog form, of the data processing, and the transmission of results to the global level-0 trigger decision unit are provided. Any level-0 trig...

  6. The SFXC software correlator for Very Long Baseline Interferometry: Algorithms and Implementation

    CERN Document Server

    Keimpema, A; Pogrebenko, S V; Campbell, R M; Cimó, G; Duev, D A; Eldering, B; Kruithof, N; van Langevelde, H J; Marchal, D; Calvés, G Molera; Ozdemir, H; Paragi, Z; Pidopryhora, Y; Szomoru, A; Yang, J

    2015-01-01

    In this paper a description is given of the SFXC software correlator, developed and maintained at the Joint Institute for VLBI in Europe (JIVE). The software is designed to run on generic Linux-based computing clusters. The correlation algorithm is explained in detail, as are some of the novel modes that software correlation has enabled, such as wide-field VLBI imaging through the use of multiple phase centres and pulsar gating and binning. This is followed by an overview of the software architecture. Finally, the performance of the correlator as a function of number of CPU cores, telescopes and spectral channels is shown.

  7. Implementing meta-analysis from genome-wide association studies for pork quality traits

    Science.gov (United States)

    Pork quality plays an important role in the meat processing industry, thus different methodologies have been implemented to elucidate the genetic architecture of traits affecting meat quality. One of the most common and widely used approaches is to perform genome-wide association (GWA) studies. Howe...

  8. Teen Pregnancy Prevention: Implementation of a Multicomponent, Community-Wide Approach.

    Science.gov (United States)

    Mueller, Trisha; Tevendale, Heather D; Fuller, Taleria R; House, L Duane; Romero, Lisa M; Brittain, Anna; Varanasi, Bala

    2017-03-01

    This article provides an overview and description of implementation activities of the multicomponent, community-wide initiatives of the Teenage Pregnancy Prevention Program initiated in 2010 by the Office of Adolescent Health and the Centers for Disease Control and Prevention. The community-wide initiatives applied the Interactive Systems Framework for dissemination and implementation through training and technical assistance on the key elements of the initiative: implementation of evidence-based teen pregnancy prevention (TPP) interventions; enhancing quality of and access to youth-friendly reproductive health services; educating stakeholders about TPP; working with youth in communities most at risk of teen pregnancy; and mobilizing the community to garner support. Of nearly 12,000 hours of training and technical assistance provided, the majority was for selecting, implementing, and evaluating an evidence-based TPP program. Real-world implementation of a community-wide approach to TPP takes time and effort. This report describes implementation within each of the components and shares lessons learned during planning and implementation phases of the initiative.

  9. Hardware Implementation of an Automatic Rendering Tone Mapping Algorithm for a Wide Dynamic Range Display

    OpenAIRE

    2013-01-01

    Tone mapping algorithms are used to adapt captured wide dynamic range (WDR) scenes to the limited dynamic range of available display devices. Although there are several tone mapping algorithms available, most of them require manual tuning of their rendering parameters. In addition, the high complexities of some of these algorithms make it difficult to implement efficient real-time hardware systems. In this work, a real-time hardware implementation of an exponent-based tone mapping algorithm i...

  10. Implementing ADM1 for plant-wide benchmark simulations in Matlab/Simulink

    DEFF Research Database (Denmark)

    Rosen, Christian; Vrecko, Darko; Gernaey, Krist

    2006-01-01

    , in particular if the ADM1 is to be included in dynamic simulations of plant-wide or even integrated systems. In this paper, the experiences gained from a Matlab/Simulink implementation of ADM1 into the extended COST/IWA Benchmark Simulation Model (BSM2) are presented. Aspects related to system stiffness, model...

  11. An Analysis of Implementation Strategies in a School-Wide Vocabulary Intervention

    Science.gov (United States)

    Roskos, Katheen A.; Moe, Jennifer Randazzo; Rosemary, Catherine

    2017-01-01

    From an improvement research perspective, this study explores strategies used to implement a school-wide vocabulary intervention into language arts instruction at an urban elementary school. Academic language time, an innovative change in the instructional delivery system, allots time and structure for deliberate teaching of cross-disciplinary…

  12. Implementing Meta-analysis for genome-wide association studies of pork quality traits

    Science.gov (United States)

    Pork quality is a critical concern in the meat industry. Implementation of genome-wide association studies (GWA) allows identification of genomic regions that explain a substantial portion of the variation of relevant traits. It is also important to determine the consistency of results of GWA across...

  13. Akzo Nobel Morris Plant Implements a Site-Wide Energy Efficiency Plan

    Energy Technology Data Exchange (ETDEWEB)

    None

    2003-01-01

    Akzo Nobel's Surface Chemistry plant in Morris, Illinois, implemented an energy efficiency plan, which included a plant-wide energy efficiency assessment. The assessment revealed opportunities to save an estimated $1.2 million per year in operating and energy costs, reduce environmental impacts, and improve production capacity.

  14. Teacher Well-Being and the Implementation of School-Wide Positive Behavior Interventions and Supports

    Science.gov (United States)

    Ross, Scott W.; Romer, Natalie; Horner, Robert H.

    2012-01-01

    Teacher well-being has become a major issue in the United States with increasing diversity and demands across classrooms and schools. With this in mind, the current study analyzed the relationship between outcomes of teacher well-being, including burnout and efficacy, and the implementation of School-Wide Positive Behavioral Interventions and…

  15. Building Nation-Wide Information Infrastructures in Healthcare through Modular Implementation Strategies

    DEFF Research Database (Denmark)

    Aanestad, Margunn; Jensen, Tina Blegind

    2011-01-01

    and Lyytinen, to contrast the organization and implementation strategies of the two projects. Our findings highlight how implementation strategies differ with respect to how stakeholders are mobilized. We argue that the realization of nation-wide IIs for healthcare not only requires a gradual transition......Initiatives that seek to realize the vision of nation-wide information infrastructures (II) in healthcare have often failed to achieve their goals. In this paper, we focus on approaches used to plan, conduct, and manage the realization of such visions. Our empirical material describes two Danish...... initiatives, where a national project failed to deliver interoperable Electronic Patient Record (EPR) systems while a small, local solution grew and now offers a nation-wide solution for sharing patient record information. We apply II theory, specifically the five design principles proposed by Hanseth...

  16. Resource utilization after implementing a hospital-wide standardized feeding tube placement pathway.

    Science.gov (United States)

    Richards, Morgan K; Li, Christopher I; Foti, Jeffrey L; Leu, Michael G; Wahbeh, Ghassan T; Shaw, Dennis; Libby, Arlene K; Melzer, Lilah; Goldin, Adam B

    2016-10-01

    Children requiring gastrostomy/gastrojejunostomy tubes (GT/GJ) are heterogeneous and medically complex patients with high resource utilization. We created and implemented a hospital-wide standardized pathway for feeding device placement. This study compares hospital resource utilization before and after pathway implementation. We performed a retrospective cohort study comparing outcomes through one year of follow-up for consecutive groups of children undergoing GT/GJ placement prepathway (n=298, 1/1/2010-12/31/2011) and postpathway (n=140, 6/1/2013-7/31/2014) implementation. We determined the change in the rate of hospital resource utilization events and time to first event. Prior to implementation, 145 (48.7%) devices were placed surgically, 113 (37.9%) endoscopically and 40 (13.4%) using image guidance. After implementation, 102 (72.9%) were placed surgically, 23 (16.4%) endoscopically and 15 (10.7%) using image guidance. Prior to implementation, 174/298 (58.4%) patients required additional hospital resource utilization compared to 60/143 (42.0%) corresponding to a multivariate adjusted 38% reduced risk of a subsequent feeding tube related event. Care of tube-feeding dependent patients is spread among multiple specialists leading to variability in the preoperative workup, intraoperative technique and postoperative care. Our study shows an association between implementation of a standardized pathway and a decrease in hospital resource utilization. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Implementation of Community-Wide Teen Pregnancy Prevention Initiatives: Focus on Partnerships.

    Science.gov (United States)

    Tevendale, Heather D; Fuller, Taleria R; House, L Duane; Dee, Deborah L; Koumans, Emilia H

    2017-03-01

    Seeking to reduce teen pregnancy and births in communities with rates above the national average, the Centers for Disease Control and Prevention, in partnership with the U.S. Department of Health and Human Services Office of Adolescent Health Teen Pregnancy Prevention Program, developed a joint funding opportunity through which grantees worked to implement and test an approach involving community-wide teen pregnancy prevention initiatives. Once these projects had been in the field for 2.5 years, Centers for Disease Control and Prevention staff developed plans for a supplemental issue of the Journal of Adolescent Health to present findings from and lessons learned during implementation of the community-wide initiatives. When the articles included in the supplemental issue are considered together, common themes emerge, particularly those related to initiating, building, and maintaining strong partnerships. Themes seen across articles include the importance of (1) sharing local data with partners to advance initiative implementation, (2) defining partner roles from the beginning of the initiatives, (3) developing teams that include community partners to provide direction to the initiatives, and (4) addressing challenges to maintaining strong partnerships including partner staff turnover and delays in implementation.

  18. Disparities in intimate partner violence prenatal counseling: setting a baseline for the implementation of the Guidelines for Women's Preventive Services.

    Science.gov (United States)

    Ta Park, Van M; Hayes, Donald K; Humphreys, Janice

    2014-05-01

    Prenatal health care counseling is associated with positive health outcomes for mothers and infants. Moreover, pregnant women are considered a vulnerable population at risk of being victims of intimate partner violence. Pregnancy provides a unique opportunity to identify and refer women experiencing intimate partner violence to community resources; however, in prior research, most women reported that their prenatal care providers did not talk to them about intimate partner violence. Given the importance for providers to offer prenatal health care counseling on intimate partner violence, it is concerning that there is scant knowledge on Asian, Native Hawaiian, and other Pacific Islander mothers' experiences in this area. The study's objectives were (a) to determine the proportion of mothers who received prenatal health care counseling on intimate partner violence; and, (b) to examine racial differences of those who received prenatal health care counseling on intimate partner violence. Hawai'i's Pregnancy Risk Assessment Monitoring System (PRAMS) data from 2004-08 were analyzed for 8,120 mothers with information on receipt of intimate partner violence prenatal health care counseling. Overall, 47.7% of mothers were counseled on intimate partner violence. Compared to Whites, Native Hawaiians, Japanese, Chinese, and Koreans were significantly less likely to report receiving prenatal health care counseling in intimate partner violence, but the opposite association was observed for Samoans. Intimate partner violence continues to be a significant problem for women, thus, this study's findings may be used as important baseline data to measure the progress made given the implementation of the new Guidelines for Women's Preventive Services in intimate partner violence screening and counseling.

  19. A cluster randomized trial of routine HIV-1 viral load monitoring in Zambia: study design, implementation, and baseline cohort characteristics.

    Directory of Open Access Journals (Sweden)

    John R Koethe

    Full Text Available BACKGROUND: The benefit of routine HIV-1 viral load (VL monitoring of patients on antiretroviral therapy (ART in resource-constrained settings is uncertain because of the high costs associated with the test and the limited treatment options. We designed a cluster randomized controlled trial to compare the use of routine VL testing at ART-initiation and at 3, 6, 12, and 18 months, versus our local standard of care (which uses immunological and clinical criteria to diagnose treatment failure, with discretionary VL testing when the two do not agree. METHODOLOGY: Dedicated study personnel were integrated into public-sector ART clinics. We collected participant information in a dedicated research database. Twelve ART clinics in Lusaka, Zambia constituted the units of randomization. Study clinics were stratified into pairs according to matching criteria (historical mortality rate, size, and duration of operation to limit the effect of clustering, and independently randomized to the intervention and control arms. The study was powered to detect a 36% reduction in mortality at 18 months. PRINCIPAL FINDINGS: From December 2006 to May 2008, we completed enrollment of 1973 participants. Measured baseline characteristics did not differ significantly between the study arms. Enrollment was staggered by clinic pair and truncated at two matched sites. CONCLUSIONS: A large clinical trial of routing VL monitoring was successfully implemented in a dynamic and rapidly growing national ART program. Close collaboration with local health authorities and adequate reserve staff were critical to success. Randomized controlled trials such as this will likely prove valuable in determining long-term outcomes in resource-constrained settings. TRIAL REGISTRATION: Clinicaltrials.gov NCT00929604.

  20. The relationship between baseline Organizational Readiness to Change Assessment subscale scores and implementation of hepatitis prevention services in substance use disorders treatment clinics: a case study

    Directory of Open Access Journals (Sweden)

    Hagedorn Hildi J

    2010-06-01

    Full Text Available Abstract Background The Organizational Readiness to Change Assessment (ORCA is a measure of organizational readiness for implementing practice change in healthcare settings that is organized based on the core elements and sub-elements of the Promoting Action on Research Implementation in Health Services (PARIHS framework. General support for the reliability and factor structure of the ORCA has been reported. However, no published study has examined the utility of the ORCA in a clinical setting. The purpose of the current study was to examine the relationship between baseline ORCA scores and implementation of hepatitis prevention services in substance use disorders (SUD clinics. Methods Nine clinic teams from Veterans Health Administration SUD clinics across the United States participated in a six-month training program to promote evidence-based practices for hepatitis prevention. A representative from each team completed the ORCA evidence and context subscales at baseline. Results Eight of nine clinics reported implementation of at least one new hepatitis prevention practice after completing the six-month training program. Clinic teams were categorized by level of implementation-high (n = 4 versus low (n = 5-based on how many hepatitis prevention practices were integrated into their clinics after completing the training program. High implementation teams had significantly higher scores on the patient experience and leadership culture subscales of the ORCA compared to low implementation teams. While not reaching significance in this small sample, high implementation clinics also had higher scores on the research, clinical experience, staff culture, leadership behavior, and measurement subscales as compared to low implementation clinics. Conclusions The results of this study suggest that the ORCA was able to measure differences in organizational factors at baseline between clinics that reported high and low implementation of practice

  1. An implementation and test platform for wide area stability assessment methods

    DEFF Research Database (Denmark)

    Wittrock, Martin Lindholm; Jóhannsson, Hjörtur

    2013-01-01

    This paper presents a software platform developed in MatLab with the purpose of supporting research, Development and testing of wide area algorithms for stability assessment and control. The development and testing process of algorithms exploiting real time wide area data from Phasor Measurement...... Units (PMU) can be very time consuming, especially if the testing procedure is not carried out in a systematic and automatic manner. The test platform overcomes this problem by automatically importing system model parameters, topology and simulation output from a time domain simulation of an instability...... scenario and automatically generating synthetic PMU snapshots of the system conditions. To demonstrate the platform’s potential for supporting research and development of wide area algorithms, a method to detect voltage instability is implemented and tested, giving results consistent with results from...

  2. Implementing ADM1 for plant-wide benchmark simulations in Matlab/Simulink.

    Science.gov (United States)

    Rosen, C; Vrecko, D; Gernaey, K V; Pons, M N; Jeppsson, U

    2006-01-01

    The IWA Anaerobic Digestion Model No.1 (ADM1) was presented in 2002 and is expected to represent the state-of-the-art model within this field in the future. Due to its complexity the implementation of the model is not a simple task and several computational aspects need to be considered, in particular if the ADM1 is to be included in dynamic simulations of plant-wide or even integrated systems. In this paper, the experiences gained from a Matlab/Simulink implementation of ADM1 into the extended COST/IWA Benchmark Simulation Model (BSM2) are presented. Aspects related to system stiffness, model interfacing with the ASM family, mass balances, acid-base equilibrium and algebraic solvers for pH and other troublesome state variables, numerical solvers and simulation time are discussed. The main conclusion is that if implemented properly, the ADM1 will also produce high-quality results in dynamic plant-wide simulations including noise, discrete sub-systems, etc. without imposing any major restrictions due to extensive computational efforts.

  3. A new wide range Euclidean distance circuit for neural network hardware implementations.

    Science.gov (United States)

    Gopalan, A; Titus, A H

    2003-01-01

    In this paper, we describe an analog very large-scale integration (VLSI) implementation of a wide range Euclidean distance computation circuit - the key element of many synapse circuits. This circuit is essentially a wide-range absolute value circuit that is designed to be as small as possible (80 /spl times/ 76 /spl mu/m) in order to achieve maximum synapse density while maintaining a wide range of operation (0.5 to 4.5 V) and low power consumption (less than 200 /spl mu/W). The circuit has been fabricated in 1.5-/spl mu/m technology through MOSIS. We present simulated and experimental results of the circuit, and compare these results. Ultimately, this circuit is intended for use as part of a high-density hardware implementation of a self-organizing map (SOM). We describe how this circuit can be used as part of the SOM and how the SOM is going to be used as part of a larger bio-inspired vision system based on the octopus visual system.

  4. Implementing meta-analysis from genome-wide association studies for pork quality traits.

    Science.gov (United States)

    Bernal Rubio, Y L; Gualdrón Duarte, J L; Bates, R O; Ernst, C W; Nonneman, D; Rohrer, G A; King, D A; Shackelford, S D; Wheeler, T L; Cantet, R J C; Steibel, J P

    2015-12-01

    Pork quality plays an important role in the meat processing industry. Thus, different methodologies have been implemented to elucidate the genetic architecture of traits affecting meat quality. One of the most common and widely used approaches is to perform genome-wide association (GWA) studies. However, a limitation of many GWA in animal breeding is the limited power due to small sample sizes in animal populations. One alternative is to implement a meta-analysis of GWA (MA-GWA) combining results from independent association studies. The objective of this study was to identify significant genomic regions associated with meat quality traits by performing MA-GWA for 8 different traits in 3 independent pig populations. Results from MA-GWA were used to search for genes possibly associated with the set of evaluated traits. Data from 3 pig data sets (U.S. Meat Animal Research Center, commercial, and Michigan State University Pig Resource Population) were used. A MA was implemented by combining -scores derived for each SNP in every population and then weighting them using the inverse of estimated variance of SNP effects. A search for annotated genes retrieved genes previously reported as candidates for shear force (calpain-1 catalytic subunit [] and calpastatin []), as well as for ultimate pH, purge loss, and cook loss (protein kinase, AMP-activated, γ 3 noncatalytic subunit []). In addition, novel candidate genes were identified for intramuscular fat and cook loss (acyl-CoA synthetase family member 3 mitochondrial []) and for the objective measure of muscle redness, CIE a* (glycogen synthase 1, muscle [] and ferritin, light polypeptide []). Thus, implementation of MA-GWA allowed integration of results for economically relevant traits and identified novel genes to be tested as candidates for meat quality traits in pig populations.

  5. Automated localisation of Mars rovers using co-registered HiRISE-CTX-HRSC orthorectified images and wide baseline Navcam orthorectified mosaics

    Science.gov (United States)

    Tao, Yu; Muller, Jan-Peter; Poole, William

    2016-12-01

    We present a wide range of research results in the area of orbit-to-orbit and orbit-to-ground data fusion, achieved within the EU-FP7 PRoVisG project and EU-FP7 PRoViDE project. We focus on examples from three Mars rover missions, i.e. MER-A/B and MSL, to provide examples of a new fully automated offline method for rover localisation. We start by introducing the mis-registration discovered between the current HRSC and HiRISE datasets. Then we introduce the HRSC to CTX and CTX to HiRISE co-registration workflow. Finally, we demonstrate results of wide baseline stereo reconstruction with fixed mast position rover stereo imagery and its application to ground-to-orbit co-registration with HiRISE orthorectified image. We show examples of the quantitative assessment of recomputed rover traverses, and extensional exploitation of the co-registered datasets in visualisation and within an interactive web-GIS.

  6. Implementing an organization-wide quality improvement initiative: insights from project leads, managers, and frontline nurses.

    Science.gov (United States)

    Jeffs, Lianne P; Lo, Joyce; Beswick, Susan; Campbell, Heather

    2013-01-01

    With the movement to advance quality care and improve health care outcomes, organizations have increasingly implemented quality improvement (QI) initiatives to meet these requirements. Key to implementation success is the multilevel involvement of frontline clinicians and leadership. To explore the perceptions and experiences of frontline nurses, project leads, and managers associated with an organization-wide initiative aimed at engaging nurses in quality improvement work. To address the aims of this study, a qualitative research approach was used. Two focus groups were conducted with a total of 13 nurse participants, and individual interviews were done with 10 managers and 6 project leads. Emergent themes from the interview data included the following: improving care in a networked approach; driving QI and having a sense of pride; and overcoming challenges. Specifically, our findings elucidate the value of communities of practice and ongoing mentorship for nurses as key strategies to acquire and apply QI knowledge to a QI project on their respective units. Key challenges emerged including workload and time constraints, as well as resistance to change from staff. Our study findings suggest that leaders need to provide learning opportunities and protected time for frontline nurses to participate in QI projects.

  7. Hardware Implementation of an Automatic Rendering Tone Mapping Algorithm for a Wide Dynamic Range Display

    Directory of Open Access Journals (Sweden)

    Orly Yadid-Pecht

    2013-10-01

    Full Text Available Tone mapping algorithms are used to adapt captured wide dynamic range (WDR scenes to the limited dynamic range of available display devices. Although there are several tone mapping algorithms available, most of them require manual tuning of their rendering parameters. In addition, the high complexities of some of these algorithms make it difficult to implement efficient real-time hardware systems. In this work, a real-time hardware implementation of an exponent-based tone mapping algorithm is presented. The algorithm performs a mixture of both global and local compression on colored WDR images. An automatic parameter selector has been proposed for the tone mapping algorithm in order to achieve good tone-mapped images without manual reconfiguration of the algorithm for each WDR image. Both algorithms are described in Verilog and synthesized for a field programmable gate array (FPGA. The hardware architecture employs a combination of parallelism and system pipelining, so as to achieve a high performance in power consumption, hardware resources usage and processing speed. Results show that the hardware architecture produces images of good visual quality that can be compared to software-based tone mapping algorithms. High peak signal-to-noise ratio (PSNR and structural similarity (SSIM scores were obtained when the results were compared with output images obtained from software simulations using MATLAB.

  8. Framework and implementation of a continuous network-wide health monitoring system for roadways

    Science.gov (United States)

    Wang, Ming; Birken, Ralf; Shahini Shamsabadi, Salar

    2014-03-01

    According to the 2013 ASCE report card America's infrastructure scores only a D+. There are more than four million miles of roads (grade D) in the U.S. requiring a broad range of maintenance activities. The nation faces a monumental problem of infrastructure management in the scheduling and implementation of maintenance and repair operations, and in the prioritization of expenditures within budgetary constraints. The efficient and effective performance of these operations however is crucial to ensuring roadway safety, preventing catastrophic failures, and promoting economic growth. There is a critical need for technology that can cost-effectively monitor the condition of a network-wide road system and provide accurate, up-to-date information for maintenance activity prioritization. The Versatile Onboard Traffic Embedded Roaming Sensors (VOTERS) project provides a framework and the sensing capability to complement periodical localized inspections to continuous network-wide health monitoring. Research focused on the development of a cost-effective, lightweight package of multi-modal sensor systems compatible with this framework. An innovative software infrastructure is created that collects, processes, and evaluates these large time-lapse multi-modal data streams. A GIS-based control center manages multiple inspection vehicles and the data for further analysis, visualization, and decision making. VOTERS' technology can monitor road conditions at both the surface and sub-surface levels while the vehicle is navigating through daily traffic going about its normal business, thereby allowing for network-wide frequent assessment of roadways. This deterioration process monitoring at unprecedented time and spatial scales provides unique experimental data that can be used to improve life-cycle cost analysis models.

  9. School-Wide Positive Behavioral Interventions and Supports: A Snapshot of Implementation in Schools Serving Students with Significant Disabilities

    Science.gov (United States)

    Schelling, Amy L.; Harris, Monica L.

    2016-01-01

    Implementation of school-wide positive behavioral interventions and supports (SWPBIS) in K-12 schools is well documented in the literature. However, far less documentation can be found in the literature related to its implementation with students with significant intellectual and other developmental disabilities being served in either typical or…

  10. Scaling-up from an implementation trial to state-wide coverage: results from the preliminary Melbourne Diabetes Prevention Study

    Directory of Open Access Journals (Sweden)

    Janus Edward D

    2012-08-01

    Full Text Available Abstract Background The successful Greater Green Triangle Diabetes Prevention Program (GGT DPP, a small implementation trial, has been scaled-up to the Victorian state-wide ‘Life!’ programme with over 10,000 individuals enrolled. The Melbourne Diabetes Prevention Study (MDPS is an evaluation of the translation from the GGT DPP to the Life! programme. We report results from the preliminary phase (pMDPS of this evaluation. Methods The pMDPS is a randomised controlled trial with 92 individuals aged 50 to 75 at high risk of developing type 2 diabetes randomised to Life! or usual care. Intervention consisted of six structured 90-minute group sessions: five fortnightly sessions and the final session at 8 months. Participants underwent anthropometric and laboratory tests at baseline and 12 months, and provided self-reported psychosocial, dietary, and physical activity measures. Intervention group participants additionally underwent these tests at 3 months. Paired t tests were used to analyse within-group changes over time. Chi-square tests were used to analyse differences between groups in goals met at 12 months. Differences between groups for changes over time were tested with generalised estimating equations and analysis of covariance. Results Intervention participants significantly improved at 12 months in mean body mass index (−0.98 kg/m2, standard error (SE = 0.26, weight (−2.65 kg, SE = 0.72, waist circumference (−7.45 cm, SE = 1.15, and systolic blood pressure (−3.18 mmHg, SE = 1.26, increased high-density lipoprotein-cholesterol (0.07 mmol/l, SE = 0.03, reduced energy from total (−2.00%, SE = 0.78 and saturated fat (−1.54%, SE = 0.41, and increased fibre intake (1.98 g/1,000 kcal energy, SE = 0.47. In controls, oral glucose at 2 hours deteriorated (0.59 mmol/l, SE = 0.27. Only waist circumference reduced significantly (−4.02 cm, SE = 0.95. Intervention participants significantly

  11. Implementation of interval walking training in patients with type 2 diabetes in Denmark: rationale, design, and baseline characteristics

    Directory of Open Access Journals (Sweden)

    Ried-Larsen M

    2016-06-01

    Full Text Available Mathias Ried-Larsen,1–3 Reimar W Thomsen,2,4 Klara Berencsi,4 Cecilie F Brinkløv,1,5 Charlotte Brøns,1,5 Laura S Valentiner,1,6 Kristian Karstoft,1,3 Henning Langberg,1,6 Allan A Vaag,1,2,5 Bente K Pedersen,1,3 Jens S Nielsen7 1Department of Infectious Diseases, Centre for Physical Activity Research, Rigshospitalet, University of Copenhagen, Copenhagen, 2The Danish Diabetes Academy, Odense University Hospital, Odense, 3Department of Infectious Diseases, Centre of Inflammation and Metabolism, Rigshospitalet, University of Copenhagen, Copenhagen, 4Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus Nord, 5Department of Endocrinology (Diabetes and Metabolism, Rigshospitalet, University of Copenhagen, 6CopenRehab, Department of Public Health, Section of Social Medicine, University of Copenhagen, Copenhagen, 7Department of Endocrinology, Odense University Hospital, Odense, Denmark Abstract: Promoting physical activity is a first-line choice of treatment for patients with type 2 diabetes (T2D. However, there is a need for more effective tools and technologies to facilitate structured lifestyle interventions and to ensure a better compliance, sustainability, and health benefits of exercise training in patients with T2D. The InterWalk initiative and its innovative application (app for smartphones described in this study were developed by the Danish Centre for Strategic Research in T2D aiming at implementing, testing, and validating interval walking in patients with T2D in Denmark. The interval walking training approach consists of repetitive 3-minute cycles of slow and fast walking with simultaneous intensity guiding, based on the exercise capacity of the user. The individual intensity during slow and fast walking is determined by a short initial self-conducted and audio-guided fitness test, which combined with automated audio instructions strives to motivate the individual to adjust the intensity to the predetermined

  12. Principal Leadership and School Culture with a School-Wide Implementation of Professional Crisis Management: A Redemptive v. Punitive Model

    Science.gov (United States)

    Adams, Mark Thomas

    2013-01-01

    This qualitative study investigated the nature of the relationship between principal leadership and school culture within a school-wide implementation of Professional Crisis Management (PCM). PCM is a comprehensive and fully integrated system designed to manage crisis situations effectively, safely, and with dignity. While designed primarily to…

  13. Implementation of a children's hospital-wide central venous catheter insertion and maintenance bundle

    NARCIS (Netherlands)

    K. Helder MScN (Onno); R.F. Kornelisse (René); C. van der Starre (Cynthia); D. Tibboel (Dick); C.W.N. Looman (Caspar); R.M.H. Wijnen (René); M.J. Poley (Marten); E. Ista (Erwin)

    2013-01-01

    textabstractBackground: Central venous catheter-associated bloodstream infections in children are an increasingly recognized serious safety problem worldwide, but are often preventable. Central venous catheter bundles have proved effective to prevent such infections. Successful implementation requir

  14. Migrating to Moodle: A Case Study Regarding a Department-Wide Implementation

    DEFF Research Database (Denmark)

    Konstantinidis, Andreas; Papadopoulos, Pantelis M.; Tsiatsos, Thrasyvoulos

    2008-01-01

    systems. Our main goal was to implement a single, easy-to-operate, easy-to-maintain system, able to support students’ and instructors’ needs in all the courses. Further-more, we present data which describe the pilot study of the Moodle implementation for the first semester and make evident the success......This paper presents the rationale behind the utilization of the Moodle Learning Management System for blended learning in our Informatics Department and examines the steps followed, to replace the prior decentralized course organizational structure which consisted of a multitude of different...

  15. Implementing a Cost Effectiveness Analyzer for Web-Supported Academic Instruction: A Campus Wide Analysis

    Science.gov (United States)

    Cohen, Anat; Nachmias, Rafi

    2009-01-01

    This paper describes the implementation of a quantitative cost effectiveness analyzer for Web-supported academic instruction that was developed in Tel Aviv University during a long term study. The paper presents the cost effectiveness analysis of Tel Aviv University campus. Cost and benefit of 3,453 courses were analyzed, exemplifying campus-wide…

  16. Business-driven IT-wide agile (Scrum) and Kanban (lean) implementation an action guide for business and IT leaders

    CERN Document Server

    Pham, Andrew Thu

    2012-01-01

    Business-Driven IT-Wide Agile (Scrum) and Kanban (Lean) Implementation: An Action Guide for Business and IT Leaders explains how to increase IT delivery capabilities through the use of Agile and Kanban. Factoring in constant change, communication, a sense of urgency, clear and measurable goals, political realities, and infrastructure needs, it covers all the ingredients required for success. Using real-world examples, this practical guide illustrates how to implement Agile and Kanban in software project management and development across the entire IT department. To make things easier for busy

  17. Wide Area Real Time Kinematic (WARTK): Usage of RTCM format, and real-time implementation

    OpenAIRE

    Valls Moreno, Angel

    2008-01-01

    Nowadays GNSS1 enhancement techniques have reached a relevant impor- tance and are present in a wide variety of applications. During the last years the Research Group of Astronomy and Geomatics (gAGE) from the Tech- nical University of Catalonia (UPC) has developed and tested new satellite navigation techniques which allow the extension of local services based on the real-time carrier phase ambiguity resolution2 to wide area scale (i.e., base- lines between the rover and reference stations gr...

  18. Baseline Assessment of Campus-Wide General Health Status and Mental Health: Opportunity for Tailored Suicide Prevention and Mental Health Awareness Programming

    Science.gov (United States)

    Hawley, Lisa D.; MacDonald, Michael G.; Wallace, Erica H.; Smith, Julia; Wummel, Brian; Wren, Patricia A.

    2016-01-01

    Objective: A campus-wide assessment examined the physical and mental health status of a midsize midwestern public university. Participants: Two thousand and forty-nine students, faculty, and staff on a single college campus were assessed in March-April 2013. Methods: Participants completed an online survey with sections devoted to demographics,…

  19. Development, Implementation, and Sustainability of Comprehensive School-Wide Behavior Management Systems.

    Science.gov (United States)

    Rosenberg, Michael S.; Jackman, Lori A.

    2003-01-01

    This article describes the PAR (Preventing, Acting upon, and Resolving) Comprehensive Behavior Management System, a process-based model in which collaborative teams work together to form consensus on a positive and supportive school-wide approach to behavior management. It highlights the content and processes used to introduce and sustain the…

  20. Baseline rationing

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    The standard problem of adjudicating conflicting claims describes a situation in which a given amount of a divisible good has to be allocated among agents who hold claims against it exceeding the available amount. This paper considers more general rationing problems in which, in addition to claims......, there exist baselines (to be interpreted as objective entitlements, ideal targets, or past consumption) that might play an important role in the allocation process. The model we present is able to accommodate real-life rationing situations, ranging from resource allocation in the public health care sector...... to international protocols for the reduction of greenhouse emissions, or water distribution in drought periods. We define a family of allocation methods for such general rationing problems - called baseline rationing rules - and provide an axiomatic characterization for it. Any baseline rationing rule within...

  1. Implementation of a hypertext-based curriculum for emergency medicine on the World Wide Web.

    Science.gov (United States)

    Savitt, D L; Steele, D W

    1997-12-01

    This project reports the publication of a variety of existing curricular resources for emergency medicine on the global Internet in a format that allows hypertext links between related material, timely updates, and end-user feedback. Curricular elements were converted to Hypertext Markup Language with extensive links between related content. The completed document contains instructions for curriculum development, specific curricula for subspecialty areas within a residency, reading lists for subspecialty curricula, banks of images, and board-type questions with answers. Users are provided with a mechanism to provide immediate feedback to section editors with suggestions for changes, including new references. Access to all or part of the document can be controlled via passwords, but is potentially available to anyone with an Internet connection and a World Wide Web browser. The document may by viewed on the World Wide Web at: http:@www.brown.edu@Administration@emergency_Medicine@ curr.html.

  2. User Interface on the World Wide Web: How to Implement a Multi-Level Program Online

    Science.gov (United States)

    Cranford, Jonathan W.

    1995-01-01

    The objective of this Langley Aerospace Research Summer Scholars (LARSS) research project was to write a user interface that utilizes current World Wide Web (WWW) technologies for an existing computer program written in C, entitled LaRCRisk. The project entailed researching data presentation and script execution on the WWW and than writing input/output procedures for the database management portion of LaRCRisk.

  3. HIPAA and information security risk: implementing an enterprise-wide risk management strategy

    Science.gov (United States)

    Alberts, Christopher J.; Dorofee, Audrey

    2001-08-01

    The Health Insurance Portability and Accountability Act (HIPAA) of 1996 effectively establishes a standard of due care for healthcare information security. One of the challenges of implementing policies, procedures, and practices consistent with HIPAA requirements in the Department of Defense Military Health System is the need for a method that can tailor the requirements to a variety of organizational contexts. This paper will describe a self- directed information security risk evaluation that will enable military healthcare providers to assess their risks and to develop mitigation strategies consistent with HIPAA guidelines.

  4. Systems, computer-implemented methods, and tangible computer-readable storage media for wide-field interferometry

    Science.gov (United States)

    Lyon, Richard G. (Inventor); Leisawitz, David T. (Inventor); Rinehart, Stephen A. (Inventor); Memarsadeghi, Nargess (Inventor)

    2012-01-01

    Disclosed herein are systems, computer-implemented methods, and tangible computer-readable storage media for wide field imaging interferometry. The method includes for each point in a two dimensional detector array over a field of view of an image: gathering a first interferogram from a first detector and a second interferogram from a second detector, modulating a path-length for a signal from an image associated with the first interferogram in the first detector, overlaying first data from the modulated first detector and second data from the second detector, and tracking the modulating at every point in a two dimensional detector array comprising the first detector and the second detector over a field of view for the image. The method then generates a wide-field data cube based on the overlaid first data and second data for each point. The method can generate an image from the wide-field data cube.

  5. Implementation of case management to reduce cardiovascular disease risk in the Stanford and San Mateo Heart to Heart randomized controlled trial: study protocol and baseline characteristics

    Directory of Open Access Journals (Sweden)

    Stafford Randall S

    2006-09-01

    Full Text Available Abstract Background Case management has emerged as a promising alternative approach to supplement traditional one-on-one sessions between patients and doctors for improving the quality of care in chronic diseases such as coronary heart disease (CHD. However, data are lacking in terms of its efficacy and cost-effectiveness when implemented in ethnic and low-income populations. Methods The Stanford and San Mateo Heart to Heart (HTH project is a randomized controlled clinical trial designed to rigorously evaluate the efficacy and cost-effectiveness of a multi-risk cardiovascular case management program in low-income, primarily ethnic minority patients served by a local county health care system in California. Randomization occurred at the patient level. The primary outcome measure is the absolute CHD risk over 10 years. Secondary outcome measures include adherence to guidelines on CHD prevention practice. We documented the study design, methodology, and baseline sociodemographic, clinical and lifestyle characteristics of 419 participants. Results We achieved equal distributions of the sociodemographic, biophysical and lifestyle characteristics between the two randomization groups. HTH participants had a mean age of 56 years, 63% were Latinos/Hispanics, 65% female, 61% less educated, and 62% were not employed. Twenty percent of participants reported having a prior cardiovascular event. 10-year CHD risk averaged 18% in men and 13% in women despite a modest low-density lipoprotein cholesterol level and a high on-treatment percentage at baseline. Sixty-three percent of participants were diagnosed with diabetes and an additional 22% had metabolic syndrome. In addition, many participants had depressed high-density lipoprotein (HDL cholesterol levels and elevated values of total cholesterol-to-HDL ratio, triglycerides, triglyceride-to-HDL ratio, and blood pressure. Furthermore, nearly 70% of participants were obese, 45% had a family history of CHD or

  6. Organized nation-wide implementation of sentinel lymph node biopsy in Denmark

    DEFF Research Database (Denmark)

    Friis, E.; Galatius, H.; Garne, J.P.

    2008-01-01

    Prior to the initiation of a nationwide study of the sentinel node staging technique the Danish Breast Cancer Cooperative Group (DBCG) defined a set of minimum requirements to be met by surgical departments before they could include patients in the study. The requirements specified a minimum...... patient load in the individual surgical unit, a minimum surgical training in the sentinel node biopsy technique and a minimum quality outcome in a validating learning series of SNLB procedures. A working group assisted departments in meeting these terms and later audited and certified departments before...... they could include patients into the study. As a result of this strategy the sentinel lymph node staging was fully implemented in all Danish surgical breast cancer centres within three years and all sentinel node biopsies in the period were recorded in the DBCG data centre. Furthermore, the strategy...

  7. Implementing a Nation-Wide Mental Health Care Reform: An Analysis of Stakeholders' Priorities.

    Science.gov (United States)

    Lorant, Vincent; Grard, Adeline; Nicaise, Pablo

    2016-04-01

    Belgium has recently reformed its mental health care delivery system with the goals to strengthen the community-based supply of care, care integration, and the social rehabilitation of users and to reduce the resort to hospitals. We assessed whether these different reform goals were endorsed by stakeholders. One-hundred and twenty-two stakeholders ranked, online, eighteen goals of the reform according to their priorities. Stakeholders supported the goals of social rehabilitation of users and community care but were reluctant to reduce the resort to hospitals. Stakeholders were averse to changes in treatment processes, particularly in relation to the reduction of the resort to hospitals and mechanisms for more care integration. Goals heterogeneity and discrepancies between stakeholders' perspectives and policy priorities are likely to produce an uneven implementation of the reform process and, hence, reduce its capacity to achieve the social rehabilitation of users.

  8. Nature-based flood risk management -challenges in implementing catchment-wide management concepts

    Science.gov (United States)

    Thaler, Thomas; Fuchs, Sven

    2017-04-01

    Traditionally, flood risk management focused on coping with the flow at a given point by, for example, building dikes or straightening the watercourse. Increasingly the emphasis has shifted to measures within the flood plain to delay the flow through storage. As such the fluent boundaries imposed by the behaviour of the catchment at a certain point are relocated upstream by the human intervention. Therefore, the implementation of flood storages and the use of natural retention areas are promoted as mitigation measures to support sustainable flood risk management. They aimed at reducing the effluent boundaries on the floodplain by increasing the effluent boundaries upstream. However, beyond the simple change of practices it is indeed often a question of land use change which is at stake in water management. As such, it poses the questions on how to govern both water and land to satisfy the different stakeholders. Nature-based strategies often follow with voluntary agreements, which are promoted as an alternative instrument to the traditional top-down command and control regulation. Voluntary agreements aim at bringing more efficiency, participatory and transparency in solving problems between different social groups. In natural hazard risk management voluntary agreements are now receiving high interests to complement the existing policy instruments in order to achieve the objectives the EU WFD and of the Floods Directive. This paper investigates the use of voluntary agreements as an alternative instrument to the traditional top-down command and control regulation in the implementation of flood storages in Austria. The paper provides a framework of analysis to reveal barriers and opportunities associated with such approach. The paper concludes that institution and power are the central elements to tackle for allowing the success of voluntary agreement.

  9. System-wide electrification and appropriate functions of tractor and implement

    Directory of Open Access Journals (Sweden)

    Sebastian Tetzlaff

    2015-10-01

    Full Text Available The advantages of electric drive technology in industrial applications have been known for a long time. In addition to the flexibility and variability for the system integration, the very good controllability and the overload capacity should be mentioned. To increase the effectiveness of agricultural machinery and equipment crucially, the different electrical/electronic systems, drives and functions have to be interconnected machine internally and also externally, based on a system-wide approach. Thereby single machinery, machinery combinations and finally complete harvest chains can be used in a smarter and more efficient way. Using the example of a tractor-swather combination the suitability of electric drives itself and of the hybrid and interface concept is proven. Newly developed functions for overload protection and prediction of the working process are presented and their integration into the machine overarching energy and operational management is described. The transferability of the results and solutions to cognate applications is ensured. Keywords

  10. Modeling and Implementation of Omnidirectional Soccer Robot with Wide Vision Scope Applied in Robocup-MSL

    Directory of Open Access Journals (Sweden)

    Mohsen Taheri

    2010-04-01

    Full Text Available The purpose of this paper is to design and implement a middle size soccer robot to conform RoboCup MSL league. First, according to the rules of RoboCup, we design the middle size soccer robot, The proposed autonomous soccer robot consists of the mechanical platform, motion control module, omni-directional vision module, front vision module, image processing and recognition module, investigated target object positioning and real coordinate reconstruction, robot path planning, competition strategies, and obstacle avoidance. And this soccer robot equips the laptop computer system and interface circuits to make decisions. In fact, the omnidirectional vision sensor of the vision system deals with the image processing and positioning for obstacle avoidance and
    target tracking. The boundary-following algorithm (BFA is applied to find the important features of the field. We utilize the sensor data fusion method in the control system parameters, self localization and world modeling. A vision-based self-localization and the conventional odometry
    systems are fused for robust selflocalization. The localization algorithm includes filtering, sharing and integration of the data for different types of objects recognized in the environment. In the control strategies, we present three state modes, which include the Attack Strategy, Defense Strategy and Intercept Strategy. The methods have been tested in the many Robocup competition field middle size robots.

  11. Institutional wide implementation of key advice for socially inclusive teaching in higher education. A Practice Report

    Directory of Open Access Journals (Sweden)

    Lisa Thomas

    2014-03-01

    Full Text Available Government policy and institutional initiatives have influenced increases in enrolment of non-traditional students to Australian universities. For these students, university culture is often incongruent with their own, making it difficult to understand the tacit requirements for participation and success. Academic teaching staff are important in creating socially inclusive learning experiences, particularly in first year subjects. This paper presents an institution-wide approach to enhancing socially inclusive teaching at one Australian university. Underpinned by a framework of ”bridging social-incongruity” the initiative was guided by six principles of socially inclusive teaching to support practice as proposed in the 2012 “Effective support of students from low socioeconomic backgrounds in higher education” report commissioned by the Australian Office of Learning and Teaching. Feedback from 150 academic teaching staff from various disciplines and campus locations, suggests this initiative was effective in increasing understanding of socially inclusive teaching practices with many participants indicating the teaching enhancements were applicable for their teaching context.

  12. Clickers at UMass: a successful program of campus-wide implementation

    Science.gov (United States)

    Schneider, Stephen

    2006-12-01

    In the early 1990s, the Physics Department of the University of Massachusetts was a testing ground for one of the forerunners of the modern classroom response systems. Today, UMass is one of largest users of the wireless descendants of this system, with “clickers” being used across all disciplines. In Astronomy (and many other departments) we use clickers primarily in our large lecture classrooms. We have found that they can be used to (a) engage students in making predictions about classroom experiments. (b) encourage cooperative work with other students to develop mathematical and reasoning skills. (c) help students explore their own misconceptions. (d) All of the above. [correct answer!] Our early uses of clickers showed that simple testing of student knowledge was often perceived negatively as, in effect, “just taking attendance.” However, when students are challenged with difficult and interesting problems, the classroom response system is a positive addition to classroom teaching. Several successful examples, using demos, experiments, and even horoscopes, are shown, and the process involved in developing a strong campus-wide program at UMass is described.

  13. The Relationship Between Implementation of School-Wide Positive Behavior Intervention and Supports and Performance on State Accountability Measures

    Directory of Open Access Journals (Sweden)

    Adriana M. Marin

    2013-10-01

    Full Text Available This study examined data from 96 schools in a Southeastern U.S. state participating in training and/or coaching on School-Wide Positive Behavioral Interventions and Supports (SWPBIS provided by the State Personnel Development Grant (SPDG in their state. Schools studied either received training only (“non-intensive” sites or training and on-site coaching (“intensive” sites. Fidelity of implementation was self-evaluated by both types of schools using the Benchmarks of Quality (BOQ. Some schools were also externally evaluated using the School-Wide Evaluation Tool (SET, with those scoring 80% or higher determined “model sites.” Using an independent sample t-test, analyses revealed statistically significant differences between intensive and nonintensive schools’ Quality of Distribution Index (QDI scores and between model sites and nonmodel sites on QDI scores. Correlations were performed to determine whether the fidelity of implementation of SWPBIS as measured by the BOQ was related to any of the state’s accountability measures: performance classification, QDI, or growth.

  14. Creation and implementation of department-wide structured reports: an analysis of the impact on error rate in radiology reports.

    Science.gov (United States)

    Hawkins, C Matthew; Hall, Seth; Zhang, Bin; Towbin, Alexander J

    2014-10-01

    The purpose of this study was to evaluate and compare textual error rates and subtypes in radiology reports before and after implementation of department-wide structured reports. Randomly selected radiology reports that were generated following the implementation of department-wide structured reports were evaluated for textual errors by two radiologists. For each report, the text was compared to the corresponding audio file. Errors in each report were tabulated and classified. Error rates were compared to results from a prior study performed prior to implementation of structured reports. Calculated error rates included the average number of errors per report, average number of nongrammatical errors per report, the percentage of reports with an error, and the percentage of reports with a nongrammatical error. Identical versions of voice-recognition software were used for both studies. A total of 644 radiology reports were randomly evaluated as part of this study. There was a statistically significant reduction in the percentage of reports with nongrammatical errors (33 to 26%; p = 0.024). The likelihood of at least one missense omission error (omission errors that changed the meaning of a phrase or sentence) occurring in a report was significantly reduced from 3.5 to 1.2% (p = 0.0175). A statistically significant reduction in the likelihood of at least one comission error (retained statements from a standardized report that contradict the dictated findings or impression) occurring in a report was also observed (3.9 to 0.8%; p = 0.0007). Carefully constructed structured reports can help to reduce certain error types in radiology reports.

  15. Effects of school-wide positive behavioral interventions and supports and fidelity of implementation on problem behavior in high schools.

    Science.gov (United States)

    Flannery, K B; Fenning, P; Kato, M McGrath; McIntosh, K

    2014-06-01

    High school is an important time in the educational career of students. It is also a time when adolescents face many behavioral, academic, and social-emotional challenges. Current statistics about the behavioral, academic, and social-emotional challenges faced by adolescents, and the impact on society through incarceration and dropout, have prompted high schools to direct their attention toward keeping students engaged and reducing high-risk behavioral challenges. The purpose of the study was to examine the effects of School-Wide Positive Behavioral Interventions and Supports (SW-PBIS) on the levels of individual student problem behaviors during a 3-year effectiveness trial without random assignment to condition. Participants were 36,653 students in 12 high schools. Eight schools implemented SW-PBIS, and four schools served as comparison schools. Results of a multilevel latent growth model showed statistically significant decreases in student office discipline referrals in SW-PBIS schools, with increases in comparison schools, when controlling for enrollment and percent of students receiving free or reduced price meals. In addition, as fidelity of implementation increased, office discipline referrals significantly decreased. Results are discussed in terms of effectiveness of a SW-PBIS approach in high schools and considerations to enhance fidelity of implementation.

  16. Attitudes of students and employees towards the implementation of a totally smoke free university campus policy at King Saud University in Saudi Arabia: a cross sectional baseline study on smoking behavior following the implementation of policy.

    Science.gov (United States)

    Almutairi, Khalid M

    2014-10-01

    Tobacco smoking is the preventable health issue worldwide. The harmful consequences of tobacco smoking and exposure to second-hand tobacco smoke are well documented. The aim of this study is to compares the prevalence of smoking among students, faculty and staff and examines their interest to quit. Study also determines the difference on perceptions of smoking and non-smoking students, faculty and staff with regard to implementation of a smoke-free policy. A cross-sectional survey was administered to one of the largest universities in Riyadh, Saudi Arabia during the academic year of 2013. A Likert scale was used on questionnaires towards attitude to smoking and smoking free policy. The Chi squared test was used to determine the difference of support on completely smoke free campus for smokers and non-smokers. Smoking rates were highest among staff members (36.8 %) followed by students (11.2 %) and faculty (6.4 %). About half of the smokers (53.7 %) within the university attempted to quit smoking. Students (OR 3.10, 95 % CI 1.00-9.60) and faculty (OR 4.06, 95 % CI 1.16-14.18) were more likely to make quit smoking than staff members. Majority of the respondents (89.6 %) were supportive of a smoking--free policy and indicated that should be strictly enforced especially into public places. Results also showed that smokers were more likely to support a smoke-free policy if there are no fines or penalties. These baseline findings will provide information among administrators in formulating and carrying out a total smoke free policy. Although the majority of people within the King Saud University demonstrate a high support for a smoke-free policy, administrators should consider difference between smokers and non-smokers attitudes when implementing such a policy.

  17. "Islands of Innovation" and "School-Wide Implementations": Two Patterns of ICT-Based Pedagogical Innovations in Schools

    Directory of Open Access Journals (Sweden)

    Alona Forkosh-Baruch

    2005-01-01

    Full Text Available The study reported here is a secondary analysis of data collected in 10 schools as part of Israel’s participation in two international studies: IEA’s SITES Module 2, focusing on innovative pedagogical practices at the classroom level, and the OECD/CERI case studies of ICT and organizational innovation, focusing on ICT-related innovations at the school system level. We identify and analyze two patterns of ICT-based curricular innovations: “islands of innovation” and “school-wide implementations.” In the analysis of both patterns we focus on (a the levels and domains of innovation reached in schools; (b the communication agents and school variables affecting the diffusion of the innovation; and (c the role of internal and external factors affecting the diffusion of the innovation. In the discussion we elaborate the potential value of sustainable islands of innovation models as agents of innovation, and the similarities and differences between both patterns of ICT implementation in schools.

  18. Implementation of Remaining Useful Lifetime Transformer Models in the Fleet-Wide Prognostic and Health Management Suite

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Vivek [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lybeck, Nancy J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pham, Binh [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rusaw, Richard [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States); Bickford, Randall [Expert Microsystems, Orangevale, CA (United States)

    2015-02-01

    Research and development efforts are required to address aging and reliability concerns of the existing fleet of nuclear power plants. As most plants continue to operate beyond the license life (i.e., towards 60 or 80 years), plant components are more likely to incur age-related degradation mechanisms. To assess and manage the health of aging plant assets across the nuclear industry, the Electric Power Research Institute has developed a web-based Fleet-Wide Prognostic and Health Management (FW-PHM) Suite for diagnosis and prognosis. FW-PHM is a set of web-based diagnostic and prognostic tools and databases, comprised of the Diagnostic Advisor, the Asset Fault Signature Database, the Remaining Useful Life Advisor, and the Remaining Useful Life Database, that serves as an integrated health monitoring architecture. The main focus of this paper is the implementation of prognostic models for generator step-up transformers in the FW-PHM Suite. One prognostic model discussed is based on the functional relationship between degree of polymerization, (the most commonly used metrics to assess the health of the winding insulation in a transformer) and furfural concentration in the insulating oil. The other model is based on thermal-induced degradation of the transformer insulation. By utilizing transformer loading information, established thermal models are used to estimate the hot spot temperature inside the transformer winding. Both models are implemented in the Remaining Useful Life Database of the FW-PHM Suite. The Remaining Useful Life Advisor utilizes the implemented prognostic models to estimate the remaining useful life of the paper winding insulation in the transformer based on actual oil testing and operational data.

  19. SCPS: a fast implementation of a spectral method for detecting protein families on a genome-wide scale

    Directory of Open Access Journals (Sweden)

    Paccanaro Alberto

    2010-03-01

    Full Text Available Abstract Background An important problem in genomics is the automatic inference of groups of homologous proteins from pairwise sequence similarities. Several approaches have been proposed for this task which are "local" in the sense that they assign a protein to a cluster based only on the distances between that protein and the other proteins in the set. It was shown recently that global methods such as spectral clustering have better performance on a wide variety of datasets. However, currently available implementations of spectral clustering methods mostly consist of a few loosely coupled Matlab scripts that assume a fair amount of familiarity with Matlab programming and hence they are inaccessible for large parts of the research community. Results SCPS (Spectral Clustering of Protein Sequences is an efficient and user-friendly implementation of a spectral method for inferring protein families. The method uses only pairwise sequence similarities, and is therefore practical when only sequence information is available. SCPS was tested on difficult sets of proteins whose relationships were extracted from the SCOP database, and its results were extensively compared with those obtained using other popular protein clustering algorithms such as TribeMCL, hierarchical clustering and connected component analysis. We show that SCPS is able to identify many of the family/superfamily relationships correctly and that the quality of the obtained clusters as indicated by their F-scores is consistently better than all the other methods we compared it with. We also demonstrate the scalability of SCPS by clustering the entire SCOP database (14,183 sequences and the complete genome of the yeast Saccharomyces cerevisiae (6,690 sequences. Conclusions Besides the spectral method, SCPS also implements connected component analysis and hierarchical clustering, it integrates TribeMCL, it provides different cluster quality tools, it can extract human-readable protein

  20. Implementation of high precision optical and radiometric LRO tracking data in the orbit determination to supplement the baseline S-band tracking

    Science.gov (United States)

    Mao, D.; Torrence, M. H.; Mazarico, E.; Neumann, G. A.; Smith, D. E.; Zuber, M. T.

    2016-12-01

    LRO has been in a polar lunar orbit for 7 year since it was launched in June 2009. Seven instruments are onboard LRO to perform a global and detailed geophysical, geological and geochemical mapping of the Moon, some of which have very high spatial resolution. To take full advantage of the high resolution LRO datasets from these instruments, the spacecraft orbit must be reconstructed precisely. The baseline LRO tracking was the NASA's White Sands station in New Mexico and a commercial network, the Universal Space Network (USN), providing up to 20 hours per day of almost continuous S-band radio frequency link to LRO. The USN stations produce S-band range data with a 0.4 m precision and Doppler data with a 0.8 mm/s precision. Using the S-band tracking data together with the high-resolution gravity field model from the GRAIL mission, definitive LRO orbit solutions are obtained with an accuracy of 10 m in total position and 0.5 m radially. Confirmed by the 0.50-m high-resolution NAC images from the LROC team, these orbits well represent the LRO orbit "truth". In addition to the S-band data, one-way Laser Ranging (LR) to LRO provides a unique LRO optical tracking dataset over 5 years, from June 2009 to September 2014. Ten international satellite laser ranging stations contributed over 4000 hours LR data with the 0.05 - 0.10 m normal point precision. Another set of high precision LRO tracking data is provided by the Deep Space Network (DSN), which produces radiometric tracking data more precise than the USN S-band data. In the last two years of the LRO mission, the temporal coverage of the USN data has decreased significantly. We show that LR and DSN data can be a good supplement to the baseline tracking data for the orbit reconstruction.

  1. Implementation and Evaluation of a Comprehensive, School-Wide Bullying Prevention Program in an Urban/Suburban Middle School

    Science.gov (United States)

    Bowllan, Nancy M.

    2011-01-01

    Background: This intervention study examined the prevalence of bullying in an urban/suburban middle school and the impact of the Olweus Bullying Prevention Program (OBPP). Methods: A quasi-experimental design consisting of a time-lagged contrast between age-equivalent groups was utilized. Baseline data collected for 158 students prior to…

  2. Factors that influenced county system leaders to implement an evidence-based program: a baseline survey within a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Brown C Hendricks

    2010-10-01

    Full Text Available Abstract Background Despite the burgeoning number of well-validated interventions that have been shown in randomized trials to produce superior outcomes compared to usual services, it is estimated that only 10% of public systems deliver evidence-based mental health services. In California, for example, more than 15,000 children are placed in group homes or residential centers with some evidence of iatrogenic effects. The present study evaluates the willingness among county leaders of child public service systems to adopt a new evidence-based model, Multidimensional Treatment Foster Care, (MTFC, as a way to decrease the prevalence of out-of-home placements. Specifically, the study examines how county-level socio-demographic factors and child public service system leaders' perceptions of their county's organizational climate influence their decision of whether or not to consider adopting MTFC. Methods Two levels were examined in this study: Stable and historical factors from 40 California counties gathered from public records including population size, number of entries into out-of-home care, financing of mental health services, and percent minority population; and system leaders' perceptions of their county's organizational climate and readiness for change measured via a web-based survey. The number of days-to-consent was the primary outcome variable defined as the duration of time between being notified of the opportunity to implement MTFC and the actual signing of a consent form indicating interest in considering implementation. Survival analysis methods were used to assess the predictors of this time-to-event measure. The present study is part of a larger randomized trial comparing two methods of implementation where counties are randomized to one of three time cohorts and two implementation conditions. Results The number of entries into care was the primary predictor of days-to-consent. This variable was significantly correlated to county

  3. The Implementation of Life Space Crisis Intervention as a School-Wide Strategy for Reducing Violence and Supporting Students' Continuation in Public Schools

    Science.gov (United States)

    Ramin, John E.

    2011-01-01

    The purpose of this study was to explore the effectiveness of implementing Life Space Crisis Intervention as a school-wide strategy for reducing school violence. Life Space Crisis Intervention (LSCI) is a strength-based verbal interaction strategy (Long, Fecser, Wood, 2001). LSCI utilizes naturally occurring crisis situations as teachable…

  4. Implementation of a nation-wide automated auditory brainstem response hearing screening programme in neonatal intensive care units

    NARCIS (Netherlands)

    Straaten, H.L.M. van; Hille, E.T.M.; Kok, J.H.; Verkerk, P.H.; Baerts, W.; Bunkers, C.M.; Smink, E.W.A.; Elburg, R.M. van; Kleine, M.J.K. de; Ilsen, A.; Maingay-Visser, A.P.G.F.; Vries, L.S. de; Weisglas-Kuperus, N.

    2003-01-01

    Aim: As part of a future national neonatal hearing screening programme in the Netherlands, automated auditory brainstem response (AABR) hearing screening was implemented in seven neonatal intensive care units (NICUs). The objective was to evaluate key outcomes of this programme: participation rate,

  5. The "Teaching Pyramid": A Model for the Implementation of Classroom Practices within a Program-Wide Approach to Behavior Support

    Science.gov (United States)

    Hemmeter, Mary Louise; Fox, Lise

    2009-01-01

    The "Teaching Pyramid" (Fox, Dunlap, Hemmeter, Joseph, & Strain, 2003) is a framework for organizing evidence-based practices for promoting social-emotional development and preventing and addressing challenging behavior in preschool programs. In this article, we briefly describe the "Teaching Pyramid" as a framework for implementing effective…

  6. The Seniors Health Research Transfer Network Knowledge Network Model: system-wide implementation for health and healthcare of seniors.

    Science.gov (United States)

    Chambers, Larry W; Luesby, Deirdre; Brookman, Catherine; Harris, Megan; Lusk, Elizabeth

    2010-01-01

    The Ontario Seniors Health Research Transfer Network (SHRTN) aims to improve the health of older adults through increasing the knowledge capacity of 850 community care agencies and 620 long-term care homes. The SHRTN includes caregivers, researchers, policy makers, administrators, educators, and organizations. The SHRTN comprises communities of practice, a library service, a network of 7 research institutes, and local implementation teams. The SHRTN combines face-to-face meetings with information technology to promote change at the client care level in organizational and provincial policies and in the promotion of health services research.

  7. Development and implementation of a NATO-wide state-of-the-art interim geospatial intelligence support tool

    Science.gov (United States)

    Teufert, John F.

    2004-09-01

    In order to enhance operational planning capabilities of the NATO Force Headquarters (KFOR, SFOR, ISAF), the NC3A Geo Team has developed a web-based interim geospatial intelligence support tool (IGEOSIT). The NC3A IGEOSIT displays geospatial data, such as digital topographic maps and satellite/air photo imagery, together with selectable overlay objects retrieved from distributed operational databases (DBs), for example minefields, bridges, culverts and military units. The NC3A IGEOSIT is a state-of-the-art web-based and Java-based multi-tier solution consisting of applications distributed over multiple servers within each Force HQ. The IGEOSIT provides advanced GIS terrain analysis capabilities based on the available Geo-data, including line-of-sight, 3-D perspective views, terrain profiles, and the definition of go/no-go areas. The system also performs vector-based route analysis and enhances the real-time tracking capabilities of mobile vehicles and troops. The IGEOSIT analyzes overlay data sets according to their attributes and dependencies in order to highlight otherwise hidden spatial relations that may be critical for mission planning. After performing geospatial analysis, the system compiles maps automatically to provide the user with immediate hard copy results, according to NATO standards, if necessary. The successful implementation of the IGEOSIT currently provides all NATO FORCE HQ staff members with a common operational picture of the theatre. This ensures that a common set of recently-updated information overlays forms the basis for all operational decisions. This paper describes the architecture, technology, performance tests (including test environment, analysis and measurement tools, hardware, selected test scenarios and results) and the lessons learned implementing advanced network and Java-based multi-tier solutions within the NATO Force Headquarters.

  8. Process, policy, and implementation of pool-wide drawdowns on the Upper Mississippi River: a promising approach for ecological restoration of large impounded rivers

    Science.gov (United States)

    Kenow, Kevin P.; Gretchen Benjamin,; Tim Schlagenhaft,; Ruth Nissen,; Mary Stefanski,; Gary Wege,; Scott A. Jutila,; Newton, Teresa J.

    2016-01-01

    The Upper Mississippi River (UMR) has been developed and subsequently managed for commercial navigation by the U.S. Army Corps of Engineers (USACE). The navigation pools created by a series of lock and dams initially provided a complex of aquatic habitats that supported a variety of fish and wildlife. However, biological productivity declined as the pools aged. The River Resources Forum, an advisory body to the St. Paul District of the USACE, established a multiagency Water Level Management Task Force (WLMTF) to evaluate the potential of water level management to improve ecological function and restore the distribution and abundance of fish and wildlife habitat. The WLMTF identified several water level management options and concluded that summer growing season drawdowns at the pool scale offered the greatest potential to provide habitat benefits over a large area. Here we summarize the process followed to plan and implement pool-wide drawdowns on the UMR, including involvement of stakeholders in decision making, addressing requirements to modify reservoir operating plans, development and evaluation of drawdown alternatives, pool selection, establishment of a monitoring plan, interagency coordination, and a public information campaign. Three pool-wide drawdowns were implemented within the St. Paul District and deemed successful in providing ecological benefits without adversely affecting commercial navigation and recreational use of the pools. Insights are provided based on more than 17 years of experience in planning and implementing drawdowns on the UMR. 

  9. Implementation and Investigation of a Compact Circular Wide Slot UWB Antenna with Dual Notched Band Characteristics using Stepped Impedance Resonators

    Directory of Open Access Journals (Sweden)

    Yingsong Li

    2012-04-01

    Full Text Available A coplanar waveguide (CPW fed ultra-wideband (UWB antenna with dual notched band characteristics is presented in this paper. The circular wide slot and circular radiation patch are utilized to broaden the impedance bandwidth of the UWB antenna. The dual notched band functions are achieved by employing two stepped impedance resonators (SIRs which etched on the circular radiation patch and CPW excitation line, respectively. The two notched bands can be controlled by adjusting the dimensions of the two stepped impedance resonators which give tunable notched band functions. The proposed dual notched band UWB antenna has been designed in details and optimized by means of HFSS. Experimental and numerical results show that the proposed antenna with compact size of 32 × 24 mm2, has an impedance bandwidth range from 2.8 GHz to 13.5 Hz for voltage standing-wave ratio (VSWR less than 2, except the notch bands 5.0 GHz - 6.2 GHz for HIPERLAN/2 and IEEE 802.11a (5.1 GHz - 5.9 GHz and 8.0 GHz - 9.3 GHz for satellite and military applications.

  10. StatoilHydro chooses Platform LSF and EnginFrame to implement an enterprise-wide computing grid

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This article described a single enterprise-wide computing grid developed by Canadian company Platform Computing Inc. for StatoilHydro. The system used an LSF workload management software with a multi-cluster capability and a web portal for job submissions. The system gave Statoil engineers access to all computing resources in the division and allowed computing resources to be managed centrally. The system was designed to accommodate Statoil's extensive 3-D modelling programs. The company initially used separate platforms for all 4 of its Norway locations. However, the disparity in the relative number of units in each location caused problems in terms of differences in engineering process consistency, and reservoir simulation accuracy. The new network was established in under 20 days, and will allow the oil company to dynamically grow the size of its grid by harvesting unused CPU cycles from any of the 185 user workstations across the division that may not be running at full capacity. It was concluded that when StatoilHydro opens a new location, additional clusters can be added to the grid.

  11. An economic analysis of a system wide Lean approach: cost estimations for the implementation of Lean in the Saskatchewan healthcare system for 2012-2014.

    Science.gov (United States)

    Sari, Nazmi; Rotter, Thomas; Goodridge, Donna; Harrison, Liz; Kinsman, Leigh

    2017-08-03

    The costs of investing in health care reform initiatives to improve quality and safety have been underreported and are often underestimated. This paper reports direct and indirect cost estimates for the initial phase of the province-wide implementation of Lean activities in Saskatchewan, Canada. In order to obtain detailed information about each type of Lean event, as well as the total number of corresponding Lean events, we used the Provincial Kaizen Promotion Office (PKPO) Kaizen database. While the indirect cost of Lean implementation has been estimated using the corresponding wage rate for the event participants, the direct cost has been estimated using the fees paid to the consultant and other relevant expenses. The total cost for implementation of Lean over two years (2012-2014), including consultants and new hires, ranged from $44 million CAD to $49.6 million CAD, depending upon the assumptions used. Consultant costs accounted for close to 50% of the total. The estimated cost of Lean events alone ranged from $16 million CAD to $19.5 million CAD, with Rapid Process Improvement Workshops requiring the highest input of resources. Recognizing the substantial financial and human investments required to undertake reforms designed to improve quality and contain cost, policy makers must carefully consider whether and how these efforts result in the desired transformations. Evaluation of the outcomes of these investments must be part of the accountability framework, even prior to implementation.

  12. Personalized Medicine in the U.S. and Germany: Awareness, Acceptance, Use and Preconditions for the Wide Implementation into the Medical Standard.

    Science.gov (United States)

    Kichko, Kateryna; Marschall, Paul; Flessa, Steffen

    2016-05-02

    The aim of our research was to collect comprehensive data about the public and physician awareness, acceptance and use of Personalized Medicine (PM), as well as their opinions on PM reimbursement and genetic privacy protection in the U.S. and Germany. In order to give a better overview, we compared our survey results with the results from other studies and discussed Personalized Medicine preconditions for its wide implementation into the medical standard. For the data collection, using the same methodology, we performed several surveys in Pennsylvania (U.S.) and Bavaria (Germany). Physicians were contacted via letter, while public representatives in person. Survey results, analyzed by means of descriptive and non-parametric statistic methods, have shown that awareness, acceptance, use and opinions on PM aspects in Pennsylvania and Bavaria were not significantly different. In both states there were strong concerns about genetic privacy protection and no support of one genetic database. The costs for Personalized Medicine were expected to be covered by health insurances and governmental funds. Summarizing, we came to the conclusion that for PM wide implementation there will be need to adjust the healthcare reimbursement system, as well as adopt new laws which protect against genetic misuse and simultaneously enable voluntary data provision.

  13. School-wide implementation of the elements of effective classroom instruction: Lessons from a high-performing, high-poverty urban school

    Science.gov (United States)

    Dyson, Hilarie

    2008-10-01

    The purpose of the study was to identify structures and systems implemented in a high-performing high-poverty urban school to promote high academic achievement among students of color. The researcher used a sociocultural theoretical framework to examine the influence of culture on the structures and systems that increased performance by African American and Hispanic students. Four research questions guided the study: (1) What are the trends and patterns of student performance among students of color? (2) What are the organizational structures and systems that are perceived to contribute to high student performance in high-poverty urban schools with high concentrations of students of color? (3) How are the organizational structures and systems implemented to support school-wide effective classroom instruction that promotes student learning? (4) How is the construct of race reflected in the school's structures and systems? Qualitative data were collected through interviews, observations, and artifact collection. A single case study method was employed and collected data were triangulated to capture and explore the rich details of the study. The study focused on a high-performing high-poverty urban elementary school located in southern California. The school population consisted of 99% students of color and 93% were economically disadvantaged. The school was selected for making significant and consistent growth in Academic Performance Index and Adequate Yearly Progress over a 3-year period. The school-wide structures and systems studied were (a) leadership, (b) school climate and culture, (c) standards-based instruction, (d) data-driven decision making, and (e) professional development. Four common themes emerged from the findings: (a) instructional leadership that focused on teaching and learning; (b) high expectations for all students; (c) school-wide focus on student achievement using standards, data, and culturally responsive teaching; and (d) positive

  14. Implementation of a campus-wide Irish hospital smoking ban in 2009: prevalence and attitudinal trends among staff and patients in lead up.

    LENUS (Irish Health Repository)

    Fitzpatrick, Patricia

    2012-02-01

    We report the evidence base that supported the decision to implement the first campus-wide hospital smoking ban in the Republic of Ireland with effect from 1 January 2009. Three separate data sources are utilized; surveillance data collected from patients and staff in 8 surveys between 1997 and 2006, a 1-week observational study to assess smoker behaviour in designated smoking shelters and an attitudinal interview with 28 smoker patients and 30 staff on the implications of the 2004 indoors workplace smoking ban, conducted in 2005. The main outcome measures were trends in prevalence of smoking over time according to age, sex and occupational groups and attitudes to the 2004 ban and a projected outright campus ban. Smoking rates among patients remained steady, 24.2% in 1997\\/98 and 22.7% in 2006. Staff smoking rates declined from 27.4% to 17.8%, with a strong occupational gradient. Observational evidence suggested a majority of those using smoking shelters in 2005 were women and health-care workers rather than patients. Attitudes of patients and staff were positive towards the 2004 ban, but with some ambivalence on the effectiveness of current arrangements. Staff particularly were concerned with patient safety issues associated with smoking outdoors. The 2004 ban was supported by 87.6% of patients and 81.3% of staff in 2006 and a majority of 58.6% of patients and 52.4% of staff agreed with an outright campus ban being implemented. These findings were persuasive in instigating a process in 2007\\/08 to go totally smoke-free by 2009, the stages for which are discussed.

  15. Magic Baseline Beta Beam

    CERN Document Server

    Agarwalla, Sanjib Kumar; Raychaudhuri, Amitava

    2007-01-01

    We study the physics reach of an experiment where neutrinos produced in a beta-beam facility at CERN are observed in a large magnetized iron calorimeter (ICAL) at the India-based Neutrino Observatory (INO). The CERN-INO distance is close to the so-called "magic" baseline which helps evade some of the parameter degeneracies and allows for a better measurement of the neutrino mass hierarchy and $\\theta_{13}$.

  16. Rationing with baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    2013-01-01

    We introduce a new operator for general rationing problems in which, besides conflicting claims, individual baselines play an important role in the rationing process. The operator builds onto ideas of composition, which are not only frequent in rationing, but also in related problems...... such as bargaining, choice, and queuing. We characterize the operator and show how it preserves some standard axioms in the literature on rationing. We also relate it to recent contributions in such literature....

  17. The TDAQ Baseline Architecture

    CERN Multimedia

    Wickens, F J

    The Trigger-DAQ community is currently busy preparing material for the DAQ, HLT and DCS TDR. Over the last few weeks a very important step has been a series of meetings to complete agreement on the baseline architecture. An overview of the architecture indicating some of the main parameters is shown in figure 1. As reported at the ATLAS Plenary during the February ATLAS week, the main area where the baseline had not yet been agreed was around the Read-Out System (ROS) and details in the DataFlow. The agreed architecture has: Read-Out Links (ROLs) from the RODs using S-Link; Read-Out Buffers (ROB) sited near the RODs, mounted in a chassis - today assumed to be a PC, using PCI bus at least for configuration, control and monitoring. The baseline assumes data aggregation, in the ROB and/or at the output (which could either be over a bus or in the network). Optimization of the data aggregation will be made in the coming months, but the current model has each ROB card receiving input from 4 ROLs, and 3 such c...

  18. Design, Implementation, and Wide Pilot Deployment of FitForAll: An Easy to use Exergaming Platform Improving Physical Fitness and Life Quality of Senior Citizens.

    Science.gov (United States)

    Konstantinidis, Evdokimos I; Billis, Antonis S; Mouzakidis, Christos A; Zilidou, Vasiliki I; Antoniou, Panagiotis E; Bamidis, Panagiotis D

    2016-01-01

    Many platforms have emerged as response to the call for technology supporting active and healthy aging. Key requirements for any such e-health systems and any subsequent business exploitation are tailor-made design and proper evaluation. This paper presents the design, implementation, wide deployment, and evaluation of the low cost, physical exercise, and gaming (exergaming) FitForAll (FFA) platform system usability, user adherence to exercise, and efficacy are explored. The design of FFA is tailored to elderly populations, distilling literature guidelines and recommendations. The FFA architecture introduces standard physical exercise protocols in exergaming software engineering, as well as, standard physical assessment tests for augmented adaptability through adjustable exercise intensity. This opens up the way to next generation exergaming software, which may be more automatically/smartly adaptive. 116 elderly users piloted FFA five times/week, during an eight-week controlled intervention. Usability evaluation was formally conducted (SUS, SUMI questionnaires). Control group consisted of a size-matched elderly group following cognitive training. Efficacy was assessed objectively through the senior fitness (Fullerton) test, and subjectively, through WHOQoL-BREF comparisons of pre-postintervention between groups. Adherence to schedule was measured by attendance logs. The global SUMI score was 68.33±5.85%, while SUS was 77.7. Good usability perception is reflected in relatively high adherence of 82% for a daily two months pilot schedule. Compared to control group, elderly using FFA improved significantly strength, flexibility, endurance, and balance while presenting a significant trend in quality of life improvements. This is the first elderly focused exergaming platform intensively evaluated with more than 100 participants. The use of formal tools makes the findings comparable to other studies and forms an elderly exergaming corpus.

  19. Practice, science and governance in interaction: European effort for the system-wide implementation of the International Classification of Functioning, Disability and Health (ICF) in Physical and Rehabilitation Medicine.

    Science.gov (United States)

    Stucki, Gerold; Zampolini, Mauro; Juocevicius, Alvydas; Negrini, Stefano; Christodoulou, Nicolas

    2017-04-01

    Since its launch in 2001, relevant international, regional and national PRM bodies have aimed to implement the International Classification of Functioning, Disability and Health (ICF) in Physical and Rehabilitation Medicine (PRM), whereby contributing to the development of suitable practical tools. These tools are available for implementing the ICF in day-to-day clinical practice, standardized reporting of functioning outcomes in quality management and research, and guiding evidence-informed policy. Educational efforts have reinforced PRM physicians' and other rehabilitation professionals' ICF knowledge, and numerous implementation projects have explored how the ICF is applied in clinical practice, research and policy. Largely lacking though is the system-wide implementation of ICF in day-to-day practice across all rehabilitation services of national health systems. In Europe, system-wide implementation of ICF requires the interaction between practice, science and governance. Considering its mandate, the UEMS PRM Section and Board have decided to lead a European effort towards system-wide ICF implementation in PRM, rehabilitation and health care at large, in interaction with governments, non-governmental actors and the private sector, and aligned with ISPRM's collaboration plan with WHO. In this paper we present the current PRM internal and external policy agenda towards system-wide ICF implementation and the corresponding implementation action plan, while highlighting priority action steps - promotion of ICF-based standardized reporting in national quality management and assurance programs, development of unambiguous rehabilitation service descriptions using the International Classification System for Service Organization in Health-related Rehabilitation, development of Clinical Assessment Schedules, qualitative linkage and quantitative mapping of data to the ICF, and the cultural adaptation of the ICF Clinical Data Collection Tool in European languages.

  20. Waste management project technical baseline description

    Energy Technology Data Exchange (ETDEWEB)

    Sederburg, J.P.

    1997-08-13

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

  1. How Valid Are the Portland Baseline Essays?

    Science.gov (United States)

    Martel, Erich

    1991-01-01

    Portland, Oregon's "African-American Baseline Essays," widely used in creating multicultural curricula, inaccurately depicts ancient Egyptians as black people and Olmec civilization as derived from African influences. The authors advance racial theories long abandoned by mainline Africa scholars, attribute mystical powers to pyramids,…

  2. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  3. Owning the program technical baseline for future space systems acquisition: program technical baseline tracking tool

    Science.gov (United States)

    Nguyen, Tien M.; Guillen, Andy T.; Hant, James J.; Kizer, Justin R.; Min, Inki A.; Siedlak, Dennis J. L.; Yoh, James

    2017-05-01

    The U.S. Air Force (USAF) has recognized the needs for owning the program and technical knowledge within the Air Force concerning the systems being acquired to ensure success. This paper extends the previous work done by the authors [1-2] on the "Resilient Program Technical Baseline Framework for Future Space Systems" and "Portfolio Decision Support Tool (PDST)" to the development and implementation of the Program and Technical Baseline (PTB) Tracking Tool (PTBTL) for the DOD acquisition life cycle. The paper describes the "simplified" PTB tracking model with a focus on the preaward phases and discusses how to implement this model in PDST.

  4. Long Baseline Neutrino Experiments

    Science.gov (United States)

    Mezzetto, Mauro

    2016-05-01

    Following the discovery of neutrino oscillations by the Super-Kamiokande collaboration, recently awarded with the Nobel Prize, two generations of long baseline experiments had been setup to further study neutrino oscillations. The first generation experiments, K2K in Japan, Minos in the States and Opera in Europe, focused in confirming the Super-Kamiokande result, improving the precision with which oscillation parameters had been measured and demonstrating the ντ appearance process. Second generation experiments, T2K in Japan and very recently NOνA in the States, went further, being optimized to look for genuine three neutrino phenomena like non-zero values of θ13 and first glimpses to leptonic CP violation (LCPV) and neutrino mass ordering (NMO). The discovery of leptonic CP violation will require third generation setups, at the moment two strong proposals are ongoing, Dune in the States and Hyper-Kamiokande in Japan. This review will focus a little more in these future initiatives.

  5. Biofuels Baseline 2008

    Energy Technology Data Exchange (ETDEWEB)

    Hamelinck, C.; Koper, M.; Berndes, G.; Englund, O.; Diaz-Chavez, R.; Kunen, E.; Walden, D.

    2011-10-15

    The European Union is promoting the use of biofuels and other renewable energy in transport. In April 2009, the Renewable Energy Directive (2009/28/EC) was adopted that set a 10% target for renewable energy in transport in 2020. The directive sets several requirements to the sustainability of biofuels marketed in the frame of the Directive. The Commission is required to report to the European Parliament on a regular basis on a range of sustainability impacts resulting from the use of biofuels in the EU. This report serves as a baseline of information for regular monitoring on the impacts of the Directive. Chapter 2 discusses the EU biofuels market, the production and consumption of biofuels and international trade. It is derived where the feedstock for EU consumed biofuels originally come from. Chapter 3 discusses the biofuel policy framework in the EU and major third countries of supply. It looks at various policy aspects that are relevant to comply with the EU sustainability requirements. Chapter 4 discusses the environmental and social sustainability aspects associated with EU biofuels and their feedstock. Chapter 5 discusses the macro-economic effects that indirectly result from increased EU biofuels consumption, on commodity prices and land use. Chapter 6 presents country factsheets for main third countries that supplied biofuels to the EU market in 2008.

  6. Effects of a Provincial-Wide Implementation of Screening for Distress on Healthcare Professionals' Confidence and Understanding of Person-Centered Care in Oncology.

    Science.gov (United States)

    Tamagawa, Rie; Groff, Shannon; Anderson, Jennifer; Champ, Sarah; Deiure, Andrea; Looyis, Jennifer; Faris, Peter; Watson, Linda

    2016-10-01

    Although published studies report that screening for distress (SFD) improves the quality of care for patients with cancer, little is known about how SFD impacts healthcare professionals (HCPs). This quality improvement project examined the impact of implementing the SFD intervention on HCPs' confidence in addressing patient distress and awareness of person-centered care. This project involved pre-evaluation and post-evaluation of the impact of implementing SFD. A total of 254 HCPs (cohort 1) were recruited from 17 facilities across the province to complete questionnaires. SFD was then implemented at all cancer care facilities over a 10-month implementation period, after which 157 HCPs (cohort 2) completed post-implementation questionnaires. At regional and community care centers, navigators supported the integration of SFD into routine practice; therefore, the impact of navigators was examined. HCPs in cohort 2 reported significantly greater confidence in managing patients' distress and greater awareness about person-centered care relative to HCPs in cohort 1. HCPs at regional and community sites reported greater awareness in person-centeredness before and after the intervention, and reported fewer negative impacts of SFD relative to HCPs at tertiary sites. Caring for single or multiple tumor types was an effect modifier, with effects observed only in the HCPs treating multiple tumors. Implementation of SFD was beneficial for HCPs' confidence and awareness of person-centeredness. Factors comprising different models of care, such as having site-based navigators and caring for single or multiple tumors, influenced outcomes. Copyright © 2016 by the National Comprehensive Cancer Network.

  7. The Gambia Impact Evaluation Baseline Report

    OpenAIRE

    2015-01-01

    The Government of The Gambia is implementing the Maternal and Child Nutrition and Health Results Project (MCNHRP) to increase the utilization of community nutrition and primary maternal and child health services. In collaboration with the Government, the World Bank is conducting an impact evaluation (IE) to assess the impact of the project on key aspects of maternal and child nutrition and health. The baseline survey for the MCNHRP IE took place between November 2014 and February 2015. It c...

  8. Relationship between Leadership Styles of High School Teachers, Principals, and Assistant Principals and Their Attitudes toward School Wide Positive Behavior and Support Implementation

    Science.gov (United States)

    Lampton-Holmes, Geneva Cosweler

    2014-01-01

    The purpose of this study was to determine if seventh through twelfth grade educators' attitudes towards School-Wide Positive Behavior Support (SWPBS) are affected based on their gender, years of experience, school discipline policy, leadership style, and knowledge of SWPBS. Through an online survey, an analysis of the leadership style and…

  9. Effects on Problem Behavior and Social Skills Associated with the Implementation of School Wide Positive Behavioral Intervention and Supports Approach in an Alternative School Setting

    Science.gov (United States)

    Evans, Erica

    2013-01-01

    In spite of research documenting the negative effects of punishment, most high schools and correctional facilities rely on punishment to establish order and compliance with rules and routines (Nelson, Sprague, Jolivette, Smith, & Tobin, 2009). One alternative to punitive consequences is School Wide Positive Behavioral Intervention and Supports…

  10. Using Baseline Studies in the Investigation of Test Impact

    Science.gov (United States)

    Wall, Dianne; Horak, Tania

    2007-01-01

    The purpose of this article is to discuss the role of "baseline studies" in investigations of test impact and to illustrate the type of thinking underlying the design and implementation of such studies by reference to a recent study relating to a high-stakes test of English language proficiency. Baseline studies are used to describe an educational…

  11. Baseline inventory data users guide to abiotic GIS layers

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Baseline Inventory Team was chartered by the Fulfilling the Promises Implementation Team to recommend minimum abiotic and biotic inventories for the National...

  12. Butler Hollow Glades : Baseline assessment and vegetation monitoring establishment

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Several sampling and documentation protocols were implemented to establish baseline vegetation data. These data will provide a comparison point for future...

  13. An analysis of the return on investment of Navy Enterprise Resource Planning as implemented Navy-wide FY04-FY15

    OpenAIRE

    Kovack, Robert G.; Lindley, Philip R.

    2011-01-01

    MBA Professional Report Since 2003, the United States Navy has invested hundreds of millions of dollars into the Enterprise Resource Planning (ERP) Program. ERP evolved from four pilot programs into a single solution. Furthermore, the Navy has invested approximately 2 billion dollars for ERP implementation and developed several programs to streamline the financial reporting practices. This thesis project analyzes the evolution and development of ERP, identifies the Navy's projections f...

  14. Design and implementation of World Wide Web-based tools for image management in computed tomography, magnetic resonance imaging, and ultrasonography

    OpenAIRE

    Henri, Christopher J.; Rubin, Richard K.; Cox, Robert D.; Bret, Patrice M.

    1997-01-01

    This article describes our experience in developing and using several web-based tools to facilitate access to and management of images from inside and outside of our department. Having recently eliminated film in ultrasound, computed tomography (CT) and magnetic resonance imaging (MRI), a simple method was required to access imaging from computers already existing throughout the hospital. The success of the World Wide Web (WWW), the familiarity of endusers with web browsers, and the relative ...

  15. How good are we at implementing evidence to support the management of birth related perineal trauma? A UK wide survey of midwifery practice

    Directory of Open Access Journals (Sweden)

    Bick Debra E

    2012-06-01

    Full Text Available Abstract Background The accurate assessment and appropriate repair of birth related perineal trauma require high levels of skill and competency, with evidence based guideline recommendations available to inform UK midwifery practice. Implementation of guideline recommendations could reduce maternal morbidity associated with perineal trauma, which is commonly reported and persistent, with potential to deter women from a future vaginal birth. Despite evidence, limited attention is paid to this important aspect of midwifery practice. We wished to identify how midwives in the UK assessed and repaired perineal trauma and the extent to which practice reflected evidence based guidance. Findings would be used to inform the content of a large intervention study. Methods A descriptive cross sectional study was completed. One thousand randomly selected midwives were accessed via the Royal College of Midwives (RCM and sent a questionnaire. Study inclusion criteria included that the midwives were in clinical practice and undertook perineal assessment and management within their current role. Quantitative and qualitative data were collated. Associations between midwife characteristics and implementation of evidence based recommendations for perineal assessment and management were examined using chi-square tests of association. Results 405 midwives (40.5% returned a questionnaire, 338 (83.5% of whom met inclusion criteria. The majority worked in a consultant led unit (235, 69.5% and over a third had been qualified for 20 years or longer (129, 38.2%. Compliance with evidence was poor. Few (6% midwives used evidence based suturing methods to repair all layers of perineal trauma and only 58 (17.3% performed rectal examination as part of routine perineal trauma assessment. Over half (192, 58.0% did not suture all second degree tears. Feeling confident to assess perineal trauma all of the time was only reported by 116 (34.3% midwives, with even fewer (73, 21

  16. Implementing an online tool for genome-wide validation of survival-associated biomarkers in ovarian-cancer using microarray data from 1287 patients

    DEFF Research Database (Denmark)

    Győrffy, Balázs; Lánczky, András; Szállási, Zoltán

    2012-01-01

    The validation of prognostic biomarkers in large independent patient cohorts is a major bottleneck in ovarian cancer research. We implemented an online tool to assess the prognostic value of the expression levels of all microarray-quantified genes in ovarian cancer patients. First, a database was...... biomarker validation platform that mines all available microarray data to assess the prognostic power of 22 277 genes in 1287 ovarian cancer patients. We specifically used this tool to evaluate the effect of 37 previously published biomarkers on ovarian cancer prognosis.......The validation of prognostic biomarkers in large independent patient cohorts is a major bottleneck in ovarian cancer research. We implemented an online tool to assess the prognostic value of the expression levels of all microarray-quantified genes in ovarian cancer patients. First, a database...... was set up using gene expression data and survival information of 1287 ovarian cancer patients downloaded from Gene Expression Omnibus and The Cancer Genome Atlas (Affymetrix HG-U133A, HG-U133A 2.0, and HG-U133 Plus 2.0 microarrays). After quality control and normalization, only probes present on all...

  17. Awareness and attitudes of pre-exposure prophylaxis for HIV prevention among physicians in Guatemala: Implications for country-wide implementation

    Science.gov (United States)

    Mejia, Carlos; Melendez, Johanna; Chan, Philip A.; Nunn, Amy C.; Powderly, William; Goodenberger, Katherine; Liu, Jingxia; Mayer, Kenneth H.

    2017-01-01

    Introduction HIV continues to be a major health concern with approximately 2.1 million new infections occurring worldwide in 2015. In Central America, Guatemala had the highest incident number of HIV infections (3,700) in 2015. Antiretroviral pre-exposure prophylaxis (PrEP) was recently recommended by the World Health Organization (WHO) as an efficacious intervention to prevent HIV transmission. PrEP implementation efforts are underway in Guatemala and success will require providers that are knowledgeable and willing to prescribe PrEP. We sought to explore current PrEP awareness and prescribing attitudes among Guatemalan physicians in order to inform future PrEP implementation efforts. Methods We conducted a cross-sectional survey of adult internal medicine physicians at the main teaching hospital in Guatemala City in March 2015. The survey included demographics, medical specialty, years of HIV patient care, PrEP awareness, willingness to prescribe PrEP, previous experience with post-exposure prophylaxis, and concerns about PrEP. The primary outcome was willingness to prescribe PrEP, which was assessed using a 5-point Likert scale for different at-risk population scenarios. Univariate and multivariate logistic regression was performed to identify predictors for willingness to prescribe PrEP. Results Eighty-seven physicians completed the survey; 66% were male, 64% were internal medicine residency trainees, and 10% were infectious disease (ID) specialists. Sixty-nine percent of physicians were PrEP aware, of which 9% had previously prescribed PrEP. Most (87%) of respondents were willing to prescribe PrEP to men who have sex with men (MSM), sex workers, injection drug users, or HIV-uninfected persons having known HIV-positive sexual partners. Concerns regarding PrEP included development of resistance (92%), risk compensation (90%), and cost (64%). Univariate logistic regression showed that younger age, being a resident trainee, and being a non-ID specialist were

  18. Hanford Site technical baseline database

    Energy Technology Data Exchange (ETDEWEB)

    Porter, P.E.

    1996-09-30

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of September 30, 1996. The cassette tape also includes the delta files that dellinate the differences between this revision and revision 4 (May 10, 1996) of the Hanford Site Technical Baseline Database.

  19. Hanford Site technical baseline database

    Energy Technology Data Exchange (ETDEWEB)

    Porter, P.E., Westinghouse Hanford

    1996-05-10

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of May 10, 1996. The cassette tape also includes the delta files that delineate the differences between this revision and revision 3 (April 10, 1996) of the Hanford Site Technical Baseline Database.

  20. Plutonium Immobilization Project Baseline Formulation

    Energy Technology Data Exchange (ETDEWEB)

    Ebbinghaus, B.

    1999-02-01

    A key milestone for the Immobilization Project (AOP Milestone 3.2a) in Fiscal Year 1998 (FY98) is the definition of the baseline composition or formulation for the plutonium ceramic form. The baseline formulation for the plutonium ceramic product must be finalized before the repository- and plant-related process specifications can be determined. The baseline formulation that is currently specified is given in Table 1.1. In addition to the baseline formulation specification, this report provides specifications for two alternative formulations, related compositional specifications (e.g., precursor compositions and mixing recipes), and other preliminary form and process specifications that are linked to the baseline formulation. The preliminary specifications, when finalized, are not expected to vary tremendously from the preliminary values given.

  1. Development of a System-Wide Predator Control Program: Stepwise Implementation of a Predation Index, Predator Control Fisheries, and Evaluation Plan in the Columbia River Basin; Northern Pikeminnow Management Program, 2001 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Russell G.; Winther, Eric C.; Fox, Lyle G.

    2003-03-01

    This report presents results for year eleven in a basin-wide program to harvest northern pikeminnow (Ptychocheilus oregonensis). This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated that, if predator-size northern pikeminnow were exploited at a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991--a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible.

  2. Development of a System-Wide Predator Control Program: Stepwise Implementation of a Predation Index, Predator Control Fisheries, and Evaluation Plan in the Columbia River Basin; Northern Pikeminnow Management Program, 2002 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Russell G.; Winther, Eric C.; Fox, Lyle G.

    2004-01-01

    This report presents results for year twelve in a basin-wide program to harvest northern pikeminnow1 (Ptychocheilus oregonensis). This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated that, if predator-size northern pikeminnow were exploited at a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991--a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible. Estimates of combined annual exploitation rates resulting from the sport-reward and damangling fisheries remained at the low end of our target range of 10-20%. This suggested the need for additional effective harvest techniques. During 1991 and 1992, we developed and tested a modified

  3. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  4. Development and Implementation of a School-Wide Institute for Excellence in Education to Enable Educational Scholarship by Medical School Faculty.

    Science.gov (United States)

    Cofrancesco, Joseph; Barone, Michael A; Serwint, Janet R; Goldstein, Mitchell; Westman, Michael; Lipsett, Pamela A

    2017-07-28

    Educational scholarship is an important component for faculty at Academic Medical Centers, especially those with single-track promotion systems. Yet, faculty may lack the skills and mentorship needed to successfully complete projects. In addition, many educators feel undervalued. To reinvigorate our school's educational mission, the Institute for Excellence in Education (IEE) was created. Here we focus on one of the IEE's strategic goals, that of inspiring and supporting educational research, scholarship, and innovation. Using the 6-step curriculum development process as a framework, we describe the development and outcomes of IEE programs aimed at enabling educational scholarship at the Johns Hopkins University School of Medicine. Four significant programs that focused on educational scholarship were developed and implemented: (a) an annual conference, (b) a Faculty Education Scholars' Program, (c) "Shark Tank" small-grant program, and (d) Residency Redesign Challenge grants. A diverse group of primarily junior faculty engaged in these programs with strong mentorship, successfully completing and disseminating projects. Faculty members have been able to clarify their personal goals and develop a greater sense of self-efficacy for their desired paths in teaching and educational research. Faculty require programs and resources for educational scholarship and career development, focused on skills building in methodology, assessment, and statistical analysis. Mentoring and the time to work on projects are critical. Key to the IEE's success in maintaining and building programs has been ongoing needs assessment of faculty and learners and a strong partnership with our school's fund-raising staff. The IEE will next try to expand opportunities by adding additional mentoring capacity and further devilment of our small-grants programs.

  5. Improving the sterile sperm identification method for its implementation in the Area-wide Sterile Insect Technique Program against Ceratitis capitata (Diptera: Tephritidae) in Spain.

    Science.gov (United States)

    Juan-Blasco, M; Urbaneja, A; San Andrés, V; Castañera, P; Sabater-Muñoz, B

    2013-12-01

    The success of sterile males in area-wide sterile insect technique (aw-SIT) programs against Ceratitis capitata (Wiedemann) is currently measured by using indirect methods as the wild:sterile male ratio captured in monitoring traps. In the past decade, molecular techniques have been used to improve these methods. The development of a polymerase chain reaction-restriction fragment-length polymorphism-based method to identify the transfer of sterile sperm to wild females, the target of SIT, was considered a significant step in this direction. This method relies on identification of sperm by detecting the presence of Y chromosomes in spermathecae DNA extract complemented by the identification of the genetic origin of this sperm: Vienna-8 males or wild haplotype. However, the application of this protocol to aw-SIT programs is limited by handling time and personnel cost. The objective of this work was to obtain a high-throughput protocol to facilitate the routine measurement in a pest population of sterile sperm presence in wild females. The polymerase chain reaction-restriction fragment-length polymorphism markers previously developed were validated in Mediterranean fruit fly samples collected from various locations worldwide. A laboratory protocol previously published was modified to allow for the analysis of more samples at the same time. Preservation methods and preservation times commonly used for Mediterranean fruit fly female samples were assessed for their influence on the correct molecular detection of sterile sperm. This high-throughput methodology, as well as the results of sample management presented here, provide a robust, efficient, fast, and economical sterile sperm identification method ready to be used in all Mediterranean fruit fly SIT programs.

  6. Quivira NWR biological baseline data

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This dataset is biological baseline data for Quivira National Wildlife Refuge as of January 2016. It contains data on species found on the refuge, when and where...

  7. 324 Building Baseline Radiological Characterization

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  8. SRP baseline hydrogeologic investigation, Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, H.W.

    1987-11-01

    As discussed in the program plan for the Savannah River Plant (SRP) Baseline Hydrogeologic Investigation, this program has been implemented for the purpose of updating and improving the current state of knowledge and understanding of the hydrogeologic systems underlying the Savannah River Plant (SRP). The objective of the program is to install a series of observation well clusters (wells installed in each major water bearing formation at the same site) at key locations across the plant site in order to: (1) provide detailed information on the lithology, stratigraphy, and groundwater hydrology, (2) provide observation wells to monitor the groundwater quality, head relationships, gradients, and flow paths.

  9. Very Long Baseline Interferometry with the SKA

    CERN Document Server

    Paragi, Zsolt; Reynolds, Cormac; Rioja, Maria; Deller, Adam; Zhang, Bo; Gurvits, Leonid; Bietenholz, Michael; Szomoru, Arpad; Bignall, Hayley; Boven, Paul; Charlot, Patrick; Dodson, Richard; Frey, Sandor; Garrett, Michael; Imai, Hiroshi; Lobanov, Andrei; Reid, Mark; Ros, Eduardo; van Langevelde, Huib; Zensus, J Anton; Zheng, Xing Wu; Alberdi, Antxon; Agudo, Ivan; An, Tao; Argo, Megan; Beswick, Rob; Biggs, Andy D; Brunthaler, Andreas; Campbell, Robert M; Cimo, Giuseppe; Colomer, Francisco; Corbel, Stephane; Conway, John; Cseh, David; Deane, Roger; Falcke, Heino; Gabanyi, Krisztina; Gawronski, Marcin; Gaylard, Michael; Giovannini, Gabriele; Giroletti, Marcello; Goddi, Ciriaco; Goedhart, Sharmila; Gomez, Jose L; Gunn, Alastair; Jung, Taehyun; Kharb, Preeti; Klockner, Hans-Rainer; Kording, Elmar; Kovalev, Yurii Yu; Kunert-Bajraszewska, Magdalena; Lindqvist, Michael; Lister, Matt; Mantovani, Franco; Marti-Vidal, Ivan; Mezcua, Mar; McKean, John; Middelberg, Enno; Miller-Jones, James; Moldon, Javier; Muxlow, Tom; O'Brien, Tim; Pérez-Torres, Miguel; Pogrebenko, Sergei; Quick, Jonathan; Rushton, Anthony P; Schilizzi, Richard; Smirnov, Oleg; Sohn, Bong Won; Surcis, Gabriele; Taylor, Greg; Tingay, Steven; Tudose, Valeriu; van der Horst, Alexander; van Leeuwen, Joeri; Venturi, Tiziana; Vermeulen, Rene; Vlemmings, Wouter; de Witt, Aletha; Wucknitz, Olaf; Yang, Jun

    2014-01-01

    Adding VLBI capability to the SKA arrays will greatly broaden the science of the SKA, and is feasible within the current specifications. SKA-VLBI can be initially implemented by providing phased-array outputs for SKA1-MID and SKA1-SUR and using these extremely sensitive stations with other radio telescopes, and in SKA2 by realising a distributed configuration providing baselines up to thousands of km, merging it with existing VLBI networks. The motivation for and the possible realization of SKA-VLBI is described in this paper.

  10. SRP Baseline Hydrogeologic Investigation, Phase 3

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, H.W.

    1988-08-01

    The SRP Baseline Hydrogeologic Investigation was implemented for the purpose of updating and improving the knowledge and understanding of the hydrogeologic systems underlying the SRP site. Phase III, which is discussed in this report, includes the drilling of 7 deep coreholes (sites P-24 through P-30) and the installation of 53 observation wells ranging in depth from approximately 50 ft to more than 970 ft below the ground surface. In addition to the collection of geologic cores for lithologic and stratigraphic study, samples were also collected for the determination of physical characteristics of the sediments and for the identification of microorganisms.

  11. SRP baseline hydrogeologic investigation: Aquifer characterization

    Energy Technology Data Exchange (ETDEWEB)

    Strom, R.N.; Kaback, D.S.

    1992-03-31

    An investigation of the mineralogy and chemistry of the principal hydrogeologic units and the geochemistry of the water in the principal aquifers at Savannah River Site (SRS) was undertaken as part of the Baseline Hydrogeologic Investigation. This investigation was conducted to provide background data for future site studies and reports and to provide a site-wide interpretation of the geology and geochemistry of the Coastal Plain Hydrostratigraphic province. Ground water samples were analyzed for major cations and anions, minor and trace elements, gross alpha and beta, tritium, stable isotopes of hydrogen, oxygen, and carbon, and carbon-14. Sediments from the well borings were analyzed for mineralogy and major and minor elements.

  12. Development of a System-Wide Predator Control Program: Stepwise Implementation of a Predation Index, Predator Control Fisheries, and Evaluation Plan in the Columbia River Basin; Northern Pikeminnow Management Program, 2000 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Russell G.; Glaser, Bryce G.; Amren, Jennifer

    2003-03-01

    This report presents results for year ten in a basin-wide program to harvest northern pikeminnow (Ptychocheilus oregonensis). This program was started in an effort to reduce predation by northern pikeminnow on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River Basin suggested predation by northern pikeminnow on juvenile salmonids might account for most of the 10-20% mortality juvenile salmonids experience in each of eight Columbia River and Snake River reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated that, if predator-size northern pikeminnow were exploited at a 10-20% rate, the resulting restructuring of their population could reduce their predation on juvenile salmonids by 50%. To test this hypothesis, we implemented a sport-reward angling fishery and a commercial longline fishery in the John Day Pool in 1990. We also conducted an angling fishery in areas inaccessible to the public at four dams on the mainstem Columbia River and at Ice Harbor Dam on the Snake River. Based on the success of these limited efforts, we implemented three test fisheries on a system-wide scale in 1991--a tribal longline fishery above Bonneville Dam, a sport-reward fishery, and a dam-angling fishery. Low catch of target fish and high cost of implementation resulted in discontinuation of the tribal longline fishery. However, the sport-reward and dam-angling fisheries were continued in 1992 and 1993. In 1992, we investigated the feasibility of implementing a commercial longline fishery in the Columbia River below Bonneville Dam and found that implementation of this fishery was also infeasible. Estimates of combined annual exploitation rates resulting from the sport-reward and damangling fisheries remained at the low end of our target range of 10-20%. This suggested the need for additional effective harvest techniques. During 1991 and 1992, we developed and tested a modified

  13. Learning to Baseline Business Technology

    Directory of Open Access Journals (Sweden)

    David Gore

    2013-10-01

    Full Text Available bills, sign multi-­‐year contracts, and make purchasing decisions without having an overall technology plan. That plan includes a technology baseline to fully assess existing technology. A CIO's goal is to align IT with business goals. Businesses must know total cost of ownership and the return on investment for all technology purchases and monthly costs. A business must also be able to manage technology assets and best utilize resources across the business. Teaching students to baseline technology will enable them to track and manage costs, discover errors and waste, and consolidate and improve existing technology.

  14. OCIO FITARA Common Baseline Implementation Plan and Self-Assessment

    Data.gov (United States)

    National Aeronautics and Space Administration — This document outlines NASA's IT management and decision-making structure as well as the the Office of the Chief Information Officer's (OCIO) self-assessment against...

  15. SRNL RADIONUCLIDE FIELD LYSIMETER EXPERIMENT: BASELINE CONSTRUCTION AND IMPLEMENTATION

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, K.; Kaplan, D.; Bagwell, L.; Powell, B.; Almond, P.; Emerson, H.; Hixon, A.; Jablonski, J.; Buchanan, C.; Waterhouse, T.

    2012-10-17

    The purpose of this document is to compile information regarding experimental design, facility design, construction, radionuclide source preparation, and path forward for the ten year Savannah River National Laboratory (SRNL) Radionuclide Field Lysimeter Experiment at the Savannah River Site (SRS). This is a collaborative effort by researchers at SRNL and Clemson University. The scientific objectives of this study are to: Study long-term radionuclide transport under conditions more representative of vadose zone conditions than laboratory experiments; Provide more realistic quantification of radionuclide transport and geochemistry in the vadose zone, providing better information pertinent to radioactive waste storage solutions than presently exists; Reduce uncertainty and improve justification for geochemical models such as those used in performance assessments and composite analyses.

  16. Baseline Removal From EMG Recordings

    Science.gov (United States)

    2007-11-02

    a time-varying baseline contamination. Acknowledgements: Work funded by the Departamento de Salud del Gobierno de Navarrra and by a Spanish MEC...Name(s) and Address(es) Departamento de Ingenieria Electra y Electronica Universidad Publica de Navarra Pamplona, Spain Performing Organization Report

  17. Towards system-wide implementation of the International Classification of Functioning, Disability and Health (ICF) in routine practice: Developing simple, intuitive descriptions of ICF categories in the ICF Generic and Rehabilitation Set.

    Science.gov (United States)

    Prodinger, Birgit; Reinhardt, Jan D; Selb, Melissa; Stucki, Gerold; Yan, Tiebin; Zhang, Xia; Li, Jianan

    2016-06-13

    A national, multi-phase, consensus process to develop simple, intuitive descriptions of International Classification of Functioning, Disability and Health (ICF) categories contained in the ICF Generic and Rehabilitation Sets, with the aim of enhancing the utility of the ICF in routine clinical practice, is presented in this study. A multi-stage, national, consensus process was conducted. The consensus process involved 3 expert groups and consisted of a preparatory phase, a consensus conference with consecutive working groups and 3 voting rounds (votes A, B and C), followed by an implementation phase. In the consensus conference, participants first voted on whether they agreed that an initially developed proposal for simple, intuitive descriptions of an ICF category was in fact simple and intuitive. The consensus conference was held in August 2014 in mainland China. Twenty-one people with a background in physical medicine and rehabilitation participated in the consensus process. Four ICF categories achieved consensus in vote A, 16 in vote B, and 8 in vote C. This process can be seen as part of a larger effort towards the system-wide implementation of the ICF in routine clinical and rehabilitation practice to allow for the regular and comprehensive evaluation of health outcomes most relevant for the monitoring of quality of care.

  18. Mode S Baseline Radar Tracking.

    Science.gov (United States)

    1982-11-01

    range units and 20 azimuth units) overlaying the position of the beacon reports. In the cases analyzed where beacon reports were not radar reinforced ...82/53 j~ C ~ 7 C _ _ _ _ _ _ 4. Title end Su.btitle 5. Neget at. November 1982 MDDE S BASELINE RADAR TRACKIN4G 6. Poelin Orgeuianti.. Cede ACT-100...Ground Clutter 33 Mode S/ARTS III 100-Scan False Radar Track Summary 74 34 Percent Beacon Radar Reinforcement 77 vii INTRODUCTION PURPOSE. The purpose of

  19. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    Energy Technology Data Exchange (ETDEWEB)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen [Galson Sciences Ltd. Oakham, Rutland (United Kingdom); Bolton, Gary [National Nuclear Laboratory Risley, Warrington (United Kingdom); McKinney, James; Morris, Darrell [Nuclear Decommissioning Authority Moor Row, Cumbria (United Kingdom); Angus, Mike [National Nuclear Laboratory Risley, Warrington (United Kingdom); Cann, Gavin; Binks, Tracey [National Nuclear Laboratory Sellafield (United Kingdom)

    2013-07-01

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. During the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)

  20. Development of a System-Wide Predator Control Program : Stepwise Implementation of a Predation Index, Predator Control Fisheries and Evaluation Plan in the Columbia River Basin, 1991 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Nigro, Anthony A.; Willis, Charles F.

    1993-02-01

    We report our results from the first year of a basin-wide program to harvest northern squawfish in an effort to reduce mortality due to northern squawfish predation on juvenile salmonids during their emigration from natal streams to the ocean. Earlier work in the Columbia River basin suggested predation by northern squawfish on juvenile salmonids may account for most of the 10 to 20 percent mortality juvenile salmonids experience in each of eight Columbia and Snake river reservoirs. Modeling simulations based on work in John Day Reservoir from 1982 through 1988 indicated it is not necessary to eradicate northern squawfish to substantially reduce predation-caused mortality of juvenile salmonids. Instead, if northern squawfish were exploited at a 10 to 20 percent rate, reductions in their numbers and restructuring of their populations could reduce their predation on juvenile salmonids by 50 percent or more. Consequently, we designed and tested a sport reward hook-and-line fishery and a longline fishery in the John Day pool in 1990. Based on the successfulness of these limited efforts, we implemented three test fisheries on a multi-pool or system wide scale in 1991: a tribal longline fishery, a sport reward fishery, and a dam angling (hook-and-line) fishery. In addition, we examined several alternative harvest techniques to determine their potential for use in system-wide test fisheries. Evaluation of the success of the three test fisheries conducted in 1991 in achieving a 20 percent exploitation rate on northern squawfish, together with information regarding the economic, social, and legal feasibility of sustaining each fishery, is presented in Section II of this report.

  1. Education Organization Baseline Control Protection and Trusted Level Security

    Directory of Open Access Journals (Sweden)

    Wasim A. Al-Hamdani

    2007-12-01

    Full Text Available Many education organizations have adopted for security the enterprise best practices for implementation on their campuses, while others focus on ISO Standard (or/and the National Institution of Standards and Technology.All these adoptions are dependent on IT personal and their experiences or knowledge of the standard. On top of this is the size of the education organizations. The larger the population in an education organization, the more the problem of information and security become very clear. Thus, they have been obliged to comply with information security issues and adopt the national or international standard. The case is quite different when the population size of the education organization is smaller. In such education organizations, they use social security numbers as student ID, and issue administrative rights to faculty and lab managers – or they are not aware of the Family Educational Rights and Privacy Act (FERPA – and release some personal information.The problem of education organization security is widely open and depends on the IT staff and their information security knowledge in addition to the education culture (education, scholarships and services has very special characteristics other than an enterprise or comparative organizationThis paper is part of a research to develop an “Education Organization Baseline Control Protection and Trusted Level Security.” The research has three parts: Adopting (standards, Testing and Modifying (if needed.

  2. Ultra wide band antennas

    CERN Document Server

    Begaud, Xavier

    2013-01-01

    Ultra Wide Band Technology (UWB) has reached a level of maturity that allows us to offer wireless links with either high or low data rates. These wireless links are frequently associated with a location capability for which ultimate accuracy varies with the inverse of the frequency bandwidth. Using time or frequency domain waveforms, they are currently the subject of international standards facilitating their commercial implementation. Drawing up a complete state of the art, Ultra Wide Band Antennas is aimed at students, engineers and researchers and presents a summary of internationally recog

  3. Vegetation baseline report : Connacher great divide project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-08-01

    This baseline report supported an application by Connacher Oil and Gas Ltd. to the Alberta Energy and Utilities Board (EUB) and Alberta Environment (AENV) for the Great Divide Steam Assisted Gravity Drainage (SAGD) Project. The goal of the report was to document the distribution and occurrence of ecosite phases and wetland classes in the project footprint as well as to document the distribution of rare plants; rare plant communities: and intrusive species and old growth communities, including species of management concern. A methodology of the baseline report was presented, including details of mapping and field surveys. Six vegetation types in addition to the disturbed land unit were identified in the project footprint and associated buffer. It was noted that all vegetation types are common for the boreal forest natural regions. Several species of management concern were identified during the spring rare plant survey, including rare bryophytes and non-native or invasive species. Mitigation was identified through a slight shift of the footprint, transplant of appropriate bryophyte species and implementation of a weed management plan. It was noted that results of future surveys for rare plants will be submitted upon completion. It was concluded that the effects of the project on existing vegetation is expected to be low because of the small footprint, prior disturbance history, available mitigation measures and conservation and reclamation planning. 27 refs., 5 tabs., 4 figs.

  4. System Wide Information Management (SWIM)

    Science.gov (United States)

    Hritz, Mike; McGowan, Shirley; Ramos, Cal

    2004-01-01

    This viewgraph presentation lists questions regarding the implementation of System Wide Information Management (SWIM). Some of the questions concern policy issues and strategies, technology issues and strategies, or transition issues and strategies.

  5. Pipeline integrity: ILI baseline data for QRA

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Todd R. [Tuboscope Pipeline Services, Houston, TX (United States)]. E-mail: tporter@varco.com; Silva, Jose Augusto Pereira da [Pipeway Engenharia, Rio de Janeiro, RJ (Brazil)]. E-mail: guto@pipeway.com; Marr, James [MARR and Associates, Calgary, AB (Canada)]. E-mail: jmarr@marr-associates.com

    2003-07-01

    The initial phase of a pipeline integrity management program (IMP) is conducting a baseline assessment of the pipeline system and segments as part of Quantitative Risk Assessment (QRA). This gives the operator's integrity team the opportunity to identify critical areas and deficiencies in the protection, maintenance, and mitigation strategies. As a part of data gathering and integration of a wide variety of sources, in-line inspection (ILI) data is a key element. In order to move forward in the integrity program development and execution, the baseline geometry of the pipeline must be determined with accuracy and confidence. From this, all subsequent analysis and conclusions will be derived. Tuboscope Pipeline Services (TPS), in conjunction with Pipeway Engenharia of Brazil, operate ILI inertial navigation system (INS) and Caliper geometry tools, to address this integrity requirement. This INS and Caliper ILI tool data provides pipeline trajectory at centimeter level resolution and sub-metre 3D position accuracy along with internal geometry - ovality, dents, misalignment, and wrinkle/buckle characterization. Global strain can be derived from precise INS curvature measurements and departure from the initial pipeline state. Accurate pipeline elevation profile data is essential in the identification of sag/over bend sections for fluid dynamic and hydrostatic calculations. This data, along with pipeline construction, operations, direct assessment and maintenance data is integrated in LinaViewPRO{sup TM}, a pipeline data management system for decision support functions, and subsequent QRA operations. This technology provides the baseline for an informed, accurate and confident integrity management program. This paper/presentation will detail these aspects of an effective IMP, and experience will be presented, showing the benefits for liquid and gas pipeline systems. (author)

  6. The Effect of a Class-wide Training on Prosocial Bystander Behaviors

    OpenAIRE

    Barnes, Charity Deanne

    2015-01-01

    The purpose of this study was to decrease school bullying by implementing a class-wide intervention that targets bystanders. Hypotheses include that an intervention will increase prosocial bystander behaviors that will result in reduced rates of bullying and improved positive peer responses. Ross and Horner’s Positive Behavior Supports bullying prevention program was modified to increase incentives for students who defend others from bullying. A multiple baseline design across three general e...

  7. Long-Baseline Neutrino Experiments

    CERN Document Server

    Diwan, M V; Qian, X; Rubbia, A

    2016-01-01

    We review long-baseline neutrino experiments in which neutrinos are detected after traversing macroscopic distances. Over such distances neutrinos have been found to oscillate among flavor states. Experiments with solar, atmospheric, reactor, and accelerator neutrinos have resulted in a coherent picture of neutrino masses and mixing of the three known flavor states. We will summarize the current best knowledge of neutrino parameters and phenomenology with our focus on the evolution of the experimental technique. We proceed from the first evidence produced by astrophysical neutrino sources to the current open questions and the goals of future research.

  8. Long-Baseline Neutrino Experiments

    Science.gov (United States)

    Diwan, M. V.; Galymov, V.; Qian, X.; Rubbia, A.

    2016-10-01

    We review long-baseline neutrino experiments in which neutrinos are detected after traversing macroscopic distances. Over such distances neutrinos have been found to oscillate among flavor states. Experiments with solar, atmospheric, reactor, and accelerator neutrinos have resulted in a coherent picture of neutrino masses and mixing of the three known flavor states. We summarize the current best knowledge of neutrino parameters and phenomenology, with a focus on the evolution of the experimental technique. We proceed from the first evidence produced by astrophysical neutrino sources to the current open questions and the goals of future research.

  9. Baseline LAW Glass Formulation Testing

    Energy Technology Data Exchange (ETDEWEB)

    Kruger, Albert A. [USDOE Office of River Protection, Richland, WA (United States); Mooers, Cavin [The Catholic University of America, Washington, DC (United States). Vitreous State Lab.; Bazemore, Gina [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Pegg, Ian L. [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Hight, Kenneth [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Lai, Shan Tao [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Buechele, Andrew [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Rielley, Elizabeth [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Gan, Hao [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Muller, Isabelle S. [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Cecil, Richard [The Catholic University of America, Washington, DC (United States). Vitreous State Lab

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  10. Radio sources - Very, Very Long Baseline Interferometry

    Science.gov (United States)

    Roberts, D. H.

    1983-03-01

    With resolution of a thousandth of an arcsecond, the radio technique of Very Long Baseline Interferometry (VLBI) provides astronomers with their highest-resolution view of the universe. Data taken with widely-separated antennas are combined, with the help of atomic clocks, to form a Michelson interferometer whose size may be as great as the earth's diameter. Extraordinary phenomena, from the birth of stars as signaled by the brilliant flashes of powerful interstellar masers to the 'faster-than-light' expansion of the cores of distant quasars, are being explored with this technique. However, earth-bound VLBI suffers from several restrictions due to the location of the component antennas at fixed places on the earth's surface. The use of one or more antennas in space in concert with ground-based equipment will greatly expand the technical and scientific capabilities of VLBI, leading to a more complete and even higher resolution view of cosmic phenomena.

  11. FED baseline engineering studies report

    Energy Technology Data Exchange (ETDEWEB)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  12. Pinellas Plant Environmental Baseline Report

    Energy Technology Data Exchange (ETDEWEB)

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  13. TWRS privatization process technical baseline

    Energy Technology Data Exchange (ETDEWEB)

    Orme, R.M.

    1996-09-13

    The U.S. Department of Energy (DOE) is planning a two-phased program for the remediation of Hanford tank waste. Phase 1 is a pilot program to demonstrate the procurement of treatment services. The volume of waste treated during the Phase 1 is a small percentage of the tank waste. During Phase 2, DOE intends to procure treatment services for the balance of the waste. The TWRS Privatization Process Technical Baseline (PPTB) provides a summary level flowsheet/mass balance of tank waste treatment operations which is consistent with the tank inventory information, waste feed staging studies, and privatization guidelines currently available. The PPTB will be revised periodically as privatized processing concepts are crystallized.

  14. Integrated Baseline Review (IBR) Handbook

    Science.gov (United States)

    Fleming, Jon F.; Kehrer, Kristen C.

    2016-01-01

    The purpose of this handbook is intended to be a how-to guide to prepare for, conduct, and close-out an Integrated Baseline Review (IBR). It discusses the steps that should be considered, describes roles and responsibilities, tips for tailoring the IBR based on risk, cost, and need for management insight, and provides lessons learned from past IBRs. Appendices contain example documentation typically used in connection with an IBR. Note that these appendices are examples only, and should be tailored to meet the needs of individual projects and contracts. Following the guidance in this handbook will help customers and suppliers preparing for an IBR understand the expectations of the IBR, and ensure that the IBR meets the requirements for both in-house and contract efforts.

  15. 2016 Annual Technology Baseline (ATB)

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; O' Connor, Patrick; Waldoch, Connor

    2016-09-01

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

  16. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  17. Atlantic NAD 83 SLA Baseline Points

    Data.gov (United States)

    Bureau of Ocean Energy Management, Department of the Interior — This data set contains baseline points in ArcGIS shapefile format for the BOEM Atlantic Region. Baseline points are the discrete coordinate points along the...

  18. Atlantic NAD 83 SLA Baseline Points

    Data.gov (United States)

    Bureau of Ocean Energy Management, Department of the Interior — This data set contains baseline points in ArcGIS shapefile format for the BOEM Atlantic Region. Baseline points are the discrete coordinate points along the...

  19. Atlantic NAD 83 SLA Baseline Tangents

    Data.gov (United States)

    Bureau of Ocean Energy Management, Department of the Interior — This data set contains baseline tangent lines in ArcGIS shapefile format for the BOEM Atlantic Region. Baseline tangent lines are typically bay or river closing...

  20. 40 CFR 1042.825 - Baseline determination.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Baseline determination. 1042.825... Provisions for Remanufactured Marine Engines § 1042.825 Baseline determination. (a) For the purpose of this... not valid. (f) Use good engineering judgment for all aspects of the baseline determination. We may...

  1. LBCS: The LOFAR Long-Baseline Calibrator Survey

    Science.gov (United States)

    Jackson, N.; Tagore, A.; Deller, A.; Moldón, J.; Varenius, E.; Morabito, L.; Wucknitz, O.; Carozzi, T.; Conway, J.; Drabent, A.; Kapinska, A.; Orrù, E.; Brentjens, M.; Blaauw, R.; Kuper, G.; Sluman, J.; Schaap, J.; Vermaas, N.; Iacobelli, M.; Cerrigone, L.; Shulevski, A.; ter Veen, S.; Fallows, R.; Pizzo, R.; Sipior, M.; Anderson, J.; Avruch, I. M.; Bell, M. E.; van Bemmel, I.; Bentum, M. J.; Best, P.; Bonafede, A.; Breitling, F.; Broderick, J. W.; Brouw, W. N.; Brüggen, M.; Ciardi, B.; Corstanje, A.; de Gasperin, F.; de Geus, E.; Eislöffel, J.; Engels, D.; Falcke, H.; Garrett, M. A.; Grießmeier, J. M.; Gunst, A. W.; van Haarlem, M. P.; Heald, G.; Hoeft, M.; Hörandel, J.; Horneffer, A.; Intema, H.; Juette, E.; Kuniyoshi, M.; van Leeuwen, J.; Loose, G. M.; Maat, P.; McFadden, R.; McKay-Bukowski, D.; McKean, J. P.; Mulcahy, D. D.; Munk, H.; Pandey-Pommier, M.; Polatidis, A. G.; Reich, W.; Röttgering, H. J. A.; Rowlinson, A.; Scaife, A. M. M.; Schwarz, D. J.; Steinmetz, M.; Swinbank, J.; Thoudam, S.; Toribio, M. C.; Vermeulen, R.; Vocks, C.; van Weeren, R. J.; Wise, M. W.; Yatawatta, S.; Zarka, P.

    2016-11-01

    We outline the LOFAR Long-Baseline Calibrator Survey (LBCS), whose aim is to identify sources suitable for calibrating the highest-resolution observations made with the International LOFAR Telescope, which include baselines >1000 km. Suitable sources must contain significant correlated flux density (≳ 50 - 100 mJy) at frequencies around 110-190 MHz on scales of a few hundred milliarcseconds. At least for the 200-300-km international baselines, we find around 1 suitable calibrator source per square degree over a large part of the northern sky, in agreement with previous work. This should allow a randomly selected target to be successfully phase calibrated on the international baselines in over 50% of cases. Products of the survey include calibrator source lists and fringe-rate and delay maps of wide areas - typically a few degrees - around each source. The density of sources with significant correlated flux declines noticeably with baseline length over the range 200-600 km, with good calibrators on the longest baselines appearing only at the rate of 0.5 per sq. deg. Coherence times decrease from 1-3 min on 200-km baselines to about 1 min on 600-km baselines, suggesting that ionospheric phase variations contain components with scales of a few hundred kilometres. The longest median coherence time, at just over 3 min, is seen on the DE609 baseline, which at 227 km is close to being the shortest. We see median coherence times of between 80 and 110 s on the four longest baselines (580-600 km), and about 2 min for the other baselines. The success of phase transfer from calibrator to target is shown to be influenced by distance, in a manner that suggests a coherence patch at 150-MHz of the order of 1 deg. Although source structures cannot be measured in these observations, we deduce that phase transfer is affected if the calibrator source structure is not known. We give suggestions for calibration strategies and choice of calibrator sources, and describe the access to

  2. Lorentz symmetry and very long baseline interferometry

    Science.gov (United States)

    Le Poncin-Lafitte, C.; Hees, A.; Lambert, S.

    2016-12-01

    Lorentz symmetry violations can be described by an effective field theory framework that contains both general relativity and the Standard Model of particle physics called the Standard Model extension (SME). Recently, postfit analysis of Gravity Probe B and binary pulsars led to an upper limit at the 10-4 level on the time-time coefficient s¯T T of the pure-gravity sector of the minimal SME. In this work, we derive the observable of very long baseline interferometry (VLBI) in SME and then implement it into a real data analysis code of geodetic VLBI observations. Analyzing all available observations recorded since 1979, we compare estimates of s¯T T and errors obtained with various analysis schemes, including global estimations over several time spans, and with various Sun elongation cutoff angles, and by analysis of radio source coordinate time series. We obtain a constraint on s¯ T T=(-5 ±8 )×10-5 , directly fitted to the observations and improving by a factor of 5 previous postfit analysis estimates.

  3. Lorentz symmetry and Very Long Baseline Interferometry

    CERN Document Server

    Poncin-Lafitte, C Le; lambert, S

    2016-01-01

    Lorentz symmetry violations can be described by an effective field theory framework that contains both General Relativity and the Standard Model of particle physics called the Standard-Model extension (SME). Recently, post-fit analysis of Gravity Probe B and binary pulsars lead to an upper limit at the $10^{-4}$ level on the time-time coefficient $\\bar s^{TT}$ of the pure-gravity sector of the minimal SME. In this work, we derive the observable of Very Long Baseline Interferometry (VLBI) in SME and then we implement it into a real data analysis code of geodetic VLBI observations. Analyzing all available observations recorded since 1979, we compare estimates of $\\bar s^{TT}$ and errors obtained with various analysis schemes, including global estimations over several time spans and with various Sun elongation cut-off angles, and with analysis of radio source coordinate time series. We obtain a constraint on $\\bar s^{TT}=(-5\\pm 8)\\times 10^{-5}$, directly fitted to the observations and improving by a factor 5 pr...

  4. Tools for NEPA compliance: Baseline reports and compliance guides

    Energy Technology Data Exchange (ETDEWEB)

    Wolff, T.A. [Sandia National Labs., Albuquerque, NM (United States); Hansen, R.P. [Hansen Environmental Consultants, Englewood, CO (United States)

    1994-12-31

    Environmental baseline documents and NEPA compliance guides should be carried in every NEPA implementation ``tool kit``. These two indispensable tools can play a major role in avoiding repeated violations of NEPA requirements that have occurred over the past 26 years. This paper describes these tools, discusses their contents, and explains how they are used to prepare better NEPA documents more cost-effectively. Focus is on experience at Sandia Laboratories (NM).

  5. 广域测量系统中刀片式构架相量数据集中器的设计与实现%Design and Implementation of Blade Structure Data Concentrator in Wide Area Measurement System

    Institute of Scientific and Technical Information of China (English)

    刘伟; 王亮; 陈玉林; 侯学勇; 吕航; 文继锋

    2012-01-01

    基于同步相量测量技术的广域测量系统(WAMS)为电力系统各领域的研究和开发提供了全新的数据支持平台。相量数据集中器(PDC)是WAMS中连接同步相量测量单元(PMU)和测量主站的关键环节,目前多数解决方案是采用工控机来实现。实际情况表明,基于工控机的PDC在可靠性和可扩展性方面存在明显不足。采用刀片式构架理念设计的新型PDC极大地提高了其在可靠性和可扩展性方面的表现,特别是对基于IEC 61850标准过程层总线的智能变电站有很好的适应性,实际运行验证了所提出的解决方案有效、实用。%Wide area measurement system (WAMS) based on synchronized phasor measurement technology provides a new data support platform for research and development in all area of power system, that makes it possible to monitor and control the dynamic characteristics of gird asynehronously, and to greatly promote the research advancements of related application in power System. Data Concentrator is a key part in WAMS, which connecting synchronized phasor measurement unit (PMU) with main measuring station. Currently, most schemes for solution are implemented with industrial computer. Practices show that the industrial computer based data concentrator has apparent deficiency in reliability and scalability. The new data concentrator with blade structure design can significantly improve the performance in reliability and scalability. In particularly, the structure proposed shows strong adaptability in the application of smart substation based on the IEC 61850 process bus. Its practical operation in substation proved validity and practicality of the solution.

  6. LBCS: the LOFAR Long-Baseline Calibrator Survey

    CERN Document Server

    Jackson, N; Deller, A; Moldón, J; Varenius, E; Morabito, L; Wucknitz, O; Carozzi, T; Conway, J; Drabent, A; Kapinska, A; Orrù, E; Brentjens, M; Blaauw, R; Kuper, G; Sluman, J; Schaap, J; Vermaas, N; Iacobelli, M; Cerrigone, L; Shulevski, A; ter Veen, S; Fallows, R; Pizzo, R; Sipior, M; Anderson, J; Avruch, M; Bell, M; van Bemmel, I; Bentum, M; Best, P; Bonafede, A; Breitling, F; Broderick, J; Brouw, W; Brüggen, M; Ciardi, B; Corstanje, A; de Gasperin, F; de Geus, E; Eislöffel, J; Engels, D; Falcke, H; Garrett, M; Griessmeier, J; Gunst, A; van Haarlem, M; Heald, G; Hoeft, M; Hörandel, J; Horneffer, A; Intema, H; Juette, E; Kuniyoshi, M; van Leeuwen, J; Loose, G; Maat, P; McFadden, R; McKay-Bukowski, D; McKean, J; Mulcahy, D; Munk, H; Pandey-Pommier, M; Polatidis, A; Reich, W; Röttgering, H; Rowlinson, A; Scaife, A; Schwarz, D; Steinmetz, M; Swinbank, J; Thoudam, S; Toribio, M; Vermeulen, R; Vocks, C; van Weeren, R; Wise, M; Yatawatta, S; Zarka, P

    2016-01-01

    (abridged). We outline LBCS (the LOFAR Long-Baseline Calibrator Survey), whose aim is to identify sources suitable for calibrating the highest-resolution observations made with the International LOFAR Telescope, which include baselines >1000 km. Suitable sources must contain significant correlated flux density (50-100mJy) at frequencies around 110--190~MHz on scales of a few hundred mas. At least for the 200--300-km international baselines, we find around 1 suitable calibrator source per square degree over a large part of the northern sky, in agreement with previous work. This should allow a randomly selected target to be successfully phase calibrated on the international baselines in over 50% of cases. Products of the survey include calibrator source lists and fringe-rate and delay maps of wide areas -- typically a few degrees -- around each source. The density of sources with significant correlated flux declines noticeably with baseline length over the range 200--600~km, with good calibrators on the longest...

  7. Post Auction Coverage Baseline 2.0

    Data.gov (United States)

    Federal Communications Commission — FINAL TELEVISION CHANNEL ASSIGNMENT INFORMATION RELATED TO INCENTIVE AUCTION REPACKING. NOTE: This file provides new baseline coverage and population data for all...

  8. A Simple Method to Control Positive Baseline Trend within Data Nonoverlap

    Science.gov (United States)

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2014-01-01

    Nonoverlap is widely used as a statistical summary of data; however, these analyses rarely correct unwanted positive baseline trend. This article presents and validates the graph rotation for overlap and trend (GROT) technique, a hand calculation method for controlling positive baseline trend within an analysis of data nonoverlap. GROT is…

  9. National baselines for the Sustainable Development Goals assessed in the SDG Index and Dashboards

    Science.gov (United States)

    Schmidt-Traub, Guido; Kroll, Christian; Teksoz, Katerina; Durand-Delacre, David; Sachs, Jeffrey D.

    2017-08-01

    The Sustainable Development Goals (SDGs) -- agreed in 2015 by all 193 member states of the United Nations and complemented by commitments made in the Paris Agreement -- map out a broad spectrum of economic, social and environmental objectives to be achieved by 2030. Reaching these goals will require deep transformations in every country, as well as major efforts in monitoring and measuring progress. Here we introduce the SDG Index and Dashboards as analytical tools for assessing countries' baselines for the SDGs that can be applied by researchers in the cross-disciplinary analyses required for implementation. The Index and Dashboards synthesize available country-level data for all 17 goals, and for each country estimate the size of the gap towards achieving the SDGs. They will be updated annually. All 149 countries for which sufficient data is available face significant challenges in achieving the goals, and many countries' development strategies are imbalanced across the economic, social and environmental priorities. We illustrate the analytical value of the index by examining its relationship with other widely used development indices and by showing how it accounts for cross-national differences in subjective well-being. Given significant data gaps, scope and coverage of the Index and Dashboards are limited, but we suggest that these analyses represent a starting point for a comprehensive assessment of national SDG baselines and can help policymakers determine priorities for early action and monitor progress. The tools also identify data gaps that must be closed for SDG monitoring.

  10. TAPIR--Finnish national geochemical baseline database.

    Science.gov (United States)

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  11. Development of a System Wide Predator Control Program: Stepwise Implementation of a Predation Index, Predator Control Fisheries, and Evaluation Plan in the Columbia River Basin; Section II: Evaluation; 1996 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Franklin R.

    1997-11-01

    Predator control fisheries aimed at reducing predation on juvenile salmonids by northern squawfish (Ptychocheilus oregonensis) were implemented for the seventh consecutive year in the mainstream Columbia and Snake rivers.

  12. Health related baseline millennium development goals indicators for local authorities in Malawi.

    Science.gov (United States)

    Kalanda, Boniface

    2007-03-01

    The Malawi Social Action Fund (MASAF) is implementing a 12 year programme to close service gaps in rural communities. These service gaps are primarily those in health, education, household food security, water and sanitation, transport and communications. The impact indicators of the Project are selected Millennium Development Goal indicators. MASAF conducted a baseline study of the MDG indicators for all districts in Malawi. This paper presents available health related MDG baseline indicators for all districts in Malawi. Other stakeholders implementing health interventions could use these baseline indicators for planning purposes.

  13. COMBINED GPS/GLONASS PRECISE POSITIONING FOR LONG-DISTANCE BASELINES

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Combined GPS/GLONASS can increase the accuracy and reliability ofpositioning especially in some applications with many impediments.Due to the atmosphere delay,the commonly used methods for processing short distance baselines can not be implemented in long distance baselines.In this paper,a new data processing strategy for long distance baselines is proposed,which uses the properties of some combination observables of combined GPS/GLONASS and distance baselines may come to the order of 10-8 and combined GPS/GLONASS improves the accuracy over that of GPS-only positioning,which brings benefit to crust deformation monitoring and research on geodynamics.

  14. Wide Band Artificial Pulsar

    Science.gov (United States)

    Parsons, Zackary

    2017-01-01

    The Wide Band Artificial Pulsar (WBAP) is an instrument verification device designed and built by the National Radio Astronomy Observatory (NRAO) in Green Bank, West Virgina. The site currently operates the Green Bank Ultimate Pulsar Processing Instrument (GUPPI) and the Versatile Green Bank Astronomical Spectrometer (VEGAS) digital backends for their radio telescopes. The commissioning and continued support for these sophisticated backends has demonstrated a need for a device capable of producing an accurate artificial pulsar signal. The WBAP is designed to provide a very close approximation to an actual pulsar signal. This presentation is intended to provide an overview of the current hardware and software implementations and to also share the current results from testing using the WBAP.

  15. Treatment decisions based on scalar and functional baseline covariates.

    Science.gov (United States)

    Ciarleglio, Adam; Petkova, Eva; Ogden, R Todd; Tarpey, Thaddeus

    2015-12-01

    The amount and complexity of patient-level data being collected in randomized-controlled trials offer both opportunities and challenges for developing personalized rules for assigning treatment for a given disease or ailment. For example, trials examining treatments for major depressive disorder are not only collecting typical baseline data such as age, gender, or scores on various tests, but also data that measure the structure and function of the brain such as images from magnetic resonance imaging (MRI), functional MRI (fMRI), or electroencephalography (EEG). These latter types of data have an inherent structure and may be considered as functional data. We propose an approach that uses baseline covariates, both scalars and functions, to aid in the selection of an optimal treatment. In addition to providing information on which treatment should be selected for a new patient, the estimated regime has the potential to provide insight into the relationship between treatment response and the set of baseline covariates. Our approach can be viewed as an extension of "advantage learning" to include both scalar and functional covariates. We describe our method and how to implement it using existing software. Empirical performance of our method is evaluated with simulated data in a variety of settings and also applied to data arising from a study of patients with major depressive disorder from whom baseline scalar covariates as well as functional data from EEG are available.

  16. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  17. Airborne infection control in India: Baseline assessment of health facilities

    Science.gov (United States)

    Parmar, Malik M.; Sachdeva, K.S.; Rade, Kiran; Ghedia, Mayank; Bansal, Avi; Nagaraja, Sharath Burugina; Willis, Matthew D.; Misquitta, Dyson P.; Nair, Sreenivas A.; Moonan, Patrick K.; Dewan, Puneet K.

    2016-01-01

    Background Tuberculosis transmission in health care settings represents a major public health problem. In 2010, national airborne infection control (AIC) guidelines were adopted in India. These guidelines included specific policies for TB prevention and control in health care settings. However, the feasibility and effectiveness of these guidelines have not been assessed in routine practice. This study aimed to conduct baseline assessments of AIC policies and practices within a convenience sample of 35 health care settings across 3 states in India and to assess the level of implementation at each facility after one year. Method A multi-agency, multidisciplinary panel of experts performed site visits using a standardized risk assessment tool to document current practices and review resource capacity. At the conclusion of each assessment, facility-specific recommendations were provided to improve AIC performance to align with national guidelines. Result Upon initial assessment, AIC systems were found to be poorly developed and implemented. Administrative controls were not commonly practiced and many departments needed renovation to achieve minimum environmental standards. One year after the baseline assessments, there were substantial improvements in both policy and practice. Conclusion A package of capacity building and systems development that followed national guidelines substantially improved implementation of AIC policies and practice. PMID:26970461

  18. Breton Island, Louisiana Baseline (Geographic, NAD83)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Breton Island, Louisiana Baseline (Geographic, NAD83) consists of vector line data that were input into the Digital Shoreline Analysis System (DSAS) version 4.0,...

  19. NRAO Very Long Baseline Array (VLBA)

    Data.gov (United States)

    Federal Laboratory Consortium — The Very Long Baseline Array (VLBA) comprises ten radio telescopes spanning 5,351 miles. It's the world's largest, sharpest, dedicated telescope array. With an eye...

  20. Breton Island, Louisiana Baseline (Geographic, NAD83)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Breton Island, Louisiana Baseline (Geographic, NAD83) consists of vector line data that were input into the Digital Shoreline Analysis System (DSAS) version 4.0,...

  1. Hanford Site technical baseline database. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Porter, P.E.

    1995-01-27

    This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available.

  2. Registration Combining Wide and Narrow Baseline Feature Tracking Techniques for Markerless AR Systems

    Directory of Open Access Journals (Sweden)

    Bo Yang

    2009-12-01

    Full Text Available Augmented reality (AR is a field of computer research which deals with the combination of real world and computer generated data. Registration is one of the most difficult problems currently limiting the usability of AR systems. In this paper, we propose a novel natural feature tracking based registration method for AR applications. The proposed method has following advantages: (1 it is simple and efficient, as no man-made markers are needed for both indoor and outdoor AR applications; moreover, it can work with arbitrary geometric shapes including planar, near planar and non planar structures which really enhance the usability of AR systems. (2 Thanks to the reduced SIFT based augmented optical flow tracker, the virtual scene can still be augmented on the specified areas even under the circumstances of occlusion and large changes in viewpoint during the entire process. (3 It is easy to use, because the adaptive classification tree based matching strategy can give us fast and accurate initialization, even when the initial camera is different from the reference image to a large degree. Experimental evaluations validate the performance of the proposed method for online pose tracking and augmentation.

  3. View synthesis from wide-baseline views using occlusion aware estimation of large disparities

    Science.gov (United States)

    Elliethy, Ahmed S.; Aly, Hussein A.; Sharma, Gaurav

    2014-03-01

    Accurate disparity estimation is a key ingredient required when generating a high fidelity novel view from a set of input views. In this paper, a high quality disparity estimation method is proposed for view synthesis from multiple input images with large disparities and occlusions. The method optimally selects one out of three image pairs to estimate the disparity map for different regions of the novel view. The novel view is then formed using this disparity map. We introduce two novel elements: a) an enhanced visibility map that is able to segment the scene accurately near object boundaries and b) a backward unilateral and bilateral disparity estimation procedure using the Gabor transform on an expandable search window to tackle large disparities. The quality of the interpolated virtual views produced by the proposed method is assessed and compared against two of the prominent previously-reported methods. The proposed method offers a significant improvement both in terms of visual quality of the interpolated views as well as the peak signal-to-noise ratio (PSNR) and structured similarity image index (SSIM) metrics.

  4. COMSATCOM service technical baseline strategy development approach using PPBW concept

    Science.gov (United States)

    Nguyen, Tien M.; Guillen, Andy T.

    2016-05-01

    This paper presents an innovative approach to develop a Commercial Satellite Communications (COMSATCOM) service Technical Baseline (TB) and associated Program Baseline (PB) strategy using Portable Pool Bandwidth (PPBW) concept. The concept involves trading of the purchased commercial transponders' Bandwidths (BWs) with existing commercial satellites' bandwidths participated in a "designated pool bandwidth"3 according to agreed terms and conditions. Space Missile Systems Center (SMC) has been implementing the Better Buying Power (BBP 3.0) directive4 and recommending the System Program Offices (SPO) to own the Program and Technical Baseline (PTB) [1, 2] for the development of flexible acquisition strategy and achieving affordability and increased in competition. This paper defines and describes the critical PTB parameters and associated requirements that are important to the government SPO for "owning" an affordable COMSATCOM services contract using PPBW trading concept. The paper describes a step-by-step approach to optimally perform the PPBW trading to meet DoD and its stakeholders (i) affordability requirement, and (ii) fixed and variable bandwidth requirements by optimizing communications performance, cost and PPBW accessibility in terms of Quality of Services (QoS), Bandwidth Sharing Ratio (BSR), Committed Information Rate (CIR), Burstable Information Rate (BIR), Transponder equivalent bandwidth (TPE) and transponder Net Presence Value (NPV). The affordable optimal solution that meets variable bandwidth requirements will consider the operating and trading terms and conditions described in the Fair Access Policy (FAP).

  5. Preliminary Report: Analysis of the baseline study on the prevalence of Salmonella in laying hen flocks of Gallus gallus

    DEFF Research Database (Denmark)

    Hald, Tine

    This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary for the establ......This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary...

  6. Salton Sea sampling program: baseline studies

    Energy Technology Data Exchange (ETDEWEB)

    Tullis, R.E.; Carter, J.L.; Langlois, G.W.

    1981-04-13

    Baseline data are provided on three species of fish from the Salton Sea, California. The fishes considered were the orange mouth corvina (Cynoscion xanthulus), gulf croaker (Bairdiella icistius) and sargo (Anisotremus davidsonii). Morphometric and meristic data are presented as a baseline to aid in the evaluation of any physiological stress the fish may experience as a result of geothermal development. Analyses were made on muscle, liver, and bone of the fishes sampled to provide baseline data on elemental tissue burdens. The elements measured were: As, Br, Ca, Cu, Fe, Ga, K, Mn, Mi, Pb, Rb, Se, Sr, Zn, and Zr. These data are important if an environmentally sound progression of geothermal power production is to occur at the Salton Sea.

  7. Geochemical baseline studies of soil in Finland

    Science.gov (United States)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  8. Health related baseline millennium development goals indicators for local authorities in Malawi

    OpenAIRE

    Kalanda, Boniface

    2007-01-01

    The Malawi Social Action Fund (MASAF) is implementing a 12 year programme to close service gaps in rural communities. These service gaps are primarily those in health, education, household food security, water and sanitation, transport and communications. The impact indicators of the Project are selected Millennium Development Goal indicators. MASAF conducted a baseline study of the MDG indicators for all districts in Malawi. This paper presents available health related MDG baseline indicator...

  9. Neutrino Interactions and Long-Baseline Experiments

    CERN Document Server

    Mosel, Ulrich

    2016-01-01

    The extraction of neutrino mixing parameters and the CP-violating phase requires knowledge of the neutrino energy. This energy must be reconstructed from the final state of a neutrino-nucleus reaction since all long-baseline experiments use nuclear targets. This reconstruction requires detailed knowledge of the neutrino reactions with bound nucleons and of the final state interactions of hadrons with the nuclear environment. Quantum-kinetic transport theory can be used to build an event generator for this reconstruction that takes basic nuclear properties, such as binding, into account. Some examples are discussed that show the effects of nuclear interactions on observables in long-baseline experiments

  10. Long-baseline Neutrino Oscillation at DUNE

    Science.gov (United States)

    Worcester, Elizabeth; DUNE Collaboration Collaboration

    2017-01-01

    The Deep Underground Neutrino Experiment (DUNE) is a long-baseline neutrino oscillation experiment with primary physics goals of determining the neutrino mass hierarchy and measuring δc P with sufficient sensitivity to discover CP violation in neutrino oscillation. CP violation sensitivity in DUNE requires careful understanding of systematic uncertainty, with contributions expected from uncertainties in the neutrino flux, neutrino interactions, and detector effects. In this presentation, we will describe the expected sensitivity of DUNE to long-baseline neutrino oscillation parameters, how various aspects of the experimental design contribute to that sensitivity, and the planned strategy for constraining systematic uncertainty in these measurements.

  11. Development of a System-Wide Predator Control Program : Stepwise Implementation of a Predation Index Predator Control Fisheries and Evaluation Plan in the Columbia River Basin, 1990 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Nigro, Anthony A.

    1990-12-01

    The papers in this document report the results of studies to develop a Columbia River basin-wide program to control northern squawfish predation on juvenile salmonids. Our studies focus on (1) determining where in the basin northern squawfish predation is a problem, (2) conducting various fisheries for northern squawfish, and (3) testing a plan to evaluate how well fisheries are controlling northern squawfish populations. These studies were initiated as part of a basin-wide effort to reduce mortality of juvenile salmonids on their journey from natal streams to the ocean. Earlier work in the Columbia River basin suggested predation by northern squawfish on juvenile salmonids may account for most of the 10 to 20 percent mortality juvenile salmonids experience in each of eight Columbia and Snake river reservoirs. Modeling simulations based on work in John Day Reservoir from 1982--1988 indicated it is not necessary to eradicate northern squawfish to substantially reduce predation-caused mortality of juvenile salmonids. Instead, if northern squawfish were exploited at a 20 percent rate, reductions in their numbers and restructuring of their populations could reduce their predation on juvenile salmonids by 50 percent. We tested three fisheries in 1990, a tribal long-line fishery, a recreational-reward fishery, and a dam hook-and-line fishery.

  12. A baseline algorithm for face detection and tracking in video

    Science.gov (United States)

    Manohar, Vasant; Soundararajan, Padmanabhan; Korzhova, Valentina; Boonstra, Matthew; Goldgof, Dmitry; Kasturi, Rangachar

    2007-10-01

    Establishing benchmark datasets, performance metrics and baseline algorithms have considerable research significance in gauging the progress in any application domain. These primarily allow both users and developers to compare the performance of various algorithms on a common platform. In our earlier works, we focused on developing performance metrics and establishing a substantial dataset with ground truth for object detection and tracking tasks (text and face) in two video domains -- broadcast news and meetings. In this paper, we present the results of a face detection and tracking algorithm on broadcast news videos with the objective of establishing a baseline performance for this task-domain pair. The detection algorithm uses a statistical approach that was originally developed by Viola and Jones and later extended by Lienhart. The algorithm uses a feature set that is Haar-like and a cascade of boosted decision tree classifiers as a statistical model. In this work, we used the Intel Open Source Computer Vision Library (OpenCV) implementation of the Haar face detection algorithm. The optimal values for the tunable parameters of this implementation were found through an experimental design strategy commonly used in statistical analyses of industrial processes. Tracking was accomplished as continuous detection with the detected objects in two frames mapped using a greedy algorithm based on the distances between the centroids of bounding boxes. Results on the evaluation set containing 50 sequences (~ 2.5 mins.) using the developed performance metrics show good performance of the algorithm reflecting the state-of-the-art which makes it an appropriate choice as the baseline algorithm for the problem.

  13. 广角图像畸变校正算法的研究与实现%RESEARCH AND IMPLEMENTATION OF CORRECTION ALGORITHM FOR WIDE-ANGLE IMAGE DISTORTION

    Institute of Scientific and Technical Information of China (English)

    吴开兴; 段马丽; 张惠民; 王鹏

    2014-01-01

    为了校正广角图像的非线性畸变,提出一种新的数字校正方法来消除畸变。首先利用网格模板校正的方法,根据畸变图与理想图对应像素点的映射关系,得到畸变图像点在x轴和y轴方向上的偏移量。然后采用三次B插值函数对曲面插值,得到畸变像素点的偏移量曲面,由偏移量曲面和畸变点的坐标实现各像素点的坐标变换。最后通过双线性插值法完成灰度重建得到无畸变的图像,从而实现对广角图像的校正。为了测试该算法的速度性能和可靠性等指标,在DSP平台上运行此算法。实验结果表明该算法能够对广角畸变图像进行快速有效的校正。%In order to correct nonlinear distortion of wide-angle images, we propose a new digital correction method to eliminate the distortion.First, it makes use of grid template correction method to obtain the offset of distorted image points along X-and Y-axes directions according to the mapping relation between the corresponding pixels of distortion image and ideal figure.Then, it uses the cubic B-spline interpolation function to interpolate the surface, and gets the offset surface of the distorted pixels.According to the offset surface and the coordinate of distorted points, it realises the transformation of coordinate of each pixel.At last, a perfect image without distortion is obtained by the completion of grayscale reconstruction with bilinear interpolation, so as to achieve the correction on wide-angle images.In order to test the speed performance and the reliability of the algorithm, we run it on DSP platform.The experimental results show that this algorithm can make fast and effective correction on wide-angle distortion images.

  14. Physics Potential of Long-Baseline Experiments

    Directory of Open Access Journals (Sweden)

    Sanjib Kumar Agarwalla

    2014-01-01

    Full Text Available The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, θ13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of θ23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomenon. Then, I will discuss our present global understanding of the neutrino mass-mixing parameters and will identify the major unknowns in this sector. After that, I will present the physics reach of current generation long-baseline experiments. Finally, I will conclude with a discussion on the physics capabilities of accelerator-driven possible future long-baseline precision oscillation facilities.

  15. Geochemical modelling baseline compositions of groundwater

    DEFF Research Database (Denmark)

    Postma, Diederik Jan; Kjøller, Claus; Andersen, Martin Søgaard

    2008-01-01

    Reactive transport models, were developed to explore the evolution in groundwater chemistry along the flow path in three aquifers; the Triassic East Midland aquifer (UK), the Miocene aquifer at Valreas (F) and the Cretaceous aquifer near Aveiro (P). All three aquifers contain very old groundwaters...... of the evolution in natural baseline properties in groundwater....

  16. Solid Waste Program technical baseline description

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  17. Rationing in the presence of baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter

    2013-01-01

    We analyze a general model of rationing in which agents have baselines, in addition to claims against the (insufficient) endowment of the good to be allocated. Many real-life problems fit this general model (e.g., bankruptcy with prioritized claims, resource allocation in the public health care...

  18. National Cyberethics, Cybersafety, Cybersecurity Baseline Study

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2009

    2009-01-01

    This article presents findings from a study that explores the nature of the Cyberethics, Cybersafety, and Cybersecurity (C3) educational awareness policies, initiatives, curriculum, and practices currently taking place in the U.S. public and private K-12 educational settings. The study establishes baseline data on C3 awareness, which can be used…

  19. Development of a System-wide Predator Control Program: Stepwise Implementation of a Predation Index, Predator Control Fisheries, and Evaluation Plan in the Columbia River Basin; Northern Pikeminnow Management Program, 1998 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Franklin R.; Wachtel, Mark L.; Petersen, Marc R.

    2003-03-01

    We are reporting on the progress of the Northern Pikeminnow Sport-Reward Fishery (NPSRF) in the lower Columbia and Snake rivers for 1998. The objectives of this project were to (1) implement a sport fishery that rewards anglers who harvest northern pikeminnow Ptychocheilus oregonensis {ge}279 mm (11 inches) total length, (2) collect catch data on selected fish species caught by fishery participants while targeting northern pikeminnow, (3) monitor and report incidental catch of sensitive salmonid species by anglers targeting northern pikeminnow and, (4) collect, monitor and report data on angler participation, catch and catch per angler day of northern pikeminnow during the season. A total of 108,903 northern pikeminnow {ge}279 mm were harvested during the 1998 season and 21,959 angler days were spent harvesting these fish. Harvest was below the seven year average of 150,874 and participation was well below the seven-year average of 51,013 angler days. Catch per angler day for all anglers during the season was 4.96 and exceeded the seven-year average of 2.96 northern pikeminnow per angler day. Peamouth Mylocheilus caurinus, and white sturgeon Acipencer transmontanus, were the other species most often harvested by returning NPSRF anglers targeting northern pikeminnow. Harvest of salmonids Oncorhynchus spp. by NPSRF anglers targeting northern pikeminnow remained below limits established by the National Marine Fisheries Service (NMFS).

  20. Ultrasonic Techniques for Baseline-Free Damage Detection in Structures

    Science.gov (United States)

    Dutta, Debaditya

    damage threshold and prevents the occurrences of false alarms resulting from imperfections and noise in the measurement system. The threshold computation from only the measured signals is they key behind baseline-free damage detection in plates. Chapters 3 and 4 are concerned with nonlinear ultrasonic techniques for crack detection in metallic structures. Chapter 3 describes a nonlinear guided wave technique based on the principle of super-harmonic production due to crack induced nonlinearity. A semi-analytical method is formulated to investigate the behavior of a bilinear crack model. Upon comparing the behavior with experimental observations, it is inferred that a bilinear model can only partially capture the signal characteristics arising from a fatigue crack. A correlation between the extents of nonlinear behavior of a breathing crack with the different stages of the fatigue crack growth is also made in Chapter 3. In Chapter 4, a nonlinear system identification method through coherence measurement is proposed. A popular electro-magnetic impedance circuit was used to detect acoustic nonlinearity produced by a crack. Chapters 5 and 6 comprise the final part of this thesis where wavefield images from a scanning laser vibrometer are digitally processed to detect defects in composite structures. Once processed, the defect in the scanned surface stands out as an outlier in the background of the undamaged area. An outlier analysis algorithm is then implemented to detect and localize the damage automatically. In Chapter 5, exploratory groundwork on wavefield imaging is done by obtaining wave propagation images from specimens made of different materials and with different geometries. In Chapter 6, a hitherto unnoted phenomenon of standing wave formation in delaminated composite plates is observed and explained. Novel signal and image processing techniques are also proposed in this chapter, of which the isolation of standing waves using wavenumber-frequency domain manipulation

  1. Design and Implementation of an Adaptive Bandwidth PLL with Wide Temperature Range and Low Jitter%一种低抖动带宽自适应锁相环的设计与实现

    Institute of Scientific and Technical Information of China (English)

    刘颖; 田泽; 邵刚; 刘敏侠

    2015-01-01

    随着高速通信系统的发展和传输速率的不断提高,锁相环作为提供精确时钟信号的核心电路,不仅需要产生低抖动、低噪声的时钟,而且要求频率覆盖范围广和支持多协议,而恒定带宽的锁相环无法满足多协议对锁相环带宽的要求。为了实现统一架构下多协议对不同频率的带宽要求,文中设计了一种宽温低抖动带宽自适应的锁相环电路,利用比较器模块和电荷泵形成反馈回路灵活地改变电荷泵电流,实现了环路带宽对不同频率在锁定过程中的自适应调整。同时采用改进的占空比校正、压控振荡器和电荷泵电路,降低了锁相环噪声。采用0.13μm CMOS工艺。测试结果表明输出频率为1.0625~3 GHz,数据率覆盖1.0625~5.9 Gbps,RJ<1.3 ps,温度范围为-55~125℃,满足了FC-PI-4、PCIE1.1和Rapid IO1.3的协议要求,已成功应用于多款高速SerDes芯片中。%With the development of high speed communication system and the improvement of the transmission speed,PLL to be the core circuit of providing precision clock is not only required to produce low jitter and low noise clock,but also demanded wide frequency range and multi-protocol support,but the fixed bandwidth PLL cannot reach the requirement of multi-protocol. An adaptive bandwidth PLL with wide temperature range and low jitter is designed for achieving the requirements of multi-protocol in the unify configuration,using the comparator and charge pump to form a feedback loop to flexibly change the charge pump current,and making the loop bandwidth a-daptively adjusted at different rates. Adopt the improved duty-cycle controller,voltage control oscillator and charge pump circuit to de-crease the noise of PLL. This chip is fabricated in 0. 13 μm CMOS process. The measured results show that the output frequency is from 1. 062 5 to 3 GHz and the data rate covers 1. 062 5~5. 9 Gbps,RJ is less than 1. 3 ps,the operating temperature range is-55~125

  2. CASA Uno GPS orbit and baseline experiments

    Science.gov (United States)

    Schutz, B. E.; Ho, C. S.; Abusali, P. A. M.; Tapley, B. D.

    1990-01-01

    CASA Uno data from sites distributed in longitude from Australia to Europe have been used to determine orbits of the GPS satellites. The characteristics of the orbits determined from double difference phase have been evaluated through comparisons of two-week solutions with one-week solutions and by comparisons of predicted and estimated orbits. Evidence of unmodeled effects is demonstrated, particularly associated with the orbit planes that experience solar eclipse. The orbit accuracy has been assessed through the repeatability of unconstrained estimated baseline vectors ranging from 245 km to 5400 km. Both the baseline repeatability and the comparison with independent space geodetic methods give results at the level of 1-2 parts in 100,000,000. In addition, the Mojave/Owens Valley (245 km) and Kokee Park/Ft. Davis (5409 km) estimates agree with VLBI and SLR to better than 1 part in 100,000,000.

  3. Joint Multi-baseline SAR Interferometry

    Directory of Open Access Journals (Sweden)

    S. Tebaldini

    2005-12-01

    Full Text Available We propose a technique to provide interferometry by combining multiple images of the same area. This technique differs from the multi-baseline approach in literature as (a it exploits all the images simultaneously, (b it performs a spectral shift preprocessing to remove most of the decorrelation, and (c it exploits distributed targets. The technique is mainly intended for DEM generation at centimetric accuracy, as well as for differential interferometry. The problem is framed in the contest of single-input multiple-output (SIMO channel estimation via the cross-relations (CR technique and the resulting algorithm provides significant improvements with respect to conventional approaches based either on independent analysis of single interferograms or multi-baselines phase analysis of single pixels of current literature, for those targets that are correlated in all the images, like for long-term coherent areas, or for acquisitions taken with a short revisit time (as those gathered with future satellite constellations.

  4. Dissipative Effect in Long Baseline Neutrino Experiments

    CERN Document Server

    Oliveira, Roberto L N

    2016-01-01

    The propagation of neutrinos in long baselines experiments may be influenced by dissipation effects. Using Lindblad Master Equation we evolve neutrinos taking into account these dissipative effects. The MSW and the dissipative effects may change the probabilities behavior. In this work, we show and explain how the behavior of the probabilities can change due to the decoherence and relaxation effects acting individually with the MSW effect. A new exotic peak appears in this case and we show the difference between the decoherence and relaxation effects in the appearance of this peak. We also adapt the usual approximate expression for survival and appearance probabilities with all possible decoherence effects. We suppose the baseline of DUNE and show how each decoherence parameters change the probabilities analyzing the possible modification using numeric and analytic approach.

  5. CASA Uno GPS orbit and baseline experiments

    Science.gov (United States)

    Schutz, B. E.; Ho, C. S.; Abusali, P. A. M.; Tapley, B. D.

    1990-01-01

    CASA Uno data from sites distributed in longitude from Australia to Europe have been used to determine orbits of the GPS satellites. The characteristics of the orbits determined from double difference phase have been evaluated through comparisons of two-week solutions with one-week solutions and by comparisons of predicted and estimated orbits. Evidence of unmodeled effects is demonstrated, particularly associated with the orbit planes that experience solar eclipse. The orbit accuracy has been assessed through the repeatability of unconstrained estimated baseline vectors ranging from 245 km to 5400 km. Both the baseline repeatability and the comparison with independent space geodetic methods give results at the level of 1-2 parts in 100,000,000. In addition, the Mojave/Owens Valley (245 km) and Kokee Park/Ft. Davis (5409 km) estimates agree with VLBI and SLR to better than 1 part in 100,000,000.

  6. Systematic errors in long baseline oscillation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Deborah A.; /Fermilab

    2006-02-01

    This article gives a brief overview of long baseline neutrino experiments and their goals, and then describes the different kinds of systematic errors that are encountered in these experiments. Particular attention is paid to the uncertainties that come about because of imperfect knowledge of neutrino cross sections and more generally how neutrinos interact in nuclei. Near detectors are planned for most of these experiments, and the extent to which certain uncertainties can be reduced by the presence of near detectors is also discussed.

  7. Geochemical baseline data, Youngs Bay, Oregon, 1974

    Energy Technology Data Exchange (ETDEWEB)

    McMechan, K.J. (ed.); Johnson, V.G.; Cutshall, N.H.

    1975-04-01

    This report comprises one part of a final report to the Alumax Pacific Aluminum Corporation on the Physical, Chemical and Biological Studies of Youngs Bay''. The data reported herein are the product of the geochemical baseline section of the project. The primary objectives of the geochemical study were: to provide a baseline record of fluoride and selected trace metal levels in Youngs Bay bottom sediment, to identify areas that might function as heavy metal traps, to attempt to determine the recent depositional history of sediment in the bay. In addition to these primary objectives, a number of secondary tasks were undertaken during the study. While time did not allow these additional studies to be carried to completion, preliminary results are included herein because of their potential usefulness in assessing the impact of environmental releases of fluoride to aquatic systems in the vicinity of Youngs Bay or elsewhere. This report is made up of two major sections. In the first, a description of sample collection and analytical procedures is followed by a discussion of the baseline results. Obvious vertical and horizontal patterns of elemental distribution are identified and their origins considered. Problems needed further research are also discussed. In the second section, the data are presented in interpretive, graphical form, as well as in tables. 35 refs., 29 figs., 14 tabs.

  8. Multiple mechanisms account for variation in base-line sensitivity to azole fungicides in field isolates of Mycosphaerella graminicola

    NARCIS (Netherlands)

    Stergiopoulos, I.; Nistelrooy, van J.G.M.; Kema, G.H.J.; Waard, de M.A.

    2003-01-01

    Molecular mechanisms that account for variation in base-line sensitivity to azole fungicides were examined in a collection of twenty field isolates, collected in France and Germany, of the wheat pathogen Mycosphaerella graminicola (Fuckel) Schroeter. The isolates tested represent the wide base-line

  9. Gravity sensing with Very Long Baseline Atom Interferometry

    Science.gov (United States)

    Schlippert, Dennis; Albers, Henning; Richardson, Logan L.; Nath, Dipankar; Meiners, Christian; Wodey, Étienne; Schubert, Christian; Ertmer, Wolfgang; Rasel, Ernst M.

    2016-04-01

    Very Long Baseline Atom Interferometry (VLBAI) represents a new class of atom optics experiments with applications in high-accuracy absolute gravimetry, gravity-gradiometry, and for tests of fundamental physics. Extending the baseline of atomic gravimeters from tens of centimeters to several meters opens the route towards competition with superconducting gravimeters. The VLBAI-test stand will consist of a 10m-baseline atom interferometer allowing for free fall times on the order of seconds, which will implemented in the Hannover Institut für Technologie (HITec) of the Leibniz Universität Hannover. In order to suppress environmental noise, the facility utilizes a state-of-the-art vibration isolation platform and a three-layer magnetic shield. We envisage a resolution of local gravitational acceleration of 5 ṡ 10-10 m/s2 with an inaccuracy < 10-9 m/s2. Operation as a gravity-gradiometer will allow to resolve the first-order gravity gradient with a resolution of 5 ṡ 10-10 1/s2. The operation of VLBAI as a differential dual-species gravimeter using ultracold mixtures of ytterbium and rubidium atoms enables quantum tests of the universality of free fall (UFF) at an unprecedented level [1], with the potential to surpass the accuracy of the best experiments to date [2]. We report on the first quantum test of the UFF using two different chemical elements, 39K and 87Rb [3], reaching a 100 ppb inaccuracy and show the potential of UFF tests in VLBAI at an inaccuracy of 10-13 and beyond. References J. Hartwig et al., New J. Phys. 17, 035011- (2015) S. Schlamminger et al., Phys. Rev. Lett. 100, 041101- (2008) D. Schlippert et al., Phys. Rev. Lett. 112, 203002 (2014)

  10. Model Order Selection in Multi-baseline Interferometric Radar Systems

    Directory of Open Access Journals (Sweden)

    Fulvio Gini

    2005-12-01

    Full Text Available Synthetic aperture radar interferometry (InSAR is a powerful technique to derive three-dimensional terrain images. Interest is growing in exploiting the advanced multi-baseline mode of InSAR to solve layover effects from complex orography, which generate reception of unexpected multicomponent signals that degrade imagery of both terrain radar reflectivity and height. This work addresses a few problems related to the implementation into interferometric processing of nonlinear algorithms for estimating the number of signal components, including a system trade-off analysis. Performance of various eigenvalues-based information-theoretic criteria (ITC algorithms is numerically investigated under some realistic conditions. In particular, speckle effects from surface and volume scattering are taken into account as multiplicative noise in the signal model. Robustness to leakage of signal power into the noise eigenvalues and operation with a small number of looks are investigated. The issue of baseline optimization for detection is also addressed. The use of diagonally loaded ITC methods is then proposed as a tool for robust operation in the presence of speckle decorrelation. Finally, case studies of a nonuniform array are studied and recommendations for a proper combination of ITC methods and system configuration are given.

  11. Impact of selected energy conservation technologies on baseline demands

    Energy Technology Data Exchange (ETDEWEB)

    Doernberg, A

    1977-09-01

    This study is an application of the modeling and demand projection capability existing at Brookhaven National Laboratory to specific options in energy conservation. Baseline energy demands are modified by introducing successively three sets of conservation options. The implementation of improved building standards and the use of co-generation in industry are analyzed in detail and constitute the body of this report. Two further sets of energy demands are presented that complete the view of a low energy use, ''conservation'' scenario. An introduction to the report covers the complexities in evaluating ''conservation'' in view of the ways it is inextricably linked to technology, prices, policy, and the mix of output in the economy. The term as used in this report is narrowly defined, and methodologies are suggested by which these other aspects listed can be studied in the future.

  12. Laser-Ranging Long Baseline Differential Atom Interferometers for Space

    CERN Document Server

    Chiow, Sheng-wey; Yu, Nan

    2015-01-01

    High sensitivity differential atom interferometers are promising for precision measurements in science frontiers in space, including gravity field mapping for Earth science studies and gravitational wave detection. We propose a new configuration of twin atom interferometers connected by a laser ranging interferometer (LRI-AI) to provide precise information of the displacements between the two AI reference mirrors and a means to phase-lock the two independent interferometer lasers over long distances, thereby further enhancing the feasibility of long baseline differential atom interferometers. We show that a properly implemented LRI-AI can achieve equivalent functionality to the conventional differential atom interferometer measurement system. LRI-AI isolates the laser requirements for atom interferometers and for optical phase readout between distant locations, thus enabling optimized allocation of available laser power within a limited physical size and resource budget. A unique aspect of LRI-AI also enables...

  13. Adapting a Cryogenic Sapphire Oscillator for Very Long Baseline Interferometry

    CERN Document Server

    Doeleman, Sheperd; Rogers, Alan; Hartnett, John; Tobar, Michael; Nand, Nitin; 10.1086/660156

    2011-01-01

    Extension of very long baseline interferometry (VLBI) to observing wavelengths shorter than 1.3mm provides exceptional angular resolution (~20 micro arcsec) and access to new spectral regimes for the study of astrophysical phenomena. To maintain phase coherence across a global VLBI array at these wavelengths requires that ultrastable frequency references be used for the heterodyne receivers at all participating telescopes. Hydrogen masers have traditionally been used as VLBI references, but atmospheric turbulence typically limits (sub) millimeter VLBI coherence times to ~1-30 s. Cryogenic Sapphire Oscillators (CSO) have better stability than Hydrogen masers on these time scale and are potential alternatives to masers as VLBI references. Here, We describe the design, implementation and tests of a system to produce a 10 MHz VLBI frequency standard from the microwave (11.2 GHz) output of a CSO. To improve long-term stability of the new reference, the CSO was locked to the timing signal from the Global Positionin...

  14. Comparison of Small Baseline Interferometric SAR Processors for Estimating Ground Deformation

    Directory of Open Access Journals (Sweden)

    Wenyu Gong

    2016-04-01

    Full Text Available The small Baseline Synthetic Aperture Radar (SAR Interferometry (SBI technique has been widely and successfully applied in various ground deformation monitoring applications. Over the last decade, a variety of SBI algorithms have been developed based on the same fundamental concepts. Recently developed SBI toolboxes provide an open environment for researchers to apply different SBI methods for various purposes. However, there has been no thorough discussion that compares the particular characteristics of different SBI methods and their corresponding performance in ground deformation reconstruction. Thus, two SBI toolboxes that implement a total of four SBI algorithms were selected for comparison. This study discusses and summarizes the main differences, pros and cons of these four SBI implementations, which could help users to choose a suitable SBI method for their specific application. The study focuses on exploring the suitability of each SBI module under various data set conditions, including small/large number of interferograms, the presence or absence of larger time gaps, urban/vegetation ground coverage, and temporally regular/irregular ground displacement with multiple spatial scales. Within this paper we discuss the corresponding theoretical background of each SBI method. We present a performance analysis of these SBI modules based on two real data sets characterized by different environmental and surface deformation conditions. The study shows that all four SBI processors are capable of generating similar ground deformation results when the data set has sufficient temporal sampling and a stable ground backscatter mechanism like urban area. Strengths and limitations of different SBI processors were analyzed based on data set configuration and environmental conditions and are summarized in this paper to guide future users of SBI techniques.

  15. THE IMPACT OF EX-ANTE VERSUS EX-POST CDM BASELINES ON A MONOPOLY FIRM

    Energy Technology Data Exchange (ETDEWEB)

    Jiro Akita [Graduate School of Economics and Management Tohoku University Kawauchi, Aoba-Ku, Sendai (Japan); Haruo Imaiy [Kyoto Institute of Economic Research Kyoto University Yoshida, Sakyo-Ku, Kyoto (Japan); Hidenori Niizawaz [School of Economics University of Hyogo Kobe (Japan)

    2008-09-30

    CDM baseline-setting methods may be broadly classified into ex ante methods and ex post methods. Ex post baseline takes into consideration information that becomes available ex post facto, that is, after the CDM project is implemented, in addition to data available prior to the project. Incorporating ex post information, however, inadvertently runs the risk of distorting the incentives of project participants. When output scale is en-dogenously determined, ex post baseline tends to boost output. We show that this may increase total emissions despite the reduction in emissions per output. With ex ante baseline, output is suppressed, bringing about the bene.t of reduced total emissions. But lower output implies reduced consumer and producer surplus. We show that the total social welfare may actually deteriorate because of CDM.

  16. Optimization of the CLIC Baseline Collimation System

    Energy Technology Data Exchange (ETDEWEB)

    Resta-Lopez, Javier; /Oxford U., JAI; Angal-Kalinin, Deepa; /Daresbury; Fernandez-Hernando, Juan; /Daresbury; Jackson, Frank; /Daresbury; Dalena, Barbara; /CERN; Schulte, Daniel; /CERN; Tomas, Rogelio; /CERN; Seryi, Andrei; /SLAC

    2012-07-06

    Important efforts have recently been dedicated to the improvement of the design of the baseline collimation system of the Compact Linear Collider (CLIC). Different aspects of the design have been optimized: the transverse collimation depths have been recalculated in order to reduce the collimator wakefield effects while maintaining a good efficiency in cleaning the undesired beam halo; the geometric design of the spoilers have also been reviewed to minimize wakefields; in addition, the optics design have been polished to improve the collimation efficiency. This paper describes the current status of the CLIC collimation system after this optimization.

  17. Does Baseline Heart Rate Variability Reflect Stable Positive Emotionality?

    Science.gov (United States)

    Silvia, Paul J; Jackson, Bryonna A; Sopko, Rachel S

    2014-11-01

    Several recent studies have found significant correlations, medium in effect size, between baseline heart rate variability (HRV) and measures of positive functioning, such as extraversion, agreeableness, and trait positive affectivity. Other research, however, has suggested an optimal level of HRV and found nonlinear effects. In the present study, a diverse sample of 239 young adults completed a wide range of measures that reflect positive psychological functioning, including personality traits, an array of positive emotions (measured with the Dispositional Positive Emotions Scale), and depression, anxiety, and stress symptoms (measured with the DASS and CESD). HRV was measured with a 6-minute baseline period and quantified using many common HRV metrics (e.g., respiratory sinus arrhythmia, root mean square of successive differences, and others), and potentially confounding behavioral and lifestyle variables (e.g., BMI, caffeine and nicotine use, sleep quality) were assessed. Neither linear nor non-linear effects were found, and the effect sizes were small and near zero. The findings suggest that the cross-sectional relationship between HRV and positive experience deserves more attention and meta-analytic synthesis.

  18. Direct coal liquefaction baseline design and system analysis

    Energy Technology Data Exchange (ETDEWEB)

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  19. Direct coal liquefaction baseline design and system analysis

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  20. Waste Assessment Baseline for the IPOC Second Floor, West Wing

    Energy Technology Data Exchange (ETDEWEB)

    McCord, Samuel A [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Waste Management and Pollution Prevention

    2015-04-01

    Following a building-wide waste assessment in September, 2014, and subsequent presentation to Sandia leadership regarding the goal of Zero Waste by 2025, the occupants of the IPOC Second Floor, West Wing contacted the Materials Sustainability and Pollution Prevention (MSP2) team to guide them to Zero Waste in advance of the rest of the site. The occupants are from Center 3600, Public Relations and Communications , and Center 800, Independent Audit, Ethics and Business Conduct . To accomplish this, MSP2 conducted a new limited waste assessment from March 2-6, 2015 to compare the second floor, west wing to the building as a whole. The assessment also serves as a baseline with which to mark improvements in diversion in approximately 6 months.

  1. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  2. Vitamin D production after UVB exposure depends on baseline vitamin D and total cholesterol but not on skin pigmentation

    DEFF Research Database (Denmark)

    Bogh, Morten Huus; Schmedes, Anne V; Philipsen, Peter Alshede

    2010-01-01

    cholesterol on 25(OH)D production after UVB exposure, 182 persons were screened for 25(OH)D level. A total of 50 participants with a wide range in baseline 25(OH)D levels were selected to define the importance of baseline 25(OH)D level. Of these, 28 non-sun worshippers with limited past sun exposure were used...... was measured at baseline. The increase in 25(OH)D level after UVB exposure was negatively correlated with baseline 25(OH)D level (P...

  3. Physics Potential of Long-Baseline Experiments

    CERN Document Server

    Agarwalla, Sanjib Kumar

    2014-01-01

    The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, theta13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of theta23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomeno...

  4. Baseline and benchmark model development for hotels

    Science.gov (United States)

    Hooks, Edward T., Jr.

    The hotel industry currently faces rising energy costs and requires the tools to maximize energy efficiency. In order to achieve this goal a clear definition of the current methods used to measure and monitor energy consumption is made. Uncovering the limitations to the most common practiced analysis strategies and presenting methods that can potentially overcome those limitations is the main purpose. Techniques presented can be used for measurement and verification of energy efficiency plans and retrofits. Also, modern energy modeling tool are introduced to demonstrate how they can be utilized for benchmarking and baseline models. This will provide the ability to obtain energy saving recommendations and parametric analysis to explore energy savings potential. These same energy models can be used in design decisions for new construction. An energy model is created of a resort style hotel that over one million square feet and has over one thousand rooms. A simulation and detailed analysis is performed on a hotel room. The planning process for creating the model and acquiring data from the hotel room to calibrate and verify the simulation will be explained. An explanation as to how this type of modeling can potentially be beneficial for future baseline and benchmarking strategies for the hotel industry. Ultimately the conclusion will address some common obstacles the hotel industry has in reaching their full potential of energy efficiency and how these techniques can best serve them.

  5. Pilot implementation

    DEFF Research Database (Denmark)

    Hertzum, Morten; Bansler, Jørgen P.; Havn, Erling C.;

    2012-01-01

    implementation and provide three empirical illustrations of our model. We conclude that pilot implementation has much merit as an ISD technique when system performance is contingent on context. But we also warn developers that, despite their seductive conceptual simplicity, pilot implementations can be difficult...

  6. Rapid ambiguity resolution over medium-to-long baselines based on GPS/BDS multi-frequency observables

    Science.gov (United States)

    Gong, Xiaopeng; Lou, Yidong; Liu, Wanke; Zheng, Fu; Gu, Shengfeng; Wang, Hua

    2017-02-01

    Medium-long baseline RTK positioning generally needs a long initial time to find an accurate position due to non-negligible atmospheric delay residual. In order to shorten the initial or re-convergence time, a rapid phase ambiguity resolution method is employed based on GPS/BDS multi-frequency observables in this paper. This method is realized by two steps. First, double-differenced un-combined observables (i.e., L1/L2 and B1/B2/B3 observables) are used to obtain a float solution with atmospheric delay estimated as random walk parameter by using Kalman filter. This model enables an easy and consistent implementation for different systems and different frequency observables and can readily be extended to use more satellite navigation systems (e.g., Galileo, QZSS). Additional prior constraints for atmospheric information can be quickly added as well, because atmospheric delay is parameterized. Second, in order to fix ambiguity rapidly and reliably, ambiguities are divided into three types (extra-wide-lane (EWL), wide-lane (WL) and narrow-lane (NL)) according to their wavelengths and are to be fixed sequentially by using the LAMBDA method. Several baselines ranging from 61 km to 232 km collected by Trimble and Panda receivers are used to validate the method. The results illustrate that it only takes approximately 1, 2 and 6 epochs (30 s intervals) to fix EWL, WL and NL ambiguities, respectively. More epochs' observables are needed to fix WL and NL ambiguity around local time 14:00 than other time mainly due to more active ionosphere activity. As for the re-convergence time, the simulated results show that 90% of epochs can be fixed within 2 epochs by using prior atmospheric delay information obtained from previously 5 min. Finally, as for positioning accuracy, meter, decimeter and centimeter level positioning results are obtained according to different ambiguity resolution performances, i.e., EWL, WL and NL fixed solutions.

  7. The OPERA long baseline neutrino oscillation experiment

    Science.gov (United States)

    Wilquet, G.

    2008-05-01

    OPERA is a long baseline neutrino oscillation experiment designed to observe the appearance of vτ in a pure vμ beam in the parameter space indicated by the atmospheric neutrinos oscillation signal. The detector is situated in the underground LNGS laboratory under 3 800 water meter equivalent at a distance of 730 km from CERN where the CNGS neutrino beam to which it is exposed originates. It consists of two identical 0.68 kilotons lead/nuclear emulsion targets, each instrumented with a tracking device and complemented by a muon spectrometer. The concept and the status of the detector are described and the first results obtained with cosmic rays and during two weeks of beam commissioning in 2006 are reported.

  8. In-Space Manufacturing Baseline Property Development

    Science.gov (United States)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  9. Pentek concrete scabbling system: Baseline report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-31

    The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE{reg_sign}, SQUIRREL{reg_sign}-I, and SQUIRREL{reg_sign}-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  10. Steganography Based on Baseline Sequential JPEG Compression

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Information hiding in Joint Photographic Experts Group (JPEG) compressed images are investigated in this paper. Quantization is the source of information loss in JPEG compression process. Therefore, information hidden in images is probably destroyed by JPEG compression. This paper presents an algorithm to reliably embed information into the JPEG bit streams in the process of JPEG encoding. Information extraction is performed in the process of JPEG decoding. The basic idea of our algorithm is to modify the quantized direct current (DC) coefficients and non-zero alternating current (AC) coefficients to represent one bit information (0 or 1). Experimental results on gray images using baseline sequential JPEG encoding show that the cover images (images without secret information) and the stego-images (images with secret information) are perceptually indiscernible.

  11. Intensity interferometry: Optical imaging with kilometer baselines

    CERN Document Server

    Dravins, Dainis

    2016-01-01

    Optical imaging with microarcsecond resolution will reveal details across and outside stellar surfaces but requires kilometer-scale interferometers, challenging to realize either on the ground or in space. Intensity interferometry, electronically connecting independent telescopes, has a noise budget that relates to the electronic time resolution, circumventing issues of atmospheric turbulence. Extents up to a few km are becoming realistic with arrays of optical air Cherenkov telescopes (primarily erected for gamma-ray studies), enabling an optical equivalent of radio interferometer arrays. Pioneered by Hanbury Brown and Twiss, digital versions of the technique have now been demonstrated, reconstructing diffraction-limited images from laboratory measurements over hundreds of optical baselines. This review outlines the method from its beginnings, describes current experiments, and sketches prospects for future observations.

  12. Long-Baseline Neutrino Oscillation Experiments

    CERN Document Server

    Sousa, Alexandre

    2011-01-01

    During the past decade, long-baseline neutrino experiments played a fundamental role in confirming neutrino flavor change and in measuring the neutrino mixing matrix with high precision. This role will be amplified with the next generation of experiments, which will begin probing the possibility of CP violation in the leptonic sector and possibly pin down the neutrino mass hierarchy. An account of the most recent results from the MINOS experiment is presented, along with the earlier measurement from the K2K experiment. The next generation projects, T2K and NOvA, are described and their current status, schedule and physics reach discussed. Finally, we report on future efforts, currently in the R&D stage, such as the LBNE and T2KK projects.

  13. The WITCH Model. Structure, Baseline, Solutions.

    Energy Technology Data Exchange (ETDEWEB)

    Bosetti, V.; Massetti, E.; Tavoni, M.

    2007-07-01

    WITCH - World Induced Technical Change Hybrid - is a regionally disaggregated hard link hybrid global model with a neoclassical optimal growth structure (top down) and an energy input detail (bottom up). The model endogenously accounts for technological change, both through learning curves affecting prices of new vintages of capital and through R and D investments. The model features the main economic and environmental policies in each world region as the outcome of a dynamic game. WITCH belongs to the class of Integrated Assessment Models as it possesses a climate module that feeds climate changes back into the economy. In this paper we provide a thorough discussion of the model structure and baseline projections. We report detailed information on the evolution of energy demand, technology and CO2 emissions. Finally, we explicitly quantifiy the role of free riding in determining the emissions scenarios. (auth)

  14. Octant degeneracy, CPV phase at Long Baseline $\

    CERN Document Server

    Bora, Kalpana; Dutta, Debajyoti

    2015-01-01

    In a recent work by two of us, we have studied, how CP violation discovery potential can be improved at long baseline neutrino experiments (LBNE/DUNE), by combining with its ND (near detector) and reactor experiments. In this work, we discuss how this study can be further analysed to resolve entanglement of the quadrant of CPV phase and Octant of atmospheric mixing angle {\\theta}23, at LBNEs. The study is done for both NH (Normal hierarchy) and IH (Inverted hierarchy). We further show how leptogenesis can enhance this effect of resolving this entanglement. A detailed analytic and numerical study of baryogenesis through leptogenesis is performed in this framework in a model independent way. We then compare our result of the baryon to photon ratio with the the current observational data of the baryon asymmetry.

  15. Zoning: focused support: a trust wide implementation project.

    Science.gov (United States)

    Gamble, C; Dodd, G; Grellier, J; Hever, M; O'Conner, C; Clarke, T; Chipere, R; Mellor, M; Ness, M

    2010-02-01

    Applying pragmatic risk management procedures to facilitate the sharing of clinical knowledge in and across mental health teams. Abstract Zoning: focused support is pragmatic risk management support procedure that enhances adherence to operational policies, provides a forum in which staff can receive support and visually facilitates the sharing of clinical knowledge. This paper presents a 3-year multi-method management project that sought to introduce zoning principles into all teams of an inner city Mental Health NHS Trust. By changing the language and culture of the organization findings indicate that there has been a positive attitudinal shift in how the approach is perceived. It is considered to be of value to staff, service users and their families and 73% of teams are now using it routinely.

  16. World wide IFC phosphoric acid fuel cell implementation

    Energy Technology Data Exchange (ETDEWEB)

    King, J.M. Jr

    1996-04-01

    International Fuel Cells, a subsidary of United technologies Corporation, is engaged in research and development of all types of fuel cell technologies and currently manufactures alkaline fuel cell power plants for the U.S. manned space flight program and natural gas fueled stationary power plants using phosphoric acid fuel cells. This paper describes the phosphoric acid fuel cell power plants.

  17. Design and implementation of a hospital wide waveform capture system.

    Science.gov (United States)

    Blum, James M; Joo, Heyon; Lee, Henry; Saeed, Mohammed

    2015-06-01

    The use of telemetry and invasive monitoring is exceptionally common in modern healthcare. To date the vast majority of this information is not stored for more than a brief duration on the local monitor. This prohibits extensive investigation into waveform data. We describe a system to collect such data in a quaternary care facility. Using standardized "packet sniffing" technology along with routine manual documentation, we reverse engineered the Unity network protocol used to transmit waveform data across the University of Michigan mission critical monitor network. Data was subsequently captured using a proprietary piece of software writing waveform data to local disks. Nightly, this data is post-processed using data from the admit-discharge-transfer system into individual patient waveforms for the day regardless of location. Over a 10 month period, over 2,785 individual patients had a total of 65,112 waveforms captured 15,978 from the operating rooms and 49,134 from the ICUs. The average OR case collected over 11 MB of data. The average single day data collection consisted of 8.6 GB of data. Entire hospital waveform data collection is possible using internally developed software enabling research on waveform data with minimal technical burden. Further research is required to determine the long-term storage and processing of such data.

  18. Tightly coupled long baseline/ultra-short baseline integrated navigation system

    Science.gov (United States)

    Batista, Pedro; Silvestre, Carlos; Oliveira, Paulo

    2016-06-01

    This paper proposes a novel integrated navigation filter based on a combined long baseline/ultra short baseline acoustic positioning system with application to underwater vehicles. With a tightly coupled structure, the position, linear velocity, attitude, and rate gyro bias are estimated, considering the full nonlinear system dynamics without resorting to any algebraic inversion or linearisation techniques. The resulting solution ensures convergence of the estimation error to zero for all initial conditions, exponentially fast. Finally, it is shown, under simulation environment, that the filter achieves very good performance in the presence of sensor noise.

  19. Biological baseline data Youngs Bay, Oregon, 1974

    Energy Technology Data Exchange (ETDEWEB)

    McMechan, K.J. (ed.); Higley, D.L.; Holton, R.L.

    1975-04-01

    This report presents biological baseline information gathered during the research project, Physical, Chemical and Biological Studies on Youngs Bay.'' Youngs Bay is a shallow embayment located on the south shore of the Columbia River, near Astoria, Oregon. Research on Youngs Bay was motivated by the proposed construction by Alumax Pacific Aluminum Corporation of an aluminum reduction plant at Warrenton, Oregon. The research was designed to provide biological baseline information on Youngs Bay in anticipation of potential harmful effects from plant effluents. The information collected concerns the kinds of animals found in the Youngs Bay area, and their distribution and seasonal patterns of abundance. In addition, information was collected on the feeding habits of selected fish species, and on the life history and behavioral characteristics of the most abundant benthic amphipod, Corophium salmonis. Sampling was conducted at approximately three-week intervals, using commonly accepted methods of animal collection. Relatively few stations were sampled for fish, because of the need to standardize conditions of capture. Data on fish capture are reported in terms of catch-per-unit effort by a particular sampling gear at a specific station. Methods used in sampling invertebrates were generally more quantitative, and allowed sampling at a greater variety of places, as well as a valid basis for the computation of densities. Checklists of invertebrate species and fish species were developed from these samples, and are referred to throughout the report. The invertebrate checklist is more specific taxonomically than are tables reporting invertebrate densities. This is because the methods employed in identification were more precise than those used in counts. 9 refs., 27 figs., 25 tabs.

  20. Baseline and Multimodal UAV GCS Interface Design

    Science.gov (United States)

    2013-07-01

    necessary. Our situational awareness questionnaire which was implemented during the experimental session had a number of questions that were confusing...during the training phase, one where the participant is required to abort the landing and one where the UAV can land safely. The additional practice...scenario was added to remove the bias of experiencing only an abort scenario during the training phase. A new visual indicator of engine

  1. Pilot implementation

    DEFF Research Database (Denmark)

    Hertzum, Morten; Bansler, Jørgen P.; Havn, Erling C.

    2012-01-01

    A recurrent problem in information-systems development (ISD) is that many design shortcomings are not detected during development, but first after the system has been delivered and implemented in its intended environment. Pilot implementations appear to promise a way to extend prototyping from...... the laboratory to the field, thereby allowing users to experience a system design under realistic conditions and developers to get feedback from realistic use while the design is still malleable. We characterize pilot implementation, contrast it with prototyping, propose a iveelement model of pilot...... implementation and provide three empirical illustrations of our model. We conclude that pilot implementation has much merit as an ISD technique when system performance is contingent on context. But we also warn developers that, despite their seductive conceptual simplicity, pilot implementations can be difficult...

  2. Pilot implementation

    DEFF Research Database (Denmark)

    Hertzum, Morten; Bansler, Jørgen P.; Havn, Erling C.

    2012-01-01

    A recurrent problem in information-systems development (ISD) is that many design shortcomings are not detected during development, but first after the system has been delivered and implemented in its intended environment. Pilot implementations appear to promise a way to extend prototyping from...... the laboratory to the field, thereby allowing users to experience a system design under realistic conditions and developers to get feedback from realistic use while the design is still malleable. We characterize pilot implementation, contrast it with prototyping, propose a five-element model of pilot...... implementation, and provide three empirical illustrations of our model. We conclude that pilot implementation has much merit as an ISD technique when system performance is contingent on context. But we also warn developers that, despite their seductive conceptual simplicity, pilot implementations can...

  3. The Wide Field Imaging Interferometry Testbed

    CERN Document Server

    Zhang, X; Leisawitz, D T; Leviton, D B; Martino, A J; Mather, J C; Zhang, Xiaolei; Feinberg, Lee; Leisawitz, Dave; Leviton, Douglas B.; Martino, Anthony J.; Mather, John C.

    2001-01-01

    We are developing a Wide-Field Imaging Interferometry Testbed (WIIT) in support of design studies for NASA's future space interferometry missions, in particular the SPIRIT and SPECS far-infrared/submillimeter interferometers. WIIT operates at optical wavelengths and uses Michelson beam combination to achieve both wide-field imaging and high-resolution spectroscopy. It will be used chiefly to test the feasibility of using a large-format detector array at the image plane of the sky to obtain wide-field interferometry images through mosaicing techniques. In this setup each detector pixel records interferograms corresponding to averaging a particular pointing range on the sky as the optical path length is scanned and as the baseline separation and orientation is varied. The final image is constructed through spatial and spectral Fourier transforms of the recorded interferograms for each pixel, followed by a mosaic/joint-deconvolution procedure of all the pixels. In this manner the image within the pointing range ...

  4. Arc melter demonstration baseline test results

    Energy Technology Data Exchange (ETDEWEB)

    Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; Oden, L.L.; O`Connor, W.K.; Turner, P.C.

    1994-07-01

    This report describes the test results and evaluation for the Phase 1 (baseline) arc melter vitrification test series conducted for the Buried Waste Integrated Demonstration program (BWID). Phase 1 tests were conducted on surrogate mixtures of as-incinerated wastes and soil. Some buried wastes, soils, and stored wastes at the INEL and other DOE sites, are contaminated with transuranic (TRU) radionuclides and hazardous organics and metals. The high temperature environment in an electric arc furnace may be used to process these wastes to produce materials suitable for final disposal. An electric arc furnace system can treat heterogeneous wastes and contaminated soils by (a) dissolving and retaining TRU elements and selected toxic metals as oxides in the slag phase, (b) destroying organic materials by dissociation, pyrolyzation, and combustion, and (c) capturing separated volatilized metals in the offgas system for further treatment. Structural metals in the waste may be melted and tapped separately for recycle or disposal, or these metals may be oxidized and dissolved into the slag. The molten slag, after cooling, will provide a glass/ceramic final waste form that is homogeneous, highly nonleachable, and extremely durable. These features make this waste form suitable for immobilization of TRU radionuclides and toxic metals for geologic timeframes. Further, the volume of contaminated wastes and soils will be substantially reduced in the process.

  5. The LOFAR long baseline snapshot calibrator survey

    CERN Document Server

    Moldón, J; Wucknitz, O; Jackson, N; Drabent, A; Carozzi, T; Conway, J; Kapińska, A D; McKean, P; Morabito, L; Varenius, E; Zarka, P; Anderson, J; Asgekar, A; Avruch, I M; Bell, M E; Bentum, M J; Bernardi, G; Best, P; Bîrzan, L; Bregman, J; Breitling, F; Broderick, J W; Brüggen, M; Butcher, H R; Carbone, D; Ciardi, B; de Gasperin, F; de Geus, E; Duscha, S; Eislöffel, J; Engels, D; Falcke, H; Fallows, R A; Fender, R; Ferrari, C; Frieswijk, W; Garrett, M A; Grießmeier, J; Gunst, A W; Hamaker, J P; Hassall, T E; Heald, G; Hoeft, M; Juette, E; Karastergiou, A; Kondratiev, V I; Kramer, M; Kuniyoshi, M; Kuper, G; Maat, P; Mann, G; Markoff, S; McFadden, R; McKay-Bukowski, D; Morganti, R; Munk, H; Norden, M J; Offringa, A R; Orru, E; Paas, H; Pandey-Pommier, M; Pizzo, R; Polatidis, A G; Reich, W; Röttgering, H; Rowlinson, A; Scaife, A M M; Schwarz, D; Sluman, J; Smirnov, O; Stappers, B W; Steinmetz, M; Tagger, M; Tang, Y; Tasse, C; Thoudam, S; Toribio, M C; Vermeulen, R; Vocks, C; van Weeren, R J; White, S; Wise, M W; Yatawatta, S; Zensus, A

    2014-01-01

    Aims. An efficient means of locating calibrator sources for International LOFAR is developed and used to determine the average density of usable calibrator sources on the sky for subarcsecond observations at 140 MHz. Methods. We used the multi-beaming capability of LOFAR to conduct a fast and computationally inexpensive survey with the full International LOFAR array. Sources were pre-selected on the basis of 325 MHz arcminute-scale flux density using existing catalogues. By observing 30 different sources in each of the 12 sets of pointings per hour, we were able to inspect 630 sources in two hours to determine if they possess a sufficiently bright compact component to be usable as LOFAR delay calibrators. Results. Over 40% of the observed sources are detected on multiple baselines between international stations and 86 are classified as satisfactory calibrators. We show that a flat low-frequency spectrum (from 74 to 325 MHz) is the best predictor of compactness at 140 MHz. We extrapolate from our sample to sho...

  6. Gated integrator with signal baseline subtraction

    Science.gov (United States)

    Wang, Xucheng

    1996-01-01

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

  7. Gated integrator with signal baseline subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, X.

    1996-12-17

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

  8. Resetting predator baselines in coral reef ecosystems

    Science.gov (United States)

    Bradley, Darcy; Conklin, Eric; Papastamatiou, Yannis P.; McCauley, Douglas J.; Pollock, Kydd; Pollock, Amanda; Kendall, Bruce E.; Gaines, Steven D.; Caselle, Jennifer E.

    2017-01-01

    What did coral reef ecosystems look like before human impacts became pervasive? Early efforts to reconstruct baselines resulted in the controversial suggestion that pristine coral reefs have inverted trophic pyramids, with disproportionally large top predator biomass. The validity of the coral reef inverted trophic pyramid has been questioned, but until now, was not resolved empirically. We use data from an eight-year tag-recapture program with spatially explicit, capture-recapture models to re-examine the population size and density of a key top predator at Palmyra atoll, the same location that inspired the idea of inverted trophic biomass pyramids in coral reef ecosystems. Given that animal movement is suspected to have significantly biased early biomass estimates of highly mobile top predators, we focused our reassessment on the most mobile and most abundant predator at Palmyra, the grey reef shark (Carcharhinus amblyrhynchos). We estimated a density of 21.3 (95% CI 17.8, 24.7) grey reef sharks/km2, which is an order of magnitude lower than the estimates that suggested an inverted trophic pyramid. Our results indicate that the trophic structure of an unexploited reef fish community is not inverted, and that even healthy top predator populations may be considerably smaller, and more precarious, than previously thought. PMID:28220895

  9. Baseline air quality study at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Dave, M.J.; Charboneau, R.

    1980-10-01

    Air quality and meteorological data collected at Fermi National Accelerator Laboratory are presented. The data represent baseline values for the pre-construction phase of a proposed coal-gasification test facility. Air quality data were characterized through continuous monitoring of gaseous pollutants, collection of meteorological data, data acquisition and reduction, and collection and analysis of discrete atmospheric samples. Seven air quality parameters were monitored and recorded on a continuous real-time basis: sulfur dioxide, ozone, total hydrocarbons, nonreactive hydrocarbons, nitric oxide, nitrogen oxides, and carbon monoxide. A 20.9-m tower was erected near Argonne's mobile air monitoring laboratory, which was located immediately downwind of the proposed facility. The tower was instrumented at three levels to collect continuous meteorological data. Wind speed was monitored at three levels; wind direction, horizontal and vertical, at the top level; ambient temperature at the top level; and differential temperature between all three levels. All continuously-monitored parameters were digitized and recorded on magnetic tape. Appropriate software was prepared to reduce the data. Statistical summaries, grphical displays, and correlation studies also are presented.

  10. Generating synthetic baseline populations from register data

    DEFF Research Database (Denmark)

    Rich, Jeppe; Mulalic, Ismir

    2012-01-01

    algorithm. The solution strategy consists in establishing a harmonisation process for the population targets, which combined with a linear programming approach, is applied to generate a consistent target representation. The model approach is implemented and tested on Danish administrative register data....... A test on historical census data shows that a 2006 population could be predicted by a 1994 population with an overall percentage deviation of 5–6% given that targets were known. It is also indicated that the deviation is approximately a linear function of the length of the forecast period....

  11. Digital Offshore Cadastre (DOC) - Pacific83 - Baseline Tangent Lines

    Data.gov (United States)

    Bureau of Ocean Energy Management, Department of the Interior — This data set contains baseline tangent lines and bay closing lines in ESRI Arc/Info export and Arc/View shape file formats for the BOEM Pacific Region. Baseline...

  12. The effect of short-baseline neutrino oscillations on LBNE

    Energy Technology Data Exchange (ETDEWEB)

    Louis, William C. [Physics Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2015-10-15

    Short-baseline neutrino oscillations can have a relatively big effect on long-baseline oscillations, due to the cross terms that arise from multiple mass scales. The existing short-baseline anomalies suggest that short-baseline oscillations can affect the ν{sub μ} → ν{sub e} appearance probabilities by up to 20-40%, depending on the values of the CP-violating parameters.

  13. The effect of short-baseline neutrino oscillations on LBNE

    Science.gov (United States)

    Louis, William C.

    2015-10-01

    Short-baseline neutrino oscillations can have a relatively big effect on long-baseline oscillations, due to the cross terms that arise from multiple mass scales. The existing short-baseline anomalies suggest that short-baseline oscillations can affect the νμ → νe appearance probabilities by up to 20-40%, depending on the values of the CP-violating parameters.

  14. Implementation First

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The G20 summit achieved better-than-expected results, but whether they can be implemented to resuscitate the sluggish global economy remains a question chinese economists welcomed the outcomes of the G20 London summit, but said they worry about the implementation of the agreements made. Wu Qiang, professor

  15. Vertical Implementation

    NARCIS (Netherlands)

    Rensink, Arend; Gorrieri, Roberto

    2001-01-01

    We investigate criteria to relate specifications and implementations belonging to conceptually different levels of abstraction. For this purpose, we introduce the generic concept of a vertical implementation relation, which is a family of binary relations indexed by a refinement function that maps

  16. Pilot Implementations

    DEFF Research Database (Denmark)

    Manikas, Maria Ie

    tensions and negotiations are fundamental characteristics of pilot implementations. Based on the analysis of a project that is pilot implementing an electronic pre-hospital patient record for emergency medical services in Danish health care, I investigate other perceptions of pilot implementations....... The analysis is conducted by means of a theoretical framework that centres on the concept infrastructure. With infrastructure I understand the relation between organised practice and the information systems supporting this practice. Thus, infrastructure is not a thing but a relational and situated concept...... understanding of pilot implementations as enacted interventions into existing infrastructures. Moreover, being embedded in the day-to-day organisation of work pilot implementations intervenes in the conventions of practice making the taken for granted visible. This allows project participants to attend...

  17. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Science.gov (United States)

    2010-07-01

    ... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a... gasoline volume of the facility, per § 80.91. (b) Baseline exhaust benzene emissions—simple model. (1) Simple model exhaust benzene emissions of conventional gasoline shall be determined using the following...

  18. 10 CFR 850.20 - Baseline beryllium inventory.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the... inventory, the responsible employer must: (1) Review current and historical records; (2) Interview...

  19. World wide biomass resources

    NARCIS (Netherlands)

    Faaij, A.P.C.

    2012-01-01

    In a wide variety of scenarios, policy strategies, and studies that address the future world energy demand and the reduction of greenhouse gas emissions, biomass is considered to play a major role as renewable energy carrier. Over the past decades, the modern use of biomass has increased rapidly in

  20. Health Care Wide Hazards

    Science.gov (United States)

    ... Other Hazards (Lack of) PPE Slips/Trips/Falls Stress Tuberculosis Universal Precautions Workplace Violence Use of Medical Lasers Health Effects Use ... Needlesticks Noise Mercury Inappropriate PPE Slips/Trips/Falls ... of Universal Precautions Workplace Violence For more information, see Other Healthcare Wide ...

  1. World wide biomass resources

    NARCIS (Netherlands)

    Faaij, A.P.C.

    2012-01-01

    In a wide variety of scenarios, policy strategies, and studies that address the future world energy demand and the reduction of greenhouse gas emissions, biomass is considered to play a major role as renewable energy carrier. Over the past decades, the modern use of biomass has increased rapidly in

  2. World wide biomass resources

    NARCIS (Netherlands)

    Faaij, A.P.C.

    2012-01-01

    In a wide variety of scenarios, policy strategies, and studies that address the future world energy demand and the reduction of greenhouse gas emissions, biomass is considered to play a major role as renewable energy carrier. Over the past decades, the modern use of biomass has increased

  3. Nonintrusive methodology for wellness baseline profiling

    Science.gov (United States)

    Chung, Danny Wen-Yaw; Tsai, Yuh-Show; Miaou, Shaou-Gang; Chang, Walter H.; Chang, Yaw-Jen; Chen, Shia-Chung; Hong, Y. Y.; Chyang, C. S.; Chang, Quan-Shong; Hsu, Hon-Yen; Hsu, James; Yao, Wei-Cheng; Hsu, Ming-Sin; Chen, Ming-Chung; Lee, Shi-Chen; Hsu, Charles; Miao, Lidan; Byrd, Kenny; Chouikha, Mohamed F.; Gu, Xin-Bin; Wang, Paul C.; Szu, Harold

    2007-04-01

    We develop an accumulatively effective and affordable set of smart pair devices to save the exuberant expenditure for the healthcare of aging population, which will not be sustainable when all the post-war baby boomers retire (78 millions will cost 1/5~1/4 GDP in US alone). To design an accessible test-bed for distributed points of homecare, we choose two exemplars of the set to demonstrate the possibility of translation of modern military and clinical know-how, because two exemplars share identically the noninvasive algorithm adapted to the Smart Sensor-pairs for the real world persistent surveillance. Currently, the standard diagnoses for malignant tumors and diabetes disorders are blood serum tests, X-ray CAT scan, and biopsy used sometime in the physical checkup by physicians as cohort-average wellness baselines. The loss of the quality of life in making second careers productive may be caused by the missing of timeliness for correct diagnoses and easier treatments, which contributes to the one quarter of human errors generating the lawsuits against physicians and hospitals, which further escalates the insurance cost and wasteful healthcare expenditure. Such a vicious cycle should be entirely eliminated by building an "individual diagnostic aids (IDA)," similar to the trend of personalized drug, developed from daily noninvasive intelligent databases of the "wellness baseline profiling (WBP)". Since our physiology state undulates diurnally, the Nyquist anti-aliasing theory dictates a minimum twice-a-day sampling of the WBP for the IDA, which must be made affordable by means of noninvasive, unsupervised and unbiased methodology at the convenience of homes. Thus, a pair of military infrared (IR) spectral cameras has been demonstrated for the noninvasive spectrogram ratio test of the spontaneously emitted thermal radiation from a normal human body at 37°C temperature. This invisible self-emission spreads from 3 microns to 12 microns of the radiation wavelengths

  4. 1993 baseline solid waste management system description

    Energy Technology Data Exchange (ETDEWEB)

    Armacost, L.L.; Fowler, R.A.; Konynenbelt, H.S.

    1994-02-01

    Pacific Northwest Laboratory has prepared this report under the direction of Westinghouse Hanford Company. The report provides an integrated description of the system planned for managing Hanford`s solid low-level waste, low-level mixed waste, transuranic waste, and transuranic mixed waste. The primary purpose of this document is to illustrate a collective view of the key functions planned at the Hanford Site to handle existing waste inventories, as well as solid wastes that will be generated in the future. By viewing this system as a whole rather than as individual projects, key facility interactions and requirements are identified and a better understanding of the overall system may be gained. The system is described so as to form a basis for modeling the system at various levels of detail. Model results provide insight into issues such as facility capacity requirements, alternative system operating strategies, and impacts of system changes (ie., startup dates). This description of the planned Hanford solid waste processing system: defines a baseline system configuration; identifies the entering waste streams to be managed within the system; identifies basic system functions and waste flows; and highlights system constraints. This system description will evolve and be revised as issues are resolved, planning decisions are made, additional data are collected, and assumptions are tested and changed. Out of necessity, this document will also be revised and updated so that a documented system description, which reflects current system planning, is always available for use by engineers and managers. It does not provide any results generated from the many alternatives that will be modeled in the course of analyzing solid waste disposal options; such results will be provided in separate documents.

  5. Profile of NASA software engineering: Lessons learned from building the baseline

    Science.gov (United States)

    Hall, Dana; Mcgarry, Frank

    1993-01-01

    It is critically important in any improvement activity to first understand the organization's current status, strengths, and weaknesses and, only after that understanding is achieved, examine and implement promising improvements. This fundamental rule is certainly true for an organization seeking to further its software viability and effectiveness. This paper addresses the role of the organizational process baseline in a software improvement effort and the lessons we learned assembling such an understanding for NASA overall and for the NASA Goddard Space Flight Center in particular. We discuss important, core data that must be captured and contrast that with our experience in actually finding such information. Our baselining efforts have evolved into a set of data gathering, analysis, and crosschecking techniques and information presentation formats that may prove useful to others seeking to establish similar baselines for their organization.

  6. The LBNO long-baseline oscillation sensitivities with two conventional neutrino beams at different baselines

    CERN Document Server

    Agarwalla, S.K.; Aittola, M.; Alekou, A.; Andrieu, B.; Antoniou, F.; Asfandiyarov, R.; Autiero, D.; Besida, O.; Balik, A.; Ballett, P.; Bandac, I.; Banerjee, D.; Bartmann, W.; Bay, F.; Biskup, B.; Blebea-Apostu, A.M.; Blondel, A.; Bogomilov, M.; Bolognesi, S.; Borriello, E.; Brancus, I.; Bravar, A.; Buizza-Avanzini, M.; Caiulo, D.; Calin, M.; Calviani, M.; Campanelli, M.; Cantini, C.; Cata-Danil, G.; Chakraborty, S.; Charitonidis, N.; Chaussard, L.; Chesneanu, D.; Chipesiu, F.; Crivelli, P.; Dawson, J.; De Bonis, I.; Declais, Y.; del Amo Sanchez, P.; Delbart, A.; Di Luise, S.; Duchesneau, D.; Dumarchez, J.; Efthymiopoulos, I.; Eliseev, A.; Emery, S.; Enqvist, T.; Enqvist, K.; Epprecht, L.; Erykalov, A.N.; Esanu, T.; Franco, D.; Friend, M.; Galymov, V.; Gavrilov, G.; Gendotti, A.; Giganti, C.; Gilardoni, S.; Goddard, B.; Gomoiu, C.M.; Gornushkin, Y.A.; Gorodetzky, P.; Haesler, A.; Hasegawa, T.; Horikawa, S.; Huitu, K.; Izmaylov, A.; Jipa, A.; Kainulainen, K.; Karadzhov, Y.; Khabibullin, M.; Khotjantsev, A.; Kopylov, A.N.; Korzenev, A.; Kosyanenko, S.; Kryn, D.; Kudenko, Y.; Kuusiniemi, P.; Lazanu, I.; Lazaridis, C.; Levy, J.M.; Loo, K.; Maalampi, J.; Margineanu, R.M.; Marteau, J.; Martin-Mari, C.; Matveev, V.; Mazzucato, E.; Mefodiev, A.; Mineev, O.; Mirizzi, A.; Mitrica, B.; Murphy, S.; Nakadaira, T.; Narita, S.; Nesterenko, D.A.; Nguyen, K.; Nikolics, K.; Noah, E.; Novikov, Yu.; Oprima, A.; Osborne, J.; Ovsyannikova, T.; Papaphilippou, Y.; Pascoli, S.; Patzak, T.; Pectu, M.; Pennacchio, E.; Periale, L.; Pessard, H.; Popov, B.; Ravonel, M.; Rayner, M.; Resnati, F.; Ristea, O.; Robert, A.; Rubbia, A.; Rummukainen, K.; Saftoiu, A.; Sakashita, K.; Sanchez-Galan, F.; Sarkamo, J.; Saviano, N.; Scantamburlo, E.; Sergiampietri, F.; Sgalaberna, D.; Shaposhnikova, E.; Slupecki, M.; Smargianaki, D.; Stanca, D.; Steerenberg, R.; Sterian, A.R.; Sterian, P.; Stoica, S.; Strabel, C.; Suhonen, J.; Suvorov, V.; Toma, G.; Tonazzo, A.; Trzaska, W.H.; Tsenov, R.; Tuominen, K.; Valram, M.; Vankova-Kirilova, G.; Vannucci, F.; Vasseur, G.; Velotti, F.; Velten, P.; Venturi, V.; Viant, T.; Vihonen, S.; Vincke, H.; Vorobyev, A.; Weber, A.; Wu, S.; Yershov, N.; Zambelli, L.; Zito, M.

    2014-01-01

    The proposed Long Baseline Neutrino Observatory (LBNO) initially consists of $\\sim 20$ kton liquid double phase TPC complemented by a magnetised iron calorimeter, to be installed at the Pyh\\"asalmi mine, at a distance of 2300 km from CERN. The conventional neutrino beam is produced by 400 GeV protons accelerated at the SPS accelerator delivering 700 kW of power. The long baseline provides a unique opportunity to study neutrino flavour oscillations over their 1st and 2nd oscillation maxima exploring the $L/E$ behaviour, and distinguishing effects arising from $\\delta_{CP}$ and matter. In this paper we show how this comprehensive physics case can be further enhanced and complemented if a neutrino beam produced at the Protvino IHEP accelerator complex, at a distance of 1160 km, and with modest power of 450 kW is aimed towards the same far detectors. We show that the coupling of two independent sub-MW conventional neutrino and antineutrino beams at different baselines from CERN and Protvino will allow to measure ...

  7. Pilot implementation

    DEFF Research Database (Denmark)

    Hertzum, Morten; Bansler, Jørgen P.; Havn, Erling C.

    2012-01-01

    A recurrent problem in information-systems development (ISD) is that many design shortcomings are not detected during development, but first after the system has been delivered and implemented in its intended environment. Pilot implementations appear to promise a way to extend prototyping from...... the laboratory to the field, thereby allowing users to experience a system design under realistic conditions and developers to get feedback from realistic use while the design is still malleable. We characterize pilot implementation, contrast it with prototyping, propose a five-element model of pilot...

  8. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

    Energy Technology Data Exchange (ETDEWEB)

    Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

    1994-01-01

    This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

  9. Short baseline neutrino oscillations: when entanglement suppresses coherence

    CERN Document Server

    Boyanovsky, Daniel

    2011-01-01

    For neutrino oscillations to take place the entangled quantum state of a neutrino and a charged lepton produced via charged current interactions must be disentangled. Implementing a non-perturbative Wigner-Weisskopf method we obtain the correct \\emph{entangled} quantum state of neutrinos and charged leptons from the (two-body) decay of a parent particle. The source lifetime and disentanglement length scale lead to a suppression of the oscillation probabilities in short-baseline experiments. The suppression is determined by $\\pi\\, L_d/L_{osc}$ where $L_d$ is the \\emph{smallest} of the decay length of the parent particle or the disentanglement length scale. For $L_d \\geq L_{osc}$ coherence and oscillations are suppressed. These effects are more prominent in \\emph{short base line experiments} and at low neutrino energy. We obtain the corrections to the appearance and disappearance probabilities modified by both the lifetime of the source and the disentanglement scale and discuss their implications for accelerato...

  10. Fort Stewart integrated resource assessment. Volume 2, Baseline detail

    Energy Technology Data Exchange (ETDEWEB)

    Keller, J.M.; Sullivan, G.P.; Wahlstrom, R.R.; Larson, L.L.

    1993-08-01

    This report documents the assessment of baseline energy use at Fort Stewart, a US Army Forces Command facility located near Savannah, Georgia. This is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) tasked Pacific Northwest Laboratory (PNL) to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Stewart. PNL, in support of the US Department of Energy (DOE) Federal Energy Management Program (FEMP), has designed a model program applicable to the federal sector for this purpose. The model program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Stewart. The analysis examines the characteristics of electric, natural gas, oil, propane, and wood chip use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Fort Stewart by building type, fuel type, and energy end use. A complete energy consumption reconciliation is presented that accounts for the distribution of all major energy uses and losses among buildings, utilities, and central systems.

  11. Removal of Baseline Wander Noise from Electrocardiogram (ECG) using Fifth-order Spline Interpolation

    OpenAIRE

    John A. OJO; Temilade B. ADETOYI; Solomon A. Adeniran

    2016-01-01

    Baseline wandering can mask some important features of the Electrocardiogram (ECG) signal hence it is desirable to remove this noise for proper analysis and display of the ECG signal. This paper presents the implementation and evaluation of spline interpolation and linear phase FIR filtering methods to remove this noise. Spline interpolation method requires the QRS waves to be first detected and fifth-order (quintic) interpolation technique applied to determine the smo...

  12. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald L.; Joe, Jeffrey C.

    2015-02-01

    For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

  13. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  14. Estimating complicated baselines in analytical signals using the iterative training of Bayesian regularized artificial neural networks.

    Science.gov (United States)

    Mani-Varnosfaderani, Ahmad; Kanginejad, Atefeh; Gilany, Kambiz; Valadkhani, Abolfazl

    2016-10-12

    The present work deals with the development of a new baseline correction method based on the comparative learning capabilities of artificial neural networks. The developed method uses the Bayes probability theorem for prevention of the occurrence of the over-fitting and finding a generalized baseline. The developed method has been applied on simulated and real metabolomic gas-chromatography (GC) and Raman data sets. The results revealed that the proposed method can be used to handle different types of baselines with cave, convex, curvelinear, triangular and sinusoidal patterns. For further evaluation of the performances of this method, it has been compared with benchmarking baseline correction methods such as corner-cutting (CC), morphological weighted penalized least squares (MPLS), adaptive iteratively-reweighted penalized least squares (airPLS) and iterative polynomial fitting (iPF). In order to compare the methods, the projected difference resolution (PDR) criterion has been calculated for the data before and after the baseline correction procedure. The calculated values of PDR after the baseline correction using iBRANN, airPLS, MPLS, iPF and CC algorithms for the GC metabolomic data were 4.18, 3.64, 3.88, 1.88 and 3.08, respectively. The obtained results in this work demonstrated that the developed iterative Bayesian regularized neural network (iBRANN) method in this work thoroughly detects the baselines and is superior over the CC, MPLS, airPLS and iPF techniques. A graphical user interface has been developed for the suggested algorithm and can be used for easy implementation of the iBRANN algorithm for the correction of different chromatography, NMR and Raman data sets.

  15. Scientific Opportunities with the Long-Baseline Neutrino Experiment

    CERN Document Server

    Adams, C; Andrews, M; Anghel, I; Arrieta-Diaz, E; Artuso, M; Asaadi, J; Bai, X; Baird, M; Balantekin, B; Baller, B; Baptista, B; Barker, D; Barletta, W; Barr, G; Bashyal, A; Bass, M; Bellini, V; Berger, B E; Bergevin, M; Berman, E; Berns, H; Bernstein, A; Bernstein, R; Bhatnagar, V; Bhuyan, B; Bishai, M; Blake, A; Blaufuss, E; Bleakley, B; Blucher, E; Blusk, S; Bocean, V; Bolton, T; Breedon, R; Brandt, A; Bromberg, C; Brown, R; Buchanan, N; Bugg, B; Camilleri, L; Carr, R; Carminati, G; Cavanna, F; Chen, A; Chen, H; Chen, K; Cherdack, D; Chi, C; Childress, S; Choudhary, B; Christofferson, C; Church, E; Cline, D; Coan, T; Coelho, J; Coleman, S; Conrad, J; Convery, M; Corey, R; Corwin, L; Davies, G S; Dazeley, S; de Gouvea, A; de Jong, J K; Escobar, C; De, K; Demuth, D; Diwan, M; Djurcic, Z; Dolph, J; Drake, G; Duyang, H; Dye, S; Edmunds, D; Elliott, S; Eno, S; Enomoto, S; Farbin, A; Falk, L; Felde, J; Feyzi, F; Fields, L; Fleming, B; Fowler, J; Fox, W; Friedland, A; Fujikawa, B; Gallagher, H; Gandhi, R; Garvey, G; Gehman, V M; Geronimo, G; Gill, R; Goodman, M C; Goon, J; Graham, M; Gran, R; Grant, C; Greenlee, H; Greenler, L; Guarino, V; Guardincerri, E; Guenette, R; Habib, S; Habig, A; Hackenburg, R W; Hahn, A; Haines, T; Handler, T; Hans, S; Hartnell, J; Harton, J; Hatcher, R; Hatzikoutelis, A; Hays, S; Hazen, E; Headley, M; Heavey, A; Heeger, K; Heise, J; Hellauer, R; Himmel, A; Hogan, M; Holin, A; Horton-Smith, G; Howell, J; Hurh, P; Huston, J; Hylen, J; Imlay, R; Insler, J; Isvan, Z; Jackson, C; Jaffe, D; James, C; Johnson, M; Johnson, R; Johnson, S; Johnston, W; Johnstone, J; Jones, B; Jostlein, H; Junk, T; Kadel, R; Karagiorgi, G; Kaspar, J; Katori, T; Kayser, B; Kearns, E; Keener, P; Kettell, S H; Kirby, M; Klein, J; Koizumi, G; Kopp, S; Kropp, W; Kudryavtsev, V A; Kumar, A; Kumar, J; Kutter, T; Lande, K; Lane, C; Lang, K; Lanni, F; Lanza, R; Latorre, T; La Zia, F; Learned, J; Lee, D; Lee, K; Li, S; Li, Y; Li, Z; Libo, J; Linden, S; Ling, J; Link, J; Littenberg, L; Liu, H; Liu, Q; Liu, T; Losecco, J; Louis, W; Lundberg, B; Lundin, T; Maesano, C; Magill, S; Mahler, G; Malys, S; Mammoliti, F; Mandal, S; Mann, A; Mantsch, P; Marchionni, A; Marciano, W; Mariani, C; Maricic, J; Marino, A; Marshak, M; Marshall, J; Matsuno, S; Mauger, C; Mayer, N; McCluskey, E; McDonald, K; McFarland, K; McKee, D; McKeown, R; McTaggart, R; Mehdiyev, R; Mei, D; Meng, Y; Mercurio, B; Messier, M D; Metcalf, W; Meyhandan, R; Milincic, R; Miller, W; Mills, G; Mishra, S; Sher, S Moed; Mokhov, N; Montanari, D; Moore, C D; Morfin, J; Morse, W; Mufson, S; Muller, D; Musser, J; Naples, D; Napolitano, J; Newcomer, M; Niner, E; Norris, B; Olson, T; Page, B; Pakvasa, S; Paley, J; Palamara, O; Paolone, V; Papadimitriou, V; Park, S; Parsa, Z; Paulos, B; Partyka, K; Pavlovic, Z; Perch, A; Perkin, J D; Peeters, S; Petti, R; Plunkett, R; Polly, C; Pordes, S; Potenza, R; Prakash, A; Prokofiev, O; Perdue, G; Qian, X; Raaf, J L; Radeka, V; Rajendran, R; Rakhno, I; Rameika, R; Ramsey, J; Rebel, B; Rescia, S; Reitzner, D; Richardson, M; Riesselman, K; Robinson, M; Ronquest, M; Rosen, M; Rosenfeld, C; Rucinski, R; Sahijpal, S; Sahoo, H; Samios, N; Sanchez, M C; Schellman, H; Schmitt, R; Schmitz, D; Schneps, J; Scholberg, K; Seibert, S; Shaevitz, M; Shanahan, P; Sharma, R; Shaw, T; Simos, N; Singh, V; Sinnis, G; Sippach, W; Skwarnicki, T; Smy, M; Sobel, H; Soderberg, M; Sondericker, J; Sondheim, W; Spooner, N J C; Stancari, M; Stancu, I; Stefanik, A; Stewart, J; Stone, S; Strait, J; Strait, M; Striganov, S; Sullivan, G; Suter, L; Svoboda, R; Szczerbinska, B; Szydagis, M; Szelc, A; Talaga, R; Tamsett, M; Tariq, S; Tayloe, R; Taylor, C; Taylor, D; Teymourian, A; Themann, H; Thiesse, M; Thomas, J; Thompson, L F; Thomson, M; Thorn, C; Tian, X; Tiedt, D; Toki, W; Tolich, N; Tripathi, M; Tropin, I; Tzanov, M; Urheim, J; Usman, S; Vagins, M; Van Berg, R; Van de Water, R; Varner, G; Vaziri, K; Velev, G; Viren, B; Wachala, T; Wahl, D; Waldron, A; Walter, C W; Wang, H; Wang, W; Warner, D; Wasserman, R; Watson, B; Weber, A; Wei, W; Wendell, R; Wetstein, M; White, A; White, H; Whitehead, L; Whittington, D; Willhite, J; Willis, W; Wilson, R J; Winslow, L; Worcester, E; Wyman, T; Xin, T; Yarritu, K; Ye, J; Yu, J; Yeh, M; Yu, B; Zeller, G; Zhang, C; Zimmerman, E D; Zwaska, R

    2013-01-01

    In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC...

  16. Scientific Opportunities with the Long-Baseline Neutrino Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Adams, C.; et al.

    2013-07-28

    In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC) deployed there as a 'far' detector. The facilities envisioned are expected to enable many other science opportunities due to the high event rates and excellent detector resolution from beam neutrinos in the near detector and atmospheric neutrinos in the far detector. This is a mature, well developed, world class experiment whose relevance, importance, and probability of unearthing critical and exciting physics has increased with time.

  17. Baseline analysis and recommendations for BREEAM travel plan implementation : Case company: Finnair Oyj

    OpenAIRE

    Han, Adela Bianca

    2016-01-01

    This thesis was commissioned by Finnair Oyj. Communication is a critical factor in an organisation’s operational success. With a direct influence on the employees’ awareness of the company’s values and goals, internal communication helps employees to understand the planned changes, engaging and involving them to commit to be part of the change process. The theory framework of this thesis covers all the concepts needed to explain the importance of internal communications within the or...

  18. 10 CFR 707.5 - Submission, approval, and implementation of a baseline workplace substance abuse program.

    Science.gov (United States)

    2010-01-01

    ... workplace substance abuse program. 707.5 Section 707.5 Energy DEPARTMENT OF ENERGY WORKPLACE SUBSTANCE ABUSE... substance abuse program. (a) Each contractor subject to this part shall develop a written program consistent... employees concerning problems of substance abuse, including illegal drug use, and the availability...

  19. Implementing reliable Web services

    OpenAIRE

    Koskipää, Otto

    2012-01-01

    Web services are a common and standard way to implement communication between information systems and provide documented interfaces. The Web services are usually using SOAP because it is a widely-spread, well-documented and used standard. SOAP standard defines a message structure, an envelope, that is sent over internet using HTTP and contains XML data. An important part of the SOAP structure is the exception mechanism that returns a Fault element in the response. The SOAP Fault is a stan...

  20. Jihadism, Narrow and Wide

    DEFF Research Database (Denmark)

    Sedgwick, Mark

    2015-01-01

    The term “jihadism” is popular, but difficult. It has narrow senses, which are generally valuable, and wide senses, which may be misleading. This article looks at the derivation and use of “jihadism” and of related terms, at definitions provided by a number of leading scholars, and at media usage....... It distinguishes two main groups of scholarly definitions, some careful and narrow, and some appearing to match loose media usage. However, it shows that even these scholarly definitions actually make important distinctions between jihadism and associated political and theological ideology. The article closes...

  1. 基于9223模块超宽带雷达回波信号实时采集系统的设计与实现%Design and implementation of the real-time acquisition system of ultra-wide band radar echo signal based on 9223

    Institute of Scientific and Technical Information of China (English)

    张延波; 王忠民; 徐文青; 杨秀蔚

    2016-01-01

    According to the acquisition requirements of the ultra-wide band radar echo signal ,this paper introduces on IF echo signal acquisition system based on the 9223.Firstly,this system ob-tains IF echo signal through the mixing processing of the radar echo signal and the local oscillator signal.Secondly,the 9223 acquisition module completes the four channel signal synchronous ac-quisition and the acquisition data is transmitted to computer by ethernet.Lastly,the system gives 1 meter and 2 meters of target echo signal behind the concrete wall (wall thickness 12cm)accord-ing to the data sampling process.The test result show that the system effectively avoids the diffi-culty of implementing ultra-wide band radar direct sampling and the echo signal can effectively distinguish the target peak.This system meets the demand of the ultra-wide band data acquisi-tion.%针对超宽带探测雷达回波的采集要求,设计了一种基于9223采集模块的回波信号中频信号采集系统。该系统将穿墙雷达回波信号与本振信号进行混频处理获取可直接采样的中频信号,由计算机控制9223采集模块完成了四通道回波信号同步采集并经以太网传输至计算机;依据数据采样流程,系统给出了混凝土墙体(墙体厚度12cm)后面1米处、2米处目标回波信号。实验结果表明,该采集系统有效避免了超宽带穿墙雷达直接采样的实施难度,回波信号能有效分辨目标对应峰值,满足了超宽带数据采集系统的需求。

  2. Implementation Politics

    DEFF Research Database (Denmark)

    2008-01-01

    level are supplemented or even replaced by national priorities. The chapter concludes that in order to capture the domestic politics associated with CFP implementation in Denmark, it is important to understand the policy process as a synergistic interaction between dominant interests, policy alliances...

  3. Implementing TQM.

    Science.gov (United States)

    Bull, G; Maffetone, M A; Miller, S K

    1992-01-01

    Total quality management (TQM) is an organized, systematic approach to problem solving and continuous improvement. American corporations have found that TQM is an excellent way to improve competitiveness, lower operating costs, and improve productivity. Increasing numbers of laboratories are investigating the benefits of TQM. For this month's column, we asked our respondents: What steps has your laboratory taken to implement TQM?

  4. Fort Drum integrated resource assessment. Volume 2, Baseline detail

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

    1992-12-01

    The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

  5. Baseline Profile of Soil Samples from Upian River Watershed

    Directory of Open Access Journals (Sweden)

    Wilanfranco Caballero TAYONE

    2014-06-01

    Full Text Available The Mines and Geosciences Bureau (MGB in the Philippines is currently mapping out the entire Davao City Watershed Area (DCWA. There are 8 major watershed areas within DCWA that has been identified by the MGB and the largest is the Davao River Watershed Area (DRWA. A smaller sub-watershed within DRWA, the Upian River Watershed Area (URWA, was proposed of which its boundary and soil profile is yet to be established. This study focused on the analyses of the soil samples from URWA. The results for pH, organic matter, cation exchange capacity, N, P, K, Ca and Mg were then compared to the Bureau of Soil standard for its fertility rating. Analysis of lead (Pb was also included as a pollutant indicator for possible soil contamination. There are 4 sampling sites with unfavorable ratings for pH, 3 for both organic matter and phosphorus, and 2 stations for both nitrogen and calcium. Fertility rating is generally good for cation exchange capacity, potassium and magnesium. The Bureau of Soil has no existing standards for micronutrients. However, all sampling sites were found to be too low with micronutrients according to Gershuny and Smillie. No indication of lead contamination or pollution on all sites as far as natural levels of lead in surface soil is concerned. This study will provide baseline information that is useful to all stakeholders, to the people living near the area, farmers, planners, and resource managers. This can also provide inputs to key government agencies in the Philippines like the Department of Environment and Natural Resources (DENR and the City Planning Office of Davao in formulating policies for sustainable management of the resource upon implementation of their programs and projects. Without the aforementioned information, planners would have difficulty in predicting the impact or recommend best management strategies for a specific land use.

  6. IODINE SALT CONSUMPTION IN INDONESIAN HOUSEHOLDS: BASELINE HEALTH SURVEY 2007

    Directory of Open Access Journals (Sweden)

    Ni Ketut Aryastami

    2012-11-01

    Full Text Available Background: Iodine Deficiency Disorder (100 reduction program has been implemented since 1976. According to the National Economic Survey 2002, the average consumption of iodized salt was 6. 26 grams. The results of Iodine Salt Survey (SGY 2003 showed that the consumption of iodine salt at the household level was 73.2%, meanwhile, the baseline health survey (Riskesdas 2007 showed there was reduction of iodine salt consumption towards 60.2%. Methods: Type of study was secondary data analysis with cross-sectional design utilizing the Riskesdas 2007's data. Sample was selected purposively according to the previous SGY's survey based on the endemically criteria namely highly endemic, mediocre and non endemic. Results: The results of the analysis were there was discrepancy of iodine salt consumption among urban and rural areas as well as mother's education level. The iodine salt consumption was higher in the urban area (65.5% compare to the rural area (52.9%. The higher the education of mothers the better the iodine salt consumed. The usage of iodine salt in the households based on salt quick test was 60.2%, meanwhile, according to the salt titration it was only 23.4%. The results of Excretion Iodine Urine showed that the iodine intake among the school children (age of 6-12 years old was 12.8% and was still below the cut-off point prevalence, which is greater than 50%. The conclusion of this analysisis that there was evidence of iodine salt reduction consumed at the household level. Conversely, there was inclination of the percentage of iodine urine level among the school children in Indonesia in the year 2007. It is recommended that policy analysis need to be conducted due to the achievement of the Universal Salt iodization target, especially in the endemic areas to asses the existence of the IDO prevalence. Key words: Iodine salt at the households, Iodine salt consumption, urine iodine excretion

  7. Long-Term Stewardship Baseline Report and Transition Guidance

    Energy Technology Data Exchange (ETDEWEB)

    Kristofferson, Keith

    2001-11-01

    Long-term stewardship consists of those actions necessary to maintain and demonstrate continued protection of human health and the environment after facility cleanup is complete. As the Department of Energy’s (DOE) lead laboratory for environmental management programs, the Idaho National Engineering and Environmental Laboratory (INEEL) administers DOE’s long-term stewardship science and technology efforts. The INEEL provides DOE with technical, and scientific expertise needed to oversee its long-term environmental management obligations complexwide. Long-term stewardship is administered and overseen by the Environmental Management Office of Science and Technology. The INEEL Long-Term Stewardship Program is currently developing the management structures and plans to complete INEEL-specific, long-term stewardship obligations. This guidance document (1) assists in ensuring that the program leads transition planning for the INEEL with respect to facility and site areas and (2) describes the classes and types of criteria and data required to initiate transition for areas and sites where the facility mission has ended and cleanup is complete. Additionally, this document summarizes current information on INEEL facilities, structures, and release sites likely to enter long-term stewardship at the completion of DOE’s cleanup mission. This document is not intended to function as a discrete checklist or local procedure to determine readiness to transition. It is an overarching document meant as guidance in implementing specific transition procedures. Several documents formed the foundation upon which this guidance was developed. Principal among these documents was the Long-Term Stewardship Draft Technical Baseline; A Report to Congress on Long-Term Stewardship, Volumes I and II; Infrastructure Long-Range Plan; Comprehensive Facility Land Use Plan; INEEL End-State Plan; and INEEL Institutional Plan.

  8. Widely tunable edge emitters

    Science.gov (United States)

    Sarlet, Gert; Wesstrom, Jan-Olof; Rigole, Pierre-Jean; Broberg, Bjoern

    2001-11-01

    We will present the current state-of-the-art in widely tunable edge emitting lasers for WDM applications. Typical applications for a tunable laser will be discussed, and the different types of tunable lasers available today will be compared with respect to the requirements posed by these applications. We will focus on the DBR-type tunable lasers - DBR, SG-DBR and GCSR - which at present seem to be the only tunable lasers mature enough for real-life applications. Their main advantages are that they are all monolithic, with no moving parts, and can be switched from one frequency to the other very rapidly since the tuning is based on carrier injection and not on thermal or mechanical changes. We will briefly discuss the working principle of each of these devices, and present typical performance characteristics. From a manufacturing point of view, rapid characterization of the lasers is crucial; therefore an overview will be given of different characterization schemes that have recently been proposed. For the end user, reliability is the prime issue. We will show results of degradation studies on these lasers and outline how the control electronics that drive the laser can compensate for any frequency drift. Finally, we will also discuss the impact of the requirement for rapid frequency switching on the design of the control electronics.

  9. Wide HI profile galaxies

    CERN Document Server

    Brosch, Noah; Zitrin, Adi

    2011-01-01

    We investigate the nature of objects in a complete sample of 28 galaxies selected from the first sky area fully covered by ALFALFA, being well-detected and having HI profiles wider than 550 km/s. The selection does not use brightness, morphology, or any other property derived from optical or other spectral bands. We investigate the degree of isolation, the morphology, and other properties gathered or derived from open data bases and show that some objects have wide HI profiles probably because they are disturbed or are interacting, or might be confused in the ALFALFA beam. We identify a sub-sample of 14 galaxies lacking immediate interacting neighbours and showing regular, symmetric, two-horned HI profiles that we propose as candidate high-mass disk systems (CHMDs). We measure the net-Halpha emission from the CHMDs and combine this with public multispectral data to model the global star formation (SF) properties of each galaxy. The Halpha observations show SFRs not higher than a few solar masses per year. Sim...

  10. 77 FR 26535 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-04

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Hope Gas, Inc.; Notice of Baseline Filing Take notice that on April 26, 2012, Hope Gas, Inc. (Hope Gas) submitted a baseline filing of their Statement of Operating Conditions...

  11. 77 FR 31841 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-30

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Hope Gas, Inc.; Notice of Baseline Filing Take notice that on May 16, 2012, Hope Gas, Inc. (Hope Gas) submitted a revised baseline filing of their Statement of...

  12. Adapting the M3 Surveillance Metrics for an Unknown Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Hamada, Michael Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Abes, Jeff I. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Jaramillo, Brandon Michael Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-30

    The original M3 surveillance metrics assume that the baseline is known. In this article, adapted M3 metrics are presented when the baseline is not known and estimated by available data. Deciding on how much available data is enough is also discussed.

  13. 40 CFR 80.93 - Individual baseline submission and approval.

    Science.gov (United States)

    2010-07-01

    ...: (i) Refinery block flow diagram, showing principal refining units; (ii) Principal refining unit..., whether or not the auditor was retained through the baseline approval process. (ii) Identification of the... samples from batch processes, including volume of each batch sampled; and (G) Baseline fuel parameter...

  14. Digital Offshore Cadastre (DOC) - Pacific83 - Baseline Points

    Data.gov (United States)

    Bureau of Ocean Energy Management, Department of the Interior — This data set contains baseline points in ESRI Arc/Info export and Arc/View shape file formats for the BOEM Pacific Region. Baseline points are used by the BOEM to...

  15. Searching for neutrino oscillation parameters in long baseline experiments

    CERN Document Server

    Vihonen, Sampsa

    2016-01-01

    Developing neutrino astronomy requires a good understanding of the neutrino oscillations mechanism. The European strategy for neutrino oscillation physics sets a high priority on future long baseline neutrino experiments with the aim to measure the intrinsic parameters that govern the neutrino oscillations. In this work we take a look at the next generation of long baseline experiments and discuss their prospects in future research.

  16. The 2014 ALMA Long Baseline Campaign : An Overview

    NARCIS (Netherlands)

    ALMA Partnership, [Unknown; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Asaki, Y.; Matsushita, S.; Dent, W. R. F.; Hills, R. E.; Phillips, N.; Richards, A. M. S.; Cox, P.; Amestica, R.; Broguiere, D.; Cotton, W.; Hales, A. S.; Hiriart, R.; Hirota, A.; Hodge, J. A.; Impellizzeri, C. M. V.; Kern, J.; Kneissl, R.; Liuzzo, E.; Marcelino, N.; Marson, R.; Mignano, A.; Nakanishi, K.; Nikolic, B.; Perez, J. E.; Pérez, L. M.; Toledo, I.; Aladro, R.; Butler, B.; Cortes, J.; Cortes, P.; Dhawan, V.; Di Francesco, J.; Espada, D.; Galarza, F.; Garcia-Appadoo, D.; Guzman-Ramirez, L.; Humphreys, E. M.; Jung, T.; Kameno, S.; Laing, R. A.; Leon, S.; Mangum, J.; Marconi, G.; Nagai, H.; Nyman, L.-A.; Radiszcz, M.; Rodón, J. A.; Sawada, T.; Takahashi, S.; Tilanus, R. P. J.; van Kempen, T.; Vila Vilaro, B.; Watson, L. C.; Wiklind, T.; Gueth, F.; Tatematsu, K.; Wootten, A.; Castro-Carrizo, A.; Chapillon, E.; Dumas, G.; de Gregorio-Monsalvo, I.; Francke, H.; Gallardo, J.; Garcia, J.; Gonzalez, S.; Hibbard, J. E.; Hill, T.; Kaminski, T.; Karim, A.; Krips, M.; Kurono, Y.; Lopez, C.; Martin, S.; Maud, L.; Morales, F.; Pietu, V.; Plarre, K.; Schieven, G.; Testi, L.; Videla, L.; Villard, E.; Whyborn, N.; Alves, F.; Andreani, P.; Avison, A.; Barta, M.; Bedosti, F.; Bendo, G. J.; Bertoldi, F.; Bethermin, M.; Biggs, A.; Boissier, J.; Brand, J.; Burkutean, S.; Casasola, V.; Conway, J.; Cortese, L.; Dabrowski, B.; Davis, T. A.; Diaz Trigo, M.; Fontani, F.; Franco-Hernandez, R.; Fuller, G.; Galvan Madrid, R.; Giannetti, A.; Ginsburg, A.; Graves, S. F.; Hatziminaoglou, E.; Hogerheijde, M.; Jachym, P.; Jimenez Serra, I.; Karlicky, M.; Klaasen, P.; Kraus, M.; Kunneriath, D.; Lagos, C.; Longmore, S.; Leurini, S.; Maercker, M.; Magnelli, B.; Marti Vidal, I.; Massardi, M.; Maury, A.; Muehle, S.; Muller, S.; Muxlow, T.; O’Gorman, E.; Paladino, R.; Petry, D.; Pineda, J.; Randall, S.; Richer, J. S.; Rossetti, A.; Rushton, A.; Rygl, K.; Sanchez Monge, A.; Schaaf, R.; Schilke, P.; Stanke, T.; Schmalzl, M.; Stoehr, F.; Urban, S.; van Kampen, E.; Vlemmings, W.; Wang, K.; Wild, W.; Yang, Y.; Iguchi, S.; Hasegawa, T.; Saito, M.; Inatani, J.; Mizuno, N.; Asayama, S.; Kosugi, G.; Morita, K.-I.; Chiba, K.; Kawashima, S.; Okumura, S. K.; Ohashi, N.; Ogasawara, R.; Sakamoto, S.; Noguchi, T.; Huang, Y.-D.; Liu, S.-Y.; Kemper, F.; Koch, P. M.; Chen, M.-T.; Chikada, Y.; Hiramatsu, M.; Iono, D.; Shimojo, M.; Komugi, S.; Kim, J.; Lyo, A.-R.; Muller, E.; Herrera, C.; Miura, R. E.; Ueda, J.; Chibueze, J.; Su, Y.-N.; Trejo-Cruz, A.; Wang, K.-S.; Kiuchi, H.; Ukita, N.; Sugimoto, M.; Kawabe, R.; Hayashi, M.; Miyama, S.; Ho, P. T. P.; Kaifu, N.; Ishiguro, M.; Beasley, A. J.; Bhatnagar, S.; Braatz, J. A., III; Brisbin, D. G.; Brunetti, N.; Carilli, C.; Crossley, J. H.; D’Addario, L.; Donovan Meyer, J. L.; Emerson, D. T.; Evans, A. S.; Fisher, P.; Golap, K.; Griffith, D. M.; Hale, A. E.; Halstead, D.; Hardy, E. J.; Hatz, M. C.; Holdaway, M.; Indebetouw, R.; Jewell, P. R.; Kepley, A. A.; Kim, D.-C.; Lacy, M. D.; Leroy, A. K.; Liszt, H. S.; Lonsdale, C. J.; Matthews, B.; McKinnon, M.; Mason, B. S.; Moellenbrock, G.; Moullet, A.; Myers, S. T.; Ott, J.; Peck, A. B.; Pisano, J.; Radford, S. J. E.; Randolph, W. T.; Rao Venkata, U.; Rawlings, M. G.; Rosen, R.; Schnee, S. L.; Scott, K. S.; Sharp, N. K.; Sheth, K.; Simon, R. S.; Tsutsumi, T.; Wood, S. J.

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried

  17. Preclinical models for neuroblastoma: establishing a baseline for treatment.

    Directory of Open Access Journals (Sweden)

    Tal Teitz

    Full Text Available BACKGROUND: Preclinical models of pediatric cancers are essential for testing new chemotherapeutic combinations for clinical trials. The most widely used genetic model for preclinical testing of neuroblastoma is the TH-MYCN mouse. This neuroblastoma-prone mouse recapitulates many of the features of human neuroblastoma. Limitations of this model include the low frequency of bone marrow metastasis, the lack of information on whether the gene expression patterns in this system parallels human neuroblastomas, the relatively slow rate of tumor formation and variability in tumor penetrance on different genetic backgrounds. As an alternative, preclinical studies are frequently performed using human cell lines xenografted into immunocompromised mice, either as flank implant or orthtotopically. Drawbacks of this system include the use of cell lines that have been in culture for years, the inappropriate microenvironment of the flank or difficult, time consuming surgery for orthotopic transplants and the absence of an intact immune system. PRINCIPAL FINDINGS: Here we characterize and optimize both systems to increase their utility for preclinical studies. We show that TH-MYCN mice develop tumors in the paraspinal ganglia, but not in the adrenal, with cellular and gene expression patterns similar to human NB. In addition, we present a new ultrasound guided, minimally invasive orthotopic xenograft method. This injection technique is rapid, provides accurate targeting of the injected cells and leads to efficient engraftment. We also demonstrate that tumors can be detected, monitored and quantified prior to visualization using ultrasound, MRI and bioluminescence. Finally we develop and test a "standard of care" chemotherapy regimen. This protocol, which is based on current treatments for neuroblastoma, provides a baseline for comparison of new therapeutic agents. SIGNIFICANCE: The studies suggest that use of both the TH-NMYC model of neuroblastoma and the

  18. Esophageal impedance baseline according to different time intervals

    Directory of Open Access Journals (Sweden)

    Ummarino Dario

    2012-06-01

    Full Text Available Abstract Background The impedance baseline has been shown to reflect esophageal integrity, and to be decreased in patients with esophagitis. However, different methods for the determination of the impedance baseline have not been compared. Methods The median impedance baseline was calculated in 10 consecutive multichannel intraluminal impedance recordings in children with non-erosive reflux disease. All children underwent an endoscopy with a biopsy as part of the clinical work-up to exclude esophagitis. The impedance baseline was obtained both by including and excluding all impedance episodes (IE; reflux, swallows and gas episodes during the full recording, and during the first 1-minute period without an IE every hour (method 1, every 2 hours (method 2 or every 4 hours (method 3. The impedance baseline obtained during the full recording was set at 100%, and the variation (difference in impedance baseline for the different methods and variability (difference in impedance baseline during one analysis period were assessed. Results None of the participants had esophagitis. The mean difference over the six channels between the impedance baseline over the total recording with and without IE was approximately 2.5%, and comparable for each channel (range 0.47% to 5.55%. A mean of 1,028 IEs were excluded in each tracing, and it took between 4 and 24 hours to delete all events in one tracing. The difference in the impedance baseline obtained with and without IEs was mainly caused by the gas episodes in the upper channels and swallows in the lower channels. The median impedance baseline according to the three one-minute analysis methods was comparable to the median impedance baseline according to the 24 hour analysis. Conclusions The automatic determination of the median impedance baseline over the total tracing including IEs is an adequate method. In isolated tracings with numerous IEs, the calculation of the median impedance baseline over one minute

  19. Implementation of an Internet Weight Loss Program in a Worksite Setting

    Directory of Open Access Journals (Sweden)

    Kathryn M. Ross

    2016-01-01

    Full Text Available Background. Worksite wellness programs typically produce modest weight losses. We examined whether an efficacious Internet behavioral weight loss program could be successfully implemented in a worksite setting. Methods. Participants were 75 overweight or obese employees/dependents of a large healthcare system who were given access to a 12-week Internet-based, multicomponent behavioral weight loss program. Assessments occurred at baseline, Month 3 (end of intervention, and Month 6 (follow-up. Results. Retention was excellent (93% at Month 3 and 89% at Month 6. Intent-to-treat analyses demonstrated that participants lost an average (±SE of -5.8±.60 kg from baseline to Month 3 and regained 1.1±.31 kg from Month 3 to Month 6; overall, weight loss from baseline to Month 6 was -4.7±.71 kg, p<.001. Men lost more weight than women, p=.022, and individuals who had a college degree or higher lost more weight than those with less education, p=.005. Adherence to viewing lessons (8 of 12 and self-monitoring (83% of days was excellent and significantly associated with weight loss, ps<.05. Conclusions. An Internet-based behavioral weight management intervention can be successfully implemented in a worksite setting and can lead to clinically significant weight losses. Given the low costs of offering this program, it could easily be widely disseminated.

  20. Testing a simple and low-cost method for long-term (baseline) CO2 monitoring in the shallow subsurface

    NARCIS (Netherlands)

    Gaasbeek, H.; Goldberg, T.; Koenen, M.; Visser, W.; Wildenborg, T.; Steeghs, P.

    2014-01-01

    Implementation of geological CO2 storage requires monitoring for potential leakage, with an essential part being establishment of baseline CO2 in soil gas. CO2 concentrations and weather parameters were monitored for ∼2 years at three locations in the Netherlands. CO2 concentrations in soil ranged f

  1. Testing a simple and low-cost method for long-term (baseline) CO2 monitoring in the shallow subsurface

    NARCIS (Netherlands)

    Gaasbeek, H.; Goldberg, T.; Koenen, M.; Visser, W.; Wildenborg, T.; Steeghs, P.

    2014-01-01

    Implementation of geological CO2 storage requires monitoring for potential leakage, with an essential part being establishment of baseline CO2 in soil gas. CO2 concentrations and weather parameters were monitored for ∼2 years at three locations in the Netherlands. CO2 concentrations in soil ranged f

  2. Baseline Analyses of SIG Applications and SIG-Eligible and SIG-Awarded Schools. NCEE 2011-4019

    Science.gov (United States)

    Hurlburt, Steven; Le Floch, Kerstin Carlson; Therriault, Susan Bowles; Cole, Susan

    2011-01-01

    The Study of School Turnaround is an examination of the implementation of School Improvement Grants (SIG) authorized under Title I section 1003(g) of the "Elementary and Secondary Education Act" and supplemented by the "American Recovery and Reinvestment Act of 2009." "Baseline Analyses of SIG Applications and SIG-Eligible…

  3. Removal of Baseline Wander Noise from Electrocardiogram (ECG using Fifth-order Spline Interpolation

    Directory of Open Access Journals (Sweden)

    John A. OJO

    2016-10-01

    Full Text Available Baseline wandering can mask some important features of the Electrocardiogram (ECG signal hence it is desirable to remove this noise for proper analysis and display of the ECG signal. This paper presents the implementation and evaluation of spline interpolation and linear phase FIR filtering methods to remove this noise. Spline interpolation method requires the QRS waves to be first detected and fifth-order (quintic interpolation technique applied to determine the smoothest curve joining several QRS points. Filtering of the ECG baseline wander was performed by using the difference between the estimated baseline wander and the noisy ECG signal. ECG signals from the MIT-BIT arrhythmia database was used to test the system, while the technique was implemented in MATLAB. The performance of the system was evaluated using Average Power (AP after filtering, Mean Square Error (MSE and the Signal to Noise Ratio (SNR. The quintic spline interpolation gave the best performance in terms of AP, MSE and SNR when compared with linear phase filtering and cubic (3rd-order spline interpolation methods.

  4. The effectiveness of the PRISMA integrated service delivery network: preliminary report on methods and baseline data.

    Science.gov (United States)

    Hébert, Réjean; Dubois, Marie-France; Raîche, Michel; Dubuc, Nicole

    2008-02-14

    The PRISMA study analyzes an innovative coordination-type integrated service delivery (ISD) system developed to improve continuity and increase the effectiveness and efficiency of services, especially for older and disabled populations. The objective of the PRISMA study is to evaluate the effectiveness of this system to improve health, empowerment and satisfaction of frail older people, modify their health and social services utilization, without increasing the burden of informal caregivers. The objective of this paper is to present the methodology and give baseline data on the study participants. A quasi-experimental study with pre-test, multiple post-tests, and a comparison group was used to evaluate the impact of PRISMA ISD. Elders at risk of functional decline (501 experimental, 419 control) participated in the study. At entry, the two groups were comparable for most variables. Over the first year, when the implementation rate was low (32%), participants from the control group used fewer services than those from the experimental group. After the first year, no significant statistical difference was observed for functional decline and changes in the other outcome variables. This first year must be considered a baseline year, showing the situation without significant implementation of PRISMA ISD systems. Results for the following years will have to be examined with consideration of these baseline results.

  5. The effectiveness of the PRISMA integrated service delivery network: preliminary report on methods and baseline data

    Directory of Open Access Journals (Sweden)

    Réjean Hébert

    2008-02-01

    Full Text Available Purpose: The PRISMA study analyzes an innovative coordination-type integrated service delivery (ISD system developed to improve continuity and increase the effectiveness and efficiency of services, especially for older and disabled populations. The objective of the PRISMA study is to evaluate the effectiveness of this system to improve health, empowerment and satisfaction of frail older people, modify their health and social services utilization, without increasing the burden of informal caregivers. The objective of this paper is to present the methodology and give baseline data on the study participants. Methods: A quasi-experimental study with pre-test, multiple post-tests, and a comparison group was used to evaluate the impact of PRISMA ISD. Elders at risk of functional decline (501 experimental, 419 control participated in the study. Results: At entry, the two groups were comparable for most variables. Over the first year, when the implementation rate was low (32%, participants from the control group used fewer services than those from the experimental group. After the first year, no significant statistical difference was observed for functional decline and changes in the other outcome variables. Conclusion: This first year must be considered a baseline year, showing the situation without significant implementation of PRISMA ISD systems. Results for the following years will have to be examined with consideration of these baseline results.

  6. Baseline Optimization for the Measurement of CP Violation, Mass Hierarchy, and $\\theta_{23}$ Octant in a Long-Baseline Neutrino Oscillation Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Bass, M. [Colorado State U.; Bishai, M. [Brookhaven; Cherdack, D. [Colorado State U.; Diwan, M. [Brookhaven; Djurcic, Z. [Argonne; Hernandez, J. [Houston U.; Lundberg, B. [Fermilab; Paolone, V. [Pittsburgh U.; Qian, X. [Brookhaven; Rameika, R. [Fermilab; Whitehead, L. [Houston U.; Wilson, R. J. [Colorado State U.; Worcester, E. [Brookhaven; Zeller, G. [Fermilab

    2015-03-19

    Next-generation long-baseline electron neutrino appearance experiments will seek to discover CP violation, determine the mass hierarchy and resolve the θ23 octant. In light of the recent precision measurements of θ13, we consider the sensitivity of these measurements in a study to determine the optimal baseline, including practical considerations regarding beam and detector performance. We conclude that a detector at a baseline of at least 1000 km in a wide-band muon neutrino beam is the optimal configuration.

  7. Effects of School-Wide Positive Behavior Support on Teacher Self-Efficacy

    Science.gov (United States)

    Kelm, Joanna L.; McIntosh, Kent

    2012-01-01

    This study examined the relationships between implementation of a school-wide approach to behavior, School-wide Positive Behavior Support (SWPBS), and teacher self-efficacy. Twenty-two teachers from schools implementing SWPBS and 40 teachers from schools not implementing SWPBS completed a questionnaire measuring aspects of self-efficacy.…

  8. Baseline values of immunologic parameters in the lizard Salvator merianae (Teiidae, Squamata

    Directory of Open Access Journals (Sweden)

    Ana Paula Mestre

    2017-05-01

    Full Text Available The genus Salvator is widely distributed throughout South America. In Argentina, the species most abundant widely distributed is Salvator merianae. Particularly in Santa Fe province, the area occupied by populations of these lizards overlaps with areas where agriculture was extended. With the aim of established baseline values for four immunologic biomarkers widely used, 36 tegu lizards were evaluated tacking into account different age classes and both sexes. Total leukocyte counts were not different between age classes. Of the leucocytes count, eosinophils levels were higher in neonates compared with juvenile and adults; nevertheless, the heterophils group was the most prevalent leukocyte in the peripheral blood in all age classes. Lymphocytes, monocytes, heterophils, azurophils and basophils levels did not differ with age. Natural antibodies titres were higher in the adults compared with neonates and juveniles lizards. Lastly, complement system activity was low in neonates compared with juveniles and adults. Statistical analysis within each age group showed that gender was not a factor in the outcomes. Based on the results, we concluded that S. merianae demonstrated age (but not gender related differences in the immune parameters analyzed. Having established baseline values for these four widely-used immunologic biomarkers, ongoing studies will seek to optimize the use of the S. merianae model in future research.

  9. National greenhouse gas emissions baseline scenarios. Learning from experiences in developing countries

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    This report reviews national approaches to preparing baseline scenarios of greenhouse-gas (GHG) emissions. It does so by describing and comparing in non-technical language existing practices and choices made by ten developing countries - Brazil, China, Ethiopia, India, Indonesia, Kenya, Mexico, South Africa, Thailand and Vietnam. The review focuses on a number of key elements, including model choices, transparency considerations, choices about underlying assumptions and challenges associated with data management. The aim is to improve overall understanding of baseline scenarios and facilitate their use for policy-making in developing countries more broadly. The findings are based on the results of a collaborative project involving a number of activities undertaken by the Danish Energy Agency, the Organisation for Economic Co-operation and Development (OECD) and the UNEP Risoe Centre (URC), including a series of workshops on the subject. The ten contributing countries account for approximately 40% of current global GHG emissions - a share that is expected to increase in the future. The breakdown of emissions by sector varies widely among these countries. In some countries, the energy sector is the leading source of emissions; for others, the land-use sector and/or agricultural sector dominate emissions. The report underscores some common technical and financial capacity gaps faced by developing countries when preparing baseline scenarios. It does not endeavour to propose guidelines for preparing baseline scenarios. Rather, it is hoped that the report will inform any future attempts at preparing such kind of guidelines. (Author)

  10. The MINK methodology: background and baseline. [USA - Midwest Region

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, N.J.; Crosson, P.R.; Frederick, K.D.; Easterling, W.E. III; McKenney, M.S.; Bowes, M.D.; Sedjo, R.A.; Darmstadter, J.; Katz, L.A.; Lemon, K.M. (Resources for the Future, Washington, DC (United States))

    1993-06-01

    A four step methodology has been developed for study of the regional impacts of climate change and the possible responses thereto. First the region's climate sensitive sectors and total economy are described (Task A, current baseline). Next a scenario of climate change is imposed on the current baseline (Task B, current baseline with climate change). A new baseline describing the climate sensitive sectors and total regional economy is projected for some time in the future (Task C, future baseline, year 2030) in the absence of climate change. Finally, the climate change scenario is reimposed on the future baseline (Task D, future baseline with climate change). Impacts of the climate change scenario on the current and future regional economies are determined by means of simulation models and other appropriate techniques. These techniques are also used to assess the impacts of an elevated CO[sub 2] concentration (450 ppm) and of various forms of adjustments and adaptations. The region chosen for the first test of the methodology is composed of the four U.S. states of Missouri, Iowa, Nebraska and Kansas. The climate change scenario is the actual weather of the 1930s decade in the MINK region. 'Current' climate is the actual weather of the period 1951-1980. 20 refs., 3 figs., 2 tabs.

  11. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    Science.gov (United States)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  12. Wide-Bandgap Semiconductors

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, M.S.

    2005-11-22

    With the increase in demand for more efficient, higher-power, and higher-temperature operation of power converters, design engineers face the challenge of increasing the efficiency and power density of converters [1, 2]. Development in power semiconductors is vital for achieving the design goals set by the industry. Silicon (Si) power devices have reached their theoretical limits in terms of higher-temperature and higher-power operation by virtue of the physical properties of the material. To overcome these limitations, research has focused on wide-bandgap materials such as silicon carbide (SiC), gallium nitride (GaN), and diamond because of their superior material advantages such as large bandgap, high thermal conductivity, and high critical breakdown field strength. Diamond is the ultimate material for power devices because of its greater than tenfold improvement in electrical properties compared with silicon; however, it is more suited for higher-voltage (grid level) higher-power applications based on the intrinsic properties of the material [3]. GaN and SiC power devices have similar performance improvements over Si power devices. GaN performs only slightly better than SiC. Both SiC and GaN have processing issues that need to be resolved before they can seriously challenge Si power devices; however, SiC is at a more technically advanced stage than GaN. SiC is considered to be the best transition material for future power devices before high-power diamond device technology matures. Since SiC power devices have lower losses than Si devices, SiC-based power converters are more efficient. With the high-temperature operation capability of SiC, thermal management requirements are reduced; therefore, a smaller heat sink would be sufficient. In addition, since SiC power devices can be switched at higher frequencies, smaller passive components are required in power converters. Smaller heat sinks and passive components result in higher-power-density power converters

  13. Lichen bioindication of biodiversity, air quality, and climate: baseline results from monitoring in Washington, Oregon, and California.

    Science.gov (United States)

    Sarah. Jovan

    2008-01-01

    Lichens are highly valued ecological indicators known for their sensitivity to a wide variety of environmental stressors like air quality and climate change. This report summarizes baseline results from the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) Lichen Community Indicator covering the first full cycle of data collection (...

  14. The long-period eccentric orbit of the particle accelerator HD 167971 revealed by long baseline interferometry

    NARCIS (Netherlands)

    De Becker, M.; Sana, H.; Absil, O.; Le Bouquin, J.-B.; Blomme, R.

    2012-01-01

    Using optical long baseline interferometry, we resolved for the first time the two wide components of HD 167971, a candidate hierarchical triple system known to efficiently accelerate particles. Our multi-epoch Very Large Telescope Interferometer observations provide direct evidence for a gravitatio

  15. Trinocular Stereo Matching Based on Correlations Between Baselines and Disparities

    Institute of Scientific and Technical Information of China (English)

    GUANYepeng; GUWeikang

    2004-01-01

    Gray correlation technique is utilized to take the multi-peak feature points with gray correlation coefficients less than a certain range of maximal correlation coefficient as a potential candidate matching set. There exists the maximal correlation between the correct disparities and their corresponding baselines. A trinocular stereo matching algorithm is proposed based on correlations between the baselines and disparities. After computing the correlations between the baselines and disparities, the unique matches can be determined by maximal correlation coefficient. It is proved that the algorithm proposed is valid and credible by 3-D reconstruction on two pairs of actual natural stereo images.

  16. Neutrino oscillations: what is magic about the "magic" baseline?

    CERN Document Server

    Smirnov, A Yu

    2006-01-01

    Physics interpretation of the ``magic'' baseline that can play important role in future oscillation experiments is given. The ``magic'' baseline coincides with the refraction length, $l_0$. The latter, in turn, approximately equals the oscillation length in matter at high energies. Therefore at the baseline $L = l_0$ the oscillation phase is $2\\pi$, and consequently, the ``solar'' amplitude of oscillations driven by the mixing angle $\\theta_{12}$ and mass splitting $\\Delta m^2_{21}$ vanishes. As a result, in the lowest order (i) the interference of amplitudes in the $\

  17. Dynamic baseline detection method for power data network service

    Science.gov (United States)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  18. WEB GIS: IMPLEMENTATION ISSUES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    With the rapid expansion and development of Internet and WWW (World Wide Web or Web), Web GIS (Web Geographical Information Systen) is becoming ever more popular and as a result numerous sites have added GIS capability on their Web sites. In this paper, the reasons behind developing a Web GIS instead of a “traditional” GIS are first outlined. Then the current status of Web GIS is reviewed, and their implementation methodologies are explored as well.The underlying technologies for developing Web GIS, such as Web Server, Web browser, CGI (Common Gateway Interface), Java, ActiveX, are discussed, and some typical implementation tools from both commercial and public domain are given as well. Finally, the future development direction of Web GIS is predicted.

  19. Altering school climate through school-wide Positive Behavioral Interventions and Supports: findings from a group-randomized effectiveness trial.

    Science.gov (United States)

    Bradshaw, Catherine P; Koth, Christine W; Thornton, Leslie A; Leaf, Philip J

    2009-06-01

    Positive Behavioral Interventions and Supports (PBIS) is a universal, school-wide prevention strategy that is currently implemented in over 7,500 schools to reduce disruptive behavior problems. The present study examines the impact of PBIS on staff reports of school organizational health using data from a group-randomized controlled effectiveness trial of PBIS conducted in 37 elementary schools. Longitudinal multilevel analyses on data from 2,596 staff revealed a significant effect of PBIS on the schools' overall organizational health, resource influence, staff affiliation, and academic emphasis over the 5-year trial; the effects on collegial leadership and institutional integrity were significant when implementation fidelity was included in the model. Trained schools that adopted PBIS the fastest tended to have higher levels of organizational health at baseline, but the later-implementing schools tended to experience the greatest improvements in organizational health after implementing PBIS. This study indicated that changes in school organizational health are important consequences of the PBIS whole-school prevention model, and may in turn be a potential contextual mediator of the effect of PBIS on student performance.

  20. Baseline hospital performance and the impact of medical emergency teams: Modelling vs. conventional subgroup analysis

    Directory of Open Access Journals (Sweden)

    Hillman Ken

    2009-12-01

    Full Text Available Abstract Background To compare two approaches to the statistical analysis of the relationship between the baseline incidence of adverse events and the effect of medical emergency teams (METs. Methods Using data from a cluster randomized controlled trial (the MERIT study, we analysed the relationship between the baseline incidence of adverse events and its change from baseline to the MET activation phase using quadratic modelling techniques. We compared the findings with those obtained with conventional subgroup analysis. Results Using linear and quadratic modelling techniques, we found that each unit increase in the baseline incidence of adverse events in MET hospitals was associated with a 0.59 unit subsequent reduction in adverse events (95%CI: 0.33 to 0.86 after MET implementation and activation. This applied to cardiac arrests (0.74; 95%CI: 0.52 to 0.95, unplanned ICU admissions (0.56; 95%CI: 0.26 to 0.85 and unexpected deaths (0.68; 95%CI: 0.45 to 0.90. Control hospitals showed a similar reduction only for cardiac arrests (0.95; 95%CI: 0.56 to 1.32. Comparison using conventional subgroup analysis, on the other hand, detected no significant difference between MET and control hospitals. Conclusions Our study showed that, in the MERIT study, when there was dependence of treatment effect on baseline performance, an approach based on regression modelling helped illustrate the nature and magnitude of such dependence while sub-group analysis did not. The ability to assess the nature and magnitude of such dependence may have policy implications. Regression technique may thus prove useful in analysing data when there is a conditional treatment effect.

  1. Baseline and post-bronchodilator interrupter resistance and spirometry in asthmatic children.

    Science.gov (United States)

    Beydon, Nicole; Mahut, Bruno; Maingot, L; Guillo, H; La Rocca, M C; Medjahdi, N; Koskas, M; Boulé, M; Delclaux, Christophe

    2012-10-01

    In children unable to perform reliable spirometry, the interrupter resistance (R(int) ) technique for assessing respiratory resistance is easy to perform. However, few data are available on the possibility to use R(int) as a surrogate for spirometry. We aimed at comparing R(int) and spirometry at baseline and after bronchodilator administration in a large population of asthmatic children. We collected retrospectively R(int) and spirometry results measured in 695 children [median age 7.8 (range 4.8-13.9) years] referred to our lab for routine assessment of asthma disease. Correlations between R(int) and spirometry were studied using data expressed as z-scores. Receiver operator characteristic curves for the baseline R(int) value (z-score) and the bronchodilator effect (percentage predicted value and z-score) were generated to assess diagnostic performance. At baseline, the relationship between raw values of R(int) and FEV(1) was not linear. Despite a highly significant inverse correlation between R(int) and all of the spirometry indices (FEV(1) , FVC, FEV(1) /FVC, FEF(25-75%) ; P 12% baseline increase) with 70% sensitivity and 69% specificity (AUC = 0.79). R(int) measurements fitted a one-compartment model that explained the relationship between flows and airway resistance. We found that R(int) had poor sensitivity to detect baseline obstruction, but fairly good sensitivity and specificity to detect reversibility. However, in order to implement asthma guidelines for children unable to produce reliable spirometry, bronchodilator response measured by R(int) should be systematically studied and further assessed in conjunction with clinical outcomes.

  2. Baseline inventory data recommendations for National Wildlife Refuges

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Baseline Inventory Team recommends that each refuge have available abiotic “data layers” for topography, aerial photography, hydrography, soils, boundaries, and...

  3. Baseline Biomarkers for Outcome of Melanoma Patients Treated with Pembrolizumab

    NARCIS (Netherlands)

    Weide, Benjamin; Martens, Alexander; Hassel, Jessica C.; Berking, Carola; Postow, Michael A.; Bisschop, Kees; Simeone, Ester; Mangana, Johanna; Schilling, Bastian; Di Giacomo, Anna Maria; Brenner, Nicole; Kaehler, Katharina; Heinzerling, Lucie; Gutzmer, Ralf; Bender, Armin; Gebhardt, Christoffer; Romano, Emanuela; Meier, Friedegund; Martus, Peter; Maio, Michele; Blank, Christian; Schadendorf, Dirk; Dummer, Reinhard; Ascierto, Paolo A.; Hospers, Geke; Garbe, Claus; Wolchok, Jedd D.

    2016-01-01

    Purpose: Biomarkers for outcome after immune-checkpoint blockade are strongly needed as these may influence individual treatment selection or sequence. We aimed to identify baseline factors associated with overall survival (OS) after pembrolizumab treatment in melanoma patients. Experimental Design:

  4. Seier NWR second year baseline CCP preparation surveys

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Final report for a project to collect baseline biological information on John W. And Louise Seier National Wildlife Refuge, to aid in preparation of 2014...

  5. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  6. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  7. Sterile Neutrino Fits to Short-Baseline Neutrino Oscillation Measurements

    Directory of Open Access Journals (Sweden)

    J. M. Conrad

    2013-01-01

    (3 + 2 and (3 + 3 fits, rather than (3 + 1 fits, for future neutrino oscillation phenomenology. These results motivate the pursuit of further short-baseline experiments, such as those reviewed in this paper.

  8. Baseline vegetation mapping : Fort Niobrara National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Final report for the baseline vegetation mapping project on Fort Niobrara National Wildlife Refuge. This project aims to create a vegetation map showing the...

  9. Baseline assessment of fish communities of the Flower Garden Banks

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The work developed baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys employed diving,...

  10. Fusion of a Variable Baseline System and a Range Finder

    Directory of Open Access Journals (Sweden)

    Rafael Arnay

    2011-12-01

    Full Text Available One of the greatest difficulties in stereo vision is the appearance of ambiguities when matching similar points from different images. In this article we analyze the effectiveness of using a fusion of multiple baselines and a range finder from a theoretical point of view, focusing on the results of using both prismatic and rotational articulations for baseline generation, and offer a practical case to prove its efficiency on an autonomous vehicle.

  11. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Partnership, ALMA [Astrophysics Research Institute, Liverpool John Moores University, IC2, Liverpool Science Park, 146 Brownlow Hill, Liverpool L3 5RF (United Kingdom); Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S. [Joint ALMA Observatory, Alonso de Córdova 3107, Vitacura, Santiago (Chile); Lucas, R. [Institut de Planétologie et d’Astrophysique de Grenoble (UMR 5274), BP 53, F-38041 Grenoble Cedex 9 (France); Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Asaki, Y. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Matsushita, S. [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Hills, R. E. [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Richards, A. M. S. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Broguiere, D., E-mail: efomalon@nrao.edu [Institut de Radioastronomie Millime´trique (IRAM), 300 rue de la Piscine, Domaine Universitaire, F-38406 Saint Martin d’Hères (France); and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  12. MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation

    Directory of Open Access Journals (Sweden)

    Howard Daniel

    2006-01-01

    Full Text Available Stochastic Bernstein (SB approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant or tuned with evolutionary computation (for .

  13. Precise baseline determination for the TanDEM-X mission

    Science.gov (United States)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  14. A Baseline Study of Piermont Marsh as Nekton Habitat

    Science.gov (United States)

    Ortega, M.; Bloomfield, F.; Torres, T.; Ward, J.; Sanders, D.; Lobato, A.

    2011-12-01

    Between 2007 and 2011 we have conducted a study of fish populations and water quality in the Piermont Marsh, a brackish tidal wetland about 40 km north of Manhattan. This 5-year period represents the baseline for an ongoing ecological study of the marsh. The marsh, along with similar wetlands between the Federal Dam at Troy and the Battery, is an important refuge for juvenile fish, and it is believed that estuarine wetland dynamics are critical in population recruitment for coastal fisheries. Piermont Marsh has undergone a rapid transition from a primarily Spartina alternaflora and Spartina pattens setting to one dominated by an invasive genotype of common reed Phragmites australis. The impact of this shift on local fish populations, species diversity, and adult recruitment are not well understood. The long term goal of this study is to tease apart factors in by use of the marsh as a nekton habitat. Fish were collected in unbaited minnow gee traps which were deployed at slack tide and left for 24 hours. Samples were preserved in 10% buffered formalin. All organisms were identified to the lowest practical taxonomic level, enumerated, and measured. Gross weight was recorded for each sample set. Water quality measurements such as temperature, salinity and dissolved oxygen were collected concurrently with all sampling events. Sample collections were focused on the tidal creeks crossing the marsh, which provide the primary exchange of water and nutrients between the marsh interior and Hudson River estuary. As expected, most minnows captured were Fundulus heteroclitus. However a wide variety of other nekton, including species that are important to commercial and recreational coastal Atlantic fish stocks, was recorded as well. Comparisons are made between habitats such as erosional and depostional banks, rivulets, and exterior and interior marsh settings. Also involved were transient conditions such as temperature, salinity, dissolved oxygen levels, and hydroperiod

  15. Minimal sufficient balance-a new strategy to balance baseline covariates and preserve randomness of treatment allocation.

    Science.gov (United States)

    Zhao, Wenle; Hill, Michael D; Palesch, Yuko

    2015-12-01

    In many clinical trials, baseline covariates could affect the primary outcome. Commonly used strategies to balance baseline covariates include stratified constrained randomization and minimization. Stratification is limited to few categorical covariates. Minimization lacks the randomness of treatment allocation. Both apply only to categorical covariates. As a result, serious imbalances could occur in important baseline covariates not included in the randomization algorithm. Furthermore, randomness of treatment allocation could be significantly compromised because of the high proportion of deterministic assignments associated with stratified block randomization and minimization, potentially resulting in selection bias. Serious baseline covariate imbalances and selection biases often contribute to controversial interpretation of the trial results. The National Institute of Neurological Disorders and Stroke recombinant tissue plasminogen activator Stroke Trial and the Captopril Prevention Project are two examples. In this article, we propose a new randomization strategy, termed the minimal sufficient balance randomization, which will dually prevent serious imbalances in all important baseline covariates, including both categorical and continuous types, and preserve the randomness of treatment allocation. Computer simulations are conducted using the data from the National Institute of Neurological Disorders and Stroke recombinant tissue plasminogen activator Stroke Trial. Serious imbalances in four continuous and one categorical covariate are prevented with a small cost in treatment allocation randomness. A scenario of simultaneously balancing 11 baseline covariates is explored with similar promising results. The proposed minimal sufficient balance randomization algorithm can be easily implemented in computerized central randomization systems for large multicenter trials.

  16. Single-baseline RTK GNSS Positioning for Hydrographic Surveying

    Science.gov (United States)

    Metin Alkan, Reha; Murat Ozulu, I.; Ilçi, Veli; Kahveci, Muzaffer

    2015-04-01

    Positioning with GNSS technique can be carried out in two ways, absolute and relative. It has been possible to reach a few meters absolute point positioning accuracies in real time after disabling SA permanently in May 2000. Today, accuracies obtainable from absolute point positioning using code observations are not sufficient for most surveying applications. Thus to meet higher accuracy requirements, differential methods using single or dual frequency geodetic-grade GNSS receivers that measure carrier phase have to be used. However, this method requires time-cost field and office works and if the measurement is not carried out with conventional RTK method, user needs a GNSS data processing software to estimate the coordinates. If RTK is used, at least two or more GNSS receivers are required, one as a reference and the other as a rover. Moreover, the distance between the receivers must not exceed 15-20 km in order to be able to rapidly and reliably resolve the carrier phase ambiguities. On the other hand, based on the innovations and improvements in satellite geodesy and GNSS modernization studies occurred within the last decade, many new positioning methods and new approaches have been developed. One of them is Network-RTK (or commonly known as CORS) and the other is Single-baseline RTK. These methods are widely used for many surveying applications in many countries. The user of the system can obtain his/her position within a few cm level of accuracy in real-time with only a single GNSS receiver that has Network RTK (CORS) capability. When compared with the conventional differential and RTK methods, this technique has several significant advantages as it is easy to use and it produces accurate, cost-effective and rapid solutions. In Turkey, establishment of a multi-base RTK network was completed and opened for civilian use in 2009. This network is called CORS-TR and consists of 146 reference stations having about 80-100 km interstation distances. It is possible

  17. Small pulmonary nodules in baseline and incidence screening rounds of low-dose CT lung cancer screening.

    Science.gov (United States)

    Walter, Joan E; Heuvelmans, Marjolein A; Oudkerk, Matthijs

    2017-02-01

    Currently, lung cancer screening by low-dose computed tomography (LDCT) is widely recommended for high-risk individuals by US guidelines, but there still is an ongoing debate concerning respective recommendations for European countries. Nevertheless, the available data regarding pulmonary nodules released by lung cancer screening studies could improve future screening guidelines, as well as the clinical practice of incidentally detected pulmonary nodules on routine CT scans. Most lung cancer screening trials present results for baseline and incidence screening rounds separately, clustering pulmonary nodules initially found at baseline screening and newly detected pulmonary nodules after baseline screening together. This approach does not appreciate possible differences among pulmonary nodules detected at baseline and firstly detected at incidence screening rounds and is heavily influenced by methodological differences of the respective screening trials. This review intends to create a basis for assessing non-calcified pulmonary nodules detected during LDCT lung cancer screening in a more clinical relevant manner. The aim is to present data of non-calcified pulmonary baseline nodules and new non-calcified pulmonary incident nodules without clustering them together, thereby also simplifying translation to the clinical practice of incidentally detected pulmonary nodules. Small pulmonary nodules newly detected at incidence screening rounds of LDCT lung cancer screening may possess a greater lung cancer probability than pulmonary baseline nodules at a smaller size, which is essential for the development of new guidelines.

  18. SRP baseline hydrogeologic investigation: Aquifer characterization. Groundwater geochemistry of the Savannah River Site and vicinity

    Energy Technology Data Exchange (ETDEWEB)

    Strom, R.N.; Kaback, D.S.

    1992-03-31

    An investigation of the mineralogy and chemistry of the principal hydrogeologic units and the geochemistry of the water in the principal aquifers at Savannah River Site (SRS) was undertaken as part of the Baseline Hydrogeologic Investigation. This investigation was conducted to provide background data for future site studies and reports and to provide a site-wide interpretation of the geology and geochemistry of the Coastal Plain Hydrostratigraphic province. Ground water samples were analyzed for major cations and anions, minor and trace elements, gross alpha and beta, tritium, stable isotopes of hydrogen, oxygen, and carbon, and carbon-14. Sediments from the well borings were analyzed for mineralogy and major and minor elements.

  19. Wide Aperture Multipole Magnets of Separator COMBAS

    CERN Document Server

    Artukh, A G; Gridnev, G F; Gruszecki, M; Koscielniak, F; Semchenkova, O V; Sereda, Yu M; Shchepunov, V A; Szmider, J; Teterev, Yu G; Severgin, Yu P; Rozhdestvensky, B V; Myasnikov, Yu A; Shilkin, N F; Lamzin, E A; Nagaenko, M G; Sytchevsky, S E; Vishnevski, I N

    2000-01-01

    The high-resolving wide aperture separator COMBAS has been designed and commissioned at the FLNR, JINR. Its magneto-optical structure is based on strong focusing principle. The magnetic fields of analysing magnets M_1, M_2, M_7, M_8, contain quadrupole components of alternating sign that provide necessary beam focusing. Besides, all the magnets M_1-M_8, contain sextupole and octupole field components, which minimizes the 2nd and 3rd order aberrations. All this allowed one to increase their apertures, to effectively form a beam of the required sizes, and to decrease the channel length. This implementation of wide aperture magnets with combined functions is unique for the separation technology. Three-components magnetic measurements of all the magnets were performed. The measured data allow reconstructing the 3D-distributions of the fields in all the magnets. 3D-maps are supposed to be used for particle trajectory simulations throughout the entire separator.

  20. Promoting and supporting PBL interests world wide

    DEFF Research Database (Denmark)

    Enemark, Stig; Kolmos, Anette; Moesby, Egon

    2006-01-01

    of projects world wide focusing on institutional change toward a more student centred, project organised, and problem based approach to learning. The Centre is also establishing a UCPBL Global Network on Problem Based Learning in order to facilitate better access to and co-operation within the PBL area.......-Based Learning (PBL) in Engineering Education, an increasing number of universities and engineering schools throughout the world are seeking consultancy and cooperation with Aalborg University. The establishment of UCPBL is therefore a timely opportunity to merge the efforts into one organisational structure...... aiming to promote and support PBL interests worldwide. This paper presents the UCPBL profile and plan of action. This includes a wide range of activities such as promoting research and development within the various PBL models and their implementation; Education and training in PBL through offering...

  1. Way to increase the user access at the LCLS baseline

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni

    2010-01-01

    The LCLS beam is meant for a single user, but the baseline undulator is long enough to serve two users simultaneously. To this end, we propose a setup composed of two elements: an X-ray mirrors pair for X-ray beam deflection, and a 4 m-long magnetic chicane, which creates an offset for mirrors pair installation in the middle of the baseline undulator. The insertable mirrors pair can separate spatially the X-ray beams generated in the first and in the second half of the baseline undulator. Rapid switching of the FEL amplification process allows deactivating one half and activating another half of the undulator. As proposed elsewhere, using a kicker installed upstream of the LCLS baseline undulator and an already existing corrector in the first half of the undulator, it is possible to rapidly switch the X-ray beam from one user to another. We present simulation results for the LCLS baseline, and show that it is possible to generate two saturated SASE X-ray beams in the whole 0.8-8 keV photon energy range in the...

  2. Deformation Monitoring of the Submillimetric UPV Calibration Baseline

    Science.gov (United States)

    García-Asenjo, Luis; Baselga, Sergio; Garrigues, Pascual

    2017-06-01

    A 330 m calibration baseline was established at the Universitat Politècnica de València (UPV) in 2007. Absolute scale was subsequently transferred in 2012 from the Nummela Standard Baseline in Finland and distances between pillars were determined with uncertainties ranging from 0.1 mm to 0.3 mm. In order to assess the long-term stability of the baseline three field campaigns were carried out from 2013 to 2015 in a co-operative effort with the Universidad Complutense de Madrid (UCM), which provided the only Mekometer ME5000 distance meter available in Spain. Since the application of the ISO17123-4 full procedure did not suffice to come to a definite conclusion about possible displacements of the pillars, we opted for the traditional geodetic network approach. This approach had to be adapted to the case at hand in order to deal with problems such as the geometric weakness inherent to calibration baselines and scale uncertainty derived from both the use of different instruments and the high correlation between the meteorological correction and scale determination. Additionally, the so-called the maximum number of stable points method was also tested. In this contribution it is described the process followed to assess the stability of the UPV submillimetric calibration baseline during the period of time from 2012 to 2015.

  3. Detection of abrupt baseline length changes using cumulative sums

    Science.gov (United States)

    Janssen, Volker

    2009-06-01

    Dynamic processes are usually monitored by collecting a time series of observations, which is then analysed in order to detect any motion or non-standard behaviour. Geodetic examples include the monitoring of dams, bridges, high-rise buildings, landslides, volcanoes and tectonic motion. The cumulative sum (CUSUM) test is recognised as a popular means to detect changes in the mean and/or the standard deviation of a time series and has been applied to various monitoring tasks. This paper briefly describes the CUSUM technique and how it can be utilised for the detection of small baseline length changes by differencing two perpendicular baselines sharing a common site. A simulation is carried out in order to investigate the expected behaviour of the resulting CUSUM charts for a variety of typical deformation monitoring scenarios. This simulation shows that using first differences (between successive epochs) as input, rather than the original baseline lengths, produces clear peaks or jumps in the differenced CUSUM time series when a sudden change in baseline length occurs. These findings are validated by analysing several GPS baseline pairs of a network deployed to monitor the propagation of an active ice shelf rift on the Amery Ice Shelf, East Antarctica.

  4. Combined GPS + BDS for short to long baseline RTK positioning

    Science.gov (United States)

    Odolinski, R.; Teunissen, P. J. G.; Odijk, D.

    2015-04-01

    The BeiDou Navigation Satellite System (BDS) has become fully operational in the Asia-Pacific region and it is of importance to evaluate what BDS brings when combined with the Global Positioning System (GPS). In this contribution we will look at the short, medium and long single-baseline real-time kinematic (RTK) positioning performance. Short baseline refers to when the distance between the two receivers is at most a few kilometers so that the relative slant ionospheric and tropospheric delays can be assumed absent, whereas with medium baseline we refer to when the uncertainty of these ionospheric delays can reliably be modeled as a function of the baseline length. With long baseline we refer to the necessity to parameterize the ionospheric delays and (wet) Zenith Tropospheric Delay (ZTD) as completely unknown. The GNSS real data are collected in Perth, Australia. It will be shown that combining the two systems allows for the use of higher than customary elevation cut-off angles. This can be of particular benefit in environments with restricted satellite visibility such as in open pit mines or urban canyons.

  5. Multiproject baselines for evaluation of electric power projects

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

    2003-03-12

    Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

  6. Atmospheric pressure loading parameters from very long baseline interferometry observations

    Science.gov (United States)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  7. Circular polarization control for the LCLS baseline in the soft X-ray regime

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni

    2010-01-01

    The LCLS baseline includes a planar undulator system, producing linearly polarized light in the range 0.15-1.5 nm. Polarization control in the soft X-ray region from linear to circular is highly desirable. Several schemes using helical undulators have been discussed for the LCLS. One consists in replacing three of the last planar undulator segments by APPLE III. A second proposal, the 2nd harmonic helical afterburner, uses short, crossed undulators tuned to the second harmonic. This last scheme is expected to be the better one. Its advantages are a high and stable degree of circular polarization and a low cost. Its disadvantage is a small output power and a narrow wavelength range. We propose a novel method to generate 10 GW level power at the fundamental harmonic with 99% degree of circular polarization from the LCLS baseline. Its merits are low cost, simplicity and easy implementation. After the baseline undulator, the electron beam is sent through a 40 m long straight section, and subsequently passes throu...

  8. Design, baseline characteristics, and early findings of the MPS VI (mucopolysaccharidosis VI) Clinical Surveillance Program (CSP).

    Science.gov (United States)

    Hendriksz, Christian J; Giugliani, Roberto; Harmatz, Paul; Lampe, Christina; Martins, Ana Maria; Pastores, Gregory M; Steiner, Robert D; Leão Teles, Elisa; Valayannopoulos, Vassili

    2013-03-01

    To outline the design, baseline data, and 5-year follow-up data of patients with mucopolysaccharidosis (MPS) VI enrolled in the Clinical Surveillance Program (CSP), a voluntary, multinational, observational program. The MPS VI CSP was opened in 2005 to collect, for at least 15 years, observational data from standard clinical and laboratory assessments of patients with MPS VI. Baseline and follow-up data are documented by participating physicians in electronic case report forms. Between September 2005 and March 2010 the CSP enrolled 132 patients, including 123 who received enzyme replacement therapy (ERT) with galsulfase. Median age at enrolment was 13 years (range 1-59). Mean baseline data showed impaired growth, hepatosplenomegaly, and reduced endurance and pulmonary function. The most common findings were heart valve disease (90%), reduced visual acuity (79%), impaired hearing (59%), and hepatosplenomegaly (54%). Follow-up data up to 5 years in patients with pre- and post-ERT measurements showed a decrease in urinary glycosaminoglycans and increases in height and weight in patients MPS VI to date. This first report provides information on the design and implementation of the program and population statistics for several clinical variables in patients with MPS VI. Data collected over 5 years suggest that ERT provides clinical benefit and is well-tolerated with no new safety concerns.

  9. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  10. A publication database for optical long baseline interferometry

    CERN Document Server

    Malbet, Fabien; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain

    2010-01-01

    Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.

  11. Environmental baselines: preparing for shale gas in the UK

    Science.gov (United States)

    Bloomfield, John; Manamsa, Katya; Bell, Rachel; Darling, George; Dochartaigh, Brighid O.; Stuart, Marianne; Ward, Rob

    2014-05-01

    Groundwater is a vital source of freshwater in the UK. It provides almost 30% of public water supply on average, but locally, for example in south-east England, it is constitutes nearly 90% of public supply. In addition to public supply, groundwater has a number of other uses including agriculture, industry, and food and drink production. It is also vital for maintaining river flows especially during dry periods and so is essential for maintaining ecosystem health. Recently, there have been concerns expressed about the potential impacts of shale gas development on groundwater. The UK has abundant shales and clays which are currently the focus of considerable interest and there is active research into their characterisation, resource evaluation and exploitation risks. The British Geological Survey (BGS) is undertaking research to provide information to address some of the environmental concerns related to the potential impacts of shale gas development on groundwater resources and quality. The aim of much of this initial work is to establish environmental baselines, such as a baseline survey of methane occurrence in groundwater (National methane baseline study) and the spatial relationships between potential sources and groundwater receptors (iHydrogeology project), prior to any shale gas exploration and development. The poster describes these two baseline studies and presents preliminary findings. BGS are currently undertaking a national survey of baseline methane concentrations in groundwater across the UK. This work will enable any potential future changes in methane in groundwater associated with shale gas development to be assessed. Measurements of methane in potable water from the Cretaceous, Jurassic and Triassic carbonate and sandstone aquifers are variable and reveal methane concentrations of up to 500 micrograms per litre, but the mean value is relatively low at 2km. The geological modelling process will be presented and discussed along with maps combining

  12. Association of Fetal Heart Rate Baseline Change and Neonatal Outcomes.

    Science.gov (United States)

    Yang, Michael; Stout, Molly J; López, Julia D; Colvin, Ryan; Macones, George A; Cahill, Alison G

    2017-07-01

    Objective The objective of this study was to describe the incidence of baseline change within normal range during labor and its prediction of neonatal outcomes. Materials and Methods This was a prospective cohort of singleton, nonanomalous, term neonates with continuous electronic fetal monitoring and normal baseline fetal heart rate throughout the last 2 hours of labor. We determined baseline in 10-minute segments using Eunice Kennedy Shriver National Institute of Child Health and Human Development criteria. We evaluated baseline changes of ≥ 20 and ≥ 30 bpm for association with acidemia (umbilical cord arterial pH ≤ 7.10) and neonatal intensive care unit (NICU) admission. Finally, we performed a sensitivity analysis of normal neonates, excluding those with acidemia, NICU admission, or 5-minute Apgar bpm; 272 (9.0%) had ≥ 30 bpm. Among normal neonates (n = 2,939), 1,221 (41.5%) had change ≥20 bpm. Acidemia was not associated with baseline change of any direction or magnitude. NICU admission was associated with decrease ≥ 20 bpm (adjusted odds ratio [aOR]: 2.93; 95% confidence interval [CI]: 1.19 - 7.21) or any direction ≥ 20 bpm (aOR: 4.06; 95% CI: 1.46-11.29). For decrease ≥ 20 bpm, sensitivity and specificity were 40.0 and 81.7%; for any direction ≥ 20 bpm, 75.0 and 58.3%. Conclusion Changes of normal baseline are common in term labor and poorly predict morbidity, regardless of direction or magnitude. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. Baseline-Dependent Responses of Soil Organic Carbon Dynamics to Climate and Land Disturbances

    Directory of Open Access Journals (Sweden)

    Zhengxi Tan

    2013-01-01

    Full Text Available Terrestrial carbon (C sequestration through optimizing land use and management is widely considered a realistic option to mitigate the global greenhouse effect. But how the responses of individual ecosystems to changes in land use and management are related to baseline soil organic C (SOC levels still needs to be evaluated at various scales. In this study, we modeled SOC dynamics within both natural and managed ecosystems in North Dakota of the United States and found that the average SOC stock in the top 20 cm depth of soil lost at a rate of 450 kg C ha−1 yr−1 in cropland and 110 kg C ha−1 yr−1 in grassland between 1971 and 1998. Since 1998, the study area had become a SOC sink at a rate of 44 kg C ha−1 yr−1. The annual rate of SOC change in all types of lands substantially depends on the magnitude of initial SOC contents, but such dependency varies more with climatic variables within natural ecosystems and with management practices within managed ecosystems. Additionally, soils with high baseline SOC stocks tend to be C sources following any land surface disturbances, whereas soils having low baseline C contents likely become C sinks following conservation management.

  14. Multiadaptive Bionic Wavelet Transform: Application to ECG Denoising and Baseline Wandering Reduction

    Directory of Open Access Journals (Sweden)

    Sayadi Omid

    2007-01-01

    Full Text Available We present a new modified wavelet transform, called the multiadaptive bionic wavelet transform (MABWT, that can be applied to ECG signals in order to remove noise from them under a wide range of variations for noise. By using the definition of bionic wavelet transform and adaptively determining both the center frequency of each scale together with the -function, the problem of desired signal decomposition is solved. Applying a new proposed thresholding rule works successfully in denoising the ECG. Moreover by using the multiadaptation scheme, lowpass noisy interference effects on the baseline of ECG will be removed as a direct task. The method was extensively clinically tested with real and simulated ECG signals which showed high performance of noise reduction, comparable to those of wavelet transform (WT. Quantitative evaluation of the proposed algorithm shows that the average SNR improvement of MABWT is 1.82 dB more than the WT-based results, for the best case. Also the procedure has largely proved advantageous over wavelet-based methods for baseline wandering cancellation, including both DC components and baseline drifts.

  15. Detecting dark energy in long baseline neutrino oscillations

    Institute of Scientific and Technical Information of China (English)

    GU Pei-Hong; BI Xiao-Jun; FENG Bo; YOUNG Bing-Lin; ZHANG Xin-Min

    2008-01-01

    In this paper, we discuss a possibility of studying properties of dark energy in long baseline neutrino oscillation experiments. We consider two types of models of neutrino dark energy. For one type of models the scalar field is taken to be quintessence-like and for the other phantom-like. In these models the scalar fields couple to the neutrinos to give rise to spatially varying neutrino masses. We will show that the two types of models predict different behaviors of the spatial variation of the neutrino masses inside the Earth and consequently result in different signals in long baseline neutrino oscillation experiments.

  16. Future long-baseline neutrino oscillations: View from Asia

    Energy Technology Data Exchange (ETDEWEB)

    Hayato, Yoshinari [Kamioka Observatory, ICRR, The University of Tokyo (Japan)

    2015-07-15

    Accelerator based long-baseline neutrino oscillation experiments have been playing important roles in revealing the nature of neutrinos. However, it turned out that the current experiments are not sufficient to study two major remaining problems, the CP violation in the lepton sector and the mass hierarchy of neutrinos. Therefore, several new experiments have been proposed. Among of them, two accelerator based long-baseline neutrino oscillation experiments, the J-PARC neutrino beam and Hyper-Kamiokande, and MOMENT, have been proposed in Asia. These two projects are reviewed in this article.

  17. Solar central electric power generation - A baseline design

    Science.gov (United States)

    Powell, J. C.

    1976-01-01

    The paper presents the conceptual technical baseline design of a solar electric power plant using the central receiver concept, and derives credible cost estimates from the baseline design. The major components of the plant - heliostats, tower, receiver, tower piping, and thermal storage - are discussed in terms of technical and cost information. The assumed peak plant output is 215 MW(e), over 4000 daylight hours. The contribution of total capital investment to energy cost is estimated to be about 55 mills per kwh in mid-1974 dollars.

  18. Intermediate baseline appearance experiments and three-neutrino mixing schemes

    CERN Document Server

    Cardall, C Y; Cline, D; Cardall, Christian Y.; Fuller, George M.; Cline, David

    1997-01-01

    Three-neutrino mixing schemes suggested by Cardall \\& Fuller and Acker \\& Pakvasa are compared and contrasted. Both of these schemes seek to solve the solar and atmospheric neutrino problems {\\em and} to account for the possible neutrino oscillation signal in the LSND experiment. These neutrino oscillation schemes have different atmospheric and solar neutrino signatures that will be discriminated by Super-Kamiokande and SNO. They will also have different signatures in proposed long-baseline accelerator and reactor experiments. In particular, both of these schemes would give dramatic (and dramatically different) signals in an ``intermediate baseline'' experiment, such as the proposed ICARUS detector in the Jura mountains 17 km from CERN.

  19. Subtracting Technique of Baselines for Capillary Electrophoresis Signals

    Institute of Scientific and Technical Information of China (English)

    WANG Ying; MO Jin-yuan; CHEN Zuan-guang; GAO Yan

    2004-01-01

    The drifting baselines of capillary electrophoresis affect the veracity of analysis greatly. This paper presents Threshold Fitting Technique(TFT) so as to subtract the baselines from the original signals and emendate the signals. In TFT, wav elet and curve fitting technique are applied synthetically, thresholds are decided by the computer automatically. Many experiments of signal processing indicate that TFT is simple for being used, there are few man-induced factors, and the results are satisfactory. TFT can be applied for noisy signals without any pre-processing.

  20. EQUIVALENT BASELINE AND INTERFEROMETRIC PHASE OF CLUSTER SATELLITE SAR

    Institute of Scientific and Technical Information of China (English)

    Gong Min; Zhang Chuanwu; Huang Shunji

    2005-01-01

    The change of the equivalent baseline and interferometric phase of cluster SAR satellites is analyzed when the constellation circles around the Earth and the satellites rotate around the center at the same time. The letter provides assessment of baseline error and phase error which influence the precision of height measurement in the across-track interferometric mode. The mathematical model of cluster satellite movement is built, simulation analyses and the curve of height error are presented. The simulation results show that height measurement error can be compensated by the formulae derived in this letter, therefore, the Digital Elevation Models (DEM's) are recovered accurately.

  1. Addendum to the 2015 Eastern Interconnect Baselining and Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    This report serves as an addendum to the report 2015 Eastern Interconnect Baselining and Analysis Report (Amidan, Follum, and Freeman, 2015). This addendum report investigates the following: the impact of shorter record lengths and of adding a daily regularization term to the date/time models for angle pair measurements, additional development of a method to monitor the trend in phase angle pairs, the effect of changing the length of time to determine a baseline, when calculating atypical events, and a comparison between quantitatively discovered atypical events and actual events.

  2. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  3. Recognizing the Political in Implementation Research

    Science.gov (United States)

    McDonnell, Lorraine M.; Weatherford, M. Stephen

    2016-01-01

    The widely publicized opposition to the implementation of the Common Core State Standards (CCSS) is in marked contrast to its relatively uncontroversial development and adoption--a contrast that points to the importance of understanding how the politics of enactment differs from the politics of implementation. In this article, we draw on the…

  4. Unintended cultivation, shifting baselines, and conflict between objectives for fisheries and conservation.

    Science.gov (United States)

    Brown, Christopher J; Trebilco, Rowan

    2014-06-01

    The effects of fisheries on marine ecosystems, and their capacity to drive shifts in ecosystem states, have been widely documented. Less well appreciated is that some commercially valuable species respond positively to fishing-induced ecosystem change and can become important fisheries resources in modified ecosystems. Thus, the ecological effects of one fishery can unintentionally increase the abundance and productivity of other fished species (i.e., cultivate). We reviewed examples of this effect in the peer-reviewed literature. We found 2 underlying ecosystem drivers of the effect: trophic release of prey species when predators are overfished and habitat change. Key ecological, social, and economic conditions required for one fishery to unintentionally cultivate another include strong top-down control of prey by predators, the value of the new fishery, and the capacity of fishers to adapt to a new fishery. These unintended cultivation effects imply strong trade-offs between short-term fishery success and conservation efforts to restore ecosystems toward baseline conditions because goals for fisheries and conservation may be incompatible. Conflicts are likely to be exacerbated if fisheries baselines shift relative to conservation baselines and there is investment in the new fishery. However, in the long-term, restoration toward ecosystem baselines may often benefit both fishery and conservation goals. Unintended cultivation can be identified and predicted using a combination of time-series data, dietary studies, models of food webs, and socioeconomic data. Identifying unintended cultivation is necessary for management to set compatible goals for fisheries and conservation. © 2014 Society for Conservation Biology.

  5. The application of long-lived bivalve sclerochronology in environmental baseline monitoring

    Directory of Open Access Journals (Sweden)

    Juliane Steinhardt

    2016-09-01

    Full Text Available Assessments of the impact of construction, operation and removal of large infrastructures and other human activities on the marine environment are limited because they do not fully quantify the background baseline conditions and relevant scales of natural variability. Baselines as defined in Environmental Impact Assessments typically reflect the status of the environment and its variability drawn from published literature and augmented with some short term site specific characterization. Consequently, it can be difficult to determine whether a change in the environment subsequent to industrial activity is within or outside the range of natural background variability representative of an area over decades or centuries. An innovative approach that shows some promise in overcoming the limitations of traditional baseline monitoring methodology involves the analysis of shell material (sclerochronology from molluscs living upon or within the seabed in potentially affected areas. Bivalves especially can be effective biomonitors of their environment over a wide range of spatial and temporal scales. A rapidly expanding body of research has established that numerous characteristics of the environment can be reflected in morphological and geochemical properties of the carbonate shell material in bivalve shells, as well as in functional responses such as growth rates. In addition, the annual banding pattern in shells can provide an absolute chronometer of environmental variability and/or industrial effects. Further, some species of very long-lived bivalves can be crossdated back in time, like trees, by comparing the annual banding patterns in their shells. It is therefore feasible to develop extended timeseries of certain marine environmental variables that can provide important insights into long temporal scales of baseline variability. We review recent innovative work on the shell structure, morphology and geochemistry of bivalves and conclude that they

  6. ECG baseline wander reduction using linear phase filters

    NARCIS (Netherlands)

    Alsté, van J.A.; Eck, van W.; Hermann, O.E.

    1986-01-01

    The continuous real time reduction of baseline wander is a considerable problem in electrocardiography during exercises. Our solution consists of spectral filtering. The legitimacy of high-pass filtering of the ECG by means of digital linear phase filters with a low cut-off frequency as high as the

  7. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    None

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  8. IEA Wind Task 26: Offshore Wind Farm Baseline Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Gavin [Offshore Renewable Energy Catapult, Blyth, Northumberland (United Kingdom); Smith, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sperstad, Iver Bakken [SINTEF Energy Research, Trondheim (Norway); Prinsen, Bob [Ecofys, Utrecht (Netherlands). TKI Wind Op Zee; Lacal-Arantegui, Roberto [European Commission Joint Research Centre (JRC), Brussels (Belgium)

    2016-06-02

    This document has been produced to provide the definition and rationale for the Baseline Offshore Wind Farm established within IEA Wind Task 26--Cost of Wind Energy. The Baseline has been developed to provide a common starting point for country comparisons and sensitivity analysis on key offshore wind cost and value drivers. The baseline project reflects an approximate average of the characteristics of projects installed between 2012 and 2014, with the project life assumed to be 20 years. The baseline wind farm is located 40 kilometres (km) from construction and operations and maintenance (O&M) ports and from export cable landfall. The wind farm consists of 100 4-megawatt (MW) wind turbines mounted on monopile foundations in an average water depth of 25 metres (m), connected by 33-kilovolt (kV) inter-array cables. The arrays are connected to a single offshore substation (33kV/220kV) mounted on a jacket foundation, with the substation connected via a single 220kV export cable to an onshore substation, 10km from landfall. The wind farm employs a port-based O&M strategy using crew-transfer vessels.

  9. Revised SRC-I project baseline. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    The SRC Process Area Design Baseline consists of six volumes. The first four were submitted to DOE on 9 September 1981. The fifth volume, summarizing the Category A Engineering Change Proposals (ECPs), was not submitted. The sixth volume, containing proprietary information on Kerr-McGee's Critical Solvent Deashing System, was forwarded to BRHG Synthetic Fuels, Inc. for custody, according to past instructions from DOE, and is available for perusal by authorized DOE representatives. DOE formally accepted the Design Baseline under ICRC Release ECP 4-1001, at the Project Configuration Control Board meeting in Oak Ridge, Tennessee on 5 November 1981. The documentation was then revised by Catalytic, Inc. to incorporate the Category B and C and Post-Baseline Engineering Change Proposals. Volumes I through V of the Revised Design Baseline, dated 22 October 1982, are nonproprietary and they were issued to the DOE via Engineering Change Notice (ECN) 4-1 on 23 February 1983. Volume VI again contains proprieary information on Kerr-McGee Critical Solvent Deashing System; it was issued to Burns and Roe Synthetic Fuels, Inc. Subsequently, updated process descriptions, utility summaries, and errata sheets were issued to the DOE and Burns and Roe Synthetic Fuels, Inc. on nonproprietary Engineering Change Notices 4-2 and 4-3 on 24 May 1983.

  10. The Dutch CAFE baseline: In or out of line?

    NARCIS (Netherlands)

    Jimmink BA; Folkert RJM; Thomas R; Beck JP; Eerdt MM van; Elzenga HE; Hoek KW van der; Hoen A; Peek CJ; LED; KMD; NMD; LVM; RIM; LDL

    2004-01-01

    The European Commission is constructing a strategy on air pollution within the Clean Air For Europe (CAFE) programme. This strategy will be based on assessments using the RAINS model for different policy ambitions where the CAFE baseline scenario and control strategies are employed. The Netherlands

  11. Emergency Response Capability Baseline Needs Assessment - Requirements Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2016-10-04

    This document was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by LLNL Emergency Management Department Head James Colson. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only addresses emergency response.

  12. An Overview of the 2014 ALMA Long Baseline Campaign

    CERN Document Server

    Partnership, ALMA; Vlahakis, C; Corder, S; Remijan, A; Barkats, D; Lucas, R; Hunter, T R; Brogan, C L; Asaki, Y; Matsushita, S; Dent, W R F; Hills, R E; Phillips, N; Richards, A M S; Cox, P; Amestica, R; Broguiere, D; Cotton, W; Hales, A S; Hiriart, R; Hirota, A; Hodge, J A; Impellizzeri, C M V; Kern, J; Kneissl, R; Liuzzo, E; Marcelino, N; Marson, R; Mignano, A; Nakanishi, K; Nikolic, B; Perez, J E; Pérez, L M; Toledo, I; Aladro, R; Butler, B; Cortes, J; Cortes, P; Dhawan, V; Di Francesco, J; Espada, D; Galarza, F; Garcia-Appadoo, D; Guzman-Ramirez, L; Humphreys, E M; Jung, T; Kameno, S; Laing, R A; Leon, S; Mangum, J; Marconi, G; Nagai, H; Nyman, L -A; Perley, R; Radiszcz, M; Rodón, J A; Sawada, T; Takahashi, S; Tilanus, R P J; van Kempen, T; Vilaro, B Vila; Watson, L C; Wiklind, T; Gueth, F; Tatematsu, K; Wootten, A; Castro-Carrizo, A; Chapillon, E; Dumas, G; de Gregorio-Monsalvo, I; Francke, H; Gallardo, J; Garcia, J; Gonzalez, S; Hibbard, J E; Hill, T; Kaminski, T; Karim, A; Krips, M; Kurono, Y; Lopez, C; Martin, S; Maud, L; Morales, F; Pietu, V; Plarre, K; Schieven, G; Testi, L; Videla, L; Villard, E; Whyborn, N; Zwaan, M A; Alves, F; Andreani, P; Avison, A; Barta, M; Bedosti, F; Bendo, G J; Bertoldi, F; Bethermin, M; Biggs, A; Boissier, J; Brand, J; Burkutean, S; Casasola, V; Conway, J; Cortese, L; Dabrowski, B; Davis, T A; Trigo, M Diaz; Fontani, F; Franco-Hernandez, R; Fuller, G; Madrid, R Galvan; Giannetti, A; Ginsburg, A; Graves, S F; Hatziminaoglou, E; Hogerheijde, M; Jachym, P; Serra, I Jimenez; Karlicky, M; Klaasen, P; Kraus, M; Kunneriath, D; Lagos, C; Longmore, S; Leurini, S; Maercker, M; Magnelli, B; Vidal, I Marti; Massardi, M; Maury, A; Muehle, S; Muller, S; Muxlow, T; O'Gorman, E; Paladino, R; Petry, D; Pineda, J; Randall, S; Richer, J S; Rossetti, A; Rushton, A; Rygl, K; Monge, A Sanchez; Schaaf, R; Schilke, P; Stanke, T; Schmalzl, M; Stoehr, F; Urban, S; van Kampen, E; Vlemmings, W; Wang, K; Wild, W; Yang, Y; Iguchi, S; Hasegawa, T; Saito, M; Inatani, J; Mizuno, N; Asayama, S; Kosugi, G; Morita, K -I; Chiba, K; Kawashima, S; Okumura, S K; Ohashi, N; Ogasawara, R; Sakamoto, S; Noguchi, T; Huang, Y -D; Liu, S -Y; Kemper, F; Koch, P M; Chen, M -T; Chikada, Y; Hiramatsu, M; Iono, D; Shimojo, M; Komugi, S; Kim, J; Lyo, A -R; Muller, E; Herrera, C; Miura, R E; Ueda, J; Chibueze, J; Su, Y -N; Trejo-Cruz, A; Wang, K -S; Kiuchi, H; Ukita, N; Sugimoto, M; Kawabe, R; Hayashi, M; Miyama, S; Ho, P T P; Kaifu, N; Ishiguro, M; Beasley, A J; Bhatnagar, S; Braatz, J A; Brisbin, D G; Brunetti, N; Carilli, C; Crossley, J H; D'Addario, L; Meyer, J L Donovan; Emerson, D T; Evans, A S; Fisher, P; Golap, K; Griffith, D M; Hale, A E; Halstead, D; Hardy, E J; Hatz, M C; Holdaway, M; Indebetouw, R; Jewell, P R; Kepley, A A; Kim, D -C; Lacy, M D; Leroy, A K; Liszt, H S; Lonsdale, C J; Matthews, B; McKinnon, M; Mason, B S; Moellenbrock, G; Moullet, A; Myers, S T; Ott, J; Peck, A B; Pisano, J; Radford, S J E; Randolph, W T; Venkata, U Rao; Rawlings, M; Rosen, R; Schnee, S L; Scott, K S; Sharp, N K; Sheth, K J; Simon, R S; Tsutsumi, T; Wood, S J

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ~15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from September to late November 2014, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long...

  13. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M., E-mail: hayden.rampadarath@icrar.org [International Centre for Radio Astronomy Research, Curtin University, GPO Box U1987, Perth, WA (Australia)

    2012-08-15

    The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

  14. Moon-Based INSAR Geolocation and Baseline Analysis

    Science.gov (United States)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  15. Automated baseline change detection phase I. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  16. Attendance at Health Promotion Programs: Baseline Predictors and Program Outcomes.

    Science.gov (United States)

    Atkins, Catherine J.; And Others

    1990-01-01

    As part of a family cardiovascular health promotion project, 111 Mexican-American and 95 Anglo-American families with fifth or sixth grade children were assigned to either a primary prevention program involving 18 sessions or to a control condition. Correlates of attendance were low baseline scores on physical activity and cardiovascular fitness…

  17. International Space Station EXPRESS Pallet. Ground Demonstration Baseline Design Review

    Science.gov (United States)

    Schaffer, James R.

    1995-01-01

    This publication is comprised of the viewgraphs from the presentations of the EXPRESS Pallet Baseline Design Review meeting held July 20, 1995. Individual presentations addressed general requirements and objectives; mechanical, electrical, and data systems; software; operations and KSC (Kennedy Space Center) integration; payload candidates; thermal considerations; ground vs. flight demo; and recommended actions.

  18. Magical properties of 2540 km baseline Superbeam Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Umasankar, Sankagiri; Raut, Sushant [Department of Physics, Indian Institute of Technology - I.I.T. Bombay, Mumbai, 400076 Maharashtra (India); Singh, Ravi Shanker [Department of Physics, Barus-Holley building, 184 Hope Street, Brown University, Box 1843, Providence, RI 02912 (United States)

    2010-07-01

    The determination if the neutrino mixing matrix and mass-squared differences is one of the aims of neutrino physics today. This is a complicated affair, owing to the various parameter degeneracies. While the proposed 7500 km long 'magic baseline' experiment simplifies the task considerably, the intense beam required for such an experiment seems futuristic by current standards. As an alternative, we highlight the 'magical' properties of the 2540 km baseline. We propose a Superbeam experiment at this much shorter baseline with a narrow band NuMI-like beam, and demonstrate the ability of this single setup to distinguish between the two mass hierarchies. This, we show, is possible with a moderate exposure and by running the experiment in the neutrino mode only. Our results hold up to fairly small values of the mixing angle {theta}{sub 13} and irrespective of the value of the CP violating parameter. Unlike the magic baseline, it may also be possible to use this setup to measure CP violation in neutrino oscillation experiments. (authors)

  19. Baseline correction of intraoperative electromyography using discrete wavelet transform.

    Science.gov (United States)

    Rampp, Stefan; Prell, Julian; Thielemann, Henning; Posch, Stefan; Strauss, Christian; Romstöck, Johann

    2007-08-01

    In intraoperative analysis of electromygraphic signals (EMG) for monitoring purposes, baseline artefacts frequently pose considerable problems. Since artefact sources in the operating room can only be reduced to a limited degree, signal-processing methods are needed to correct the registered data online without major changes to the relevant data itself. We describe a method for baseline correction based on "discrete wavelet transform" (DWT) and evaluate its performance compared to commonly used digital filters. EMG data from 10 patients who underwent removal of acoustic neuromas were processed. Effectiveness, preservation of relevant EMG patterns and processing speed of a DWT based correction method was assessed and compared to a range of commonly used Butterworth, Resistor-Capacitor and Gaussian filters. Butterworth and DWT filters showed better performance regarding artefact correction and pattern preservation compared to Resistor-Capacitor and Gaussian filters. Assuming equal weighting of both characteristics, DWT outperformed the other methods: While Butterworth, Resistor-Capacitor and Gaussian provided good pattern preservation, the effectiveness was low and vice versa, while DWT baseline correction at level 6 performed well in both characteristics. The DWT method allows reliable and efficient intraoperative baseline correction in real-time. It is superior to commonly used methods and may be crucial for intraoperative analysis of EMG data, for example for intraoperative assessment of facial nerve function.

  20. The Dutch CAFE baseline: In or out of line?

    NARCIS (Netherlands)

    Jimmink BA; Folkert RJM; Thomas R; Beck JP; Eerdt MM van; Elzenga HE; Hoek KW van der; Hoen A; Peek CJ; LED; KMD; NMD; LVM; RIM; LDL

    2004-01-01

    The European Commission is constructing a strategy on air pollution within the Clean Air For Europe (CAFE) programme. This strategy will be based on assessments using the RAINS model for different policy ambitions where the CAFE baseline scenario and control strategies are employed. The Netherlands

  1. Baseline design of an OTEC pilot plantship. Volume C. Specifications

    Energy Technology Data Exchange (ETDEWEB)

    Glosten, L. R.; Bringloe, Thomas; Soracco, Dave; Fenstermacher, Earl; Magura, Donald; Sander, Olof; Richards, Dennis; Seward, Jerry

    1979-05-01

    Volume C is part of a three-volume report that presents a baseline engineering design of an Ocean Thermal Energy Conversion (OTEC) plantship. This volume provides the specifications for the hull, cold-water pipe, ship outfitting and machinery, OTEC power system, electrical system, and folded-tube heat exchangers.

  2. Delta Healthy Sprouts: Participants' Diet and Food Environment at Baseline

    Science.gov (United States)

    Local food environments influence the nutrition and health of area residents. This baseline analysis focuses on the food environments of women who participated in the Delta Healthy Sprouts project, a randomized, controlled, comparative trial designed to test the efficacy of two Maternal, Infant, an...

  3. Baseline Gas Turbine Development Program. Eleventh quarterly progress report

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, F.W.; Wagner, C.E.

    1975-07-31

    Progress is reported for a program to demonstrate by 1976 an experimental gas turbine powered automobile which meets the 1978 Federal Emissions Standards, has significantly improved fuel economy, and is competitive in performance, reliability, and potential manufacturing cost with the conventional piston engine powered, standard size American automobile. NASA completed initial heat balance testing of a baseline engine. An additional 450 hours were run on ceramic regenerators and seals. Seal wear rates are very good, and the elastomeric mounting system was satisfactory. An engine/control oil supply system based on the power steering pump is successfully operating in baseline vehicles. The design of the upgraded engine power turbine nozzle actuator was finalized, and layouts of the inlet guide vane actuator are in process. A lock-up torque converter was installed in the free rotor vehicle. Baseline engine and vehicle testing of water injection and variable inlet guide vanes was completed. A thermal analysis of the gas generator is in process. A steady-state, full power analysis was made. A three-dimensional stress analysis of the compressor cover was made. The power turbine nozzle actuating system layout was completed. The analytical studies of the power turbine rotor bearings were completed. MTI completed the design of the gas generator rotor simulation fixture and is starting to build it. Optimized reduction gears were successfully tested in a baseline engine.

  4. Revised SRC-I project baseline. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

  5. Measurement of western U.S. baseline ozone from the surface to the tropopause and assessment of downwind impact regions

    Science.gov (United States)

    Cooper, O. R.; Oltmans, S. J.; Johnson, B. J.; Brioude, J.; Angevine, W.; Trainer, M.; Parrish, D. D.; Ryerson, T. R.; Pollack, I.; Cullis, P. D.; Ives, M. A.; Tarasick, D. W.; Al-Saadi, J.; Stajner, I.

    2011-11-01

    Since 1997, baseline ozone monitoring from the surface to the tropopause along the U.S. west coast has been limited to the weekly ozonesondes from Trinidad Head, California. To explore baseline ozone at other latitudes, an ozonesonde network was implemented during spring 2010, including four launch sites along the California coast. Modeling indicated that North American pollution plumes impacted the California coast primarily below 3 km, but had no measurable impact on the average coastal ozone profiles. Vertical and latitudinal variation in free tropospheric baseline ozone appears to be partly explained by polluted and stratospheric air masses that descend isentropically along the west coast. Above 3 km, the dominant sources of ozone precursors were China and international shipping, while international shipping was the greatest source below 2 km. Approximately 8-10% of the baseline ozone that enters California in the 0-6 km range impacts the surface of the USA, but very little reaches the eastern USA. Within California, the major impact of baseline ozone above 2 km is on the high elevation terrain of eastern California. Baseline ozone below 2 km has its strongest impact on the low elevation sites throughout the state. To quantify ozone production within California we compared inland ozone measurements to baseline measurements. For average daytime conditions, we found no enhancements of lower tropospheric ozone in the northern Central Valley, but enhancements of 12-23% were found in the southern Central Valley. Enhancements above Joshua Tree were greater, 33-41%, while the greatest enhancements occurred over the LA Basin, 32-63%.

  6. Baseline of indicators for R&D and Innovation in ICT: a tool for decision-making, design and monitoring of public policies

    Energy Technology Data Exchange (ETDEWEB)

    Mora Holguin, H.; Lucio-Arias, D.; Zarate, S.; Castro, N.; Pardo, C.

    2016-07-01

    Development and implementation of sophisticated strategies to improve competitiveness of sectors relies on precise monitoring of the sectors dynamics and particularly, evolution of scientific and technological development and innovation (STI) generating capacities. In a knowledge based economy, non-technological innovation plays an important due to the importance of information and knowledge management for individuals and organizations (OECD, 2011). According to the World Economic Forum, the role of ICT in stimulating economic growth and creating new employment opportunities for highly qualified personal has never received as much attention as today and as a result it has become a common concern for researchers. ICT's positive impacts in the efficiency of firms has been widely acknowledged and allows businessmen to optimizer their firms production and mobilize resources to other more productive investments. ICTs are also regarded as an innovation source that can accelerate growth, favor technology adoption and adaptation, and promote technological change due to their effect in reducing transaction costs and minimizing the importance of geographical distance in innovation processes. As a result of the importance of ICTs and of monitoring STI capabilities, it is necessary to have updated and relevant statistical information that facilitates the design and monitoring of public policies for the sector. In Colombia, lack of information resulted in the initiative to create a baseline of indicators to provide information on the STI activities. The set of proposed indicators should result beneficial to the academic sector, the government, the industry and society in general. We will make a brief discussion of the importance of the baseline and the methodology underlying its design and construction. (Author)

  7. ALMA Long Baseline Campaigns: Phase Characteristics of Atmosphere at Long Baselines in the Millimeter and Submillimeter Wavelengths

    Science.gov (United States)

    Matsushita, Satoki; Asaki, Yoshiharu; Fomalont, Edward B.; Morita, Koh-Ichiro; Barkats, Denis; Hills, Richard E.; Kawabe, Ryohei; Maud, Luke T.; Nikolic, Bojan; Tilanus, Remo P. J.; Vlahakis, Catherine; Whyborn, Nicholas D.

    2017-03-01

    We present millimeter- and submillimeter-wave phase characteristics measured between 2012 and 2014 of Atacama Large Millimeter/submillimeter Array long baseline campaigns. This paper presents the first detailed investigation of the characteristics of phase fluctuation and phase correction methods obtained with baseline lengths up to ∼15 km. The basic phase fluctuation characteristics can be expressed with the spatial structure function (SSF). Most of the SSFs show that the phase fluctuation increases as a function of baseline length, with a power-law slope of ∼0.6. In many cases, we find that the slope becomes shallower (average of ∼0.2–0.3) at baseline lengths longer than ∼1 km, namely showing a turn-over in SSF. These power law slopes do not change with the amount of precipitable water vapor (PWV), but the fitted constants have a weak correlation with PWV, so that the phase fluctuation at a baseline length of 10 km also increases as a function of PWV. The phase correction method using water vapor radiometers (WVRs) works well, especially for the cases where PWV > 1 {mm}, which reduces the degree of phase fluctuations by a factor of two in many cases. However, phase fluctuations still remain after the WVR phase correction, suggesting the existence of other turbulent constituent that cause the phase fluctuation. This is supported by occasional SSFs that do not exhibit any turn-over; these are only seen when the PWV is low (i.e., when the WVR phase correction works less effectively) or after WVR phase correction. This means that the phase fluctuation caused by this turbulent constituent is inherently smaller than that caused by water vapor. Since in these rare cases there is no turn-over in the SSF up to the maximum baseline length of ∼15 km, this turbulent constituent must have scale height of 10 km or more, and thus cannot be water vapor, whose scale height is around 1 km. Based on the characteristics, this large scale height turbulent constituent is

  8. Circular polarization control for the LCLS baseline in the soft X-ray regime

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-12-15

    The LCLS baseline includes a planar undulator system, which produces intense linearly polarized light in the wavelength range 0.15-1.5 nm. In the soft X-ray wavelength region polarization control from linear to circular is highly desirable for studying ultrafast magnetic phenomena and material science issues. Several schemes using helical undulators have been discussed in the context of the LCLS. One consists in replacing three of the last planar undulator segments by helical (APPLE III) ones. A second proposal, the 2nd harmonic helical afterburner, is based on the use of short, crossed undulators tuned to the second harmonic. This last scheme is expected to be the better one. Its advantages are a high (over 90%) and stable degree of circular polarization and a low cost. Its disadvantage is a small output power (1% of the power at the fundamental harmonic) and a narrow wavelength range. We propose a novel method to generate 10 GW level power at the fundamental harmonic with 99% degree of circular polarization from the LCLS baseline. Its merits are low cost, simplicity and easy implementation. In the option presented here, the microbunching of the planar undulator is used too. After the baseline undulator, the electron beam is sent through a 40 m long straight section, and subsequently passes through a short helical (APPLE II) radiator. In this case the microbunch structure is easily preserved, and intense coherent radiation is emitted in the helical radiator. The background radiation from the baseline undulator can be easily suppressed by letting radiation and electron beamthrough horizontal and vertical slits upstream the helical radiator, where the radiation spot size is about ten times larger than the electron bunch transverse size. Using thin Beryllium foils for the slits the divergence of the electron beam halo will increase by Coulomb scattering, but the beam will propagate through the setup without electron losses. The applicability of our method is not

  9. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

  10. Long-Baseline Neutrino Physics in the U.S

    Energy Technology Data Exchange (ETDEWEB)

    Kopp, Sacha E. [Department of Physics, University of Texas at Austin, 1 University Station C1600, Austin, Texas 78712 (United States)

    2007-06-15

    Long-baseline neutrino oscillation physics in the U.S. is centered at the Fermi National Accelerator Laboratory (FNAL), in particular at the Neutrinos at the Main Injector (NuMI) beamline commissioned in 2004-2005. Already, the MINOS experiment has published its first results confirming the disappearance of {nu}{sub {mu}}'s across a 735 km baseline. The forthcoming NO{nu}A experiment will search for the transition {nu}{sub {mu}}{yields}{nu}{sub e} and use this transition to understand the mass heirarchy of neutrinos. These, as well as other conceptual ideas for future experiments using the NuMI beam, will be discussed. The turn-on of the NuMI facility has been positive, with over 310 kW beam power achieved. Plans for increasing the beam intensity once the Main Injector accelerator is fully-dedicated to the neutrino program will be presented.

  11. Coral reef baselines: how much macroalgae is natural?

    Science.gov (United States)

    Bruno, John F; Precht, William F; Vroom, Peter S; Aronson, Richard B

    2014-03-15

    Identifying the baseline or natural state of an ecosystem is a critical step in effective conservation and restoration. Like most marine ecosystems, coral reefs are being degraded by human activities: corals and fish have declined in abundance and seaweeds, or macroalgae, have become more prevalent. The challenge for resource managers is to reverse these trends, but by how much? Based on surveys of Caribbean reefs in the 1970s, some reef scientists believe that the average cover of seaweed was very low in the natural state: perhaps less than 3%. On the other hand, evidence from remote Pacific reefs, ecological theory, and impacts of over-harvesting in other systems all suggest that, historically, macroalgal biomass may have been higher than assumed. Uncertainties about the natural state of coral reefs illustrate the difficulty of determining the baseline condition of even well studied systems.

  12. Statistical Mechanics of Node-perturbation Learning with Noisy Baseline

    Science.gov (United States)

    Hara, Kazuyuki; Katahira, Kentaro; Okada, Masato

    2017-02-01

    Node-perturbation learning is a type of statistical gradient descent algorithm that can be applied to problems where the objective function is not explicitly formulated, including reinforcement learning. It estimates the gradient of an objective function by using the change in the object function in response to the perturbation. The value of the objective function for an unperturbed output is called a baseline. Cho et al. proposed node-perturbation learning with a noisy baseline. In this paper, we report on building the statistical mechanics of Cho's model and on deriving coupled differential equations of order parameters that depict learning dynamics. We also show how to derive the generalization error by solving the differential equations of order parameters. On the basis of the results, we show that Cho's results are also apply in general cases and show some general performances of Cho's model.

  13. Measurement of baseline and orientation between distributed aerospace platforms.

    Science.gov (United States)

    Wang, Wen-Qin

    2013-01-01

    Distributed platforms play an important role in aerospace remote sensing, radar navigation, and wireless communication applications. However, besides the requirement of high accurate time and frequency synchronization for coherent signal processing, the baseline between the transmitting platform and receiving platform and the orientation of platform towards each other during data recording must be measured in real time. In this paper, we propose an improved pulsed duplex microwave ranging approach, which allows determining the spatial baseline and orientation between distributed aerospace platforms by the proposed high-precision time-interval estimation method. This approach is novel in the sense that it cancels the effect of oscillator frequency synchronization errors due to separate oscillators that are used in the platforms. Several performance specifications are also discussed. The effectiveness of the approach is verified by simulation results.

  14. Hepatitis C treatment response kinetics and impact of baseline predictors

    DEFF Research Database (Denmark)

    Lindh, M; Arnholm, B; Eilard, A

    2011-01-01

    Summary. The optimal duration of treatment for hepatitis C virus (HCV) infections is highly variable but critical for achieving cure (sustained virological response, SVR). We prospectively investigated the impact of age, fibrosis, baseline viraemia and genotype on the early viral kinetics...... above 400 000 IU/mL were strongly associated with slower second phase declines of HCV RNA. Genotype 2/3 infections responded more rapidly than genotype 1, reaching week 4 negativity (RVR) in 59%vs 22%. We conclude that baseline response predictors such as age, fibrosis and viral load were well reflected...... by the early viral kinetics as assessed by repeated HCV RNA quantifications. The kinetic patterns and the high relapse rate in genotype 2/3 patients without RVR suggest that this group might benefit from treatment durations longer than 24 weeks....

  15. NA61/SHINE Data For Long Baseline Neutrino Experiments

    CERN Document Server

    Hälser, Alexis

    2015-01-01

    Accelerator based long baseline neutrino experiments require precise neutrino fl ux predictions to reach their physics goals. These experiments are commonly based on a set of two detectors. At the near detector, cross section measurements are performed and the neutrino fl ux can be observed before oscillation, while at the far detector the signal for neutrino oscillations is studied. An accurate knowledge on hadron production is mandatory in order to predict the neutrino fluxes. The NA61/SHINE facility at the CERN SPS has proven its ability to deliver high quality measurements of hadron production for the long baseline neutrino experiments. In this paper, the latest results from N A61 /SHINE for the neutrino physics programme are reviewed and future plans are presented.

  16. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2016-01-01

    One of the greatest challenges for sensorimotor adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. This information could guide individually customized countermeasures, which would enable more efficient use of crew time and provide better outcomes. The principal aim of this work is to look for baseline performance metrics that relate to locomotor adaptability. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations ("noise") in motor performance, as a predictor of individual adaptive capabilities.

  17. Forecasting Sensorimotor Adaptability from Baseline Inter­-Trial Correlations

    Science.gov (United States)

    Beaton, Kara H.; Bloomberg, Jacob J.

    2016-01-01

    One of the greatest challenges for sensorimotor adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a one-size-fits-all approach is inappropriate. Therefore it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. This information could guide individually customized countermeasures, which would enable more efficient use of crew time and provide better outcomes. The principal aim of this work is to look for baseline performance metrics that relate to locomotor adaptability. To-date, a strong relationship has been found between baseline inter-trial correlations, the trial-to-trial fluctuations ("noise") in motor performance, and adaptability in two oculomotor systems (see Preliminary Results). We now propose an analogous predictive mechanisms in the locomotor system.

  18. Configurations of the Long-Baseline Neutrino Experiment

    CERN Document Server

    Barger, Vernon; Chatterjee, Animesh; Gandhi, Raj; Marfatia, Danny; Masud, Mehedi

    2014-01-01

    We perform a comprehensive study of the ability of the Long-Baseline Neutrino Experiment (LBNE) to answer outstanding questions in the neutrino sector. We consider the sensitivities to the mass hierarchy, the octant of $\\theta_{23}$ and to CP violation using data from beam and atmospheric neutrinos. We evaluate the dependencies on the precision with which $\\theta_{13}$ will be measured by reactor experiments, on the detector size, beam power and exposure time, on detector magnetization, and on the systematic uncertainties achievable with and without a near detector. We find that a 35 kt LBNE with a near detector will resolve the eight-fold degeneracy that is intrinsic to long baseline experiments and will meet the primary goals of oscillation physics that it is designed for.

  19. Centimeter repeatability of the VLBI estimates of European baselines

    Science.gov (United States)

    Rius, Antonio; Zarraoa, Nestor; Sardon, Esther; Ma, Chopo

    1992-01-01

    In the last three years, the European Geodetic Very Long Baseline Interferometry (VLBI) Network has grown to a total of six fixed antennas placed in Germany, Italy, Spain and Sweden, all equipped with the standard geodetic VLBI instrumentation and data recording systems. During this period of time, several experiments have been carried out using this interferometer providing data of very high quality due to the excellent sensitivity and performance of the European stations. The purpose of this paper is to study the consistency of the VLBI geodetic results on the European baselines with respect to the different degrees of freedom in the analysis procedure. Used to complete this study were both real and simulated data sets, two different software packages (OCCAM 3.0 and CALC 7.4/SOLVE), and a variety of data analysis strategies.

  20. Fissile materials disposition program plutonium immobilization project baseline formulation

    Energy Technology Data Exchange (ETDEWEB)

    Ebbinghaus, B B; Armantrout, G A; Gray, L; Herman, C C; Shaw, H F; Van Konynenburg, R A

    2000-09-01

    Since 1994 Lawrence Livermore National Laboratory (LLNL), with the help of several other laboratories and university groups, has been the lead laboratory for the Plutonium Immobilization Project (PIP). This involves, among other tasks, the development of a formulation and a fabrication process for a ceramic to be used in the immobilization of excess weapons-usable plutonium. This report reviews the history of the project as it relates to the development of the ceramic form. It describes the sample test plan for the pyrochlore-rich ceramic formulation that was selected, and it specifies the baseline formulation that has been adopted. It also presents compositional specifications (e.g. precursor compositions and mixing recipes) and other form and process specifications that are linked or potentially linked to the baseline formulation.

  1. The Fermilab Short-Baseline Program: MicroBooNE

    Energy Technology Data Exchange (ETDEWEB)

    Schukraft, Anne [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The MicroBooNE experiment is the first of three detectors of the Fermilab short-baseline neutrino program that started operation in the Booster Neutrino Beamline in October 2015 [1]. When completed, the three-detector lineup will explore short-baseline neutrino oscillations and will be sensitive to sterile neutrino scenarios. MicroBooNE in itself is now starting its own physics program, with the measurement of neutrino-argon cross sections in the ~1GeV range being one of its main physics goals. These proceedings describe the status of the detector, the start of operation, and the automated reconstruction of the first neutrino events observed with MicroBooNE. Prospects for upcoming cross section measurements are also given.

  2. Re-Creating Missing Population Baselines for Pacific Reef Sharks

    OpenAIRE

    Marc O Nadon; Julia K. Baum; Ivor D Williams; Mcpherson, Jana M; Zgliczynski, Brian J.; Richards, Benjamin L.; Schroeder, Robert E.; Russell E Brainard

    2012-01-01

    Summary Abstract Sharks and other large predators are scarce on most coral reefs, but studies of their historical ecology provide qualitative evidence that predators were once numerous in these ecosystems. Quantifying density of sharks in the absence of humans (baseline) is, however, hindered by a paucity of pertinent time-series data. Recently researchers have used underwater visual surveys, primarily of limited spatial extent or nonstandard design, to infer negative associations between ree...

  3. Baseline Gas Turbine Development Program. Fourteenth quarterly progress report

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, F W; Wagner, C E

    1976-04-30

    Progress is reported for a Baseline Gas Turbine Development Program sponsored by the Heat Engine Systems Branch, Division of Transportation Energy Conservation (TEC) of the Energy Research and Development Administration (ERDA). Structurally, this program is made up of three parts: (1) documentation of the existing automotive gas turbine state-of-the-art; (2) conduction of an extensive component improvement program; and (3) utilization of the improvements in the design, and building of an Upgraded Engine capable of demonstrating program goals.

  4. GPS dynamic cycle slip detection and correction with baseline constraint

    Institute of Scientific and Technical Information of China (English)

    Liu Zhenkun; Huang Ahunji

    2009-01-01

    When the cycle slips take place in the attitude determination of a moving platform, the precision of the attitude will be impaired badly. A method of cycle slip detection and correction is proposed, which is suitable to the dynamic measurement using GPS carrier phase: the cycle slips detection is first achieved by triple difference observables, then the cycle slips correction is performed with baseline length constraint. The simulation results show that the proposed method is effective to the dynamic cycle slips problem.

  5. A Resilient Program technical baseline framework for future space systems

    Science.gov (United States)

    Nguyen, Tien M.; Guillen, Andy T.; Matsunaga, Sumner S.

    2015-05-01

    Recent Better Buying Power (BBP) initiative for improving DoD's effectiveness in developing complex systems includes "Owning the Technical Baseline" (OTB). This paper presents an innovative approach for the development of a "Resilient Program" Technical Baseline Framework (PTBF). The framework provides a recipe for generating the "Resilient Program2" Technical Baseline (PTB) components using the Integrated Program Management (IPM) approach to integrate Key Program Elements (KPEs)3 with System Engineering (SE) process/tools, acquisition policy/process/tools, Cost and Schedule estimating tools, DOD Architecture Framework (DODAF) process/tools, Open System Architecture (OSA) process/tools, Risk Management process/tools, Critical Chain Program Management (CCPM) process, and Earned Value Management System (EVMS) process/tools. The proposed resilient framework includes a matrix that maps the required tools/processes to technical features of a comprehensive reference U.S. DOD "owned" technical baseline. Resilient PTBF employs a new Open System Approach (OSAP) combining existing OSA4 and NOA (Naval Open Architecture) frameworks, supplemented by additional proposed OA (Open Architecture) principles. The new OSAP being recommended to SMC (Space and Missiles Systems Center) presented in this paper is referred to as SMC-OSAP5. Resilient PTBF and SMC-OSAP conform to U.S. DOD Acquisition System (DAS), Joint Capabilities Integration and Development System (JCIDS), and DODAF processes. The paper also extends Ref. 21 on "Program Resiliency" concept by describing how the new OSAP can be used to align SMC acquisition management with DOD BBP 3.0 and SMC's vison for resilient acquisition and sustainment efforts.

  6. Baseline Scotland : the Lower Devonian aquifer of Strathmore

    OpenAIRE

    O Dochartaigh, B.E.; Smedley, P. L.; MacDonald, A M; Darling, W. G.

    2006-01-01

    This report presents a summary of the groundwater chemistry of the Devonian sedimentary aquifer in Strathmore, eastern Scotland. The area covered by this study extends from Perth in the southwest to Stonehaven in the northeast. The survey forms part of the ongoing Baseline Scotland project. The Devonian sedimentary rocks of Strathmore form an important regional aquifer in an area of some of the most fertile agricultural land in Scotland, with a number of major urban settleme...

  7. Logistics Operations Management Center: Maintenance Support Baseline (LOMC-MSB)

    Science.gov (United States)

    Kurrus, R.; Stump, F.

    1995-01-01

    The Logistics Operations Management Center Maintenance Support Baseline is defined. A historical record of systems, applied to and deleted from, designs in support of future management and/or technical analysis is provided. All Flight elements, Ground Support Equipment, Facility Systems and Equipment and Test Support Equipment for which LOMC has responsibilities at Kennedy Space Center and other locations are listed. International Space Station Alpha Program documentation is supplemented. The responsibility of the Space Station Launch Site Support Office is established.

  8. A moving baseline for evaluation of advanced coal extraction systems

    Science.gov (United States)

    Bickerton, C. R.; Westerfield, M. D.

    1981-01-01

    Results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000 are reported. Systems used were selected from contemporary coal mining technology and from conservation conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam extended to other seam thicknesses.

  9. Degeneracies in long-baseline neutrino experiments from nonstandard interactions

    CERN Document Server

    Liao, Jiajun; Whisnant, Kerry

    2016-01-01

    We study parameter degeneracies that can occur in long-baseline neutrino appearance experiments due to nonstandard interactions (NSI). For a single off-diagonal NSI parameter, and neutrino and antineutrino measurements at a single L/E, there exists a continuous four-fold degeneracy (related to the mass hierarchy and $\\theta_{23}$ octant) that renders the mass hierarchy, octant, and CP phase unknowable. Even with a combination of NO$\

  10. Integrated Baseline System (IBS) Version 2.0: User guide

    Energy Technology Data Exchange (ETDEWEB)

    Bower, J.C. [Bower Software Services, Kennewick, WA (United States); Burford, M.J.; Downing, T.R.; Matsumoto, S.W.; Schrank, E.E.; Williams, J.R.; Winters, C.; Wood, B.M.

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the Federal Emergency Management Agency. This User Guide explains how to start and use the IBS Program, which is designed to help civilian emergency management personnel to plan for and support their responses to a chemical-releasing event at a military chemical stockpile. The intended audience for this document is all users of the IBS, especially emergency management planners and analysts.

  11. Linking solar and long baseline terrestrial neutrino experiments.

    Science.gov (United States)

    Akhmedov, E K; Branco, G C; Rebelo, M N

    2000-04-17

    We show that, in the framework of three light neutrino species with hierarchical masses and assuming no fine tuning between the entries of the neutrino mass matrix, one can use the solar neutrino data to obtain information on the element U(e3) of the lepton mixing matrix. Conversely, a measurement of U(e3) in atmospheric or long baseline accelerator or reactor neutrino experiments would help discriminate between possible oscillation solutions of the solar neutrino problem.

  12. Challenges of collecting baseline data in emergency settings

    Directory of Open Access Journals (Sweden)

    Jennifer Schlecht

    2007-12-01

    Full Text Available Although the humanitarian community acknowledgesthe need for good quality data in programme design andmonitoring, the challenges and demands of field settingshave too often led to the argument that “we just don’thave time” or “it is too difficult”. Yet without the allocationof time and resources to the collection of baseline andmonitoring data, project activities cannot be groundedin strong evidence from programme evaluation.

  13. The long-period eccentric orbit of the particle accelerator HD 167971 revealed by long baseline interferometry

    Science.gov (United States)

    De Becker, M.; Sana, H.; Absil, O.; Le Bouquin, J.-B.; Blomme, R.

    2012-07-01

    Using optical long baseline interferometry, we resolved for the first time the two wide components of HD 167971, a candidate hierarchical triple system known to efficiently accelerate particles. Our multi-epoch Very Large Telescope Interferometer observations provide direct evidence for a gravitational link between the O8 supergiant and the close eclipsing O + O binary. The separation varies from 8 to 15 mas over the 3-year baseline of our observations, suggesting that the components evolve on a wide and very eccentric orbit (most probably e > 0.5). These results provide evidence that the wide orbit revealed by our study is not coplanar with the orbit of the inner eclipsing binary. From our measurements of the near-infrared luminosity ratio, we constrain the spectral classification of the components in the close binary to be O6-O7, and confirm that these stars are likely main-sequence objects. Our results are discussed in the context of the bright non-thermal radio emission already reported for this system, and we provide arguments in favour of a maximum radio emission coincident with periastron passage. HD 167971 turns out to be an efficient O-type particle accelerator that constitutes a valuable target for future high angular resolution radio imaging using Very Long Baseline Interferometry facilities. Based on observations collected at the European Southern Observatory, Paranal, Chile, under the programme IDs 381.D-0095, 086.D-0586 and 087.D-0264.

  14. Microservices Validation: Methodology and Implementation

    OpenAIRE

    Savchenko, D.; Radchenko, G.

    2015-01-01

    Due to the wide spread of cloud computing, arises actual question about architecture, design and implementation of cloud applications. The microservice model describes the design and development of loosely coupled cloud applications when computing resources are provided on the basis of automated IaaS and PaaS cloud platforms. Such applications consist of hundreds and thousands of service instances, so automated validation and testing of cloud applications developed on the basis of microservic...

  15. Gravity sensing with Very Long Baseline Atom Interferometry

    Science.gov (United States)

    Schlippert, Dennis; Albers, Henning; Richardson, Logan L.; Nath, Dipankar; Meiners, Christian; Wodey, Etienne; Schubert, Christian; Ertmer, Wolfgang; Rasel, Ernst M.

    2016-05-01

    Very Long Baseline Atom Interferometry (VLBAI) has applications in high-accuracy absolute gravimetry, gravity-gradiometry, and for tests of fundamental physics. Extending the baseline of atomic gravimeters from tens of centimeters to meters opens the route towards competition with superconducting gravimeters. The VLBAI-test stand will consist of a 10m-baseline atom interferometer allowing for free fall times of seconds. In order to suppress environmental noise, the facility utilizes a state-of-the-art vibration isolation platform and a three-layer magnetic shield. We envisage a resolution of local gravitational acceleration of 5 .10-10 m/ s2 with sub-ppb inaccuracy. Operation as a gradiometer will allow to resolve the gravity gradient at a resolution of 5 .10-10 1/ s2. The operation of VLBAI as a differential dual-species gravimeter using ultracold mixtures of Yb and Rb atoms enables quantum tests of the universality of free fall (UFF) at an unprecedented level, with the potential to surpass the accuracy of the best experiments to date. We report on a quantum test of the UFF using two different chemical elements, 39 K and 87 Rb, reaching a 100 ppb inaccuracy and show the potential of UFF tests in VLBAI at an inaccuracy of 10-13 and beyond.

  16. Baseline brain activity predicts response to neuromodulatory pain treatment.

    Science.gov (United States)

    Jensen, Mark P; Sherlin, Leslie H; Fregni, Felipe; Gianas, Ann; Howe, Jon D; Hakimian, Shahin

    2014-12-01

    The objective of this study was to examine the associations between baseline electroencephalogram (EEG)-assessed brain oscillations and subsequent response to four neuromodulatory treatments. Based on available research, we hypothesized that baseline theta oscillations would prospectively predict response to hypnotic analgesia. Analyses involving other oscillations and the other treatments (meditation, neurofeedback, and both active and sham transcranial direct current stimulation) were viewed as exploratory, given the lack of previous research examining brain oscillations as predictors of response to these other treatments. Randomized controlled study of single sessions of four neuromodulatory pain treatments and a control procedure. Thirty individuals with spinal cord injury and chronic pain had their EEG recorded before each session of four active treatments (hypnosis, meditation, EEG biofeedback, transcranial direct current stimulation) and a control procedure (sham transcranial direct stimulation). As hypothesized, more presession theta power was associated with greater response to hypnotic analgesia. In exploratory analyses, we found that less baseline alpha power predicted pain reduction with meditation. The findings support the idea that different patients respond to different pain treatments and that between-person treatment response differences are related to brain states as measured by EEG. The results have implications for the possibility of enhancing pain treatment response by either 1) better patient/treatment matching or 2) influencing brain activity before treatment is initiated in order to prepare patients to respond. Research is needed to replicate and confirm the findings in additional samples of individuals with chronic pain. Wiley Periodicals, Inc.

  17. Optimized Two-Baseline Beta-Beam Experiment

    CERN Document Server

    Choubey, Sandhya; Donini, Andrea; Fernandez-Martinez, Enrique

    2009-01-01

    We propose a realistic Beta-Beam experiment with four source ions and two baselines for the best possible sensitivity to theta_{13}, CP violation and mass hierarchy. Neutrinos from 18Ne and 6He with Lorentz boost gamma=350 are detected in a 500 kton water Cerenkov detector at a distance L=650 km (first oscillation peak) from the source. Neutrinos from 8B and 8Li are detected in a 50 kton magnetized iron detector at a distance L=7000 km (magic baseline) from the source. Since the decay ring requires a tilt angle of 34.5 degrees to send the beam to the magic baseline, the far end of the ring has a maximum depth of d=2132 m for magnetic field strength of 8.3 T, if one demands that the fraction of ions that decay along the straight sections of the racetrack geometry decay ring (called livetime) is 0.3. We alleviate this problem by proposing to trade reduction of the livetime of the decay ring with the increase in the boost factor of the ions, such that the number of events at the detector remains almost the same....

  18. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    One of the greatest challenges surrounding adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. Such knowledge could guide individually customized countermeasures, which would enable more efficient use of crew time, both preflight and inflight, and provide better outcomes. The primary goal of this project is to look for a baseline performance metric that can forecast sensorimotor adaptability without exposure to an adaptive stimulus. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations in motor performance, as a predictor of individual sensorimotor adaptive capabilities. To-date, a strong relationship has been found between baseline inter-trial correlations and adaptability in two oculomotor systems. For this project, we will explore an analogous predictive mechanism in the locomotion system. METHODS: Baseline Inter-trial Correlations: Inter-trial correlations specify the relationships among repeated trials of a given task that transpire as a consequence of correcting for previous performance errors over multiple timescales. We can quantify the strength of inter-trial correlations by measuring the decay of the autocorrelation function (ACF), which describes how rapidly information from past trials is "forgotten." Processes whose ACFs decay more slowly exhibit longer-term inter-trial correlations (longer memory processes), while processes whose ACFs decay more rapidly exhibit shorterterm inter-trial correlations (shorter memory processes). Longer-term correlations reflect low-frequency activity, which is more easily

  19. Pilot Implementation of Health Information Systems: Issues and challenges

    DEFF Research Database (Denmark)

    Bansler, Jørgen Peter; Havn, Erling C.

    2010-01-01

    Objectives: This study aims to explore the issues and challenges involved in designing and organizing pilot implementations of health information systems (HIS). Pilot implementations are a widely used approach for identifying design flaws and implementation issues before full-scale deployment...... implementation of an electronic Pregnancy Record (ePR) in Denmark. Our primary data collection methods comprised participant observations, semi-structured interviews and document analyses. Results: Based on a comprehensive evaluation of the implementation process, we identify three major challenges...

  20. The Class-Wide Good Behavior Board Game

    Science.gov (United States)

    Cipani, Ennio

    2010-01-01

    This paper describes the design and implementation of a class-wide behavior management system that is derived from the Good Behavior Game developed in the 1960s. The Good Behavior Game has several decades of empirical evidence demonstrating its efficacy in ameliorating many classroom problems. This management system can be used across a variety of…

  1. The Social Validity of Program-Wide Positive Behavior Support

    Science.gov (United States)

    Frey, Andy J.; Lee Park, Kristy; Browne-Ferrigno, Tricia; Korfhage, Tara L.

    2010-01-01

    In preschool settings, the majority of interventions are individualized for children at high risk for challenging behavior. However, a few early childhood sites have begun to conceptualize and implement prevention and intervention initiatives modeled after the principles and key features associated with school-wide positive behavior support. In…

  2. Wide field of view telescope

    Energy Technology Data Exchange (ETDEWEB)

    Ackermann, Mark R. (Albuquerque, NM); McGraw, John T. (Placitas, NM); Zimmer, Peter C. (Albuquerque, NM)

    2008-01-15

    A wide field of view telescope having two concave and two convex reflective surfaces, each with an aspheric surface contour, has a flat focal plane array. Each of the primary, secondary, tertiary, and quaternary reflective surfaces are rotationally symmetric about the optical axis. The combination of the reflective surfaces results in a wide field of view in the range of approximately 3.8.degree. to approximately 6.5.degree.. The length of the telescope along the optical axis is approximately equal to or less than the diameter of the largest of the reflective surfaces.

  3. Baseline Surveys - Tecolote Canyon, San Diego Co. [ds655

    Data.gov (United States)

    California Department of Resources — Various resource projects have been conducted in the City of San Diego's Open Space Parks as part of the implementation of the City's Multiple Species Conservation...

  4. Plant Wide Assessment for SIFCO Industries, Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Kelly Kissock, Arvind Thekdi et. al.

    2005-07-06

    Sifco Industries carreid out a plant wide energy assessment under a collaborative program with the U.S. Department of Energy during October 2004 to September 2005. During the year, personnel from EIS, E3M, DPS, BuyCastings.Com, and Sifco plant facilities and maintenance personnel, as a team collected energy use, construction, process, equipment and operational information about the plant. Based on this information, the team identified 13 energy savings opportunities. Near term savings opportunities have a total potential savings of about $1,329,000 per year and a combined simple payback of about 11 months. Implementation of these recommendations would reduce CO2 emissions by about 16,000,000 pounds per year, which would reduce overall plant CO2 emissions by about 45%. These totals do not include another $830,000 per year in potential savings with an estimated 9-month payback, from converting the forging hammers from steam to compressed air.

  5. Consistent implementation of decisions in the brain.

    Directory of Open Access Journals (Sweden)

    James A R Marshall

    Full Text Available Despite the complexity and variability of decision processes, motor responses are generally stereotypical and independent of decision difficulty. How is this consistency achieved? Through an engineering analogy we consider how and why a system should be designed to realise not only flexible decision-making, but also consistent decision implementation. We specifically consider neurobiologically-plausible accumulator models of decision-making, in which decisions are made when a decision threshold is reached. To trade-off between the speed and accuracy of the decision in these models, one can either adjust the thresholds themselves or, equivalently, fix the thresholds and adjust baseline activation. Here we review how this equivalence can be implemented in such models. We then argue that manipulating baseline activation is preferable as it realises consistent decision implementation by ensuring consistency of motor inputs, summarise empirical evidence in support of this hypothesis, and suggest that it could be a general principle of decision making and implementation. Our goal is therefore to review how neurobiologically-plausible models of decision-making can manipulate speed-accuracy trade-offs using different mechanisms, to consider which of these mechanisms has more desirable decision-implementation properties, and then review the relevant neuroscientific data on which mechanism brains actually use.

  6. The mixed waste management facility. Project baseline revision 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Streit, R.D.; Throop, A.L.

    1995-04-01

    Revision 1.2 to the Project Baseline (PB) for the Mixed Waste Management Facility (MWMF) is in response to DOE directives and verbal guidance to (1) Collocate the Decontamination and Waste Treatment Facility (DWTF) and MWMF into a single complex, integrate certain and overlapping functions as a cost-saving measure; (2) Meet certain fiscal year (FY) new-BA funding objectives ($15.3M in FY95) with lower and roughly balanced funding for out years; (3) Reduce Total Project Cost (TPC) for the MWMF Project; (4) Include costs for all appropriate permitting activities in the project TPC. This baseline revision also incorporates revisions in the technical baseline design for Molten Salt Oxidation (MSO) and Mediated Electrochemical Oxidation (MEO). Changes in the WBS dictionary that are necessary as a result of this rebaseline, as well as minor title changes, at WBS Level 3 or above (DOE control level) are approved as a separate document. For completeness, the WBS dictionary that reflects these changes is contained in Appendix B. The PB, with revisions as described in this document, were also the basis for the FY97 Validation Process, presented to DOE and their reviewers on March 21-22, 1995. Appendix C lists information related to prior revisions to the PB. Several key changes relate to the integration of functions and sharing of facilities between the portion of the DWTF that will house the MWMF and those portions that are used by the Hazardous Waste Management (HWM) Division at LLNL. This collocation has been directed by DOE as a cost-saving measure and has been implemented in a manner that maintains separate operational elements from a safety and permitting viewpoint. Appendix D provides background information on the decision and implications of collocating the two facilities.

  7. DiFX: A Software Correlator for Very Long Baseline Interferometry Using Multiprocessor Computing Environments

    Science.gov (United States)

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-03-01

    We describe the development of an FX-style correlator for very long baseline interferometry (VLBI), implemented in software and intended to run in multiprocessor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high-performance computing, such as multiprocessor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of software and the fact that the highly parallel and scalable nature of the correlation task is well suited to a multiprocessor computing environment. We suggest scientific applications where such an approach to VLBI correlation is most suited and will give the best returns. We report detailed results from the Distributed FX (DiFX) software correlator running on the Swinburne supercomputer (a Beowulf cluster of ~300 commodity processors), including measures of the performance of the system. For example, to correlate all Stokes products for a 10 antenna array with an aggregate bandwidth of 64 MHz per station, and using typical time and frequency resolution, currently requires an order of 100 desktop-class compute nodes. Due to the effect of Moore's law on commodity computing performance, the total number and cost of compute nodes required to meet a given correlation task continues to decrease rapidly with time. We show detailed comparisons between DiFX and two existing hardware-based correlators: the Australian Long Baseline Array S2 correlator and the NRAO Very Long Baseline Array correlator. In both cases, excellent agreement was found between the correlators. Finally, we describe plans for the future operation of DiFX on the Swinburne supercomputer for both astrophysical and geodetic science.

  8. Evaluation of sliding baseline methods for spatial estimation for cluster detection in the biosurveillance system

    Directory of Open Access Journals (Sweden)

    Leuze Michael

    2009-07-01

    Full Text Available Abstract Background The Centers for Disease Control and Prevention's (CDC's BioSense system provides near-real time situational awareness for public health monitoring through analysis of electronic health data. Determination of anomalous spatial and temporal disease clusters is a crucial part of the daily disease monitoring task. Our study focused on finding useful anomalies at manageable alert rates according to available BioSense data history. Methods The study dataset included more than 3 years of daily counts of military outpatient clinic visits for respiratory and rash syndrome groupings. We applied four spatial estimation methods in implementations of space-time scan statistics cross-checked in Matlab and C. We compared the utility of these methods according to the resultant background cluster rate (a false alarm surrogate and sensitivity to injected cluster signals. The comparison runs used a spatial resolution based on the facility zip code in the patient record and a finer resolution based on the residence zip code. Results Simple estimation methods that account for day-of-week (DOW data patterns yielded a clear advantage both in background cluster rate and in signal sensitivity. A 28-day baseline gave the most robust results for this estimation; the preferred baseline is long enough to remove daily fluctuations but short enough to reflect recent disease trends and data representation. Background cluster rates were lower for the rash syndrome counts than for the respiratory counts, likely because of seasonality and the large scale of the respiratory counts. Conclusion The spatial estimation method should be chosen according to characteristics of the selected data streams. In this dataset with strong day-of-week effects, the overall best detection performance was achieved using subregion averages over a 28-day baseline stratified by weekday or weekend/holiday behavior. Changing the estimation method for particular scenarios involving

  9. ParticipACTION: Overview and introduction of baseline research on the "new" ParticipACTION

    Science.gov (United States)

    2009-01-01

    Background This paper provides a brief overview of the Canadian physical activity communications and social marketing organization "ParticipACTION"; introduces the "new" ParticipACTION; describes the research process leading to the collection of baseline data on the new ParticipACTION; and outlines the accompanying series of papers in the supplement presenting the detailed baseline data. Methods Information on ParticipACTION was gathered from close personal involvement with the organization, from interviews and meetings with key leaders of the organization, from published literature and from ParticipACTION archives. In 2001, after nearly 30 years of operation, ParticipACTION ceased operations because of inadequate funding. In February 2007 the organization was officially resurrected and the launch of the first mass media campaign of the "new" ParticipACTION occurred in October 2007. The six-year absence of ParticipACTION, or any equivalent substitute, provided a unique opportunity to examine the impact of a national physical activity social marketing organization on important individual and organizational level indicators of success. A rapid response research team was established in January 2007 to exploit this natural intervention research opportunity. Results The research team was successful in obtaining funding through the new Canadian Institutes of Health Research Intervention Research (Healthy Living and Chronic Disease Prevention) Funding Program. Data were collected on individuals and organizations prior to the complete implementation of the first mass media campaign of the new ParticipACTION. Conclusion Rapid response research and funding mechanisms facilitated the collection of baseline information on the new ParticipACTION. These data will allow for comprehensive assessments of future initiatives of ParticipACTION. PMID:19995455

  10. ParticipACTION: Overview and introduction of baseline research on the "new" ParticipACTION.

    Science.gov (United States)

    Tremblay, Mark S; Craig, Cora L

    2009-12-09

    This paper provides a brief overview of the Canadian physical activity communications and social marketing organization "ParticipACTION"; introduces the "new" ParticipACTION; describes the research process leading to the collection of baseline data on the new ParticipACTION; and outlines the accompanying series of papers in the supplement presenting the detailed baseline data. Information on ParticipACTION was gathered from close personal involvement with the organization, from interviews and meetings with key leaders of the organization, from published literature and from ParticipACTION archives. In 2001, after nearly 30 years of operation, ParticipACTION ceased operations because of inadequate funding. In February 2007 the organization was officially resurrected and the launch of the first mass media campaign of the "new" ParticipACTION occurred in October 2007. The six-year absence of ParticipACTION, or any equivalent substitute, provided a unique opportunity to examine the impact of a national physical activity social marketing organization on important individual and organizational level indicators of success. A rapid response research team was established in January 2007 to exploit this natural intervention research opportunity. The research team was successful in obtaining funding through the new Canadian Institutes of Health Research Intervention Research (Healthy Living and Chronic Disease Prevention) Funding Program. Data were collected on individuals and organizations prior to the complete implementation of the first mass media campaign of the new ParticipACTION. Rapid response research and funding mechanisms facilitated the collection of baseline information on the new ParticipACTION. These data will allow for comprehensive assessments of future initiatives of ParticipACTION.

  11. ParticipACTION: Overview and introduction of baseline research on the "new" ParticipACTION

    Directory of Open Access Journals (Sweden)

    Craig Cora L

    2009-12-01

    Full Text Available Abstract Background This paper provides a brief overview of the Canadian physical activity communications and social marketing organization "ParticipACTION"; introduces the "new" ParticipACTION; describes the research process leading to the collection of baseline data on the new ParticipACTION; and outlines the accompanying series of papers in the supplement presenting the detailed baseline data. Methods Information on ParticipACTION was gathered from close personal involvement with the organization, from interviews and meetings with key leaders of the organization, from published literature and from ParticipACTION archives. In 2001, after nearly 30 years of operation, ParticipACTION ceased operations because of inadequate funding. In February 2007 the organization was officially resurrected and the launch of the first mass media campaign of the "new" ParticipACTION occurred in October 2007. The six-year absence of ParticipACTION, or any equivalent substitute, provided a unique opportunity to examine the impact of a national physical activity social marketing organization on important individual and organizational level indicators of success. A rapid response research team was established in January 2007 to exploit this natural intervention research opportunity. Results The research team was successful in obtaining funding through the new Canadian Institutes of Health Research Intervention Research (Healthy Living and Chronic Disease Prevention Funding Program. Data were collected on individuals and organizations prior to the complete implementation of the first mass media campaign of the new ParticipACTION. Conclusion Rapid response research and funding mechanisms facilitated the collection of baseline information on the new ParticipACTION. These data will allow for comprehensive assessments of future initiatives of ParticipACTION.

  12. Markets for energy efficiency: Exploring the implications of an EU-wide 'Tradable White Certificate' scheme

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis [International Institute for Industrial Environmental Economics at Lund University, P.O. Box 196, SE-221 00 Lund (Sweden)], E-mail: Luis.Mundaca@iiiee.lu.se

    2008-11-15

    Recent developments in European energy policy reveal an increasing interest in implementing the so-called 'Tradable White Certificate' (TWC) schemes to improve energy efficiency. Based on three evaluation criteria (cost-effectiveness, environmental effectiveness and distributional equity) this paper analyses the implications of implementing a European-wide TWC scheme targeting the household and commercial sectors. Using a bottom-up model, quantitative results show significant cost-effective potentials for improvements (ca. 1400 TWh in cumulative energy savings by 2020), with the household sector, gas and space heating representing most of the TWC supply in terms of eligible sector, fuel and energy service demand, respectively. If a single market price of negative externalities is considered, a societal cost-effective potential of energy savings above 30% (compared to the baseline) is observed. In environmental terms, the resulting greenhouse gas emission reductions are around 200 Mt CO{sub 2-eq} by 2010, representing nearly 60% of the EU-Kyoto-target. From the qualitative perspective, several embedded ancillary benefits are identified (e.g. employment generation, improved comfort level, reduced 'fuel poverty', security of energy supply). Whereas an EU-wide TWC increases liquidity and reduces the risks of market power, autarky compliance strategies may be expected in order to capture co-benefits nationally. Cross subsidies could occur due to investment recovery mechanisms and there is a risk that effects may be regressive for low-income households. Assumptions undertaken by the modelling approach strongly indicate that high effectiveness of other policy instruments is needed for an EU-wide TWC scheme to be cost-effective.

  13. Markets for energy efficiency. Exploring the implications of an EU-wide 'Tradable White Certificate' scheme

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis [International Institute for Industrial Environmental Economics at Lund University, P.O. Box 196, SE-221 00 Lund (Sweden)

    2008-11-15

    Recent developments in European energy policy reveal an increasing interest in implementing the so-called 'Tradable White Certificate' (TWC) schemes to improve energy efficiency. Based on three evaluation criteria (cost-effectiveness, environmental effectiveness and distributional equity) this paper analyses the implications of implementing a European-wide TWC scheme targeting the household and commercial sectors. Using a bottom-up model, quantitative results show significant cost-effective potentials for improvements (ca. 1400 TWh in cumulative energy savings by 2020), with the household sector, gas and space heating representing most of the TWC supply in terms of eligible sector, fuel and energy service demand, respectively. If a single market price of negative externalities is considered, a societal cost-effective potential of energy savings above 30% (compared to the baseline) is observed. In environmental terms, the resulting greenhouse gas emission reductions are around 200 Mt CO{sub 2-eq} by 2010, representing nearly 60% of the EU-Kyoto-target. From the qualitative perspective, several embedded ancillary benefits are identified (e.g. employment generation, improved comfort level, reduced 'fuel poverty', security of energy supply). Whereas an EU-wide TWC increases liquidity and reduces the risks of market power, autarky compliance strategies may be expected in order to capture co-benefits nationally. Cross subsidies could occur due to investment recovery mechanisms and there is a risk that effects may be regressive for low-income households. Assumptions undertaken by the modelling approach strongly indicate that high effectiveness of other policy instruments is needed for an EU-wide TWC scheme to be cost-effective. (author)

  14. Re-shifting the ecological baseline for the overexploited Mediterranean red coral

    Science.gov (United States)

    Garrabou, J.; Sala, E.; Linares, C.; Ledoux, J. B.; Montero-Serra, I.; Dominici, J. M.; Kipson, S.; Teixidó, N.; Cebrian, E.; Kersting, D. K.; Harmelin, J. G.

    2017-02-01

    Overexploitation leads to the ecological extinction of many oceanic species. The depletion of historical abundances of large animals, such as whales and sea turtles, is well known. However, the magnitude of the historical overfishing of exploited invertebrates is unclear. The lack of rigorous baseline data limits the implementation of efficient management and conservation plans in the marine realm. The precious Mediterranean red coral Corallium rubrum has been intensively exploited since antiquity for its use in jewellery. It shows dramatic signs of overexploitation, with no untouched populations known in shallow waters. Here, we report the discovery of an exceptional red coral population from a previously unexplored shallow underwater cave in Corsica (France) harbouring the largest biomass (by more than 100-fold) reported to date in the Mediterranean. Our findings challenge current assumptions on the pristine state of this emblematic species. Our results suggest that, before intense exploitation, red coral lived in relatively high-density populations with a large proportion of centuries-old colonies, even at very shallow depths. We call for the re-evaluation of the baseline for red coral and question the sustainability of the exploitation of a species that is still common but ecologically (functionally) extinct and in a trajectory of further decline.

  15. CNE article: safety culture in Australian intensive care units: establishing a baseline for quality improvement.

    Science.gov (United States)

    Chaboyer, Wendy; Chamberlain, Di; Hewson-Conroy, Karena; Grealy, Bernadette; Elderkin, Tania; Brittin, Maureen; McCutcheon, Catherine; Longbottom, Paula; Thalib, Lukman

    2013-03-01

    Workplace safety culture is a crucial ingredient in patients' outcomes and is increasingly being explored as a guide for quality improvement efforts. To establish a baseline understanding of the safety culture in Australian intensive care units. In a nationwide study of physicians and nurses in 10 Australian intensive care units, the Safety Attitudes Questionnaire intensive care unit version was used to measure safety culture. Descriptive statistics were used to summarize the mean scores for the 6 subscales of the questionnaire, and generalized-estimation-equations models were used to test the hypotheses that safety culture differed between physicians and nurses and between nurse leaders and bedside nurses. A total of 672 responses (50.6% response rate) were received: 513 (76.3%) from nurses, 89 (13.2%) from physicians, and 70 (10.4%) from respondents who did not specify their professional group. Ratings were highest for teamwork climate and lowest for perceptions of hospital management and working conditions. Four subscales, job satisfaction, teamwork climate, safety climate, and working conditions, were rated significantly higher by physicians than by nurses. Two subscales, working conditions and perceptions of hospital management, were rated significantly lower by nurse leaders than by bedside nurses. Measuring the baseline safety culture of an intensive care unit allows leaders to implement targeted strategies to improve specific dimensions of safety culture. These strategies ultimately may improve the working conditions of staff and the care that patients receive.

  16. Improvement during baseline: three case studies encouraging collaborative research when evaluating caregiver training.

    Science.gov (United States)

    Sohlberg, M M; Glang, A; Todis, B

    1998-04-01

    The trend in cognitive rehabilitation toward reduced services, which provide more functionally relevant outcomes and the recognition of limited maintenance and generalization with many existing interventions, challenges current research models. There is a need to develop and evaluate interventions that can be implemented by persons other than rehabilitation professionals and that are well suited to naturalistic settings. The researchers responded to these challenges by designing a series of single subject experiments evaluating the effectiveness of training caregivers to provide appropriate cognitive support to persons with brain injury within their own natural living environments. The goal of the original research project included evaluating a collaborative mode of interaction with the subjects and their support persons (as opposed to traditional directive treatment models) where the caregivers and subjects were instrumental in designing the intervention and collective performance data. This paper presents the data from the initial three subject/caregiver groups all of whom demonstrated improvement in the target behaviours during the baseline period. It appeared that the act of measuring client performance changed the behaviours of the support persons and resulted in positive changes in baseline levels. The research and clinical implications of these findings are discussed.

  17. Energy Consumption Analysis for Concrete Residences—A Baseline Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Kuo-Liang Lin

    2017-02-01

    Full Text Available Estimating building energy consumption is difficult because it deals with complex interactions among uncertain weather conditions, occupant behaviors, and building characteristics. To facilitate estimation, this study employs a benchmarking methodology to obtain energy baseline for sample buildings. Utilizing a scientific simulation tool, this study attempts to develop energy consumption baselines of two typical concrete residences in Taiwan, and subsequently allows a simplified energy consumption prediction process at an early design stage of building development. Using weather data of three metropolitan cities as testbeds, annual energy consumption of two types of modern residences are determined through a series of simulation sessions with different building settings. The impacts of key building characteristics, including building insulation, air tightness, orientation, location, and residence type, are carefully investigated. Sample utility bills are then collected to validate the simulated results, resulting in three adjustment parameters for normalization, including ‘number of residents’, ‘total floor area’, and ‘air conditioning comfort level’, for justification of occupant behaviors in different living conditions. Study results not only provide valuable benchmarking data serving as references for performance evaluation of different energy-saving strategies, but also show how effective extended building insulation, enhanced air tightness, and prudent selection of residence location and orientation can be for successful implementation of building sustainability in tropical and subtropical regions.

  18. Construct Validation of a Measure to Assess Sustainability of School-Wide Behavior Interventions

    Science.gov (United States)

    Hume, Amanda; McIntosh, Kent

    2013-01-01

    This study assessed aspects of construct validity of the School-wide Universal Behavior Sustainability Index-School Teams (SUBSIST), a measure evaluating critical features of the school context related to sustainability of school-wide interventions. Participants at 217 schools implementing School-wide Positive Behavior Support (SWPBS) were…

  19. Paracetamol, widely used hardly understood

    NARCIS (Netherlands)

    C.D. van der Marel (Caroline)

    2003-01-01

    markdownabstract__Abstract__ Paracetamol (APAP), in the USA known as acetaminophen, is widely used both in hospital settings and at home for antipyresis and mild (postoperative) pain. Although APAP is available over the counter and is ranked on the third place, following nystatin and cisapride, whe

  20. World Wide Web Homepage Design.

    Science.gov (United States)

    Tillman, Michael L.

    This paper examines hypermedia design and draws conclusions about how educational research and theory applies to various aspects of World Wide Web (WWW) homepage design. "Hypermedia" is defined as any collection of information which may be textual, graphical, visual, or auditory in nature and which may be accessed via a nonlinear route.…

  1. Medication-wide association studies

    NARCIS (Netherlands)

    P.B. Ryan (Patrick); D. Madigan (David); P.E. Stang (Paul); M.J. Schuemie (Martijn); G. Hripcsak (G.)

    2013-01-01

    textabstractUndiscovered side effects of drugs can have a profound effect on the health of the nation, and electronic health-care databases offer opportunities to speed up the discovery of these side effects. We applied a "medication-wide association study" approach that combined multivariate analys

  2. The World Wide Web Revisited

    Science.gov (United States)

    Owston, Ron

    2007-01-01

    Nearly a decade ago the author wrote in one of the first widely-cited academic articles, Educational Researcher, about the educational role of the web. He argued that educators must be able to demonstrate that the web (1) can increase access to learning, (2) must not result in higher costs for learning, and (3) can lead to improved learning. These…

  3. Medication-wide association studies

    NARCIS (Netherlands)

    P.B. Ryan (Patrick); D. Madigan (David); P.E. Stang (Paul); M.J. Schuemie (Martijn); G. Hripcsak (G.)

    2013-01-01

    textabstractUndiscovered side effects of drugs can have a profound effect on the health of the nation, and electronic health-care databases offer opportunities to speed up the discovery of these side effects. We applied a "medication-wide association study" approach that combined multivariate analys

  4. World-Wide Information Networks.

    Science.gov (United States)

    Samuelson, Kjell A. H. W.

    The future paths of research and development towards world-wide, automated information networks in full operation are examined. From international networked planning and projects under way it appears that exploratory as well as normative approaches have been taken. To some extent adequate technolgical facilities have already come into existence…

  5. Improving Ambiguity Resolution for Medium Baselines Using Combined GPS and BDS Dual/Triple-Frequency Observations.

    Science.gov (United States)

    Gao, Wang; Gao, Chengfa; Pan, Shuguo; Wang, Denghui; Deng, Jiadong

    2015-10-30

    The regional constellation of the BeiDou navigation satellite system (BDS) has been providing continuous positioning, navigation and timing services since 27 December 2012, covering China and the surrounding area. Real-time kinematic (RTK) positioning with combined BDS and GPS observations is feasible. Besides, all satellites of BDS can transmit triple-frequency signals. Using the advantages of multi-pseudorange and carrier observations from multi-systems and multi-frequencies is expected to be of much benefit for ambiguity resolution (AR). We propose an integrated AR strategy for medium baselines by using the combined GPS and BDS dual/triple-frequency observations. In the method, firstly the extra-wide-lane (EWL) ambiguities of triple-frequency system, i.e., BDS, are determined first. Then the dual-frequency WL ambiguities of BDS and GPS were resolved with the geometry-based model by using the BDS ambiguity-fixed EWL observations. After that, basic (i.e., L1/L2 or B1/B2) ambiguities of BDS and GPS are estimated together with the so-called ionosphere-constrained model, where the ambiguity-fixed WL observations are added to enhance the model strength. During both of the WL and basic AR, a partial ambiguity fixing (PAF) strategy is adopted to weaken the negative influence of new-rising or low-elevation satellites. Experiments were conducted and presented, in which the GPS/BDS dual/triple-frequency data were collected in Nanjing and Zhengzhou of China, with the baseline distance varying from about 28.6 to 51.9 km. The results indicate that, compared to the single triple-frequency BDS system, the combined system can significantly enhance the AR model strength, and thus improve AR performance for medium baselines with a 75.7% reduction of initialization time on average. Besides, more accurate and stable positioning results can also be derived by using the combined GPS/BDS system.

  6. Baseline characteristics of patients enrolled in the Exenatide Study of Cardiovascular Event Lowering (EXSCEL).

    Science.gov (United States)

    Mentz, Robert J; Bethel, M Angelyn; Gustavson, Stephanie; Thompson, Vivian P; Pagidipati, Neha J; Buse, John B; Chan, Juliana C; Iqbal, Nayyar; Maggioni, Aldo P; Marso, Steve P; Ohman, Peter; Poulter, Neil; Ramachandran, Ambady; Zinman, Bernard; Hernandez, Adrian F; Holman, Rury R

    2017-05-01

    EXSCEL is a randomized, double-blind, placebo-controlled trial examining the effect of exenatide once-weekly (EQW) versus placebo on time to the primary composite outcome (cardiovascular death, nonfatal myocardial infarction or nonfatal stroke) in patients with type 2 diabetes mellitus (DM) and a wide range of cardiovascular (CV) risk. Patients were enrolled at 688 sites in 35 countries. We describe their baseline characteristics according to prior CV event status and compare patients with those enrolled in prior glucagon-like peptide-1 receptor agonist (GLP-1RA) outcomes trials. Of a total of 14,752 participants randomized between June 2010 and September 2015, 6,788 (46.0%) patients were enrolled in Europe; 3,708 (25.1%), North America; 2,727 (18.5%), Latin America; and 1,529 (10.4%), Asia Pacific. Overall, 73% had at least one prior CV event (70% coronary artery disease, 24% peripheral arterial disease, 22% cerebrovascular disease). The median (IQR) age was 63 years (56, 69), 38% were female, median baseline HbA1c was 8.0% (7.3, 8.9) and 16% had a prior history of heart failure. Those without a prior CV event were younger with a shorter duration of diabetes and better renal function than those with at least one prior CV event. Compared with prior GLP-1RA trials, EXSCEL has a larger percentage of patients without a prior CV event and a notable percentage who were taking a dipeptidyl peptidase-4 inhibitor at baseline (15%). EXSCEL is one of the largest global GLP-1RA trials, evaluating the safety and efficacy of EQW with a broad patient population that may extend generalizability compared to prior GLP-1RA trials (ClinicalTrials.gov number, NCT01144338). Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Quantifying the extent of North American mammal extinction relative to the pre-anthropogenic baseline.

    Science.gov (United States)

    Carrasco, Marc A; Barnosky, Anthony D; Graham, Russell W

    2009-12-16

    Earth has experienced five major extinction events in the past 450 million years. Many scientists suggest we are now witnessing a sixth, driven by human impacts. However, it has been difficult to quantify the real extent of the current extinction episode, either for a given taxonomic group at the continental scale or for the worldwide biota, largely because comparisons of pre-anthropogenic and anthropogenic biodiversity baselines have been unavailable. Here, we compute those baselines for mammals of temperate North America, using a sampling-standardized rich fossil record to reconstruct species-area relationships for a series of time slices ranging from 30 million to 500 years ago. We show that shortly after humans first arrived in North America, mammalian diversity dropped to become at least 15%-42% too low compared to the "normal" diversity baseline that had existed for millions of years. While the Holocene reduction in North American mammal diversity has long been recognized qualitatively, our results provide a quantitative measure that clarifies how significant the diversity reduction actually was. If mass extinctions are defined as loss of at least 75% of species on a global scale, our data suggest that North American mammals had already progressed one-fifth to more than halfway (depending on biogeographic province) towards that benchmark, even before industrialized society began to affect them. Data currently are not available to make similar quantitative estimates for other continents, but qualitative declines in Holocene mammal diversity are also widely recognized in South America, Eurasia, and Australia. Extending our methodology to mammals in these areas, as well as to other taxa where possible, would provide a reasonable way to assess the magnitude of global extinction, the biodiversity impact of extinctions of currently threatened species, and the efficacy of conservation efforts into the future.

  8. Quantifying the extent of North American mammal extinction relative to the pre-anthropogenic baseline.

    Directory of Open Access Journals (Sweden)

    Marc A Carrasco

    Full Text Available Earth has experienced five major extinction events in the past 450 million years. Many scientists suggest we are now witnessing a sixth, driven by human impacts. However, it has been difficult to quantify the real extent of the current extinction episode, either for a given taxonomic group at the continental scale or for the worldwide biota, largely because comparisons of pre-anthropogenic and anthropogenic biodiversity baselines have been unavailable. Here, we compute those baselines for mammals of temperate North America, using a sampling-standardized rich fossil record to reconstruct species-area relationships for a series of time slices ranging from 30 million to 500 years ago. We show that shortly after humans first arrived in North America, mammalian diversity dropped to become at least 15%-42% too low compared to the "normal" diversity baseline that had existed for millions of years. While the Holocene reduction in North American mammal diversity has long been recognized qualitatively, our results provide a quantitative measure that clarifies how significant the diversity reduction actually was. If mass extinctions are defined as loss of at least 75% of species on a global scale, our data suggest that North American mammals had already progressed one-fifth to more than halfway (depending on biogeographic province towards that benchmark, even before industrialized society began to affect them. Data currently are not available to make similar quantitative estimates for other continents, but qualitative declines in Holocene mammal diversity are also widely recognized in South America, Eurasia, and Australia. Extending our methodology to mammals in these areas, as well as to other taxa where possible, would provide a reasonable way to assess the magnitude of global extinction, the biodiversity impact of extinctions of currently threatened species, and the efficacy of conservation efforts into the future.

  9. Sandia National Laboratories site-wide hydrogeologic characterization project calendar year 1992 annual report

    Energy Technology Data Exchange (ETDEWEB)

    Crowson, D.; Gibson, J.D.; Haase, C.S.; Holt, R.; Hyndman, D.; Krumhansl, J.; Lauffer, F.; McCord, J.P.; McCord, J.T.; Neel, D. [and others

    1993-10-01

    The Sandia National Laboratories, New Mexico (SNL/NM) Site-Wide Hydrogeologic Characterization (SWHC) project has been implemented as part of the SNL/NM Environmental Restoration (ER) Program to develop the regional hydrogeologic framework and baseline for the approximately 100 mi of Kirtland Air Force Base (KAFB) and adjacent withdrawn public lands upon which SNL/NM has performed research and development activities. Additionally, the SWHC project will investigate and characterize generic hydrogeologic issues associated with the 172 ER sites owned by SNL/NM across its facilities on KAFB. As called for in the Hazardous and Solid Waste Amendments (HSWA) to the Resource Conservation and Recovery Act (RCRA) Part B permit agreement between the U.S. Environmental Protection Agency (EPA) as the permitter and the U.S. Department of Energy (DOE) and SNL/NM as the permittees, an annual report is to be prepared by the SWHC project team. This document serves two primary purposes: (1) to identify and describe the conceptual framework for the hydrogeologic system underlying SNL/NM and (2) to describe characterization activities undertaken in the preceding year that add to our understanding (reduce our uncertainties) regarding the conceptual and quantitative hydrogeologic framework. This SWHC project annual report focuses primarily on purpose 1, providing a summary description of the current {open_quotes}state of knowledge{close_quotes} of the Sandia National Laboratories/Kirtland Air Force Base (SNL/KAFB) hydrogeologic setting.

  10. Prowess – A Software Model for the Ooty Wide Field Array

    Indian Academy of Sciences (India)

    Visweshwar Ram Marthi;

    2017-03-01

    One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted Hi emission from $z \\sim 3.35$. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA \\HI~ experiment.

  11. Re-creating missing population baselines for Pacific reef sharks.

    Science.gov (United States)

    Nadon, Marc O; Baum, Julia K; Williams, Ivor D; McPherson, Jana M; Zgliczynski, Brian J; Richards, Benjamin L; Schroeder, Robert E; Brainard, Russell E

    2012-06-01

    Sharks and other large predators are scarce on most coral reefs, but studies of their historical ecology provide qualitative evidence that predators were once numerous in these ecosystems. Quantifying density of sharks in the absence of humans (baseline) is, however, hindered by a paucity of pertinent time-series data. Recently researchers have used underwater visual surveys, primarily of limited spatial extent or nonstandard design, to infer negative associations between reef shark abundance and human populations. We analyzed data from 1607 towed-diver surveys (>1 ha transects surveyed by observers towed behind a boat) conducted at 46 reefs in the central-western Pacific Ocean, reefs that included some of the world's most pristine coral reefs. Estimates of shark density from towed-diver surveys were substantially lower (sharks observed in towed-diver surveys and human population in models that accounted for the influence of oceanic primary productivity, sea surface temperature, reef area, and reef physical complexity. We used these models to estimate the density of sharks in the absence of humans. Densities of gray reef sharks (Carcharhinus amblyrhynchos), whitetip reef sharks (Triaenodon obesus), and the group "all reef sharks" increased substantially as human population decreased and as primary productivity and minimum sea surface temperature (or reef area, which was highly correlated with temperature) increased. Simulated baseline densities of reef sharks under the absence of humans were 1.1-2.4/ha for the main Hawaiian Islands, 1.2-2.4/ha for inhabited islands of American Samoa, and 0.9-2.1/ha for inhabited islands in the Mariana Archipelago, which suggests that density of reef sharks has declined to 3-10% of baseline levels in these areas.

  12. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

    Energy Technology Data Exchange (ETDEWEB)

    Burford, M.J.; Downing, T.R.; Williams, J.R. [Pacific Northwest Lab., Richland, WA (United States); Bower, J.C. [Bower Software Services, Kennewick, WA (United States)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

  13. Future long-baseline neutrino oscillations: View from Europe

    Energy Technology Data Exchange (ETDEWEB)

    Patzak, T. [APC, AstroParticule et Cosmologie, Université Paris Diderot, CNRS/IN2P3, CEA/Irfu, Observatoire de Paris, Sorbonne Paris Cité, 10, rue Alice Domon et Léonie Duquet, 75205 Paris Cedex 13 (France)

    2015-07-15

    Since about a decade the european physics community interested in neutrino and neutrino-astrophysics develops a plan to conceive the next generation large underground neutrino observatory. Recently, the LAGUNA-LBNO collaboration made the outcome of the FP7 design study public which shows a clear path for the realization of such experiment. In this paper the LAGUNA and LAGUNA-LBNO Design studies, resulting in a proposal for the LBNO experiment, will be discussed. The author will focus on the long baseline neutrino oscillation search, especially on the potential to discover the neutrino mass ordering and the search for CP violation in the lepton sector.

  14. Implications of 3+1 short-baseline neutrino oscillations

    Energy Technology Data Exchange (ETDEWEB)

    Giunti, Carlo, E-mail: giunti@to.infn.it [INFN, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Laveder, Marco, E-mail: laveder@pd.infn.it [Dipartimento di Fisica ' ' G. Galilei' ' , Universita di Padova, and INFN, Sezione di Padova, Via F. Marzolo 8, I-35131 Padova (Italy)

    2011-12-06

    We present an upgrade of the 3+1 global fit of short-baseline neutrino oscillation data obtained with the addition of KARMEN and LSND {nu}{sub e}+{sup 12}C{yields}{sup 12}N{sub g.s.}+e{sup -} scattering data. We discuss the implications for the measurements of the effective neutrino mass in {beta}-decay and neutrinoless double-{beta}-decay experiments. We find respective predicted ranges of about 0.1-0.7 eV and 0.01-0.1 eV.

  15. Long baseline accelerator neutrino experiments present and future

    CERN Document Server

    Rubbia, André

    2000-01-01

    A nu /sub mu / disappearance effect has been seen in atmospheric neutrino experiments. This has led to the "evidence for neutrino oscillations". The next problem in neutrino physics is to perform the right experiment(s) to elucidate in a comprehensive way the pattern of neutrino masses and mixings. The long baseline experiments will play a fundamental role at settling definitively the question of flavor oscillation and at measuring with good precision the oscillation parameters. The CERN-NGS beam coupled with the proposed ICANOE and OPERA detectors is the only programme capable of sensitive tau and electron appearance searches. (14 refs).

  16. Integrated Baseline System (IBS) Version 1.03: Utilities guide

    Energy Technology Data Exchange (ETDEWEB)

    Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

    1993-01-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

  17. Emergency Response Capability Baseline Needs Assessment Compliance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, John A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-16

    This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures.

  18. Project W-320 thermal hydraulic model benchmarking and baselining

    Energy Technology Data Exchange (ETDEWEB)

    Sathyanarayana, K.

    1998-09-28

    Project W-320 will be retrieving waste from Tank 241-C-106 and transferring the waste to Tank 241-AY-102. Waste in both tanks must be maintained below applicable thermal limits during and following the waste transfer. Thermal hydraulic process control models will be used for process control of the thermal limits. This report documents the process control models and presents a benchmarking of the models with data from Tanks 241-C-106 and 241-AY-102. Revision 1 of this report will provide a baselining of the models in preparation for the initiation of sluicing.

  19. Scanner baseliner monitoring and control in high volume manufacturing

    Science.gov (United States)

    Samudrala, Pavan; Chung, Woong Jae; Aung, Nyan; Subramany, Lokesh; Gao, Haiyong; Gomez, Juan-Manuel

    2016-03-01

    We analyze performance of different customized models on baseliner overlay data and demonstrate the reduction in overlay residuals by ~10%. Smart Sampling sets were assessed and compared with the full wafer measurements. We found that performance of the grid can still be maintained by going to one-third of total sampling points, while reducing metrology time by 60%. We also demonstrate the feasibility of achieving time to time matching using scanner fleet manager and thus identify the tool drifts even when the tool monitoring controls are within spec limits. We also explore the scanner feedback constant variation with illumination sources.

  20. Baseline review of the U.S. LHC Accelerator project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O`Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as

  1. Organic Contamination Baseline Study on NASA JSC Astromaterial Curation Gloveboxes

    Science.gov (United States)

    Calaway, Michael J.; Allton, J. H.; Allen, C. C.; Burkett, P. J.

    2013-01-01

    Future planned sample return missions to carbon-rich asteroids and Mars in the next two decades will require strict handling and curation protocols as well as new procedures for reducing organic contamination. After the Apollo program, astromaterial collections have mainly been concerned with inorganic contamination [1-4]. However, future isolation containment systems for astromaterials, possibly nitrogen enriched gloveboxes, must be able to reduce organic and inorganic cross-contamination. In 2012, a baseline study was orchestrated to establish the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs that could be used as a benchmark for future mission designs.

  2. [Baseline correction method for spectrum signal of SF6 insulating air with optimum wavelet basis].

    Science.gov (United States)

    Liu, Yan; Liu, Kai; Tao, Wei-Liang; Wang, Xian-Pei

    2010-06-01

    SF6 gas has been widely used in the power equipments as an excellent electric insulating and arc-quenching medium. In the present paper, a baseline correction method based on the optimum wavelet basis for spectrum detection is proposed to measure the composition content of the SF6 insulating gas to secure the power safety. In this method, the optimum wavelet basis is selected in the wavelet packet according to constructor function on the energy concentration criterion to express the spectrum signal in the time-frequency domain. Then the strong spectrum composition is removed from the spectrum signal with the threshold method to eliminate the interference with the continuous spectrum fitting. Finally we remove the continuous spectrum which is fitting result from the origin spectrum and obtain the useful signal of line spectrum. The intensities of spectral line processed with the proposed algorithm could reflect the concentration of the conponents to be measured in SF6 gas. Experiments to analyze the absorption spectrum of the SF6 insulating gas mixture show that the proposed algorithm can estimate and correct the drifting baseline accurately, and its performance is better than the algorithm based on iterative wavelet.

  3. An intermediate gamma beta-beam neutrino experiment with long baseline

    CERN Document Server

    Meloni, Davide; Orme, Christopher; Palomares-Ruiz, Sergio; Pascoli, Silvia

    2008-01-01

    In order to address some fundamental questions in neutrino physics a wide, future programme of neutrino oscillation experiments is currently under discussion. Among those, long baseline experiments will play a crucial role in providing information on the value of theta13, the type of neutrino mass ordering and on the value of the CP-violating phase delta, which enters in 3-neutrino oscillations. Here, we consider a beta-beam setup with an intermediate Lorentz factor gamma=450 and a baseline of 1050 km. This could be achieved in Europe with a beta-beam sourced at CERN to a detector located at the Boulby mine in the United Kingdom. We analyse the physics potential of this setup in detail and study two different exposures (1 x 10^{21} and 5 x 10^{21} ions-kton-years). In both cases, we find that the type of neutrino mass hierarchy could be determined at 99% CL, for all values of delta, for sin^2(2 theta13) > 0.03. In the high-exposure scenario, we find that the value of the CP-violating phase delta could be meas...

  4. A multivariate based event detection method and performance comparison with two baseline methods.

    Science.gov (United States)

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate.

  5. NOvA Short-Baseline Tau-Neutrino Appearance Search

    Science.gov (United States)

    Keloth, Rijeesh

    2017-01-01

    Three-flavor neutrino oscillations have successfully explained a wide range of neutrino oscillation experiment results. However, anomalous results, such as the electron-antineutrino appearance excess seen by LSND and MiniBooNE, do not fit the three-flavor paradigm and can be explained by the addition of a sterile neutrino at a larger mass scale than the existing three flavor mass states. The NOvA experiment consists of two finely segmented, liquid scintillator detectors operating 14.6 mrad off-axis from the NuMI muon-neutrino beam. The Near Detector is located on the Fermilab campus, 1 km from the NuMI target, while the Far Detector is located at Ash River, MN, 810 km from the NuMI target. The NOvA experiment is primarily designed to measure electron-neutrino appearance at the Far Detector using the Near Detector to control systematic uncertainties; however, the Near Detector is well suited for searching for anomalous short-baseline oscillations. I will present a novel method for selecting tau neutrino interactions with high purity at the Near Detector using a convolutional neural network. Using this method, the sensitivity to anomalous short-baseline tau-neutrino appearance due to sterile neutrino oscillations will be presented.

  6. Comprehensive baseline environmental audit of former underground test areas in Colorado, Nevada, and New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    1994-05-01

    This report documents the results of the Comprehensive Baseline Environmental Audit of Former Underground Test Areas (FUTAS) in the States of Colorado, Nevada, and New Mexico. DOE and contractor systems for management of environmental protection activities on the Nevada Test Site (NTS) were not within the scope of the audit. The audit was conducted May 16-May 26, 1994, by the Office of Environmental Audit (EH-24). DOE 5482.1 B, {open_quotes}Environment, Safety, and Health Appraisal Program{close_quotes}, establishes the mission of EH-24, which is to provide comprehensive, independent oversight of Department-wide environmental programs on behalf of the Secretary of Energy. The ultimate goal of EH-24 is to enhance environmental protection and minimize risk to public health and the environment. EH-24 accomplishes its mission using systematic and periodic evaluations of DOE`s environmental programs within line organizations and supplemental activities that strengthen self-assessment and oversight functions within program, field, and contractor organizations. These evaluations function as a vehicle through which the Secretary and program managers are apprised of the status and vulnerabilities of Departmental environmental activities and environmental management systems. Several types of evaluations are conducted, including: (1) comprehensive baseline environmental audits; (2) routine environmental audits; (3) environmental management assessments; and (4) special issue reviews.

  7. Rapidly shifting environmental baselines among fishers of the Gulf of California

    Science.gov (United States)

    Sáenz-Arroyo, Andrea; Roberts, Callum M; Torre, Jorge; Cariño-Olvera, Micheline; Enríquez-Andrade, Roberto R

    2005-01-01

    Shifting environmental baselines are inter-generational changes in perception of the state of the environment. As one generation replaces another, people's perceptions of what is natural change even to the extent that they no longer believe historical anecdotes of past abundance or size of species. Although widely accepted, this phenomenon has yet to be quantitatively tested. Here we survey three generations of fishers from Mexico's Gulf of California (N=108), where fish populations have declined steeply over the last 60 years, to investigate how far and fast their environmental baselines are shifting. Compared to young fishers, old fishers named five times as many species and four times as many fishing sites as once being abundant/productive but now depleted (Kruskal–Wallis tests, both p<0.001) with no evidence of a slowdown in rates of loss experienced by younger compared to older generations (Kruskal–Wallis test, n.s. in both cases). Old fishers caught up to 25 times as many Gulf grouper Mycteroperca jordani as young fishers on their best ever fishing day (regression r2=0.62, p<0.001). Despite times of plentiful large fish still being within living memory, few young fishers appreciated that large species had ever been common or nearshore sites productive. Such rapid shifts in perception of what is natural help explain why society is tolerant of the creeping loss of biodiversity. They imply a large educational hurdle in efforts to reset expectations and targets for conservation. PMID:16191603

  8. Large-θ 13 perturbation theory of neutrino oscillation for long-baseline experiments

    Science.gov (United States)

    Asano, Katsuhiro; Minakata, Hisakazu

    2011-06-01

    The Cervera et al. formula, the best known approximate formula of neutrino oscillation probability for long-baseline experiments, can be regarded as a second-order perturbative formula with small expansion parameter ɛ ≡ ∆ m {21/2} ∆ m {31/2} ≃ 0 .03 under the 21assumption s 13 ≃ ɛ. If θ 13 is large, as suggested by a candidate ν e event at T2K as well as the recent global analyses, higher order corrections of s 13 to the formula would be needed for better accuracy. We compute the corrections systematically by formulating a perturbative framework by taking θ 13 as {s_{13}} ˜ sqrt { in } ˜eq 0.18 , which guarantees its validity in a wide range of θ 13 below the Chooz limit. We show on general ground that the correction terms must be of order ɛ2. Yet, they nicely fill the mismatch between the approximate and the exact formulas at low energies and relatively long baselines. General theorems are derived which serve for better understanding of δ-dependence of the oscillation probability. Some interesting implications of the large θ 13 hypothesis are discussed.

  9. The latitudinal distribution of the baseline geomagnetic field during the March 17, 2015 geomagnetic storm

    Science.gov (United States)

    Alberti, Tommaso; Piersanti, Mirko; Lepreti, Fabio; Vecchio, Antonio; De Michelis, Paola; Villante, Umberto; Carbone, Vincenzo

    2016-04-01

    Geomagnetic storms (GS) are global geomagnetic disturbances that result from the interaction between magnetized plasma that propagates from the Sun and plasma and magnetic fields in the near-Earth space plasma environment. The Dst (Disturbance Storm Time) global Ring Current index is still taken to be the definitive representation for geomagnetic storm and is used widely by researcher. Recent in situ measurements by satellites passing through the ring-current region (i.e. Van Allen probes) and computations with magnetospheric field models showed that there are many other field contributions on the geomagnetic storming time variations at middle and low latitudes. Appling the Empirical Mode Decomposition [Huang et al., 1998] to magnetospheric and ground observations, we detect the different magnetic field contributions during a GS and introduce the concepts of modulated baseline and fluctuations of the geomagnetic field. In this work, we apply this method to study the latitudinal distribution of the baseline geomagnetic field during the St. Patrick's Day Geomagnetic Storm 2015 in order to detect physical informations concerning the differences between high-latitude and equatorial ground measurements.

  10. Scheme for generating and transporting THz radiation to the X-ray experimental floor at LCLS baseline

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2011-08-15

    This paper describes a novel scheme for integrating a coherent THz source in the baseline of the LCLS facility. Any method relying on the spent electron beam downstream of the baseline undulator should provide a way of transporting the radiation up to the experimental floor.Herewe propose to use the dump area access maze. In this way the THz output must propagate with limited size at least for one hundred meters in a maze, following many turns, to reach the near experimental hall. The use of a standard, discrete, open beam-waveguide formed by periodic reflectors, that is a mirror guide, would lead to unacceptable size of the system. To avoid these problems, in this paper we propose an alternative approach based on periodically spaced metallic screens with holes. This quasi-optical transmission line is referred to as an iris line. We present complete calculations for the iris line using both analytical and numerical methods, which we find in good agreement. We present a design of a THz edge radiation source based on the use of an iris line. The proposed setup takes almost no cost nor time to be implemented at the LCLS baseline, and can be used at other facilities as well. The edge radiation source is limited in maximally achievable field strength at the sample. An extension based on the use of an undulator in the presence of the iris line, which is feasible at the LCLS energies, is proposed as a possible upgrade of the baseline THz source. (orig)

  11. Distributed State Machine Supervision for Long-baseline Gravitational-wave Detectors

    CERN Document Server

    Rollins, Jameson Graef

    2016-01-01

    The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely-spaced, long-baseline gravitational-wave detectors. Each LIGO detector consists of a complex optical system, isolated from the ground by multiple layers of active seismic isolation, all controlled by hundredsfast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of the Advanced LIGO detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes, organized hierarchically for full detector control. User code is written in standard Python, and the platform is designed to facilitate the fast, intense development pace associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experime...

  12. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    Energy Technology Data Exchange (ETDEWEB)

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  13. Direct coal liquefaction baseline design and system analysis. Quarterly report, May--August 1990

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  14. The LOFT wide field monitor

    DEFF Research Database (Denmark)

    Brandt, Søren; Hernanz, M.; Alvarez, L.

    2012-01-01

    class large area detector (LAD) with a field of view and a wide field monitor (WFM) instrument based on the coded mask principle, providing coverage of more than 1/3 of the sky. The LAD will provide an effective area ~20 times larger than any previous mission and will by timing studies...... be able to address fundamental questions about strong gravity in the vicinity of black holes and the equation of state of nuclear matter in neutron stars. The prime goal of the WFM will be to detect transient sources to be observed by the LAD. However, with its wide field of view and good energy...... resolution of field of the Galactic Center. The high duty...

  15. Wide Bandgap Extrinsic Photoconductive Switches

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, James S. [State Univ. of New York (SUNY), Plattsburgh, NY (United States); Univ. of California, Davis, CA (United States)

    2012-01-20

    Photoconductive semiconductor switches (PCSS) have been investigated since the late 1970s. Some devices have been developed that withstand tens of kilovolts and others that switch hundreds of amperes. However, no single device has been developed that can reliably withstand both high voltage and switch high current. Yet, photoconductive switches still hold the promise of reliable high voltage and high current operation with subnanosecond risetimes. Particularly since good quality, bulk, single crystal, wide bandgap semiconductor materials have recently become available. In this chapter we will review the basic operation of PCSS devices, status of PCSS devices and properties of the wide bandgap semiconductors 4H-SiC, 6H-SiC and 2H-GaN.

  16. Defining the baseline in social life cycle assessment

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Finkbeiner, Matthias; Jørgensen, Michael Søgaard

    2010-01-01

    A relatively broad consensus has formed that the purpose of developing and using the social life cycle assessment (SLCA) is to improve the social conditions for the stakeholders affected by the assessed product's life cycle. To create this effect, the SLCA, among other things, needs to provide...... valid assessments of the consequence of the decision that it is to support. The consequence of a decision to implement a life cycle of a product can be seen as the difference between the decision being implemented and 'non-implemented' product life cycle. This difference can to some extent be found...... using the consequential environmental life cycle assessment (ELCA) methodology to identify the processes that change as a consequence of the decision. However, if social impacts are understood as certain changes in the lives of the stakeholders, then social impacts are not only related to product life...

  17. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

    Energy Technology Data Exchange (ETDEWEB)

    Price, Lynn; Murtishaw, Scott; Worrell, Ernst

    2003-06-01

    industry-specific metric for reporting and tracking GHG emissions trends to accurately reflect year to year changes while protecting proprietary data. This GHG intensity index changes while protecting proprietary data. This GHG intensity index would provide Registry participants with a means for demonstrating improvements in their energy and GHG emissions per unit of production without divulging specific values. For the second research area, Berkeley Lab evaluated various methods used to calculate baselines for documentation of energy consumption or GHG emissions reductions, noting those that use industry-specific metrics. Accounting for actions to reduce GHGs can be done on a project-by-project basis or on an entity basis. Establishing project-related baselines for mitigation efforts has been widely discussed in the context of two of the so-called ''flexible mechanisms'' of the Kyoto Protocol to the United Nations Framework Convention on Climate Change (Kyoto Protocol) Joint Implementation (JI) and the Clean Development Mechanism (CDM).

  18. Wide and High Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Post, Brian K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Roschli, Alex C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-03-01

    The goal of this project is to develop and demonstrate the enabling technologies for Wide and High Additive Manufacturing (WHAM). WHAM will open up new areas of U.S. manufacturing for very large tooling in support of the transportation and energy industries, significantly reducing cost and lead time. As with Big Area Additive Manufacturing (BAAM), the initial focus is on the deposition of composite materials.

  19. EMD-based GPS baseline solution and validation test

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; GAO Jing-xiang; WANG Jin-ling; XU Chang-hui

    2008-01-01

    A GPS baseline solution model is presented, based on the Empirical Mode Decomposition (EMD), which has the advantage of eliminating the error effects outside the model. The EMD technique is a new signal processing method for non-linear time series, which decomposes a time series into a finite and often small number of Intrinsic Mode Functions (IMFs). The decomposition procedure is adaptive and data-driven which is suitable for non-linear data series analysis. A multi-scale decomposition and reconstruction architecture is defined on the basis of the EMD theory and the error mitigation model is demonstrated as well. A standard of the scale selection for the elimination of errors, outside the model, was given in terms of the mean of the accumulated standardized modes. Thereafter, the scheme of the GPS baseline solution based on the EMD is suggested. The float solution residuals of the Double-Difference (DD) observation equation are used to extract the errors outside the model applied to modify the GPS DD measurements. Then the float solution was given again and the fixed solution was obtained by a Lambda algorithm. Three schemes are designed to test the proposed model and the experimental results show that the proposed model dramatically improves the relia- bility of ambiguity resolution after the elimination of errors outside the model.

  20. Study of neutrino oscillations in long-baseline accelerator experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kudenko, Yurii G [Institute for Nuclear Research, Russian Academy of Sciences, Moscow (Russian Federation)

    2011-06-30

    A review of the title subject is given. The phenomenology of neutrino oscillations in the framework of the so-called neutrino Standard Model ({nu}SM) with three active neutrinos is considered. The recently completed long-baseline accelerator experiment K2K and currently in-progress MINOS and OPERA experiments are described in detail. The oscillation parameters obtained from the global analysis of all oscillation data are given. The short-baseline experiment MiniBooNE and its results on the search for light sterile neutrinos are discussed in detail. Considerable attention is given to searching for {nu}{sub {mu}{yields}{nu}e} oscillations and measuring the {theta}{sub 13} angle in muon neutrino experiments. The concept of the off-axis neutrino beam is reviewed. The T2K experiment, collecting statistics since early 2010, is described for its details and objectives. The NO{nu}A experiment under construction and the next-generation beta beam and neutrino factory experiments are also discussed. (reviews of topical problems)

  1. Baseline urologic surgical skills among medical students: Differentiating trainees.

    Science.gov (United States)

    Gupta, Vishaal; Lantz, Andrea G; Alzharani, Tarek; Foell, Kirsten; Lee, Jason Y

    2014-07-01

    Urology training programs seek to identify ideal candidates with the potential to become competent urologic surgeons. It is unclear whether innate technical ability has a role in this selection process. We aimed to determine whether there are any innate differences in baseline urologic technical skills among medical students. Second-year medical students from the University of Toronto were recruited for this study and stratified into surgical and non-surgical cohorts based on their reported career aspirations. After a pre-test questionnaire, subjects were tested on several urologic surgical skills: laparoscopy, cystoscopy and robotic surgery. Statistical analysis was performed using chi-squared test, student t-tests and Spearman's correlation where appropriate. A total of 29 students participated in the study and no significant baseline differences were found between cohorts with respect to demographics and prior surgical experience. For laparoscopic skills, the surgical cohort outperformed the non-surgical cohort on several exercises: Lap Beans Missed (4.9 vs. 9.3, p robotic surgery performance metrics: Task Time (50.6 vs. 76.3, p robotics, may differ between early trainees interested in a surgical career compared to those interested in a non-surgical career. Further studies are required to illicit what impact such differences have on future performance and competence.

  2. The IUGS/IAGC Task Group on Global Geochemical Baselines

    Institute of Scientific and Technical Information of China (English)

    David B.Smith; Shaun Reeder; Alecos Demetriades

    2012-01-01

    The Task Group on Global Geochemical Baselines,operating under the auspices of both the International Union of Geological Sciences(IUGS) and the International Association of Geochemistry(IAGC),has the long-term goal of establishing a global geochemical database to document the concentration and distribution of chemical elements in the Earth's surface or near-surface environment.The database and accompanying element distribution maps represent a geochemical baseline against which future human-induced or natural changes to the chemistry of the land surface may be recognized and quantified.In order to accomplish this long-term goal,the activities of the Task Group include:(1) developing partnerships with countries conducting broad-scale geochemical mapping studies;(2) providing consultation and training in the form of workshops and short courses;(3) organizing periodic international symposia to foster communication among the geochemical mapping community;(4) developing criteria for certifying those projects whose data are acceptable in a global geochemical database;(5) acting as a repository for data collected by those projects meeting the criteria for standardization;(6) preparing complete metadata for the certified projects;and(7) preparing,ultimately,a global geochemical database.This paper summarizes the history and accomplishments of the Task Group since its first predecessor project was established in 1988.

  3. Baselines for the Pan-Canadian science curriculum framework.

    Science.gov (United States)

    Liu, Xiufeng

    2013-01-01

    Using a Canadian student achievement assessment database, the Science Achievement Indicators Program (SAIP), and employing the Rasch partial credit measurement model, this study estimated the difficulties of items corresponding to the learning outcomes in the Pan-Canadian science curriculum framework and the latent abilities of students of grades 7, 8, 10, 11, 12 and OAC (Ontario Academic Course). The above estimates serve as baselines for validating the Pan-Canadian science curriculum framework in terms of the learning progression of learning outcomes and expected mastery of learning outcomes by grades. It was found that there was no statistically significant progression in learning outcomes from grades 4-6 to grades 7-9, and from grades 7-9 to grades 10-12; the curriculum framework sets mastery expectation about 2 grades higher than students' potential abilities. In light of the above findings, this paper discusses theoretical issues related to deciding progression of learning outcomes and setting expectation of student mastery of learning outcomes, and highlights the importance of using national assessment data to establish baselines for the above purposes. This paper concludes with recommendations for further validating the Pan-Canadian science curriculum frameworks.

  4. The IUGS/IAGC Task Group on Global Geochemical Baselines

    Science.gov (United States)

    Smith, David B.; Wang, Xueqiu; Reeder, Shaun; Demetriades, Alecos

    2012-01-01

    The Task Group on Global Geochemical Baselines, operating under the auspices of both the International Union of Geological Sciences (IUGS) and the International Association of Geochemistry (IAGC), has the long-term goal of establishing a global geochemical database to document the concentration and distribution of chemical elements in the Earth’s surface or near-surface environment. The database and accompanying element distribution maps represent a geochemical baseline against which future human-induced or natural changes to the chemistry of the land surface may be recognized and quantified. In order to accomplish this long-term goal, the activities of the Task Group include: (1) developing partnerships with countries conducting broad-scale geochemical mapping studies; (2) providing consultation and training in the form of workshops and short courses; (3) organizing periodic international symposia to foster communication among the geochemical mapping community; (4) developing criteria for certifying those projects whose data are acceptable in a global geochemical database; (5) acting as a repository for data collected by those projects meeting the criteria for standardization; (6) preparing complete metadata for the certified projects; and (7) preparing, ultimately, a global geochemical database. This paper summarizes the history and accomplishments of the Task Group since its first predecessor project was established in 1988.

  5. Verification and optimization of the CFETR baseline scenario

    Science.gov (United States)

    Zhao, D.; Lao, L. L.; Meneghini, O.; Staebler, G. M.; Candy, J.; Smith, S. P.; Snyder, P. B.; Prater, R.; Chen, X.; Chan, V. S.; Li, J.; Chen, J.; Shi, N.; Guo, W.; Pan, C.; Jian, X.

    2016-10-01

    The baseline scenario of China Fusion Engineering Test Reactor (CFETR) was designed starting from 0D calculations. The CFETR baseline scenario satisfies the minimum goal of Fusion Nuclear Science Facility aimed at bridging the gaps between ITER and DEMO. 1.5D calculations are presented to verify the on-going efforts in higher-dimensional modeling of CFETR. Steady-state scenarios are calculated self-consistently by the OMFIT integrated modeling framework that includes EFIT for equilibrium, ONETWO for sources and current, TGYRO for transport. With 68MW of neutral beam power and 8MW of ECH injected to the plasma, the average ion temperature is maintained at 15keV, while 150MW fusion power is produced. The neutral beams also drive 55% of the plasma current. Modest fast ion diffusion will reduce NBCD and affect the profile substantially. Top-launch ECH will increase the current drive and the power absorption rate. EPED model are being included. Work supported by U.S. DOE under DE-FC02-04ER54698 and the USTC CFETR contract.

  6. A baseline for the multivariate comparison of resting state networks

    Directory of Open Access Journals (Sweden)

    Elena A Allen

    2011-02-01

    Full Text Available As the size of functional and structural MRI datasets expands, it becomes increasingly important to establish a baseline from which diagnostic relevance may be determined, a processing strategy that efficiently prepares data for analysis, and a statistical approach that identifies important effects in a manner that is both robust and reproducible. In this paper, we introduce a multivariate analytic approach that optimizes sensitivity and reduces unnecessary testing. We demonstrate the utility of this mega-analytic approach by identifying the effects of age and gender on the resting state networks of 603 healthy adolescents and adults (mean age: 23.4 years, range: 12 to 71 years. Data were collected on the same scanner, preprocessed using an automated analysis pipeline based in SPM, and studied using group independent component analysis. Resting state networks were identified and evaluated in terms of three primary outcome measures: time course spectral power, spatial map intensity, and functional network connectivity. Results revealed robust effects of age on all three outcome measures, largely indicating decreases in network coherence and connectivity with increasing age. Gender effects were of smaller magnitude but suggested stronger intra-network connectivity in females and more inter-network connectivity in males, particularly with regard to sensorimotor networks. These findings, along with the analysis approach and statistical framework described here, provide a useful baseline for future investigations of brain networks in health and disease.

  7. Pilot Implementation of Health Information Systems

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2009-01-01

    Pilot implementation is a powerful and widely used approach in identifying design flaws and implementation issues before the full-scale deployment of new health information systems. However, pilot implementations often fail in the sense that they say little about the usability and usefulness...... information system. Based on the findings from this study, we identify three main challenges: (1) defining an appropriate scope for pilot implementation, (2) managing the implementation process, and (3) ensuring commitment to the pilot. Finally, recommendations for future research and implications...... of the proposed system designs. This calls for studies that seek to uncover and analyze the reasons for failure, so that guidelines for conducting such pilots can be developed. In this paper, we present a qualitative field study of an ambitious, but unsuccessful pilot implementation of a Danish healthcare...

  8. Pilot Implementation of Health Information Systems

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2009-01-01

    Pilot implementation is a powerful and widely used approach in identifying design flaws and implementation issues before the full-scale deployment of new health information systems. However, pilot implementations often fail in the sense that they say little about the usability and usefulness...... of the proposed system designs. This calls for studies that seek to uncover and analyze the reasons for failure, so that guidelines for conducting such pilots can be developed. In this paper, we present a qualitative field study of an ambitious, but unsuccessful pilot implementation of a Danish healthcare...... information system. Based on the findings from this study, we identify three main challenges: (1) defining an appropriate scope for pilot implementation, (2) managing the implementation process, and (3) ensuring commitment to the pilot. Finally, recommendations for future research and implications...

  9. Dynamical Evolution of Wide Binaries

    Directory of Open Access Journals (Sweden)

    Esmeralda H. Mallada

    2001-01-01

    Full Text Available We simulate numerically encounters of wide binaries with field stars and Giant Molecular Clouds (GMCs by means of the impulse approximation. We analyze the time evolution of the distributions of eccentricities and semimajor axes of wide binaries with given initial conditions, at intervals of 109 yr, up to 1010 yr (assumed age of the Galaxy. We compute the fraction of surviving binaries for stellar encounters, for GMC encounters and for a combination of both, and hence, the dynamical lifetime for different semimajor axes and different masses of binaries (0.5, 1, 1.2, 1.5, 2.5, and 3 Msolar. We find that the dynamical lifetime of wide binaries considering only GMCs is half than that considering only stars. For encounters with GMCs we analyze the influence of the initial inclination of the orbital plane of the binary with respect to the plane perpendicular to the relative velocity vector of the binary and the GMC. We find that the perturbation is maximum when the angle is minimum.

  10. Wide Bandgap Extrinsic Photoconductive Switches

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, James S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-07-03

    Semi-insulating Gallium Nitride, 4H and 6H Silicon Carbide are attractive materials for compact, high voltage, extrinsic, photoconductive switches due to their wide bandgap, high dark resistance, high critical electric field strength and high electron saturation velocity. These wide bandgap semiconductors are made semi-insulating by the addition of vanadium (4H and 6HSiC) and iron (2H-GaN) impurities that form deep acceptors. These deep acceptors trap electrons donated from shallow donor impurities. The electrons can be optically excited from these deep acceptor levels into the conduction band to transition the wide bandgap semiconductor materials from a semi-insulating to a conducting state. Extrinsic photoconductive switches with opposing electrodes have been constructed using vanadium compensated 6H-SiC and iron compensated 2H-GaN. These extrinsic photoconductive switches were tested at high voltage and high power to determine if they could be successfully used as the closing switch in compact medical accelerators.

  11. Managing Student Behavior in Dual Immersion Classrooms: A Study of Class-Wide Function-Related Intervention Teams.

    Science.gov (United States)

    Hansen, Blake D; Caldarella, Paul; Williams, Leslie; Wills, Howard P

    2017-09-01

    Classroom management in dual immersion classrooms includes unique challenges. The teacher must instruct and correct in the L2 language, in which students are beginning learners, and effective classroom management strategies appropriate to the L2 context. Class-Wide Function-Related Intervention Teams (CW-FIT) is a positive classroom management program that teaches social skills and uses group contingencies to improve behavior. The present study examined the ability of French immersion teachers to implement CW-FIT in the L2, including the effects of CW-FIT on teacher praise and reprimand rates and as well as on students' classroom behavior. Social validity was also assessed. A single-subject multiple baseline design with embedded reversals was used to evaluate impact in second-, third-, and fourth-grade dual immersion classrooms. Results indicated that dual immersion teachers were able to implement CW-FIT in L2 with fidelity. The intervention significantly increased teacher praise and improved classroom on-task behavior. Changes in teacher reprimand rates were inconsistent. Students and teachers reported CW-FIT to be socially valid.

  12. Teleradiology: opportunities, problems, implementation.

    Science.gov (United States)

    Williams, O L; Singh, S K

    1996-01-01

    With the introduction of computerized tomography and magnetic resonance imaging in the early 1970s, computers became integral to the imaging process. Shortly thereafter, scanners that create digitized images from film were introduced and teleradiology--images transmitted in real time--became possible. In the early 1980s, the idea of a picture archiving and communications system (PACS) began to develop. It promised to retrieve, connect and store every kind of image, from x-ray to CT, and render film obsolete. However, inflated expectations and inadequate technology hindered PACS implementation. Digital imaging offers the following benefits over film-based systems: - Less space is needed to store imaging studies. - Digital imaging files can be faster and easier for referring physicians to retrieve than film and are not susceptible to loss and damage. - Digital images can be enhanced, contrasted, colored and otherwise manipulated to maximize available information. - There are no chemicals to dispose of. While telemedicine promises to increase the efficiency and effectiveness of medical professionals, wide-scale implementation faces the following obstacles: - It has been difficult to establish a uniform standard that allows file transfer among systems made by different vendors. - There are unresolved legal questions about "interstate" medical practice as it occurs in teleradiology and about standards of care and image quality. - Any system available on a network is vulnerable to unauthorized users who may invade the database or operation of the system, and it is very difficult to detect fraud--data that has been tampered with--in digital records. - Connections to remote locations depend on local telephone lines, which may be slow. Other options are available, but they may be too expensive for facilities in the rural areas that need them the most. - Digital imaging equipment is still very costly to acquire and install. The future of telemedicine rests now with those who

  13. Baseline values of micronuclei and comet assay in the lizard Tupinambis merianae (Teiidae, Squamata).

    Science.gov (United States)

    Schaumburg, Laura G; Poletta, Gisela L; Siroski, Pablo A; Mudry, Marta D

    2012-10-01

    The Micronucleus test (MN) and Comet assay (CA) are currently the most widely used methods that allow the characterization of DNA damage induced by physical and chemical agents in wild species. The continuous expansion of the cultivated areas in Argentina, since the introduction of transgenic crops, mainly soy, in association with the increased use of pesticides, transformed deeply the natural environments where the lizard Tupinambis merianae (tegu lizard) occurs. Despite the fact that reptiles have shown to be excellent bioindicators of environmental contaminants, there is no record of genotoxicity studies in T. merianae. The aim of the present study was to adjust the MN test and CA protocols to be applied in erythrocytes of T. merianae, and determine the baseline values of DNA damage in this species. We used 20 adult lizards (10 males: 10 females) from Estación Zoológica Experimental "Granja La Esmeralda" (Santa Fe, Argentina). Peripheral blood samples were collected from all animals and the MN test and CA applied according to the protocols established for other reptilian species. We test critical parameters of CA protocol (cell density, unwinding and electrophoresis times) using increasing concentrations of H2O2 (10, 25 and 50 μM) as a known genotoxic agent to induce DNA damage. Based on this, we determined the most suitable conditions for the CA in this species: a cell density of 4×10(3) erythrocytes per slide, 10 min of unwinding and 15 min of electrophoresis at 0.90 V/cm approximately. The baseline frequency of micronuclei (BFMN=MN/1000 erythrocytes counted) determined for this species was 0.95±0.27 and the basal damage index (BDI: calculated from 100 comet images classified in arbitrary units)=103.85±0.97. No differences were observed between sexes in the BFMN or BDI (p>0.05), and no relation was found between baseline values and length or weight of the analyzed animals (p>0.05). These results demonstrated the sensitivity of both biomarkers of

  14. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2009-12-30

    provided, the remote location and low population density of some the facilities. As such, the needs assessment contains equivalencies to the applicable requirements. The compliance assessment contains no such equivalencies and simply assesses the existing emergency response resources to the requirements of the BNA and can be updated as compliance changes independent of the BNA update schedule. There are numerous NFPA codes and standards and other requirements and guidance documents that address the subject of emergency response. These requirements documents are not always well coordinated and may contain duplicative or conflicting requirements or even coverage gaps. Left unaddressed, this regulatory situation results in frequent interpretation of requirements documents. Different interpretations can then lead to inconsistent implementation. This BNA addresses this situation by compiling applicable requirements from all identified sources (see Section 5) and analyzing them collectively to address conflict and overlap as applicable to the hazards presented by the LLNL and Sandia/CA sites (see Section 7). The BNA also generates requirements when needed to fill any identified gaps in regulatory coverage. Finally, the BNA produces a customized simple set of requirements, appropriate for the DOE protection goals, such as those defined in DOE O 420.1B, the hazard level, the population density, the topography, and the site layout at LLNL and Sandia/CA that will be used as the baseline requirements set - the 'baseline needs' - for emergency response at LLNL and Sandia/CA. A template approach is utilized to accomplish this evaluation for each of the nine topical areas that comprise the baseline needs for emergency response. The basis for conclusions reached in determining the baseline needs for each of the topical areas is presented in Sections 7.1 through 7.9. This BNA identifies only mandatory requirements and establishes the minimum performance criteria. The minimum

  15. The ParkinsonNet trial: design and baseline characteristics.

    NARCIS (Netherlands)

    Keus, S.H.J.; Nijkrake, M.J.; Borm, G.F.; Kwakkel, G.; Roos, R.A.; Berendse, H.W.; Adang, E.M.M.; Overeem, S.; Bloem, B.R.; Munneke, M.

    2010-01-01

    The companion paper describes how implementation of professional networks (ParkinsonNet) may improve the quality and efficiency of allied health care in Parkinson's disease (PD). We designed a cluster-randomized controlled trial to evaluate this ParkinsonNet concept for one allied health discipline,

  16. Waste generator services implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, J.; Magleby, M.; Litus, M.

    1998-04-01

    Recurring waste management noncompliance problems have spurred a fundamental site-wide process revision to characterize and disposition wastes at the Idaho National Engineering and Environmental Laboratory. The reengineered method, termed Waste Generator Services, will streamline the waste acceptance process and provide waste generators comprehensive waste management services through a single, accountable organization to manage and disposition wastes in a timely, cost-effective, and compliant manner. This report outlines the strategy for implementing Waste Generator Services across the INEEL. It documents the culmination of efforts worked by the LMITCO Environmental Management Compliance Reengineering project team since October 1997. These efforts have included defining problems associated with the INEEL waste management process; identifying commercial best management practices; completing a review of DOE Complex-wide waste management training requirements; and involving others through an Integrated Process Team approach to provide recommendations on process flow, funding/charging mechanisms, and WGS organization. The report defines the work that will be performed by Waste Generator Services, the organization and resources, the waste acceptance process flow, the funding approach, methods for measuring performance, and the implementation schedule and approach. Field deployment will occur first at the Idaho Chemical Processing Plant in June 1998. Beginning in Fiscal Year 1999, Waste Generator Services will be deployed at the other major INEEL facilities in a phased approach, with implementation completed by March 1999.

  17. Spontaneous Neural Oscillations Bias Perception by Modulating Baseline Excitability.

    Science.gov (United States)

    Iemi, Luca; Chaumon, Maximilien; Crouzet, Sébastien M; Busch, Niko A

    2017-01-25

    The brain exhibits organized fluctuations of neural activity, even in the absence of tasks or sensory input. A prominent type of such spontaneous activity is the alpha rhythm, which influences perception and interacts with other ongoing neural activity. It is currently hypothesized that states of decreased prestimulus α oscillations indicate enhanced neural excitability, resulting in improved perceptual acuity. Nevertheless, it remains debated how changes in excitability manifest at the behavioral level in perceptual tasks. We addressed this issue by comparing two alternative models describing the effect of spontaneous α power on signal detection. The first model assumes that decreased α power increases baseline excitability, amplifying the response to both signal and noise, predicting a liberal detection criterion with no effect on sensitivity. The second model predicts that decreased α power increases the trial-by-trial precision of the sensory response, resulting in improved sensitivity. We tested these models in two EEG experiments in humans where we analyzed the effects of prestimulus α power on visual detection and discrimination using a signal detection framework. Both experiments provide strong evidence that decreased α power reflects a more liberal detection criterion, rather than improved sensitivity, consistent with the baseline model. In other words, when the task requires detecting stimulus presence versus absence, reduced α oscillations make observers more likely to report the stimulus regardless of actual stimulus presence. Contrary to previous interpretations, these results suggest that states of decreased α oscillations increase the global baseline excitability of sensory systems without affecting perceptual acuity. Spontaneous fluctuations of brain activity explain why a faint sensory stimulus is sometimes perceived and sometimes not. The prevailing view is that heightened neural excitability, indexed by decreased α oscillations, promotes

  18. Clean Energy-Related Economic Development Policy across the States: Establishing a 2016 Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Jeffrey J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-01-01

    States implement clean energy-related economic development policy to spur innovation, manufacturing, and to address other priorities. This report focuses on those policies most directly related to expanding new and existing manufacturing. The extent to which states invest in this policymaking depends on political drivers and jurisdictional economic development priorities. To date, no one source has collected all of the clean energy-related economic development policies available across the 50 states. Thus, it is unclear how many policies exist within each state and how these policies, when implemented, can drive economic development. Establishing the baseline of existing policy is a critical first step in determining the potential holistic impact of these policies on driving economic growth in a state. The goal of this report is to document the clean energy-related economic development policy landscape across the 50 states with a focus on policy that seeks to expand new or existing manufacturing within a state. States interested in promoting clean energy manufacturing in their jurisdictions may be interested in reviewing this landscape to determine how they compare to peers and to adjust their policies as necessary. This report documents over 900 existing clean energy-related economic development laws, financial incentives (technology-agnostic and clean energy focused), and other policies such as agency-directed programs and initiatives across the states.

  19. Wide-Field Detected Fourier Transform CARS Microscopy

    Science.gov (United States)

    Duarte, Alex Soares; Schnedermann, Christoph; Kukura, Philipp

    2016-11-01

    We present a wide-field imaging implementation of Fourier transform coherent anti-Stokes Raman scattering (wide-field detected FT-CARS) microscopy capable of acquiring high-contrast label-free but chemically specific images over the full vibrational ‘fingerprint’ region, suitable for a large field of view. Rapid resonant mechanical scanning of the illumination beam coupled with highly sensitive, camera-based detection of the CARS signal allows for fast and direct hyperspectral wide-field image acquisition, while minimizing sample damage. Intrinsic to FT-CARS microscopy, the ability to control the range of time-delays between pump and probe pulses allows for fine tuning of spectral resolution, bandwidth and imaging speed while maintaining full duty cycle. We outline the basic principles of wide-field detected FT-CARS microscopy and demonstrate how it can be used as a sensitive optical probe for chemically specific Raman imaging.

  20. Terahertz wide aperture reflection tomography

    Science.gov (United States)

    Pearce, Jeremy; Choi, Hyeokho; Mittleman, Daniel M.; White, Jeff; Zimdars, David

    2005-07-01

    We describe a powerful imaging modality for terahertz (THz) radiation, THz wide aperture reflection tomography (WART). Edge maps of an object's cross section are reconstructed from a series of time-domain reflection measurements at different viewing angles. Each measurement corresponds to a parallel line projection of the object's cross section. The filtered backprojection algorithm is applied to recover the image from the projection data. To our knowledge, this is the first demonstration of a reflection computed tomography technique using electromagnetic waves. We demonstrate the capabilities of THz WART by imaging the cross sections of two test objects.