WorldWideScience

Sample records for proposed baseline text

  1. A Proposed Arabic Handwritten Text Normalization Method

    Directory of Open Access Journals (Sweden)

    Tarik Abu-Ain

    2014-11-01

    Full Text Available Text normalization is an important technique in document image analysis and recognition. It consists of many preprocessing stages, which include slope correction, text padding, skew correction, and straight the writing line. In this side, text normalization has an important role in many procedures such as text segmentation, feature extraction and characters recognition. In the present article, a new method for text baseline detection, straightening, and slant correction for Arabic handwritten texts is proposed. The method comprises a set of sequential steps: first components segmentation is done followed by components text thinning; then, the direction features of the skeletons are extracted, and the candidate baseline regions are determined. After that, selection of the correct baseline region is done, and finally, the baselines of all components are aligned with the writing line.  The experiments are conducted on IFN/ENIT benchmark Arabic dataset. The results show that the proposed method has a promising and encouraging performance.

  2. New baseline correction algorithm for text-line recognition with bidirectional recurrent neural networks

    Science.gov (United States)

    Morillot, Olivier; Likforman-Sulem, Laurence; Grosicki, Emmanuèle

    2013-04-01

    Many preprocessing techniques have been proposed for isolated word recognition. However, recently, recognition systems have dealt with text blocks and their compound text lines. In this paper, we propose a new preprocessing approach to efficiently correct baseline skew and fluctuations. Our approach is based on a sliding window within which the vertical position of the baseline is estimated. Segmentation of text lines into subparts is, thus, avoided. Experiments conducted on a large publicly available database (Rimes), with a BLSTM (bidirectional long short-term memory) recurrent neural network recognition system, show that our baseline correction approach highly improves performance.

  3. PROPOSAL OF A TABLE TO CLASSIFY THE RELIABILITY OF BASELINES OBTAINED BY GNSS TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Paulo Cesar Lima Segantine

    Full Text Available The correct data processing of GNSS measurements, as well as a correct interpretation of the results are fundamental factors for analysis of quality of land surveying works. In that sense, it is important to keep in mind that, although, the statistical data provided by the majority of commercials software used for GNSS data processing, describes the credibility of the work, they do not have consistent information about the reliability of the processed coordinates. Based on that assumption, this paper proposes a classification table to classify the reliability of baselines obtained through GNSS data processing. As data input, the GNSS measurements were performed during the years 2006 and 2008, considering different seasons of the year, geometric configurations of RBMC stations and baseline lengths. As demonstrated in this paper, parameters as baseline length, ambiguity solution, PDOP value and the precision of horizontal and vertical values of coordinates can be used as reliability parameters. The proposed classification guarantees the requirements of the Brazilian Law N( 10.267/2001 of the National Institute of Colonization and Agrarian Reform (INCRA

  4. A proposal to create an extension to the European baseline series.

    Science.gov (United States)

    Wilkinson, Mark; Gallo, Rosella; Goossens, An; Johansen, Jeanne D; Rustemeyer, Thomas; Sánchez-Pérez, Javier; Schuttelaar, Marie L; Uter, Wolfgang

    2018-02-01

    The current European baseline series consists of 30 allergens, and was last updated in 2015. To use data from the European Surveillance System on Contact Allergies (ESSCA) to propose an extension to the European baseline series in response to changes in environmental exposures. Data from departmental and national extensions to the baseline series, together with some temporary additions from departments contributing to the ESSCA, were collated during 2013-2014. In total, 31689 patients were patch tested in 46 European departments. Many departments and national groups already consider the current European baseline series to be a suboptimal screen, and use their own extensions to it. The haptens tested are heterogeneous, although there are some consistent themes. Potential haptens to include in an extension to the European baseline series comprise sodium metabisulfite, formaldehyde-releasing preservatives, additional markers of fragrance allergy, propolis, Compositae mix, and 2-hydroxyethyl methacrylate. In combination with other published work from the ESSCA, changes to the current European baseline series are proposed for discussion. As well as addition of the allergens listed above, it is suggested that primin and clioquinol should be deleted from the series, owing to reduced environmental exposure. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Baseline radiological monitoring at proposed uranium prospecting site at Rohil Sikar, Rajasthan

    International Nuclear Information System (INIS)

    Kumar, Rajesh; Jha, V.N.; Sahoo, N.K.; Jha, S.K.; Tripathi, R.M.

    2018-01-01

    Once economically viable grades of uranium deposits are proposed for mining and processing by the industry radiological baseline studies are required for future comparison during operational phases. The information collected during such studies serve as connecting feature between regulatory compliance and technical information. Present paper summarizes the results of baseline monitoring of atmospheric 222 Rn and gamma level in the area at prospecting mining, milling and waste disposal sites of Rohil Rajasthan

  6. Baseline environmental survey of proposed uranium mining projects of Domiasiat, Meghalaya

    International Nuclear Information System (INIS)

    Khathing, D.T.; Myrboh, B.; Nongkynrih, P.; War, S.A.; Marbaniang, D.G.; Iongwai, P.S.

    2005-01-01

    West Khasi Hills District of Meghalaya is identified as having a large and rich deposits of Uranium. However, actual extraction on a commercial scale that may lead to an increase in the socio-economic development of the state in particular and the country in general, is yet to be undertaken. This is due to lack of any baseline environmental survey giving rise to speculative information and causing a fear psychosis in the minds of the locals populace about the negative effects of Uranium mining. A preoperational survey and environmental monitoring of the proposed mining sites and its adjacent areas would establish the baseline status of the natural radioactivity and some chemical constituents in different environmental matrices via. air, water, soil, biota and aquatic ecosystems. The North Eastern Hill University, Shillong, Meghalaya has undertaken the Project funded by DST and BRNS, Department of Atomic Energy, Govt. of India which aims to provide baseline environmental data on ambient air, water and soil quality in and around the proposed Uranium mining site of Domiasiat, West Khasi Hills in the state of Meghalaya. Trace elements (elements like Mg, Zn, Ca, K, Na, Se, As, Fe, Cu, Co, Cr, Ni, Pb, Cd, Mn etc) and the status of the activity in the samples are determined. (author)

  7. A long baseline global stereo matching based upon short baseline estimation

    Science.gov (United States)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  8. Spectrum from the Proposed BNL Very Long Baseline Neutrino Facility

    CERN Document Server

    Kahn, S A

    2005-01-01

    This paper calculates the neutrino flux that would be seen at the far detector location from the proposed BNL Very Long Baseline Neutrino Facility. The far detector is assumed to be located at an underground facility in South Dakota 2540 km from BNL. The neutrino beam facility uses a 1 MW upgraded AGS to provide an intense proton beam on the target and a magnetic horn to focus the secondary pion beam. The paper will examine the sensitivity of the neutrino flux at the far detector to the positioning of the horn and target so as to establish alignment tolerances for the neutrino system.

  9. Baseline Motivation Type as a Predictor of Dropout in a Healthy Eating Text Messaging Program.

    Science.gov (United States)

    Coa, Kisha; Patrick, Heather

    2016-09-29

    Growing evidence suggests that text messaging programs are effective in facilitating health behavior change. However, high dropout rates limit the potential effectiveness of these programs. This paper describes patterns of early dropout in the HealthyYou text (HYTxt) program, with a focus on the impact of baseline motivation quality on dropout, as characterized by Self-Determination Theory (SDT). This analysis included 193 users of HYTxt, a diet and physical activity text messaging intervention developed by the US National Cancer Institute. Descriptive statistics were computed, and logistic regression models were run to examine the association between baseline motivation type and early program dropout. Overall, 43.0% (83/193) of users dropped out of the program; of these, 65.1% (54/83; 28.0% of all users) did so within the first 2 weeks. Users with higher autonomous motivation had significantly lower odds of dropping out within the first 2 weeks. A one unit increase in autonomous motivation was associated with lower odds (odds ratio 0.44, 95% CI 0.24-0.81) of early dropout, which persisted after adjusting for level of controlled motivation. Applying SDT-based strategies to enhance autonomous motivation might reduce early dropout rates, which can improve program exposure and effectiveness.

  10. Sandia National Laboratories, California proposed CREATE facility environmental baseline survey.

    Energy Technology Data Exchange (ETDEWEB)

    Catechis, Christopher Spyros

    2013-10-01

    Sandia National Laboratories, Environmental Programs completed an environmental baseline survey (EBS) of 12.6 acres located at Sandia National Laboratories/California (SNL/CA) in support of the proposed Collaboration in Research and Engineering for Advanced Technology and Education (CREATE) Facility. The survey area is comprised of several parcels of land within SNL/CA, County of Alameda, California. The survey area is located within T 3S, R 2E, Section 13. The purpose of this EBS is to document the nature, magnitude, and extent of any environmental contamination of the property; identify potential environmental contamination liabilities associated with the property; develop sufficient information to assess the health and safety risks; and ensure adequate protection for human health and the environment related to a specific property.

  11. 77 FR 22247 - Veterinary Feed Directive; Draft Text for Proposed Regulation

    Science.gov (United States)

    2012-04-13

    .... FDA-2010-N-0155] Veterinary Feed Directive; Draft Text for Proposed Regulation AGENCY: Food and Drug... the efficiency of FDA's Veterinary Feed Directive (VFD) program. The Agency is making this draft text..., rm. 1061, Rockville, MD 20852. FOR FURTHER INFORMATION CONTACT: Sharon Benz, Center for Veterinary...

  12. Constraining proposed combinations of ice history and Earth rheology using VLBI determined baseline length rates in North America

    Science.gov (United States)

    Mitrovica, J. X.; Davis, J. L.; Shapiro, I. I.

    1993-01-01

    We predict the present-day rates of change of the lengths of 19 North American baselines due to the glacial isostatic adjustment process. Contrary to previously published research, we find that the three dimensional motion of each of the sites defining a baseline, rather than only the radial motions of these sites, needs to be considered to obtain an accurate estimate of the rate of change of the baseline length. Predictions are generated using a suite of Earth models and late Pleistocene ice histories, these include specific combinations of the two which have been proposed in the literature as satisfying a variety of rebound related geophysical observations from the North American region. A number of these published models are shown to predict rates which differ significantly from the VLBI observations.

  13. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    Directory of Open Access Journals (Sweden)

    M. Pinheiro

    2017-09-01

    Full Text Available The global Digital Elevation Model (DEM resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  14. THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2007-08-06

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory (BNL)and Fermi National Accelerator Laboratory (FNAL) to investigate the potential for future U.S. based long baseline neutrino oscillation experiments using MW class conventional neutrino beams that can be produced at FNAL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing FNAL NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from FNAL aimed at a massive detector with a baseline of > 1000km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2{sup o}.

  15. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    Science.gov (United States)

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  16. A Kalman Filter-Based Short Baseline RTK Algorithm for Single-Frequency Combination of GPS and BDS

    Directory of Open Access Journals (Sweden)

    Sihao Zhao

    2014-08-01

    Full Text Available The emerging Global Navigation Satellite Systems (GNSS including the BeiDou Navigation Satellite System (BDS offer more visible satellites for positioning users. To employ those new satellites in a real-time kinematic (RTK algorithm to enhance positioning precision and availability, a data processing model for the dual constellation of GPS and BDS is proposed and analyzed. A Kalman filter-based algorithm is developed to estimate the float ambiguities for short baseline scenarios. The entire work process of the high-precision algorithm based on the proposed model is deeply investigated in detail. The model is validated with real GPS and BDS data recorded from one zero and two short baseline experiments. Results show that the proposed algorithm can generate fixed baseline output with the same precision level as that of either a single GPS or BDS RTK algorithm. The significantly improved fixed rate and time to first fix of the proposed method demonstrates a better availability and effectiveness on processing multi-GNSSs.

  17. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener

    Directory of Open Access Journals (Sweden)

    Yun-Kyu An

    2016-09-01

    Full Text Available This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.

  18. Program Baseline Change Control Procedure

    International Nuclear Information System (INIS)

    1993-02-01

    This procedure establishes the responsibilities and process for approving initial issues of and changes to the technical, cost, and schedule baselines, and selected management documents developed by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System. This procedure implements the OCRWM Baseline Management Plan and DOE Order 4700.1, Chg 1. It streamlines the change control process to enhance integration, accountability, and traceability of Level 0 and Level I decisions through standardized Baseline Change Proposal (BCP) forms to be used by the Level 0, 1, 2, and 3 Baseline Change Control Boards (BCCBs) and to be tracked in the OCRWM-wide Configuration Information System (CIS) Database.This procedure applies to all technical, cost, and schedule baselines controlled by the Energy System Acquisition Advisory Board (ESAAB) BCCB (Level 0) and, OCRWM Program Baseline Control Board (PBCCB) (Level 1). All baseline BCPs initiated by Level 2 or lower BCCBs, which require approval from ESAAB or PBCCB, shall be processed in accordance with this procedure. This procedure also applies to all Program-level management documents controlled by the OCRWM PBCCB

  19. Ecological baseline study of the Yakima Firing Center proposed land acquisition: A status report

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, L.E.; Beedlow, P.A.; Eberhardt, L.E.; Dauble, D.D.; Fitzner, R.E.

    1989-01-01

    This report provides baseline environmental information for the property identified for possible expansion of the Yakima Firing Center. Results from this work provide general descriptions of the animals and major plant communities present. A vegetation map derived from a combination of on-site surveillance and remotely sensed imagery is provided as part of this report. Twenty-seven wildlife species of special interest (protected, sensitive, furbearer, game animal, etc.), and waterfowl, were observed on the proposed expansion area. Bird censuses revealed 13 raptorial species (including four of special interest: bald eagle, golden eagle, osprey, and prairie falcon); five upland game bird species (sage grouse, California quail, chukar, gray partridge, and ring-necked pheasant); common loons (a species proposed for state listing as threatened); and five other species of special interest (sage thrasher, loggerhead shrike, mourning dove, sage sparrow, and long-billed curlew). Estimates of waterfowl abundance are included for the Priest Rapids Pool of the Columbia River. Six small mammal species were captured during this study; one, the sagebrush vole, is a species of special interest. Two large animal species, mule deer and elk, were noted on the site. Five species of furbearing animals were observed (coyote, beaver, raccoon, mink, and striped skunk). Four species of reptiles and one amphibian were noted. Fisheries surveys were conducted to document the presence of gamefish, and sensitive-classified fish and aquatic invertebrates. Rainbow trout were the only fish collected within the boundaries of the proposed northern expansion area. 22 refs., 10 figs., 4 tabs.

  20. Synthesis and Comparison of Baseline Avian and Bat Use, Raptor Nesting and Mortality Information from Proposed and Existing Wind Developments: Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Wallace P.

    2002-12-01

    Primarily due to concerns generated from observed raptor mortality at the Altamont Pass (CA) wind plant, one of the first commercial electricity generating wind plants in the U.S., new proposed wind projects both within and outside of California have received a great deal of scrutiny and environmental review. A large amount of baseline and operational monitoring data have been collected at proposed and existing U.S. wind plants. The primary use of the avian baseline data collected at wind developments has been to estimate the overall project impacts (e.g., very low, low, moderate, and high relative mortality) on birds, especially raptors and sensitive species (e.g., state and federally listed species). In a few cases, these data have also been used for guiding placement of turbines within a project boundary. This new information has strengthened our ability to accurately predict and mitigate impacts from new projects. This report should assist various stakeholders in the interpretation and use of this large information source in evaluating new projects. This report also suggests that the level of baseline data (e.g., avian use data) required to adequately assess expected impacts of some projects may be reduced. This report provides an evaluation of the ability to predict direct impacts on avian resources (primarily raptors and waterfowl/waterbirds) using less than an entire year of baseline avian use data (one season, two seasons, etc.). This evaluation is important because pre-construction wildlife surveys can be one of the most time-consuming aspects of permitting wind power projects. For baseline data, this study focuses primarily on standardized avian use data usually collected using point count survey methodology and raptor nest survey data. In addition to avian use and raptor nest survey data, other baseline data is usually collected at a proposed project to further quantify potential impacts. These surveys often include vegetation mapping and state or

  1. Data-Driven Baseline Estimation of Residential Buildings for Demand Response

    Directory of Open Access Journals (Sweden)

    Saehong Park

    2015-09-01

    Full Text Available The advent of advanced metering infrastructure (AMI generates a large volume of data related with energy service. This paper exploits data mining approach for customer baseline load (CBL estimation in demand response (DR management. CBL plays a significant role in measurement and verification process, which quantifies the amount of demand reduction and authenticates the performance. The proposed data-driven baseline modeling is based on the unsupervised learning technique. Specifically we leverage both the self organizing map (SOM and K-means clustering for accurate estimation. This two-level approach efficiently reduces the large data set into representative weight vectors in SOM, and then these weight vectors are clustered by K-means clustering to find the load pattern that would be similar to the potential load pattern of the DR event day. To verify the proposed method, we conduct nationwide scale experiments where three major cities’ residential consumption is monitored by smart meters. Our evaluation compares the proposed solution with the various types of day matching techniques, showing that our approach outperforms the existing methods by up to a 68.5% lower error rate.

  2. A Mean-Shift-Based Feature Descriptor for Wide Baseline Stereo Matching

    Directory of Open Access Journals (Sweden)

    Yiwen Dou

    2015-01-01

    Full Text Available We propose a novel Mean-Shift-based building approach in wide baseline. Initially, scale-invariance feature transform (SIFT approach is used to extract relatively stable feature points. As to each matching SIFT feature point, it needs a reasonable neighborhood range so as to choose feature points set. Subsequently, in view of selecting repeatable and high robust feature points, Mean-Shift controls corresponding feature scale. At last, our approach is employed to depth image acquirement in wide baseline and Graph Cut algorithm optimizes disparity information. Compared with the existing methods such as SIFT, speeded up robust feature (SURF, and normalized cross-correlation (NCC, the presented approach has the advantages of higher robustness and accuracy rate. Experimental results on low resolution image and weak feature description in wide baseline confirm the validity of our approach.

  3. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint

    Directory of Open Access Journals (Sweden)

    Ang Gong

    2015-12-01

    Full Text Available For Global Navigation Satellite System (GNSS single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  4. STATUS OF THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2006-09-21

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory and Fermi National Accelerator Laboratory to investigate the potential for future U.S. based long baseline neutrino oscillation experiments beyond the currently planned program. The Study focused on MW class convention at neutrino beams that can be produced at Fermilab or BNL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing Fermilab NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000 km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from Fermilab or BNL aimed at a massive detector with a baseline of > 1000 km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2.2{sup o}.

  5. DEEP LEARNING MODEL FOR BILINGUAL SENTIMENT CLASSIFICATION OF SHORT TEXTS

    Directory of Open Access Journals (Sweden)

    Y. B. Abdullin

    2017-01-01

    Full Text Available Sentiment analysis of short texts such as Twitter messages and comments in news portals is challenging due to the lack of contextual information. We propose a deep neural network model that uses bilingual word embeddings to effectively solve sentiment classification problem for a given pair of languages. We apply our approach to two corpora of two different language pairs: English-Russian and Russian-Kazakh. We show how to train a classifier in one language and predict in another. Our approach achieves 73% accuracy for English and 74% accuracy for Russian. For Kazakh sentiment analysis, we propose a baseline method, that achieves 60% accuracy; and a method to learn bilingual embeddings from a large unlabeled corpus using a bilingual word pairs.

  6. Bandwidth Optimization of Normal Equation Matrix in Bundle Block Adjustment in Multi-baseline Rotational Photography

    Directory of Open Access Journals (Sweden)

    WANG Xiang

    2016-02-01

    Full Text Available A new bandwidth optimization method of normal equation matrix in bundle block adjustment in multi-baseline rotational close range photography by image index re-sorting is proposed. The equivalent exposure station of each image is calculated by its object space coverage and the relationship with other adjacent images. Then, according to the coordinate relations between equivalent exposure stations, new logical indices of all images are computed, based on which, the optimized bandwidth value can be obtained. Experimental results show that the bandwidth determined by our proposed method is significantly better than its original value, thus the operational efficiency, as well as the memory consumption of multi-baseline rotational close range photography in real-data applications, is optimized to a certain extent.

  7. A proposal to create an extension to the European baseline series

    DEFF Research Database (Denmark)

    Wilkinson, Mark; Gallo, Rosella; Goossens, An

    2018-01-01

    exposures. METHODS: Data from departmental and national extensions to the baseline series, together with some temporary additions from departments contributing to the ESSCA, were collated during 2013-2014. RESULTS: In total, 31689 patients were patch tested in 46 European departments. Many departments...

  8. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  9. An Energy Efficiency Evaluation Method Based on Energy Baseline for Chemical Industry

    Directory of Open Access Journals (Sweden)

    Dong-mei Yao

    2016-01-01

    Full Text Available According to the requirements and structure of ISO 50001 energy management system, this study proposes an energy efficiency evaluation method based on energy baseline for chemical industry. Using this method, the energy plan implementation effect in the processes of chemical production can be evaluated quantitatively, and evidences for system fault diagnosis can be provided. This method establishes the energy baseline models which can meet the demand of the different kinds of production processes and gives the general solving method of each kind of model according to the production data. Then the energy plan implementation effect can be evaluated and also whether the system is running normally can be determined through the baseline model. Finally, this method is used on cracked gas compressor unit of ethylene plant in some petrochemical enterprise; it can be proven that this method is correct and practical.

  10. A TEACHING PROPOSAL OF PRODUCTION OF DISSERTATIVE-ARGUMENTATIVE TEXTS BASED ON THE THEORY OF SEMANTIC BLOCKS

    Directory of Open Access Journals (Sweden)

    Cláudio Primo Delanoy

    2015-12-01

    Full Text Available This paper aims to explain a teaching proposal of production of dissertative-argumentative texts, based on concepts and principles of the Theory of Argumentation within Language (ADL of Ducrot (1990, 2009, and above all in tools made available by the Theory of Semantic Blocks (TBS, Carel (1995, 2005, and Carel and Ducrot (2005. In order to do so, first, the text production proposal of Enem 2012 is analyzed, so as to find the basic semantic units of its motivational texts, which, by being associated to argumentative aspects of semantic blocks that originate those semantic units, may guide effective argumentative routes to be realized in dissertative argumentative text from semantic relations within the same block. It is verified, also, to what extent argumentative transgressive chaining are presented in argumentative essays as more convincing than the normative argumentative ones. As a result, this work may provide theoretical and methodological support for teachers that have been working directly with the teaching of reading and writing, in basic or superior education levels.

  11. Low-Power Bitstream-Residual Decoder for H.264/AVC Baseline Profile Decoding

    Directory of Open Access Journals (Sweden)

    Xu Ke

    2009-01-01

    Full Text Available Abstract We present the design and VLSI implementation of a novel low-power bitstream-residual decoder for H.264/AVC baseline profile. It comprises a syntax parser, a parameter decoder, and an Inverse Quantization Inverse Transform (IQIT decoder. The syntax parser detects and decodes each incoming codeword in the bitstream under the control of a hierarchical Finite State Machine (FSM; the IQIT decoder performs inverse transform and quantization with pipelining and parallelism. Various power reduction techniques, such as data-driven based on statistic results, nonuniform partition, precomputation, guarded evaluation, hierarchical FSM decomposition, TAG method, zero-block skipping, and clock gating , are adopted and integrated throughout the bitstream-residual decoder. With innovative architecture, the proposed design is able to decode QCIF video sequences of 30 fps at a clock rate as low as 1.5 MHz. A prototype H.264/AVC baseline decoding chip utilizing the proposed decoder is fabricated in UMC 0.18  m 1P6M CMOS technology. The proposed design is measured under 1 V 1.8 V supply with 0.1 V step. It dissipates 76  W at 1 V and 253  W at 1.8 V.

  12. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    not perform well in this setting. In this work we compare the performance of different noise reduction methods under different noise conditions in terms of speaker verification when the text is known and the system is trained on clean data (mis-matched conditions). We furthermore propose a new approach based......The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...... on dictionary-based noise reduction and compare it to the baseline methods....

  13. On the feasibility of routine baseline improvement in processing of geomagnetic observatory data

    Science.gov (United States)

    Soloviev, Anatoly; Lesur, Vincent; Kudin, Dmitry

    2018-02-01

    We propose a new approach to the calculation of regular baselines at magnetic observatories. The proposed approach is based on the simultaneous analysis of the irregular absolute observations and the continuous time-series deltaF, widely used for estimating the data quality. The systematic deltaF analysis allows to take into account all available information about the operation of observatory instruments (i.e., continuous records of the field variations and its modulus) in the intervals between the times of absolute observations, as compared to the traditional baseline calculation where only spot values are considered. To establish a connection with the observed spot baseline values, we introduce a function for approximate evaluation of the intermediate baseline values. An important feature of the algorithm is its quantitative estimation of the resulting data precision and thus determination of the problematic fragments in raw data. We analyze the robustness of the algorithm operation using synthetic data sets. We also compare baselines and definitive data derived by the proposed algorithm with those derived by the traditional approach using Saint Petersburg observatory data, recorded in 2015 and accepted by INTERMAGNET. It is shown that the proposed method allows to essentially improve the resulting data quality when baseline data are not good enough. The obtained results prove that the baseline variability in time might be quite rapid.[Figure not available: see fulltext.

  14. The SSVEP-Based BCI Text Input System Using Entropy Encoding Algorithm

    Directory of Open Access Journals (Sweden)

    Yeou-Jiunn Chen

    2015-01-01

    Full Text Available The so-called amyotrophic lateral sclerosis (ALS or motor neuron disease (MND is a neurodegenerative disease with various causes. It is characterized by muscle spasticity, rapidly progressive weakness due to muscle atrophy, and difficulty in speaking, swallowing, and breathing. The severe disabled always have a common problem that is about communication except physical malfunctions. The steady-state visually evoked potential based brain computer interfaces (BCI, which apply visual stimulus, are very suitable to play the role of communication interface for patients with neuromuscular impairments. In this study, the entropy encoding algorithm is proposed to encode the letters of multilevel selection interface for BCI text input systems. According to the appearance frequency of each letter, the entropy encoding algorithm is proposed to construct a variable-length tree for the letter arrangement of multilevel selection interface. Then, the Gaussian mixture models are applied to recognize electrical activity of the brain. According to the recognition results, the multilevel selection interface guides the subject to spell and type the words. The experimental results showed that the proposed approach outperforms the baseline system, which does not consider the appearance frequency of each letter. Hence, the proposed approach is able to ease text input interface for patients with neuromuscular impairments.

  15. Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

    Science.gov (United States)

    Mou, Xiaozheng; Wang, Han

    2018-01-01

    This paper proposes a wide-baseline stereo-based static obstacle mapping approach for unmanned surface vehicles (USVs). The proposed approach eliminates the complicated calibration work and the bulky rig in our previous binocular stereo system, and raises the ranging ability from 500 to 1000 m with a even larger baseline obtained from the motion of USVs. Integrating a monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed while the USV is traveling, and an obstacle map is then built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. Experimental results based on our own dataset demonstrate the high efficiency of our system. To the best of our knowledge, we are the first to address the task of wide-baseline stereo-based obstacle mapping in a maritime environment. PMID:29617293

  16. Way to increase the user access at the LCLS baseline

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg; Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-10-15

    Although the LCLS photon beam is meant for a single user, the baseline undulator is long enough to serve two users simultaneously. To this end, we propose a setup composed of two simple elements: an X-ray mirror pair for X-ray beam deflection, and a short (4 m-long) magnetic chicane, which creates an offset for mirror pair installation in the middle of the baseline undulator. The insertable mirror pair can be used for spatial separation of the X-ray beams generated in the first and in the second half of the baseline undulator. The method of deactivating one half and activating another half of the undulator is based on the rapid switching of the FEL amplification process. As proposed elsewhere, using a kicker installed upstream of the LCLS baseline undulator and an already existing corrector in the first half of the undulator, it is possible to rapidly switch the X-ray beam from one user to another, thus providing two active beamlines at any time. We present simulation results dealing with the LCLS baseline, and show that it is possible to generate two saturated SASE X-ray beams in the whole 0.8-8 keV photon energy range in the same baseline undulator. These can be exploited to serve two users. Implementation of the proposed technique does not perturb the baseline mode of operation of the LCLS undulator. Moreover, the magnetic chicane setup is very flexible, and can be used as a self-seeding setup too. We present simulation results for the LCLS baseline undulator with SHAB (second harmonic afterburner) and show that one can produce monochromatic radiation at the 2nd harmonic as well as at the 1st. We describe an efficient way for obtaining multi-user operation at the LCLS hard X-ray FEL. To this end, a photon beam distribution system based on the use of crystals in the Bragg reflection geometry is proposed. The reflectivity of crystal deflectors can be switched fast enough by flipping the crystals with piezoelectric devices similar to those for X-ray phase retarders

  17. Dynamic baseline detection method for power data network service

    Science.gov (United States)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  18. Long-baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Crane, D.; Goodman, M.

    1994-01-01

    There is no unambiguous definition for long baseline neutrino oscillation experiments. The term is generally used for accelerator neutrino oscillation experiments which are sensitive to Δm 2 2 , and for which the detector is not on the accelerator site. The Snowmass N2L working group met to discuss the issues facing such experiments. The Fermilab Program Advisory Committee adopted several recommendations concerning the Fermilab neutrino program at their Aspen meeting immediately prior to the Snowmass Workshop. This heightened the attention for the proposals to use Fermilab for a long-baseline neutrino experiment at the workshop. The plan for a neutrino oscillation program at Brookhaven was also thoroughly discussed. Opportunities at CERN were considered, particularly the use of detectors at the Gran Sasso laboratory. The idea to build a neutrino beam from KEK towards Superkamiokande was not discussed at the Snowmass meeting, but there has been considerable development of this idea since then. Brookhaven and KEK would use low energy neutrino beams, while FNAL and CERN would plan have medium energy beams. This report will summarize a few topics common to LBL proposals and attempt to give a snapshot of where things stand in this fast developing field

  19. Scoping paper on new CDM baseline methodology for cross-border power trade

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Poeyry has been sub-contracted by Carbon Limits, under the African Development Bank CDM Support Programme, to prepare a new CDM baseline methodology for cross border trade, based on a transmission line from Ethiopia to Kenya. The first step in that process is to review the response of the UNFCCC, particularly the Methodologies Panel ('Meth Panel') of the CDM Executive Board, to the various proposals on cross-border trade and interconnection of grids. This report reviews the Methodology Panel and Executive Board decisions on 4 requests for revisions of ACM2 'Consolidated baseline methodology for grid-connected electricity generation from renewable sources', and 5 proposed new baseline methodologies (NM255, NM269, NM272, NM318, NM342), all of which were rejected. We analyse the reasons the methodologies were rejected, and whether the proposed draft Approved Methodology (AM) that the Methodology Panel created in response to NM269 and NM272 is a suitable basis for a new methodology proposal.(auth)

  20. MeSH: a window into full text for document summarization.

    Science.gov (United States)

    Bhattacharya, Sanmitra; Ha-Thuc, Viet; Srinivasan, Padmini

    2011-07-01

    Previous research in the biomedical text-mining domain has historically been limited to titles, abstracts and metadata available in MEDLINE records. Recent research initiatives such as TREC Genomics and BioCreAtIvE strongly point to the merits of moving beyond abstracts and into the realm of full texts. Full texts are, however, more expensive to process not only in terms of resources needed but also in terms of accuracy. Since full texts contain embellishments that elaborate, contextualize, contrast, supplement, etc., there is greater risk for false positives. Motivated by this, we explore an approach that offers a compromise between the extremes of abstracts and full texts. Specifically, we create reduced versions of full text documents that contain only important portions. In the long-term, our goal is to explore the use of such summaries for functions such as document retrieval and information extraction. Here, we focus on designing summarization strategies. In particular, we explore the use of MeSH terms, manually assigned to documents by trained annotators, as clues to select important text segments from the full text documents. Our experiments confirm the ability of our approach to pick the important text portions. Using the ROUGE measures for evaluation, we were able to achieve maximum ROUGE-1, ROUGE-2 and ROUGE-SU4 F-scores of 0.4150, 0.1435 and 0.1782, respectively, for our MeSH term-based method versus the maximum baseline scores of 0.3815, 0.1353 and 0.1428, respectively. Using a MeSH profile-based strategy, we were able to achieve maximum ROUGE F-scores of 0.4320, 0.1497 and 0.1887, respectively. Human evaluation of the baselines and our proposed strategies further corroborates the ability of our method to select important sentences from the full texts. sanmitra-bhattacharya@uiowa.edu; padmini-srinivasan@uiowa.edu.

  1. Mobile Robots Path Planning Using the Overall Conflict Resolution and Time Baseline Coordination

    Directory of Open Access Journals (Sweden)

    Yong Ma

    2014-01-01

    Full Text Available This paper aims at resolving the path planning problem in a time-varying environment based on the idea of overall conflict resolution and the algorithm of time baseline coordination. The basic task of the introduced path planning algorithms is to fulfill the automatic generation of the shortest paths from the defined start poses to their end poses with consideration of generous constraints for multiple mobile robots. Building on this, by using the overall conflict resolution, within the polynomial based paths, we take into account all the constraints including smoothness, motion boundary, kinematics constraints, obstacle avoidance, and safety constraints among robots together. And time baseline coordination algorithm is proposed to process the above formulated problem. The foremost strong point is that much time can be saved with our approach. Numerical simulations verify the effectiveness of our approach.

  2. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    Science.gov (United States)

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  3. Robust extraction of baseline signal of atmospheric trace species using local regression

    Directory of Open Access Journals (Sweden)

    A. F. Ruckstuhl

    2012-11-01

    Full Text Available The identification of atmospheric trace species measurements that are representative of well-mixed background air masses is required for monitoring atmospheric composition change at background sites. We present a statistical method based on robust local regression that is well suited for the selection of background measurements and the estimation of associated baseline curves. The bootstrap technique is applied to calculate the uncertainty in the resulting baseline curve. The non-parametric nature of the proposed approach makes it a very flexible data filtering method. Application to carbon monoxide (CO measured from 1996 to 2009 at the high-alpine site Jungfraujoch (Switzerland, 3580 m a.s.l., and to measurements of 1,1-difluoroethane (HFC-152a from Jungfraujoch (2000 to 2009 and Mace Head (Ireland, 1995 to 2009 demonstrates the feasibility and usefulness of the proposed approach.

    The determined average annual change of CO at Jungfraujoch for the 1996 to 2009 period as estimated from filtered annual mean CO concentrations is −2.2 ± 1.1 ppb yr−1. For comparison, the linear trend of unfiltered CO measurements at Jungfraujoch for this time period is −2.9 ± 1.3 ppb yr−1.

  4. Damage Identification of Bridge Based on Chebyshev Polynomial Fitting and Fuzzy Logic without Considering Baseline Model Parameters

    Directory of Open Access Journals (Sweden)

    Yu-Bo Jiao

    2015-01-01

    Full Text Available The paper presents an effective approach for damage identification of bridge based on Chebyshev polynomial fitting and fuzzy logic systems without considering baseline model data. The modal curvature of damaged bridge can be obtained through central difference approximation based on displacement modal shape. Depending on the modal curvature of damaged structure, Chebyshev polynomial fitting is applied to acquire the curvature of undamaged one without considering baseline parameters. Therefore, modal curvature difference can be derived and used for damage localizing. Subsequently, the normalized modal curvature difference is treated as input variable of fuzzy logic systems for damage condition assessment. Numerical simulation on a simply supported bridge was carried out to demonstrate the feasibility of the proposed method.

  5. Predicting Prosody from Text for Text-to-Speech Synthesis

    CERN Document Server

    Rao, K Sreenivasa

    2012-01-01

    Predicting Prosody from Text for Text-to-Speech Synthesis covers the specific aspects of prosody, mainly focusing on how to predict the prosodic information from linguistic text, and then how to exploit the predicted prosodic knowledge for various speech applications. Author K. Sreenivasa Rao discusses proposed methods along with state-of-the-art techniques for the acquisition and incorporation of prosodic knowledge for developing speech systems. Positional, contextual and phonological features are proposed for representing the linguistic and production constraints of the sound units present in the text. This book is intended for graduate students and researchers working in the area of speech processing.

  6. Negation handling in sentiment classification using rule-based adapted from Indonesian language syntactic for Indonesian text in Twitter

    Science.gov (United States)

    Amalia, Rizkiana; Arif Bijaksana, Moch; Darmantoro, Dhinta

    2018-03-01

    The presence of the word negation is able to change the polarity of the text if it is not handled properly it will affect the performance of the sentiment classification. Negation words in Indonesian are ‘tidak’, ‘bukan’, ‘belum’ and ‘jangan’. Also, there is a conjunction word that able to reverse the actual values, as the word ‘tetapi’, or ‘tapi’. Unigram has shortcomings in dealing with the existence of negation because it treats negation word and the negated words as separate words. A general approach for negation handling in English text gives the tag ‘NEG_’ for following words after negation until the first punctuation. But this may gives the tag to un-negated, and this approach does not handle negation and conjunction in one sentences. The rule-based method to determine what words negated by adapting the rules of Indonesian language syntactic of negation to determine the scope of negation was proposed in this study. With adapting syntactic rules and tagging “NEG_” using SVM classifier with RBF kernel has better performance results than the other experiments. Considering the average F1-score value, the performance of this proposed method can be improved against baseline equal to 1.79% (baseline without negation handling) and 5% (baseline with existing negation handling) for a dataset that all tweets contain negation words. And also for the second dataset that has the various number of negation words in document tweet. It can be improved against baseline at 2.69% (without negation handling) and 3.17% (with existing negation handling).

  7. Near Detectors based on gas TPCs for neutrino long baseline experiments

    CERN Document Server

    Blondel, A

    2017-01-01

    Time Projection Chambers have been used with success for the T2K ND280 near detector and are proposed for an upgrade of the T2K near detector. High pressure TPCs are also being considered for future long-baseline experiments like Hyper-Kamiokande and DUNE. A High Pressure TPC would be a very sensitive detector for the detailed study of neutrino-nucleus interactions, a limiting factor for extracting the ultimate precision in long baseline experiments. The requirements of TPCs for neutrino detectors are quite specific. We propose here the development of state-of-the-art near detectors based on gas TPC: atmospheric pressure TPCs for T2K-II and a high-pressure TPC for neutrino experiments. The project proposed here benefits from a strong involvement of the European (CERN) members of the T2K collaboration and beyond. It is a strongly synergetic precursor of other projects of near detectors using gas TPCs that are under discussion for the long baseline neutrino projects worldwide. It will help maintain and develop...

  8. A Proposal for a Three Detector Short-Baseline Neutrino Oscillation Program in the Fermilab Booster Neutrino Beam

    CERN Document Server

    Antonello, M.; Bellini, V.; Benetti, P.; Bertolucci, S.; Bilokon, H.; Boffelli, F.; Bonesini, M.; Bremer, J.; Calligarich, E.; Centro, S.; Cocco, A.G.; Dermenev, A.; Falcone, A.; Farnese, C.; Fava, A.; Ferrari, A.; Gibin, D.; Gninenko, S.; Golubev, N.; Guglielmi, A.; Ivashkin, A.; Kirsanov, M.; Kisiel, J.; Kose, U.; Mammoliti, F.; Mannocchi, G.; Menegolli, A.; Meng, G.; Mladenov, D.; Montanari, C.; Nessi, M.; Nicoletto, M.; Noto, F.; Picchi, P.; Pietropaolo, F.; Plonski, P.; Potenza, R.; Rappoldi, A.; Raselli, G.L.; Rossella, M.; Rubbia, C.; Sala, P.; Scaramelli, A.; Sobczyk, J.; Spanu, M.; Stefan, D.; Sulej, R.; Sutera, C.M.; Torti, M.; Tortorici, F.; Varanini, F.; Ventura, S.; Vignoli, C.; Wachala, T.; Zani, A.; Adams, C.; Andreopoulos, C.; Ankowski, A.M.; Asaadi, J.; Bagby, L.; Baller, B.; Barros, N.; Bass, M.; Bishai, M.; Bitadze, A.; Bugel, L.; Camilleri, L.; Cavanna, F.; Chen, H.; Chi, C.; Church, E.; Cianci, D.; Collin, G.H.; Conrad, J.M.; De Geronimo, G.; Dharmapalan, R.; Djurcic, Z.; Ereditato, A.; Esquivel, J.; Evans, J.; Fleming, B.T.; Foreman, W.M.; Freestone, J.; Gamble, T.; Garvey, G.; Genty, V.; Goldi, D.; Gramellini, E.; Greenlee, H.; Guenette, R.; Hackenburg, A.; Hanni, R.; Ho, J.; Howell, J.; James, C.; Jen, C.M.; Jones, B.J.P.; Kalousis, L.N.; Karagiorgi, G.; Ketchum, W.; Klein, J.; Klinger, J.; Kreslo, I.; Kudryavtsev, V.A.; Lissauer, D.; Livesly, P.; Louis, W.C.; Luthi, M.; Mariani, C.; Mavrokoridis, K.; McCauley, N.; McConkey, N.; Mercer, I.; Miao, T.; Mills, G.B.; Montanari, D.; Moon, J.; Moss, Z.; Mufson, S.; Norris, B.; Nowak, J.; Pal, S.; Palamara, O.; Pater, J.; Pavlovic, Z.; Perkin, J.; Pulliam, G.; Qian, X.; Qiuguang, L.; Radeka, V.; Rameika, R.; Ratoff, P.N.; Richardson, M.; von Rohr, C.Rudolf; Russell, B.; Schmitz, D.W.; Shaevitz, M.H.; Sippach, B.; Soderberg, M.; Soldner-Rembold, S.; Spitz, J.; Spooner, N.; Strauss, T.; Szelc, A.M.; Taylor, C.E.; Terao, K.; Thiesse, M.; Thompson, L.; Thomson, M.; Thorn, C.; Toups, M.; Touramanis, C.; Van de Water, R.G.; Weber, M.; Whittington, D.; Wongjirad, T.; Yu, B.; Zeller, G.P.; Zennamo, J.; Acciarri, R.; An, R.; Barr, G.; Blake, A.; Bolton, T.; Bromberg, C.; Caratelli, D.; Carls, B.; Convery, M.; Dytmam, S.; Eberly, B.; Gollapinni, S.; Graham, M.; Grosso, R.; Hen, O.; Hewes, J.; Horton-Smith, G.; Johnson, R.A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Li, Y.; Littlejohn, B.; Lockwitz, S.; Lundberg, B.; Marchionni, A.; Marshall, J.; McDonald, K.; Meddage, V.; Miceli, T.; Mooney, M.; Moulai, M.H.; Murrells, R.; Naples, D.; Nienaber, P.; Paolone, V.; Papavassiliou, V.; Pate, S.; Pordes, S.; Raaf, J.L.; Rebel, B.; Rochester, L.; Schukraft, A.; Seligman, W.; St. John, J.; Tagg, N.; Tsai, Y.; Usher, T.; Wolbers, S.; Woodruff, K.; Xu, M.; Yang, T.; Zhang, C.; Badgett, W.; Biery, K.; Brice, S.J.; Dixon, S.; Geynisman, M.; Moore, C.; Snider, E.; Wilson, P.

    2015-01-01

    A Short-Baseline Neutrino (SBN) physics program of three LAr-TPC detectors located along the Booster Neutrino Beam (BNB) at Fermilab is presented. This new SBN Program will deliver a rich and compelling physics opportunity, including the ability to resolve a class of experimental anomalies in neutrino physics and to perform the most sensitive search to date for sterile neutrinos at the eV mass-scale through both appearance and disappearance oscillation channels. Using data sets of 6.6e20 protons on target (P.O.T.) in the LAr1-ND and ICARUS T600 detectors plus 13.2e20 P.O.T. in the MicroBooNE detector, we estimate that a search for muon neutrino to electron neutrino appearance can be performed with ~5 sigma sensitivity for the LSND allowed (99% C.L.) parameter region. In this proposal for the SBN Program, we describe the physics analysis, the conceptual design of the LAr1-ND detector, the design and refurbishment of the T600 detector, the necessary infrastructure required to execute the program, and a possible...

  9. 40 CFR 74.20 - Data for baseline and alternative baseline.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Data for baseline and alternative baseline. 74.20 Section 74.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... baseline and alternative baseline. (a) Acceptable data. (1) The designated representative of a combustion...

  10. Magical properties of a 2540 km baseline superbeam experiment

    International Nuclear Information System (INIS)

    Raut, Sushant K.; Singh, Ravi Shanker; Uma Sankar, S.

    2011-01-01

    Lack of any information on the CP violating phase δ CP weakens our ability to determine neutrino mass hierarchy. Magic baseline of 7500 km was proposed to overcome this problem. However, to obtain large enough fluxes, at this very long baseline, one needs new techniques of generating high intensity neutrino beams. In this Letter, we highlight the magical properties of a 2540 km baseline. At such a baseline, using a narrow band neutrino superbeam whose no oscillation event rate peaks around the energy 3.5 GeV, we can determine neutrino mass hierarchy independently of the CP phase. For sin 2 2θ 13 ≥0.05, a very modest exposure of 10 Kiloton-years is sufficient to determine the hierarchy. For 0.02≤sin 2 2θ 13 ≤0.05, an exposure of about 100 Kiloton-years is needed.

  11. Measurement of baseline and orientation between distributed aerospace platforms.

    Science.gov (United States)

    Wang, Wen-Qin

    2013-01-01

    Distributed platforms play an important role in aerospace remote sensing, radar navigation, and wireless communication applications. However, besides the requirement of high accurate time and frequency synchronization for coherent signal processing, the baseline between the transmitting platform and receiving platform and the orientation of platform towards each other during data recording must be measured in real time. In this paper, we propose an improved pulsed duplex microwave ranging approach, which allows determining the spatial baseline and orientation between distributed aerospace platforms by the proposed high-precision time-interval estimation method. This approach is novel in the sense that it cancels the effect of oscillator frequency synchronization errors due to separate oscillators that are used in the platforms. Several performance specifications are also discussed. The effectiveness of the approach is verified by simulation results.

  12. AN INTEGRATED RANSAC AND GRAPH BASED MISMATCH ELIMINATION APPROACH FOR WIDE-BASELINE IMAGE MATCHING

    Directory of Open Access Journals (Sweden)

    M. Hasheminasab

    2015-12-01

    Full Text Available In this paper we propose an integrated approach in order to increase the precision of feature point matching. Many different algorithms have been developed as to optimizing the short-baseline image matching while because of illumination differences and viewpoints changes, wide-baseline image matching is so difficult to handle. Fortunately, the recent developments in the automatic extraction of local invariant features make wide-baseline image matching possible. The matching algorithms which are based on local feature similarity principle, using feature descriptor as to establish correspondence between feature point sets. To date, the most remarkable descriptor is the scale-invariant feature transform (SIFT descriptor , which is invariant to image rotation and scale, and it remains robust across a substantial range of affine distortion, presence of noise, and changes in illumination. The epipolar constraint based on RANSAC (random sample consensus method is a conventional model for mismatch elimination, particularly in computer vision. Because only the distance from the epipolar line is considered, there are a few false matches in the selected matching results based on epipolar geometry and RANSAC. Aguilariu et al. proposed Graph Transformation Matching (GTM algorithm to remove outliers which has some difficulties when the mismatched points surrounded by the same local neighbor structure. In this study to overcome these limitations, which mentioned above, a new three step matching scheme is presented where the SIFT algorithm is used to obtain initial corresponding point sets. In the second step, in order to reduce the outliers, RANSAC algorithm is applied. Finally, to remove the remained mismatches, based on the adjacent K-NN graph, the GTM is implemented. Four different close range image datasets with changes in viewpoint are utilized to evaluate the performance of the proposed method and the experimental results indicate its robustness and

  13. Exploring non standard physics in long-baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Chatterjee, Sabya Sachi

    2015-01-01

    After the recent discovery of large th ( 13), the focus has been shifted to address the remaining fundamental issues like neutrino mass ordering and CP-violation in leptonic sector. Future proposed Long-Baseline facilities like DUNE (1300 km baseline from FNAL to Homestake) and LBNO (2290 km baseline from CERN to Pyhasalmi) are well suited to address these issues at high confidence level. Not only to the standard framework, these experiments are highly capable to look for some new physics beyond the Standard Model scenario. In this work, we explore whether these high precision future facilities are sensitive to new U(1) global symmetries and upto which confidence level. (author)

  14. MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation

    Directory of Open Access Journals (Sweden)

    Howard Daniel

    2006-01-01

    Full Text Available Stochastic Bernstein (SB approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant or tuned with evolutionary computation (for .

  15. A baseline-free procedure for transformation models under interval censorship.

    Science.gov (United States)

    Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin

    2005-12-01

    An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.

  16. Multi-sensors multi-baseline mapping system for mobile robot using stereovision camera and laser-range device

    Directory of Open Access Journals (Sweden)

    Mohammed Faisal

    2016-06-01

    Full Text Available Countless applications today are using mobile robots, including autonomous navigation, security patrolling, housework, search-and-rescue operations, material handling, manufacturing, and automated transportation systems. Regardless of the application, a mobile robot must use a robust autonomous navigation system. Autonomous navigation remains one of the primary challenges in the mobile-robot industry; many control algorithms and techniques have been recently developed that aim to overcome this challenge. Among autonomous navigation methods, vision-based systems have been growing in recent years due to rapid gains in computational power and the reliability of visual sensors. The primary focus of research into vision-based navigation is to allow a mobile robot to navigate in an unstructured environment without collision. In recent years, several researchers have looked at methods for setting up autonomous mobile robots for navigational tasks. Among these methods, stereovision-based navigation is a promising approach for reliable and efficient navigation. In this article, we create and develop a novel mapping system for a robust autonomous navigation system. The main contribution of this article is the fuse of the multi-baseline stereovision (narrow and wide baselines and laser-range reading data to enhance the accuracy of the point cloud, to reduce the ambiguity of correspondence matching, and to extend the field of view of the proposed mapping system to 180°. Another contribution is the pruning the region of interest of the three-dimensional point clouds to reduce the computational burden involved in the stereo process. Therefore, we called the proposed system multi-sensors multi-baseline mapping system. The experimental results illustrate the robustness and accuracy of the proposed system.

  17. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    Science.gov (United States)

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  18. Baseline Design and Performance Analysis of Laser Altimeter for Korean Lunar Orbiter

    Directory of Open Access Journals (Sweden)

    Hyung-Chul Lim

    2016-09-01

    Full Text Available Korea’s lunar exploration project includes the launching of an orbiter, a lander (including a rover, and an experimental orbiter (referred to as a lunar pathfinder. Laser altimeters have played an important scientific role in lunar, planetary, and asteroid exploration missions since their first use in 1971 onboard the Apollo 15 mission to the Moon. In this study, a laser altimeter was proposed as a scientific instrument for the Korean lunar orbiter, which will be launched by 2020, to study the global topography of the surface of the Moon and its gravitational field and to support other payloads such as a terrain mapping camera or spectral imager. This study presents the baseline design and performance model for the proposed laser altimeter. Additionally, the study discusses the expected performance based on numerical simulation results. The simulation results indicate that the design of system parameters satisfies performance requirements with respect to detection probability and range error even under unfavorable conditions.

  19. Accelerated Best Basis Inventory Baselining Task

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2001-01-01

    The baselining effort was recently proposed to bring the Best-Basis Inventory (BBI) and Question No.8 of the Tank Interpretive Report (TIR) for all 177 tanks to the current standards and protocols and to prepare a TIR Question No.8 if one is not already available. This plan outlines the objectives and methodology of the accelerated BBI baselining task. BBI baselining meetings held during December 2000 resulted in a revised BBI methodology and an initial set of BBI creation rules to be used in the baselining effort. The objectives of the BBI baselining effort are to: (1) Provide inventories that are consistent with the revised BBI methodology and new BBI creation rules. (2) Split the total tank waste in each tank into six waste phases, as appropriate (Supernatant, saltcake solids, saltcake liquid, sludge solids, sludge liquid, and retained gas). In some tanks, the solids and liquid portions of the sludge and/or saltcake may be combined into a single sludge or saltcake phase. (3) Identify sampling events that are to be used for calculating the BBIs. (4) Update waste volumes for subsequent reconciliation with the Hanlon (2001) waste tank summary. (5) Implement new waste type templates. (6) Include any sample data that might have been unintentionally omitted in the previous BBI and remove any sample data that should not have been included. Sample data to be used in the BBI must be available on TWINS. (7) Ensure that an inventory value for each standard BBI analyte is provided for each waste component. Sample based inventories for supplemental BBI analytes will be included when available. (8) Provide new means and confidence interval reports if one is not already available and include uncertainties in reporting inventory values

  20. Baseline development, economic risk, and schedule risk: An integrated approach

    International Nuclear Information System (INIS)

    Tonkinson, J.A.

    1994-01-01

    The economic and schedule risks of Environmental Restoration (ER) projects are commonly analyzed toward the end of the baseline development process. Risk analysis is usually performed as the final element of the scheduling or estimating processes for the purpose of establishing cost and schedule contingency. However, there is an opportunity for earlier assessment of risks, during development of the technical scope and Work Breakdown Structure (WBS). Integrating the processes of risk management and baselining provides for early incorporation of feedback regarding schedule and cost risk into the proposed scope of work. Much of the information necessary to perform risk analysis becomes available during development of the technical baseline, as the scope of work and WBS are being defined. The analysis of risk can actually be initiated early on during development of the technical baseline and continue throughout development of the complete project baseline. Indeed, best business practices suggest that information crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable

  1. A baseline correction algorithm for Raman spectroscopy by adaptive knots B-spline

    International Nuclear Information System (INIS)

    Wang, Xin; Fan, Xian-guang; Xu, Ying-jie; Wang, Xiu-fen; He, Hao; Zuo, Yong

    2015-01-01

    The Raman spectroscopy technique is a powerful and non-invasive technique for molecular fingerprint detection which has been widely used in many areas, such as food safety, drug safety, and environmental testing. But Raman signals can be easily corrupted by a fluorescent background, therefore we presented a baseline correction algorithm to suppress the fluorescent background in this paper. In this algorithm, the background of the Raman signal was suppressed by fitting a curve called a baseline using a cyclic approximation method. Instead of the traditional polynomial fitting, we used the B-spline as the fitting algorithm due to its advantages of low-order and smoothness, which can avoid under-fitting and over-fitting effectively. In addition, we also presented an automatic adaptive knot generation method to replace traditional uniform knots. This algorithm can obtain the desired performance for most Raman spectra with varying baselines without any user input or preprocessing step. In the simulation, three kinds of fluorescent background lines were introduced to test the effectiveness of the proposed method. We showed that two real Raman spectra (parathion-methyl and colza oil) can be detected and their baselines were also corrected by the proposed method. (paper)

  2. A Fully Customized Baseline Removal Framework for Spectroscopic Applications.

    Science.gov (United States)

    Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby

    2017-07-01

    The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.

  3. Developing RESRAD-BASELINE for environmental baseline risk assessment

    International Nuclear Information System (INIS)

    Cheng, Jing-Jy.

    1995-01-01

    RESRAD-BASELINE is a computer code developed at Argonne developed at Argonne National Laboratory for the US Department of Energy (DOE) to perform both radiological and chemical risk assessments. The code implements the baseline risk assessment guidance of the US Environmental Protection Agency (EPA 1989). The computer code calculates (1) radiation doses and cancer risks from exposure to radioactive materials, and (2) hazard indexes and cancer risks from exposure to noncarcinogenic and carcinogenic chemicals, respectively. The user can enter measured or predicted environmental media concentrations from the graphic interface and can simulate different exposure scenarios by selecting the appropriate pathways and modifying the exposure parameters. The database used by PESRAD-BASELINE includes dose conversion factors and slope factors for radionuclides and toxicity information and properties for chemicals. The user can modify the database for use in the calculation. Sensitivity analysis can be performed while running the computer code to examine the influence of the input parameters. Use of RESRAD-BASELINE for risk analysis is easy, fast, and cost-saving. Furthermore, it ensures in consistency in methodology for both radiological and chemical risk analyses

  4. Building a comprehensive syntactic and semantic corpus of Chinese clinical texts.

    Science.gov (United States)

    He, Bin; Dong, Bin; Guan, Yi; Yang, Jinfeng; Jiang, Zhipeng; Yu, Qiubin; Cheng, Jianyi; Qu, Chunyan

    2017-05-01

    To build a comprehensive corpus covering syntactic and semantic annotations of Chinese clinical texts with corresponding annotation guidelines and methods as well as to develop tools trained on the annotated corpus, which supplies baselines for research on Chinese texts in the clinical domain. An iterative annotation method was proposed to train annotators and to develop annotation guidelines. Then, by using annotation quality assurance measures, a comprehensive corpus was built, containing annotations of part-of-speech (POS) tags, syntactic tags, entities, assertions, and relations. Inter-annotator agreement (IAA) was calculated to evaluate the annotation quality and a Chinese clinical text processing and information extraction system (CCTPIES) was developed based on our annotated corpus. The syntactic corpus consists of 138 Chinese clinical documents with 47,426 tokens and 2612 full parsing trees, while the semantic corpus includes 992 documents that annotated 39,511 entities with their assertions and 7693 relations. IAA evaluation shows that this comprehensive corpus is of good quality, and the system modules are effective. The annotated corpus makes a considerable contribution to natural language processing (NLP) research into Chinese texts in the clinical domain. However, this corpus has a number of limitations. Some additional types of clinical text should be introduced to improve corpus coverage and active learning methods should be utilized to promote annotation efficiency. In this study, several annotation guidelines and an annotation method for Chinese clinical texts were proposed, and a comprehensive corpus with its NLP modules were constructed, providing a foundation for further study of applying NLP techniques to Chinese texts in the clinical domain. Copyright © 2017. Published by Elsevier Inc.

  5. A Unified Algorithm for Channel Imbalance and Antenna Phase Center Position Calibration of a Single-Pass Multi-Baseline TomoSAR System

    Directory of Open Access Journals (Sweden)

    Yuncheng Bu

    2018-03-01

    Full Text Available The multi-baseline synthetic aperture radar (SAR tomography (TomoSAR system is employed in such applications as disaster remote sensing, urban 3-D reconstruction, and forest carbon storage estimation. This is because of its 3-D imaging capability in a single-pass platform. However, a high 3-D resolution of TomoSAR is based on the premise that the channel imbalance and antenna phase center (APC position are precisely known. If this is not the case, the 3-D resolution performance will be seriously degraded. In this paper, a unified algorithm for channel imbalance and APC position calibration of a single-pass multi-baseline TomoSAR system is proposed. Based on the maximum likelihood method, as well as the least squares and the damped Newton method, we can calibrate the channel imbalance and APC position. The algorithm is suitable for near-field conditions, and no phase unwrapping operation is required. The effectiveness of the proposed algorithm has been verified by simulation and experimental results.

  6. A Kalman filter-based short baseline RTK algorithm for single-frequency combination of GPS and BDS.

    Science.gov (United States)

    Zhao, Sihao; Cui, Xiaowei; Guan, Feng; Lu, Mingquan

    2014-08-20

    The emerging Global Navigation Satellite Systems (GNSS) including the BeiDou Navigation Satellite System (BDS) offer more visible satellites for positioning users. To employ those new satellites in a real-time kinematic (RTK) algorithm to enhance positioning precision and availability, a data processing model for the dual constellation of GPS and BDS is proposed and analyzed. A Kalman filter-based algorithm is developed to estimate the float ambiguities for short baseline scenarios. The entire work process of the high-precision algorithm based on the proposed model is deeply investigated in detail. The model is validated with real GPS and BDS data recorded from one zero and two short baseline experiments. Results show that the proposed algorithm can generate fixed baseline output with the same precision level as that of either a single GPS or BDS RTK algorithm. The significantly improved fixed rate and time to first fix of the proposed method demonstrates a better availability and effectiveness on processing multi-GNSSs.

  7. Physics Potential of Long-Baseline Experiments

    Directory of Open Access Journals (Sweden)

    Sanjib Kumar Agarwalla

    2014-01-01

    Full Text Available The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, θ13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of θ23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomenon. Then, I will discuss our present global understanding of the neutrino mass-mixing parameters and will identify the major unknowns in this sector. After that, I will present the physics reach of current generation long-baseline experiments. Finally, I will conclude with a discussion on the physics capabilities of accelerator-driven possible future long-baseline precision oscillation facilities.

  8. C-018H Pre-Operational Baseline Sampling Plan

    International Nuclear Information System (INIS)

    Guzek, S.J.

    1993-01-01

    The objective of this task is to field characterize and sample the soil at selected locations along the proposed effluent line routes for Project C-018H. The overall purpose of this effort is to meet the proposed plan to discontinue the disposal of contaminated liquids into the Hanford soil column as described by DOE (1987). Detailed information describing proposed transport pipeline route and associated Kaiser Engineers Hanford Company (KEH) preliminary drawings (H288746...755) all inclusive, have been prepared by KEH (1992). The information developed from field monitoring and sampling will be utilized to characterize surface and subsurface soil along the proposed C-018H effluent pipeline and it's associated facilities. Potentially existing contaminant levels may be encountered therefore, soil characterization will provide a construction preoperational baseline reference, develop personnel safety requirements, and determine the need for any changes in the proposed routes prior to construction of the pipeline

  9. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  10. Ecotaxonmic baseline evaluation of the plant species in a ...

    African Journals Online (AJOL)

    The survey of the flora composition of an ecosystem is important in several environmental baseline studies. An ecotaxonomic assessment was carried out in Ase-Ndoni proposed Rivgas Refinery project site in other to find out the plant species of medicinal and other economic values. The investigation was carried out to ...

  11. MERI: an ultra-long-baseline Moon-Earth radio interferometer.

    Science.gov (United States)

    Burns, J. O.

    Radiofrequency aperture synthesis, pioneered by Ryle and his colleagues at Cambridge in the 1960's, has evolved to ever longer baselines and larger arrays in recent years. The limiting resolution at a given frequency for modern ground-based very-long-baseline interferometry is simply determined by the physical diameter of the Earth. A second-generation, totally space-based VLB network was proposed recently by a group at the Naval Research Laboratory. The next logical extension of space-based VLBI would be a station or stations on the Moon. The Moon could serve as an outpost or even the primary correlator station for an extended array of space-based antennas.

  12. Text segmentation in degraded historical document images

    Directory of Open Access Journals (Sweden)

    A.S. Kavitha

    2016-07-01

    Full Text Available Text segmentation from degraded Historical Indus script images helps Optical Character Recognizer (OCR to achieve good recognition rates for Hindus scripts; however, it is challenging due to complex background in such images. In this paper, we present a new method for segmenting text and non-text in Indus documents based on the fact that text components are less cursive compared to non-text ones. To achieve this, we propose a new combination of Sobel and Laplacian for enhancing degraded low contrast pixels. Then the proposed method generates skeletons for text components in enhanced images to reduce computational burdens, which in turn helps in studying component structures efficiently. We propose to study the cursiveness of components based on branch information to remove false text components. The proposed method introduces the nearest neighbor criterion for grouping components in the same line, which results in clusters. Furthermore, the proposed method classifies these clusters into text and non-text cluster based on characteristics of text components. We evaluate the proposed method on a large dataset containing varieties of images. The results are compared with the existing methods to show that the proposed method is effective in terms of recall and precision.

  13. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener.

    Science.gov (United States)

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-09-16

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.

  14. Performance Analysis for Airborne Interferometric SAR Affected by Flexible Baseline Oscillation

    Directory of Open Access Journals (Sweden)

    Liu Zhong-sheng

    2014-04-01

    Full Text Available The airborne interferometric SAR platform suffers from instability factors, such as air turbulence and mechanical vibrations during flight. Such factors cause the oscillation of the flexible baseline, which leads to significant degradation of the performance of the interferometric SAR system. This study is concerned with the baseline oscillation. First, the error of the slant range model under baseline oscillation conditions is formulated. Then, the SAR complex image signal and dual-channel correlation coefficient are modeled based on the first-order, second-order, and generic slant range error. Subsequently, the impact of the baseline oscillation on the imaging and interferometric performance of the SAR system is analyzed. Finally, simulations of the echo data are used to validate the theoretical analysis of the baseline oscillation in the airborne interferometric SAR.

  15. An Energy Efficiency Evaluation Method Based on Energy Baseline for Chemical Industry

    OpenAIRE

    Yao, Dong-mei; Zhang, Xin; Wang, Ke-feng; Zou, Tao; Wang, Dong; Qian, Xin-hua

    2016-01-01

    According to the requirements and structure of ISO 50001 energy management system, this study proposes an energy efficiency evaluation method based on energy baseline for chemical industry. Using this method, the energy plan implementation effect in the processes of chemical production can be evaluated quantitatively, and evidences for system fault diagnosis can be provided. This method establishes the energy baseline models which can meet the demand of the different kinds of production proce...

  16. Text summarization as a decision support aid

    Directory of Open Access Journals (Sweden)

    Workman T

    2012-05-01

    Full Text Available Abstract Background PubMed data potentially can provide decision support information, but PubMed was not exclusively designed to be a point-of-care tool. Natural language processing applications that summarize PubMed citations hold promise for extracting decision support information. The objective of this study was to evaluate the efficiency of a text summarization application called Semantic MEDLINE, enhanced with a novel dynamic summarization method, in identifying decision support data. Methods We downloaded PubMed citations addressing the prevention and drug treatment of four disease topics. We then processed the citations with Semantic MEDLINE, enhanced with the dynamic summarization method. We also processed the citations with a conventional summarization method, as well as with a baseline procedure. We evaluated the results using clinician-vetted reference standards built from recommendations in a commercial decision support product, DynaMed. Results For the drug treatment data, Semantic MEDLINE enhanced with dynamic summarization achieved average recall and precision scores of 0.848 and 0.377, while conventional summarization produced 0.583 average recall and 0.712 average precision, and the baseline method yielded average recall and precision values of 0.252 and 0.277. For the prevention data, Semantic MEDLINE enhanced with dynamic summarization achieved average recall and precision scores of 0.655 and 0.329. The baseline technique resulted in recall and precision scores of 0.269 and 0.247. No conventional Semantic MEDLINE method accommodating summarization for prevention exists. Conclusion Semantic MEDLINE with dynamic summarization outperformed conventional summarization in terms of recall, and outperformed the baseline method in both recall and precision. This new approach to text summarization demonstrates potential in identifying decision support data for multiple needs.

  17. Office of Civilian Radioactive Waste Management Program Cost and Schedule Baseline

    International Nuclear Information System (INIS)

    1992-09-01

    The purpose of this document is to establish quantitative expressions of proposed costs and schedule to serve as a basis for measurement of program performance. It identifies the components of the Program Cost and Schedule Baseline (PCSB) that will be subject to change control by the Executive (Level 0) and Program (Level 1) Change Control Boards (CCBS) and establishes their baseline values. This document also details PCSB reporting, monitoring, and corrective action requirements. The Program technical baseline contained in the Waste Management System Description (WMSD), the Waste Management System Requirements (WMSR), and the Physical System Requirements documents provide the technical basis for the PCSB. Changes to the PCSB will be approved by the Pregrain Change Control Board (PCCB)In addition to the PCCB, the Energy System Acquisition Advisory Board Baseline CCB (ESAAB BCCB) will perform control functions relating to Total Project Cost (TPC) and major schedule milestones for the Yucca Mountain Site Characterization Project and the Monitored Retrievable Storage (MRS) Project

  18. 75 FR 67768 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Baseline...

    Science.gov (United States)

    2010-11-03

    ... elements of I2P2 among establishments. The OSHA also proposes to conduct case study interviews with... more than 10 employees. Finally, the OSHA proposes to conduct case study interviews with government... Administration (OSHA) sponsored information collection request (ICR), ``Baseline Safety and Health Practices...

  19. Language Model Adaptation Using Machine-Translated Text for Resource-Deficient Languages

    Directory of Open Access Journals (Sweden)

    Sadaoki Furui

    2009-01-01

    Full Text Available Text corpus size is an important issue when building a language model (LM. This is a particularly important issue for languages where little data is available. This paper introduces an LM adaptation technique to improve an LM built using a small amount of task-dependent text with the help of a machine-translated text corpus. Icelandic speech recognition experiments were performed using data, machine translated (MT from English to Icelandic on a word-by-word and sentence-by-sentence basis. LM interpolation using the baseline LM and an LM built from either word-by-word or sentence-by-sentence translated text reduced the word error rate significantly when manually obtained utterances used as a baseline were very sparse.

  20. XML and Free Text.

    Science.gov (United States)

    Riggs, Ken Roger

    2002-01-01

    Discusses problems with marking free text, text that is either natural language or semigrammatical but unstructured, that prevent well-formed XML from marking text for readily available meaning. Proposes a solution to mark meaning in free text that is consistent with the intended simplicity of XML versus SGML. (Author/LRW)

  1. Detecting CP violation in a single neutrino oscillation channel at very long baselines

    International Nuclear Information System (INIS)

    Latimer, D. C.; Escamilla, J.; Ernst, D. J.

    2007-01-01

    We propose a way of detecting CP violation in a single neutrino oscillation channel at very long baselines (on the order of several thousands of kilometers), given precise knowledge of the smallest mass-squared difference. It is shown that CP violation can be characterized by a shift in L/E of the peak oscillation in the ν e -ν μ appearance channel, both in vacuum and in matter. In fact, matter effects enhance the shift at a fixed energy. We consider the case in which sub-GeV neutrinos are measured with varying baseline and also the case of a fixed baseline. For the varied baseline, accurate knowledge of the absolute neutrino flux would not be necessary; however, neutrinos must be distinguishable from antineutrinos. For the fixed baseline, it is shown that CP violation can be distinguished if the mixing angle θ 13 were known

  2. Consideration of the baseline environment in examples of voluntary SEAs from Scotland

    International Nuclear Information System (INIS)

    Wright, Fiona

    2007-01-01

    Evidence from analysing and evaluating examples of three voluntary SEAs prepared in Scotland in the mid-late 1990s showed that different spatial and temporal scales were used when providing a baseline environment description. The SEAs analysed were prepared for: a wind farm siting programme that looked at national and short-term impacts; a land use plan that looked at regional and short-term impacts; and a transport plan that examined local and medium-term impacts. It was found that the two SEAs prepared by local government only considered impacts on the baseline environment within their jurisdictional boundaries whilst the SEA prepared by the private business considered impacts on the national baseline. A mixture of baseline data about planning, economic, environmental and social issues were included in the SEAs, however, evidence suggested that each SEA only focussed on those baseline features that might be significantly affected by the proposal. Each SEA also made extensive use of existing baseline information available from a variety of sources including local, and central government records and information from statutory bodies. All of the SEAs acknowledged that baseline data deficiencies existed and in certain cases steps were taken to obtain primary field data to help address these, however, it was also acknowledged that resource restrictions and decision-making deadlines limited the amount of primary baseline data that could be collected

  3. The 1993 baseline biological studies and proposed monitoring plan for the Device Assembly Facility at the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Woodward, B.D.; Hunter, R.B.; Greger, P.D.; Saethre, M.B.

    1995-02-01

    This report contains baseline data and recommendations for future monitoring of plants and animals near the new Device Assembly Facility (DAF) on the Nevada Test Site (NTS). The facility is a large structure designed for safely assembling nuclear weapons. Baseline data was collected in 1993, prior to the scheduled beginning of DAF operations in early 1995. Studies were not performed prior to construction and part of the task of monitoring operational effects will be to distinguish those effects from the extensive disturbance effects resulting from construction. Baseline information on species abundances and distributions was collected on ephemeral and perennial plants, mammals, reptiles, and birds in the desert ecosystems within three kilometers (km) of the DAF. Particular attention was paid to effects of selected disturbances, such as the paved road, sewage pond, and the flood-control dike, associated with the facility. Radiological monitoring of areas surrounding the DAF is not included in this report.

  4. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2016-01-01

    One of the greatest challenges for sensorimotor adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. This information could guide individually customized countermeasures, which would enable more efficient use of crew time and provide better outcomes. The principal aim of this work is to look for baseline performance metrics that relate to locomotor adaptability. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations ("noise") in motor performance, as a predictor of individual adaptive capabilities.

  5. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  6. Stability analysis of geomagnetic baseline data obtained at Cheongyang observatory in Korea

    Directory of Open Access Journals (Sweden)

    S. M. Amran

    2017-07-01

    Full Text Available The stability of baselines produced by Cheongyang (CYG observatory from the period of 2014 to 2016 is analysed. Step heights of higher than 5 nT were found in H and Z components in 2014 and 2015 due to magnetic noise in the absolute-measurement hut. In addition, a periodic modulation behaviour observed in the H and Z baseline curves was related to annual temperature variation of about 20 °C in the fluxgate magnetometer hut. Improvement in data quality was evidenced by a small dispersion between successive measurements from June 2015 to the end of 2016. Moreover, the baseline was also improved by correcting the discontinuity in the H and Z baselines.

  7. A one-way text messaging intervention for obesity.

    Science.gov (United States)

    Ahn, Ahleum; Choi, Jaekyung

    2016-04-01

    Worldwide, there has been a startling increase in the number of people who are obese or overweight. Obesity increases the risk of cardiovascular disease and overall mortality. Mobile phone messaging is an important means of human communication globally. Because the mobile phone can be used anywhere at any time, mobile phone messaging has the potential to manage obesity. We investigated the effectiveness of a one-way text messaging intervention for obesity. Participants' body mass index and waist circumference were measured at the beginning of the programme and again after 12 weeks. The text message group received text messages about exercise, dietary intake, and general information about obesity three times a week, while the control group did not receive any text messages from the study. Of the 80 participants, 25 subjects in the text message group and 29 participants in the control group completed the study. After adjusting for baseline body mass index, the body mass index was significantly lower in the text message group than in the control group (27.9 vs. 28.3; p = 0.02). After adjusting for the baseline waist circumference, the difference of waist circumference between the text message group and control group was not significant (93.4 vs. 94.6; p = 0.13). The one-way text messaging intervention was a simple and effective way to manage obesity. The one-way text messaging intervention may be a useful method for lifestyle modification in obese subjects. © The Author(s) 2015.

  8. Modeling and Simulation of Offshore Wind Power Platform for 5 MW Baseline NREL Turbine

    Directory of Open Access Journals (Sweden)

    Taufik Roni Sahroni

    2015-01-01

    Full Text Available This paper presents the modeling and simulation of offshore wind power platform for oil and gas companies. Wind energy has become the fastest growing renewable energy in the world and major gains in terms of energy generation are achievable when turbines are moved offshore. The objective of this project is to propose new design of an offshore wind power platform. Offshore wind turbine (OWT is composed of three main structures comprising the rotor/blades, the tower nacelle, and the supporting structure. The modeling analysis was focused on the nacelle and supporting structure. The completed final design was analyzed using finite element modeling tool ANSYS to obtain the structure’s response towards loading conditions and to ensure it complies with guidelines laid out by classification authority Det Norske Veritas. As a result, a new model of the offshore wind power platform for 5 MW Baseline NREL turbine was proposed.

  9. Baseline assessment of benthic communities of the Flower Garden Banks (2010 - present): 2011

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  10. Baseline assessment of fish communities of the Flower Garden Banks (2010 - present): 2011

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  11. Homography Propagation and Optimization for Wide-Baseline Street Image Interpolation.

    Science.gov (United States)

    Nie, Yongwei; Zhang, Zhensong; Sun, Hanqiu; Su, Tan; Li, Guiqing

    2017-10-01

    Wide-baseline street image interpolation is useful but very challenging. Existing approaches either rely on heavyweight 3D reconstruction or computationally intensive deep networks. We present a lightweight and efficient method which uses simple homography computing and refining operators to estimate piecewise smooth homographies between input views. To achieve the goal, we show how to combine homography fitting and homography propagation together based on reliable and unreliable superpixel discrimination. Such a combination, other than using homography fitting only, dramatically increases the accuracy and robustness of the estimated homographies. Then, we integrate the concepts of homography and mesh warping, and propose a novel homography-constrained warping formulation which enforces smoothness between neighboring homographies by utilizing the first-order continuity of the warped mesh. This further eliminates small artifacts of overlapping, stretching, etc. The proposed method is lightweight and flexible, allows wide-baseline interpolation. It improves the state of the art and demonstrates that homography computation suffices for interpolation. Experiments on city and rural datasets validate the efficiency and effectiveness of our method.

  12. Shifted Baselines Reduce Willingness to Pay for Conservation

    Directory of Open Access Journals (Sweden)

    Loren McClenachan

    2018-02-01

    Full Text Available A loss of memory of past environmental degradation has resulted in shifted baselines, which may result in conservation and restoration goals that are less ambitious than if stakeholders had a full knowledge of ecosystem potential. However, the link between perception of baseline states and support for conservation planning has not been tested empirically. Here, we investigate how perceptions of change in coral reef ecosystems affect stakeholders' willingness to pay (WTP for the establishment of protected areas. Coral reefs are experiencing rapid, global change that is observable by the public, and therefore provide an ideal ecosystem to test links between beliefs about baseline states and willingness to support conservation. Our survey respondents perceived change to coral reef communities across six variables: coral abundance, fish abundance, fish diversity, fish size, sedimentation, and water pollution. Respondants who accurately perceived declines in reef health had significantly higher WTP for protected areas (US $256.80 vs. $102.50 per year, suggesting that shifted baselines may reduce engagement with conservation efforts. If WTP translates to engagement, this suggests that goals for restoration and recovery are likely to be more ambitious if the public is aware of long term change. Therefore, communicating the scope and depth of environmental problems is essential in engaging the public in conservation.

  13. Performance of JPEG Image Transmission Using Proposed Asymmetric Turbo Code

    Directory of Open Access Journals (Sweden)

    Siddiqi Mohammad Umar

    2007-01-01

    Full Text Available This paper gives the results of a simulation study on the performance of JPEG image transmission over AWGN and Rayleigh fading channels using typical and proposed asymmetric turbo codes for error control coding. The baseline JPEG algorithm is used to compress a QCIF ( "Suzie" image. The recursive systematic convolutional (RSC encoder with generator polynomials , that is, (13/11 in decimal, and 3G interleaver are used for the typical WCDMA and CDMA2000 turbo codes. The proposed asymmetric turbo code uses generator polynomials , that is, (13/11; 13/9 in decimal, and a code-matched interleaver. The effect of interleaver in the proposed asymmetric turbo code is studied using weight distribution and simulation. The simulation results and performance bound for proposed asymmetric turbo code for the frame length , code rate with Log-MAP decoder over AWGN channel are compared with the typical system. From the simulation results, it is observed that the image transmission using proposed asymmetric turbo code performs better than that with the typical system.

  14. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    Science.gov (United States)

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  15. Super-NOvA a long-baseline neutrino experiment with two off-axis detectors

    CERN Document Server

    Requejo, O M; Pascoli, S; Requejo, Olga Mena; Palomares-Ruiz, Sergio; Pascoli, Silvia

    2005-01-01

    Establishing the neutrino mass hierarchy is one of the fundamental questions that will have to be addressed in the next future. Its determination could be obtained with long-baseline experiments but typically suffers from degeneracies with other neutrino parameters. We consider here the NOvA experiment configuration and propose to place a second off-axis detector, with a shorter baseline, such that, by exploiting matter effects, the type of neutrino mass hierarchy could be determined with only the neutrino run. We show that the determination of this parameter is free of degeneracies, provided the ratio L/E, where L the baseline and E is the neutrino energy, is the same for both detectors.

  16. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  17. Scheme for generating and transporting THz radiation to the X-ray experimental floor at LCLS baseline

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2011-08-15

    This paper describes a novel scheme for integrating a coherent THz source in the baseline of the LCLS facility. Any method relying on the spent electron beam downstream of the baseline undulator should provide a way of transporting the radiation up to the experimental floor.Herewe propose to use the dump area access maze. In this way the THz output must propagate with limited size at least for one hundred meters in a maze, following many turns, to reach the near experimental hall. The use of a standard, discrete, open beam-waveguide formed by periodic reflectors, that is a mirror guide, would lead to unacceptable size of the system. To avoid these problems, in this paper we propose an alternative approach based on periodically spaced metallic screens with holes. This quasi-optical transmission line is referred to as an iris line. We present complete calculations for the iris line using both analytical and numerical methods, which we find in good agreement. We present a design of a THz edge radiation source based on the use of an iris line. The proposed setup takes almost no cost nor time to be implemented at the LCLS baseline, and can be used at other facilities as well. The edge radiation source is limited in maximally achievable field strength at the sample. An extension based on the use of an undulator in the presence of the iris line, which is feasible at the LCLS energies, is proposed as a possible upgrade of the baseline THz source. (orig)

  18. Scheme for generating and transporting THz radiation to the X-ray experimental floor at LCLS baseline

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2011-08-01

    This paper describes a novel scheme for integrating a coherent THz source in the baseline of the LCLS facility. Any method relying on the spent electron beam downstream of the baseline undulator should provide a way of transporting the radiation up to the experimental floor.Herewe propose to use the dump area access maze. In this way the THz output must propagate with limited size at least for one hundred meters in a maze, following many turns, to reach the near experimental hall. The use of a standard, discrete, open beam-waveguide formed by periodic reflectors, that is a mirror guide, would lead to unacceptable size of the system. To avoid these problems, in this paper we propose an alternative approach based on periodically spaced metallic screens with holes. This quasi-optical transmission line is referred to as an iris line. We present complete calculations for the iris line using both analytical and numerical methods, which we find in good agreement. We present a design of a THz edge radiation source based on the use of an iris line. The proposed setup takes almost no cost nor time to be implemented at the LCLS baseline, and can be used at other facilities as well. The edge radiation source is limited in maximally achievable field strength at the sample. An extension based on the use of an undulator in the presence of the iris line, which is feasible at the LCLS energies, is proposed as a possible upgrade of the baseline THz source. (orig)

  19. Pilot evaluation of the text4baby mobile health program

    Directory of Open Access Journals (Sweden)

    Evans William Douglas

    2012-11-01

    Full Text Available Abstract Background Mobile phone technologies for health promotion and disease prevention have evolved rapidly, but few studies have tested the efficacy of mobile health in full-fledged programs. Text4baby is an example of mobile health based on behavioral theory, and it delivers text messages to traditionally underserved pregnant women and new mothers to change their health, health care beliefs, practices, and behaviors in order to improve clinical outcomes. The purpose of this pilot evaluation study is to assess the efficacy of this text messaging campaign. Methods We conducted a randomized pilot evaluation study. All participants were pregnant women first presenting for care at the Fairfax County, Virginia Health Department. We randomized participants to enroll in text4baby and receive usual health care (intervention, or continue simply to receive usual care (control. We then conducted a 24-item survey by telephone of attitudes and behaviors related to text4baby. We surveyed participants at baseline, before text4baby was delivered to the intervention group, and at follow-up at approximately 28 weeks of baby’s gestational age. Results We completed 123 baseline interviews in English and in Spanish. Overall, the sample was predominantly of Hispanic origin (79.7% with an average age of 27.6 years. We completed 90 follow-up interviews, and achieved a 73% retention rate. We used a logistic generalized estimating equation model to evaluate intervention effects on measured outcomes. We found a significant effect of text4baby intervention exposure on increased agreement with the attitude statement “I am prepared to be a new mother” (OR = 2.73, CI = 1.04, 7.18, p = 0.042 between baseline and follow-up. For those who had attained a high school education or greater, we observed a significantly higher overall agreement to attitudes against alcohol consumption during pregnancy (OR = 2.80, CI = 1.13, 6.90, p = 0.026. We also observed a

  20. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    Science.gov (United States)

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

    International Nuclear Information System (INIS)

    Swita, W.R.

    1998-01-01

    This document provides a summary of the proposed Tank Waste Remediation System Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost) developed to demonstrate the Tank Waste Remediation System contractor's Readiness-to-Proceed in support of the Phase 1B mission

  2. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  3. Baseline recommendations for greenhouse gas mitigation projects in the electric power sector

    Energy Technology Data Exchange (ETDEWEB)

    Kartha, Sivan; Lazarus, Michael [Stockholm Environment Institute/Tellus Institute, Boston, MA (United States); Bosi, Martina [International Energy Agency, Paris, 75 (France)

    2004-03-01

    The success of the Clean Development Mechanism (CDM) and other credit-based emission trading regimes depends on effective methodologies for quantifying a project's emissions reductions. The key methodological challenge lies in estimating project's counterfactual emission baseline, through balancing the need for accuracy, transparency, and practicality. Baseline standardisation (e.g. methodology, parameters and/or emission rate) can be a means to achieve these goals. This paper compares specific options for developing standardised baselines for the electricity sector - a natural starting point for baseline standardisation given the magnitude of the emissions reductions opportunities. The authors review fundamental assumptions that baseline studies have made with respect to estimating the generation sources avoided by CDM or other emission-reducing projects. Typically, studies have assumed that such projects affect either the operation of existing power plants (the operating margin) or the construction of new generation facilities (the build margin). The authors show that both effects are important to consider and thus recommend a combined margin approach for most projects, based on grid-specific data. They propose a three-category framework, according to projects' relative scale and environmental risk. (Author)

  4. Scheme for Generation highly monochromatic X-Rays from a baseline XFEL undulator

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2010-03-01

    One goal of XFEL facilities is the production of narrow bandwidth X-ray radiation. The self-seeding scheme was proposed to obtain a bandwidth narrower than that achievable with conventional X-ray SASE FELs. A self-seeded FEL is composed of two undulators separated by a monochromator and an electron beam bypass that must compensate for the path delay of X-rays in the monochromator. This leads to a long bypass, with a length in the order of 40-60 m, which requires modifications of the baseline undulator configuration. As an attempt to get around this obstacle, together with a study of the self-seeding scheme for the European XFEL, here we propose a novel technique based on a pulse doubler concept. Using a crystal monochromator installed within a short magnetic chicane in the baseline undulator, it is possible to decrease the bandwidth of the radiation well beyond the XFEL design down to 10 -5 . The magnetic chicane can be installed without any perturbation of the XFEL focusing structure, and does not interfere with the baseline mode of operation. We present a feasibility study and we make exemplifications with the parameters of the SASE2 line of the European XFEL. (orig.)

  5. Digital signal processing reveals circadian baseline oscillation in majority of mammalian genes.

    Directory of Open Access Journals (Sweden)

    Andrey A Ptitsyn

    2007-06-01

    Full Text Available In mammals, circadian periodicity has been described for gene expression in the hypothalamus and multiple peripheral tissues. It is accepted that 10%-15% of all genes oscillate in a daily rhythm, regulated by an intrinsic molecular clock. Statistical analyses of periodicity are limited by the small size of datasets and high levels of stochastic noise. Here, we propose a new approach applying digital signal processing algorithms separately to each group of genes oscillating in the same phase. Combined with the statistical tests for periodicity, this method identifies circadian baseline oscillation in almost 100% of all expressed genes. Consequently, circadian oscillation in gene expression should be evaluated in any study related to biological pathways. Changes in gene expression caused by mutations or regulation of environmental factors (such as photic stimuli or feeding should be considered in the context of changes in the amplitude and phase of genetic oscillations.

  6. A comparison of baseline methodologies for 'Reducing Emissions from Deforestation and Degradation'

    Directory of Open Access Journals (Sweden)

    Kok Kasper

    2009-07-01

    Full Text Available Abstract Background A mechanism for emission reductions from deforestation and degradation (REDD is very likely to be included in a future climate agreement. The choice of REDD baseline methodologies will crucially influence the environmental and economic effectiveness of the climate regime. We compare three different historical baseline methods and one innovative dynamic model baseline approach to appraise their applicability under a future REDD policy framework using a weighted multi-criteria analysis. Results The results show that each baseline method has its specific strengths and weaknesses. Although the dynamic model allows for the best environmental and for comparatively good economic performance, its high demand for data and technical capacity limit the current applicability in many developing countries. Conclusion The adoption of a multi-tier approach will allow countries to select the baseline method best suiting their specific capabilities and data availability while simultaneously ensuring scientific transparency, environmental effectiveness and broad political support.

  7. Bilingual Text4Walking Food Service Employee Intervention Pilot Study.

    Science.gov (United States)

    Buchholz, Susan Weber; Ingram, Diana; Wilbur, JoEllen; Fogg, Louis; Sandi, Giselle; Moss, Angela; Ocampo, Edith V

    2016-06-01

    Half of all adults in the United States do not meet the level of recommended aerobic physical activity. Physical activity interventions are now being conducted in the workplace. Accessible technology, in the form of widespread usage of cell phones and text messaging, is available for promoting physical activity. The purposes of this study, which was conducted in the workplace, were to determine (1) the feasibility of implementing a bilingual 12-week Text4Walking intervention and (2) the effect of the Text4Walking intervention on change in physical activity and health status in a food service employee population. Before conducting the study reported here, the Text4Walking research team developed a database of motivational physical activity text messages in English. Because Hispanic or Latino adults compose one-quarter of all adults employed in the food service industry, the Text4Walking team translated the physical activity text messages into Spanish. This pilot study was guided by the Physical Activity Health Promotion Framework and used a 1-group 12-week pre- and posttest design with food service employees who self-reported as being sedentary. The aim of the study was to increase the number of daily steps over the baseline by 3000 steps. Three physical activity text messages were delivered weekly. In addition, participants received 3 motivational calls during the study. SPSS version 19.0 and R 3.0 were used to perform the data analysis. There were 33 employees who participated in the study (57.6% female), with a mean age of 43.7 years (SD 8.4). The study included 11 Hispanic or Latino participants, 8 of whom requested that the study be delivered in Spanish. There was a 100% retention rate in the study. At baseline, the participants walked 102 (SD 138) minutes/day (per self-report). This rate increased significantly (P=.008) to 182 (SD 219) minutes/day over the course of the study. The participants had a baseline mean of 10,416 (SD 5097) steps, which also increased

  8. Historical baselines of coral cover on tropical reefs as estimated by expert opinion

    Directory of Open Access Journals (Sweden)

    Tyler D. Eddy

    2018-01-01

    Full Text Available Coral reefs are important habitats that represent global marine biodiversity hotspots and provide important benefits to people in many tropical regions. However, coral reefs are becoming increasingly threatened by climate change, overfishing, habitat destruction, and pollution. Historical baselines of coral cover are important to understand how much coral cover has been lost, e.g., to avoid the ‘shifting baseline syndrome’. There are few quantitative observations of coral reef cover prior to the industrial revolution, and therefore baselines of coral reef cover are difficult to estimate. Here, we use expert and ocean-user opinion surveys to estimate baselines of global coral reef cover. The overall mean estimated baseline coral cover was 59% (±19% standard deviation, compared to an average of 58% (±18% standard deviation estimated by professional scientists. We did not find evidence of the shifting baseline syndrome, whereby respondents who first observed coral reefs more recently report lower estimates of baseline coral cover. These estimates of historical coral reef baseline cover are important for scientists, policy makers, and managers to understand the extent to which coral reefs have become depleted and to set appropriate recovery targets.

  9. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  10. Layout-aware text extraction from full-text PDF of scientific articles

    Directory of Open Access Journals (Sweden)

    Ramakrishnan Cartic

    2012-05-01

    Full Text Available Abstract Background The Portable Document Format (PDF is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the ‘Layout-Aware PDF Text Extraction’ (LA-PDFText system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Results Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1 Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2 Classifying text blocks into rhetorical categories using a rule-based method and (3 Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF

  11. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  12. Terrestrial gamma radiation baseline mapping using ultra low density sampling methods

    International Nuclear Information System (INIS)

    Kleinschmidt, R.; Watson, D.

    2016-01-01

    Baseline terrestrial gamma radiation maps are indispensable for providing basic reference information that may be used in assessing the impact of a radiation related incident, performing epidemiological studies, remediating land contaminated with radioactive materials, assessment of land use applications and resource prospectivity. For a large land mass, such as Queensland, Australia (over 1.7 million km 2 ), it is prohibitively expensive and practically difficult to undertake detailed in-situ radiometric surveys of this scale. It is proposed that an existing, ultra-low density sampling program already undertaken for the purpose of a nationwide soil survey project be utilised to develop a baseline terrestrial gamma radiation map. Geoelement data derived from the National Geochemistry Survey of Australia (NGSA) was used to construct a baseline terrestrial gamma air kerma rate map, delineated by major drainage catchments, for Queensland. Three drainage catchments (sampled at the catchment outlet) spanning low, medium and high radioelement concentrations were selected for validation of the methodology using radiometric techniques including in-situ measurements and soil sampling for high resolution gamma spectrometry, and comparative non-radiometric analysis. A Queensland mean terrestrial air kerma rate, as calculated from the NGSA outlet sediment uranium, thorium and potassium concentrations, of 49 ± 69 nGy h −1 (n = 311, 3σ 99% confidence level) is proposed as being suitable for use as a generic terrestrial air kerma rate background range. Validation results indicate that catchment outlet measurements are representative of the range of results obtained across the catchment and that the NGSA geoelement data is suitable for calculation and mapping of terrestrial air kerma rate. - Highlights: • A baseline terrestrial air kerma map of Queensland, Australia was developed using geochemical data from a major drainage catchment ultra-low density sampling program

  13. Layout-aware text extraction from full-text PDF of scientific articles.

    Science.gov (United States)

    Ramakrishnan, Cartic; Patnia, Abhishek; Hovy, Eduard; Burns, Gully Apc

    2012-05-28

    The Portable Document Format (PDF) is the most commonly used file format for online scientific publications. The absence of effective means to extract text from these PDF files in a layout-aware manner presents a significant challenge for developers of biomedical text mining or biocuration informatics systems that use published literature as an information source. In this paper we introduce the 'Layout-Aware PDF Text Extraction' (LA-PDFText) system to facilitate accurate extraction of text from PDF files of research articles for use in text mining applications. Our paper describes the construction and performance of an open source system that extracts text blocks from PDF-formatted full-text research articles and classifies them into logical units based on rules that characterize specific sections. The LA-PDFText system focuses only on the textual content of the research articles and is meant as a baseline for further experiments into more advanced extraction methods that handle multi-modal content, such as images and graphs. The system works in a three-stage process: (1) Detecting contiguous text blocks using spatial layout processing to locate and identify blocks of contiguous text, (2) Classifying text blocks into rhetorical categories using a rule-based method and (3) Stitching classified text blocks together in the correct order resulting in the extraction of text from section-wise grouped blocks. We show that our system can identify text blocks and classify them into rhetorical categories with Precision1 = 0.96% Recall = 0.89% and F1 = 0.91%. We also present an evaluation of the accuracy of the block detection algorithm used in step 2. Additionally, we have compared the accuracy of the text extracted by LA-PDFText to the text from the Open Access subset of PubMed Central. We then compared this accuracy with that of the text extracted by the PDF2Text system, 2commonly used to extract text from PDF. Finally, we discuss preliminary error analysis for

  14. NETWORK DESIGN IN CLOSE-RANGE PHOTOGRAMMETRY WITH SHORT BASELINE IMAGES

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-08-01

    Full Text Available The avaibility of automated software for image-based 3D modelling has changed the way people acquire images for photogrammetric applications. Short baseline images are required to match image points with SIFT-like algorithms, obtaining more images than those necessary for “old fashioned” photogrammetric projects based on manual measurements. This paper describes some considerations on network design for short baseline image sequences, especially on precision and reliability of bundle adjustment. Simulated results reveal that the large number of 3D points used for image orientation has very limited impact on network precision.

  15. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  16. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    Science.gov (United States)

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  17. Text Messaging to Improve Disease Management in Patients With Painful Diabetic Peripheral Neuropathy.

    Science.gov (United States)

    Bauer, Victoria; Goodman, Nancy; Lapin, Brittany; Cooley, Camille; Wang, Ed; Craig, Terri L; Glosner, Scott E; Juhn, Mark S; Cappelleri, Joseph C; Sadosky, Alesia B; Masi, Christopher

    2018-06-01

    Purpose The purpose of the study was to determine the impact of educational text messages on diabetes self-management activities and outcomes in patients with painful diabetic peripheral neuropathy (pDPN). Methods Patients with pDPN identified from a large integrated health system who agreed to participate were randomized to 6 months of usual care (UC) or UC plus twice-daily diabetes self-management text messages (UC+TxtM). Outcomes included the Pain Numerical Rating Scale, Summary of Diabetes Self-Care Activities (SDSCA), questions on diabetes health beliefs, and glycated hemoglobin (A1C). Changes from baseline were evaluated at 6 months and compared between groups. Results Demographic characteristics were balanced between groups (N = 62; 53% female, mean age = 63 years, 94% type 2 diabetes), as were baseline measures. After 6 months, pain decreased with UC+TxtM from 6.3 to 5.5 and with UC from 6.5 to 6.0, with no difference between groups. UC+TxtM but not UC was associated with significant improvements from baseline on all SDSCA subscales. On diabetes health beliefs, UC+TxtM patients reported significantly increased benefits and reduced barriers and susceptibility relative to UC at 6 months. A1C declined in both groups, but neither change was significant relative to baseline. Conclusions Patients with pDPN who receive twice-daily text messages regarding diabetes management reported reduced pain relative to baseline, although this change was not significant compared with usual care. In addition, text messaging was associated with increased self-management activities and improved diabetes health beliefs and total self-care. These results warrant further investigation.

  18. Informed baseline subtraction of proteomic mass spectrometry data aided by a novel sliding window algorithm.

    Science.gov (United States)

    Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J

    2016-01-01

    Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of

  19. Mass hierarchy sensitivity of medium baseline reactor neutrino experiments with multiple detectors

    Directory of Open Access Journals (Sweden)

    Hong-Xin Wang

    2017-05-01

    Full Text Available We report the neutrino mass hierarchy (MH determination of medium baseline reactor neutrino experiments with multiple detectors, where the sensitivity of measuring the MH can be significantly improved by adding a near detector. Then the impact of the baseline and target mass of the near detector on the combined MH sensitivity has been studied thoroughly. The optimal selections of the baseline and target mass of the near detector are ∼12.5 km and ∼4 kton respectively for a far detector with the target mass of 20 kton and the baseline of 52.5 km. As typical examples of future medium baseline reactor neutrino experiments, the optimal location and target mass of the near detector are selected for the specific configurations of JUNO and RENO-50. Finally, we discuss distinct effects of the reactor antineutrino energy spectrum uncertainty for setups of a single detector and double detectors, which indicate that the spectrum uncertainty can be well constrained in the presence of the near detector.

  20. Hydrogeology baseline study Aurora Mine

    International Nuclear Information System (INIS)

    1996-01-01

    A baseline hydrogeologic study was conducted in the area of Syncrude's proposed Aurora Mine in order to develop a conceptual regional hydrogeologic model for the area that could be used to understand groundwater flow conditions. Geologic information was obtained from over 2,000 coreholes and from data obtained between 1980 and 1996 regarding water level for the basal aquifer. A 3-D numerical groundwater flow model was developed to provide quantitative estimates of the potential environmental impacts of the proposed mining operations on the groundwater flow system. The information was presented in the context of a regional study area which encompassed much of the Athabasca Oil Sands Region, and a local study area which was defined by the lowlands of the Muskeg River Basin. Characteristics of the topography, hydrology, climate, geology, and hydrogeology of the region are described. The conclusion is that groundwater flow in the aquifer occurs mostly in a westerly direction beneath the Aurora Mine towards its inferred discharge location along the Athabasca River. Baseflow in the Muskeg River is mostly related to discharge from shallow surficial aquifers. Water in the river under baseflow conditions was fresh, of calcium-carbonate type, with very little indication of mineralization associated with deeper groundwater in the Aurora Mine area. 44 refs., 5 tabs., 31 figs

  1. The influence of drinking, texting, and eating on simulated driving performance.

    Science.gov (United States)

    Irwin, Christopher; Monement, Sophie; Desbrow, Ben

    2015-01-01

    Driving is a complex task and distractions such as using a mobile phone for the purpose of text messaging are known to have a significant impact on driving. Eating and drinking are common forms of distraction that have received less attention in relation to their impact on driving. The aim of this study was to further explore and compare the effects of a variety of distraction tasks (i.e., text messaging, eating, drinking) on simulated driving. Twenty-eight healthy individuals (13 female) participated in a crossover design study involving 3 experimental trials (separated by ≥24 h). In each trial, participants completed a baseline driving task (no distraction) before completing a second driving task involving one of 3 different distraction tasks (drinking 400 mL water, drinking 400 mL water and eating a 6-inch Subway sandwich, drinking 400 mL water and composing 3 text messages). Primary outcome measures of driving consisted of standard deviation of lateral position (SDLP) and reaction time to auditory and visual critical events. Subjective ratings of difficulty in performing the driving tasks were also collected at the end of the study to determine perceptions of distraction difficulty on driving. Driving tasks involving texting and eating were associated with significant impairment in driving performance measures for SDLP compared to baseline driving (46.0 ± 0.08 vs. 41.3 ± 0.06 cm and 44.8 ± 0.10 vs. 41.6 ± 0.07 cm, respectively), number of lane departures compared to baseline driving (10.9 ± 7.8 vs. 7.6 ± 7.1 and 9.4 ± 7.5 vs. 7.1 ± 7.0, respectively), and auditory reaction time compared to baseline driving (922 ± 95 vs. 889 ± 104 ms and 933 ± 101 vs. 901 ± 103 ms, respectively). No difference in SDLP (42.7 ± 0.08 vs. 42.5 ± 0.07 cm), number of lane departures (7.6 ± 7.7 vs. 7.0 ± 6.8), or auditory reaction time (891 ± 98 and 885 ± 89 ms) was observed in the drive involving the drink-only condition compared to the corresponding baseline drive

  2. Baseline assessment of fish and benthic communities of the Flower Garden Banks (NODC Accession 0118358)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  3. DairyBISS Baseline report

    NARCIS (Netherlands)

    Buizer, N.N.; Berhanu, Tinsae; Murutse, Girmay; Vugt, van S.M.

    2015-01-01

    This baseline report of the Dairy Business Information Service and Support (DairyBISS) project presents the findings of a baseline survey among 103 commercial farms and 31 firms and advisors working in the dairy value chain. Additional results from the survey among commercial dairy farms are

  4. The optimized baseline project: Reinventing environmental restoration at Hanford

    International Nuclear Information System (INIS)

    Goodenough, J.D.; Janaskie, M.T.; Kleinen, P.J.

    1994-01-01

    The U.S. Department of Energy Richland Operations Office (DOE-RL) is using a strategic planning effort (termed the Optimized Baseline Project) to develop a new approach to the Hanford Environmental Restoration program. This effort seeks to achieve a quantum leap improvement in performance through results oriented prioritization of activities. This effort was conducted in parallel with the renegotiation of the Tri-Party Agreement and provided DOE with an opportunity to propose innovative initiatives to promote cost effectiveness, accelerate progress in the Hanford Environmental Restoration Program and involve stakeholders in the decision-making process. The Optimized Baseline project is an innovative approach to program planning and decision-making in several respects. First, the process is a top down, value driven effort that responds to values held by DOE, the regulatory community and the public. Second, planning is conducted in a way that reinforces the technical management process at Richland, involves the regulatory community in substantive decisions, and includes the public. Third, the Optimized Baseline Project is being conducted as part of a sitewide Hanford initiative to reinvent Government. The planning process used for the Optimized Baseline Project has many potential applications at other sites and in other programs where there is a need to build consensus among diverse, independent groups of stakeholders and decisionmakers. The project has successfully developed and demonstrated an innovative approach to program planning that accelerates the pace of cleanup, involves the regulators as partners with DOE in priority setting, and builds public understanding and support for the program through meaningful opportunities for involvement

  5. Baseline rationing

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    The standard problem of adjudicating conflicting claims describes a situation in which a given amount of a divisible good has to be allocated among agents who hold claims against it exceeding the available amount. This paper considers more general rationing problems in which, in addition to claims...... to international protocols for the reduction of greenhouse emissions, or water distribution in drought periods. We define a family of allocation methods for such general rationing problems - called baseline rationing rules - and provide an axiomatic characterization for it. Any baseline rationing rule within...... the family is associated with a standard rule and we show that if the latter obeys some properties reflecting principles of impartiality, priority and solidarity, the former obeys them too....

  6. Baseline restoration using current conveyors

    International Nuclear Information System (INIS)

    Morgado, A.M.L.S.; Simoes, J.B.; Correia, C.M.

    1996-01-01

    A good performance of high resolution nuclear spectrometry systems, at high pulse rates, demands restoration of baseline between pulses, in order to remove rate dependent baseline shifts. This restoration is performed by circuits named baseline restorers (BLRs) which also remove low frequency noise, such as power supply hum and detector microphonics. This paper presents simple circuits for baseline restoration based on a commercial current conveyor (CCII01). Tests were performed, on two circuits, with periodic trapezoidal shaped pulses in order to measure the baseline restoration for several pulse rates and restorer duty cycles. For the current conveyor based Robinson restorer, the peak shift was less than 10 mV, for duty cycles up to 60%, at high pulse rates. Duty cycles up to 80% were also tested, being the maximum peak shift 21 mV. The peak shift for the current conveyor based Grubic restorer was also measured. The maximum value found was 30 mV at 82% duty cycle. Keeping the duty cycle below 60% improves greatly the restorer performance. The ability of both baseline restorer architectures to reject low frequency modulation is also measured, with good results on both circuits

  7. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  8. Large short-baseline νμ disappearance

    International Nuclear Information System (INIS)

    Giunti, Carlo; Laveder, Marco

    2011-01-01

    We analyze the LSND, KARMEN, and MiniBooNE data on short-baseline ν μ →ν e oscillations and the data on short-baseline ν e disappearance obtained in the Bugey-3 and CHOOZ reactor experiments in the framework of 3+1 antineutrino mixing, taking into account the MINOS observation of long-baseline ν μ disappearance and the KamLAND observation of very-long-baseline ν e disappearance. We show that the fit of the data implies that the short-baseline disappearance of ν μ is relatively large. We obtain a prediction of an effective amplitude sin 2 2θ μμ > or approx. 0.1 for short-baseline ν μ disappearance generated by 0.2 2 2 , which could be measured in future experiments.

  9. Measuring complexity with multifractals in texts. Translation effects

    International Nuclear Information System (INIS)

    Ausloos, M.

    2012-01-01

    Highlights: ► Two texts in English and one in Esperanto are transformed into 6 time series. ► D(q) and f(alpha) of such (and shuffled) time series are obtained. ► A model for text construction is presented based on a parametrized Cantor set. ► The model parameters can also be used when examining machine translated texts. ► Suggested extensions to higher dimensions: in 2D image analysis and on hypertexts. - Abstract: Should quality be almost a synonymous of complexity? To measure quality appears to be audacious, even very subjective. It is hereby proposed to use a multifractal approach in order to quantify quality, thus through complexity measures. A one-dimensional system is examined. It is known that (all) written texts can be one-dimensional nonlinear maps. Thus, several written texts by the same author are considered, together with their translation, into an unusual language, Esperanto, and asa baseline their corresponding shuffled versions. Different one-dimensional time series can be used: e.g. (i) one based on word lengths, (ii) the other based on word frequencies; both are used for studying, comparing and discussing the map structure. It is shown that a variety in style can be measured through the D(q) and f(α) curves characterizing multifractal objects. This allows to observe on the one hand whether natural and artificial languages significantly influence the writing and the translation, and whether one author’s texts differ technically from each other. In fact, the f(α) curves of the original texts are similar to each other, but the translated text shows marked differences. However in each case, the f(α) curves are far from being parabolic, – in contrast to the shuffled texts. Moreover, the Esperanto text has more extreme values. Criteria are thereby suggested for estimating a text quality, as if it is a time series only. A model is introduced in order to substantiate the findings: it consists in considering a text as a random Cantor set

  10. Efficient algorithm for baseline wander and powerline noise removal from ECG signals based on discrete Fourier series.

    Science.gov (United States)

    Bahaz, Mohamed; Benzid, Redha

    2018-03-01

    Electrocardiogram (ECG) signals are often contaminated with artefacts and noises which can lead to incorrect diagnosis when they are visually inspected by cardiologists. In this paper, the well-known discrete Fourier series (DFS) is re-explored and an efficient DFS-based method is proposed to reduce contribution of both baseline wander (BW) and powerline interference (PLI) noises in ECG records. In the first step, the determination of the exact number of low frequency harmonics contributing in BW is achieved. Next, the baseline drift is estimated by the sum of all associated Fourier sinusoids components. Then, the baseline shift is discarded efficiently by a subtraction of its approximated version from the original biased ECG signal. Concerning the PLI, the subtraction of the contributing harmonics calculated in the same manner reduces efficiently such type of noise. In addition of visual quality results, the proposed algorithm shows superior performance in terms of higher signal-to-noise ratio and smaller mean square error when faced to the DCT-based algorithm.

  11. TAPIR--Finnish national geochemical baseline database.

    Science.gov (United States)

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  12. Automatic Coding of Short Text Responses via Clustering in Educational Assessment

    Science.gov (United States)

    Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank

    2016-01-01

    Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…

  13. A proposal of texts for political ideological work and the value formation of the future physical culture professional

    Directory of Open Access Journals (Sweden)

    Ana Isel Rodríguez-Cruz

    2013-08-01

    Full Text Available The values like complex formations of the personality are very related to the person's own existence and they have a lot to do with each individual's ideological political formation, that’s why, by means of a texts selection associated to this theme and from the Communicative Spanish subject, the authors propose to deep in the ideological political work and the main values in correspondence with the expectations, interests and the necessities of the current Cuban society. After the application of theoretical, empirical and statistical methods the necessity of reinforcing the ideological political work and the formation of values in our students was verified. In this way, the subject team puts in practice the educational ways and the performance ways by the teacher, in such way from the classes, the teacher generates changes in the students and contributes to the professional's formation that demands the modern society. For this reason, the teacher works with texts through diverse themes related like history, sport, personalities, and important events. (The texts make emphasis in the fight that at present is taken place for the free of the Cuban five, among others. The texts analysis includes the search of key words, the relationships among significant and meaning, the translation, interpretation and extrapolation of the same ones, the paragraph qualities, as well as the rhetorical patterns or methods of development of it, among other aspects. All of these items contribute to reinforce the values that the students have and at the same time the teacher's work facilitates the students express their feelings and thinking in correspondence with their personality.

  14. High baseline activity in inferior temporal cortex improves neural and behavioral discriminability during visual categorization

    Directory of Open Access Journals (Sweden)

    Nazli eEmadi

    2014-11-01

    Full Text Available Spontaneous firing is a ubiquitous property of neural activity in the brain. Recent literature suggests that this baseline activity plays a key role in perception. However, it is not known how the baseline activity contributes to neural coding and behavior. Here, by recording from the single neurons in the inferior temporal cortex of monkeys performing a visual categorization task, we thoroughly explored the relationship between baseline activity, the evoked response, and behavior. Specifically we found that a low-frequency (< 8 Hz oscillation in the spike train, prior and phase-locked to the stimulus onset, was correlated with increased gamma power and neuronal baseline activity. This enhancement of the baseline activity was then followed by an increase in the neural selectivity and the response reliability and eventually a higher behavioral performance.

  15. Baseline review of the U.S. LHC Accelerator project

    International Nuclear Information System (INIS)

    1998-02-01

    The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O'Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as

  16. n-Gram-Based Text Compression

    Directory of Open Access Journals (Sweden)

    Vu H. Nguyen

    2016-01-01

    Full Text Available We propose an efficient method for compressing Vietnamese text using n-gram dictionaries. It has a significant compression ratio in comparison with those of state-of-the-art methods on the same dataset. Given a text, first, the proposed method splits it into n-grams and then encodes them based on n-gram dictionaries. In the encoding phase, we use a sliding window with a size that ranges from bigram to five grams to obtain the best encoding stream. Each n-gram is encoded by two to four bytes accordingly based on its corresponding n-gram dictionary. We collected 2.5 GB text corpus from some Vietnamese news agencies to build n-gram dictionaries from unigram to five grams and achieve dictionaries with a size of 12 GB in total. In order to evaluate our method, we collected a testing set of 10 different text files with different sizes. The experimental results indicate that our method achieves compression ratio around 90% and outperforms state-of-the-art methods.

  17. Baseline assessment of the fish and benthic communities of the Flower Garden Banks (NODC Accession 0118358)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  18. Primary School Text Comprehension Predicts Mathematical Word Problem-Solving Skills in Secondary School

    Science.gov (United States)

    Björn, Piia Maria; Aunola, Kaisa; Nurmi, Jari-Erik

    2016-01-01

    This longitudinal study aimed to investigate the extent to which primary school text comprehension predicts mathematical word problem-solving skills in secondary school among Finnish students. The participants were 224 fourth graders (9-10 years old at the baseline). The children's text-reading fluency, text comprehension and basic calculation…

  19. A novel baseline-correction method for standard addition based derivative spectra and its application to quantitative analysis of benzo(a)pyrene in vegetable oil samples.

    Science.gov (United States)

    Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun

    2011-07-07

    In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.

  20. Microbunch preserving in-line system for an APPLE II helical radiator at the LCLS baseline

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL Project Team, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2011-05-15

    In a previous work we proposed a scheme for polarization control at the LCLS baseline, which exploited the microbunching from the planar undulator. After the baseline undulator, the electron beam is transported through a drift by a FODO focusing system, and through a short helical radiator. The microbunching structure can be preserved, and intense coherent radiation is emitted in the helical undulator at fundamental harmonic. The driving idea of this proposal is that the background linearly-polarized radiation from the baseline undulator is suppressed by spatial filtering. Filtering is achieved by letting radiation and electron beam through Be slits upstream of the helical radiator, where the radiation spot size is about ten times larger than the electron beam transverse size. Several changes considered in the present paper were made to improve the previous design. Slits are now placed immediately behind the helical radiator. The advantage is that the electron beam can be spoiled by the slits, and narrower slits width can be used for spatial filtering. Due to this fundamental reason, the present setup is shorter than the previous one. The helical radiator is now placed immediately behind the SHAB undulator. It is thus sufficient to use the existing FODO focusing system of the SHAB undulator for transporting themodulated electron beam. This paper presents complete GENESIS code calculations for the new design, starting from the baseline undulator entrance up to the helical radiator exit including the modulated electron beam transport by the SHAB FODO focusing system. (orig.)

  1. Microbunch preserving in-line system for an APPLE II helical radiator at the LCLS baseline

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2011-05-01

    In a previous work we proposed a scheme for polarization control at the LCLS baseline, which exploited the microbunching from the planar undulator. After the baseline undulator, the electron beam is transported through a drift by a FODO focusing system, and through a short helical radiator. The microbunching structure can be preserved, and intense coherent radiation is emitted in the helical undulator at fundamental harmonic. The driving idea of this proposal is that the background linearly-polarized radiation from the baseline undulator is suppressed by spatial filtering. Filtering is achieved by letting radiation and electron beam through Be slits upstream of the helical radiator, where the radiation spot size is about ten times larger than the electron beam transverse size. Several changes considered in the present paper were made to improve the previous design. Slits are now placed immediately behind the helical radiator. The advantage is that the electron beam can be spoiled by the slits, and narrower slits width can be used for spatial filtering. Due to this fundamental reason, the present setup is shorter than the previous one. The helical radiator is now placed immediately behind the SHAB undulator. It is thus sufficient to use the existing FODO focusing system of the SHAB undulator for transporting themodulated electron beam. This paper presents complete GENESIS code calculations for the new design, starting from the baseline undulator entrance up to the helical radiator exit including the modulated electron beam transport by the SHAB FODO focusing system. (orig.)

  2. Terrestrial gamma radiation baseline mapping using ultra low density sampling methods.

    Science.gov (United States)

    Kleinschmidt, R; Watson, D

    2016-01-01

    Baseline terrestrial gamma radiation maps are indispensable for providing basic reference information that may be used in assessing the impact of a radiation related incident, performing epidemiological studies, remediating land contaminated with radioactive materials, assessment of land use applications and resource prospectivity. For a large land mass, such as Queensland, Australia (over 1.7 million km(2)), it is prohibitively expensive and practically difficult to undertake detailed in-situ radiometric surveys of this scale. It is proposed that an existing, ultra-low density sampling program already undertaken for the purpose of a nationwide soil survey project be utilised to develop a baseline terrestrial gamma radiation map. Geoelement data derived from the National Geochemistry Survey of Australia (NGSA) was used to construct a baseline terrestrial gamma air kerma rate map, delineated by major drainage catchments, for Queensland. Three drainage catchments (sampled at the catchment outlet) spanning low, medium and high radioelement concentrations were selected for validation of the methodology using radiometric techniques including in-situ measurements and soil sampling for high resolution gamma spectrometry, and comparative non-radiometric analysis. A Queensland mean terrestrial air kerma rate, as calculated from the NGSA outlet sediment uranium, thorium and potassium concentrations, of 49 ± 69 nGy h(-1) (n = 311, 3σ 99% confidence level) is proposed as being suitable for use as a generic terrestrial air kerma rate background range. Validation results indicate that catchment outlet measurements are representative of the range of results obtained across the catchment and that the NGSA geoelement data is suitable for calculation and mapping of terrestrial air kerma rate. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  3. Text2Floss: the feasibility and acceptability of a text messaging intervention to improve oral health behavior and knowledge.

    Science.gov (United States)

    Hashemian, Tony S; Kritz-Silverstein, Donna; Baker, Ryan

    2015-01-01

    Text messaging is useful for promoting numerous health-related behaviors. The Text2Floss Study examines the feasibility and utility of a 7-day text messaging intervention to improve oral health knowledge and behavior in mothers of young children. Mothers were recruited from a private practice and a community clinic. Of 156 mothers enrolled, 129 randomized into text (n = 60) and control groups (n = 69) completed the trial. Participants in the text group received text messages for 7 days, asking about flossing and presenting oral health information. Oral health behaviors and knowledge were surveyed pre- and post-intervention. At baseline, there were no differences between text and control group mothers in knowledge and behaviors (P > 0.10). Post-intervention, text group mothers flossed more (P = 0.01), had higher total (P = 0.0006) and specific (P Text messages were accepted and perceived as useful. Mothers receiving text messages improved their own oral health behaviors and knowledge as well as their behaviors regarding their children's oral health. Text messaging represents a viable method to improve oral health behaviors and knowledge. Its high acceptance may make it useful for preventing oral disease. © 2014 American Association of Public Health Dentistry.

  4. 40 CFR 1042.825 - Baseline determination.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Baseline determination. 1042.825... Provisions for Remanufactured Marine Engines § 1042.825 Baseline determination. (a) For the purpose of this... not valid. (f) Use good engineering judgment for all aspects of the baseline determination. We may...

  5. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  6. On the baseline evolution of automobile fuel economy in Europe

    International Nuclear Information System (INIS)

    Zachariadis, Theodoros

    2006-01-01

    'Business as usual' scenarios in long-term energy forecasts are crucial for scenario-based policy analyses. This article focuses on fuel economy of passenger cars and light trucks, a long-disputed issue with serious implications for worldwide energy use and CO 2 emissions. The current status in Europe is explained and future developments are analysed with the aid of historical data of the last three decades from the United States and Europe. As a result of this analysis, fuel economy values are proposed for use as assumptions in baseline energy/transport scenarios in the 15 'old' European Union Member States. Proposed values are given for new gasoline and diesel cars and for the years 2010, 2020 and 2030. The increasing discrepancy between vehicle fuel consumption measured under test conditions and that in the real world is also considered. One main conclusion is that the European Commission's voluntary agreement with the automobile industry should not be assumed to fully achieve its target under baseline conditions, nor should it be regarded as a major stimulus for autonomous vehicle efficiency improvements after 2010. A second conclusion is that three very recent studies enjoying authority across the EU tend to be overly optimistic as regards the technical progress for conventional and alternative vehicle propulsion technologies under 'business as usual' conditions

  7. A teaching proposal on electrostatics based on the history of science through the reading of historical texts and argumentative discussions

    International Nuclear Information System (INIS)

    Castells, Marina; Konstantinidou, Aikaterini; Cerveró, Josep M.

    2015-01-01

    Researches on electrostatics’ conceptions found that students have ideas and conceptions that disagree with the scientific models and that might explain students’ learning difficulties. To favour the change of student’s ideas and conceptions, a teaching sequence that relies on a historical study of electrostatics is proposed. It begins with an exploration of electrostatics phenomena that students would do with everyday materials. About these phenomena they must draw their own explanations that will be shared and discussed in the class. The teacher will collect and summarize the ideas and explanations which are nearer the history of science. A brief history of electrostatics is introduced then, and some texts from scientists are used in an activity role-play-debate type in which the 'supporters of a single fluid' and 'supporters of two fluids' have to present arguments for their model and/or against the other model to explain the phenomena observed in the exploration phase. In the following, students will read texts related to science applications, the main aim of this activity is to relate electrostatics phenomena with current electricity. The first text explains how Franklin understood the nature of the lightning and the lightning rod and the second is a chapter of a roman about one historical episode situated in the Barcelona of the XVIII Century. Students will use the historical models of one and of two fluids to explain these two phenomena, and will compare them with the scientific explanation of the 'accepted' science of nowadays introduced by the teacher. With this type of teaching proposal, conceptual aspect of electrostatics will be learnt, but also the students will learn about the nature and history of science and culture, as well as about the practice of argumentation.

  8. A teaching proposal on electrostatics based on the history of science through the reading of historical texts and argumentative discussions

    Science.gov (United States)

    Castells, Marina; Konstantinidou, Aikaterini; Cerveró, Josep M.

    2016-05-01

    Researches on electrostatics' conceptions found that students have ideas and conceptions that disagree with the scientific models and that might explain students' learning difficulties. To favour the change of student's ideas and conceptions, a teaching sequence that relies on a historical study of electrostatics is proposed. It begins with an exploration of electrostatics phenomena that students would do with everyday materials. About these phenomena they must draw their own explanations that will be shared and discussed in the class. The teacher will collect and summarize the ideas and explanations which are nearer the history of science. A brief history of electrostatics is introduced then, and some texts from scientists are used in an activity role-play-debate type in which the "supporters of a single fluid" and "supporters of two fluids" have to present arguments for their model and/or against the other model to explain the phenomena observed in the exploration phase. In the following, students will read texts related to science applications, the main aim of this activity is to relate electrostatics phenomena with current electricity. The first text explains how Franklin understood the nature of the lightning and the lightning rod and the second is a chapter of a roman about one historical episode situated in the Barcelona of the XVIII Century. Students will use the historical models of one and of two fluids to explain these two phenomena, and will compare them with the scientific explanation of the "accepted" science of nowadays introduced by the teacher. With this type of teaching proposal, conceptual aspect of electrostatics will be learnt, but also the students will learn about the nature and history of science and culture, as well as about the practice of argumentation.

  9. 33 CFR 2.20 - Territorial sea baseline.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line.... Normally, the territorial sea baseline is the mean low water line along the coast of the United States...

  10. Robust extraction of baseline signal of atmospheric trace species using local regression

    Science.gov (United States)

    Ruckstuhl, A. F.; Henne, S.; Reimann, S.; Steinbacher, M.; Vollmer, M. K.; O'Doherty, S.; Buchmann, B.; Hueglin, C.

    2012-11-01

    The identification of atmospheric trace species measurements that are representative of well-mixed background air masses is required for monitoring atmospheric composition change at background sites. We present a statistical method based on robust local regression that is well suited for the selection of background measurements and the estimation of associated baseline curves. The bootstrap technique is applied to calculate the uncertainty in the resulting baseline curve. The non-parametric nature of the proposed approach makes it a very flexible data filtering method. Application to carbon monoxide (CO) measured from 1996 to 2009 at the high-alpine site Jungfraujoch (Switzerland, 3580 m a.s.l.), and to measurements of 1,1-difluoroethane (HFC-152a) from Jungfraujoch (2000 to 2009) and Mace Head (Ireland, 1995 to 2009) demonstrates the feasibility and usefulness of the proposed approach. The determined average annual change of CO at Jungfraujoch for the 1996 to 2009 period as estimated from filtered annual mean CO concentrations is -2.2 ± 1.1 ppb yr-1. For comparison, the linear trend of unfiltered CO measurements at Jungfraujoch for this time period is -2.9 ± 1.3 ppb yr-1.

  11. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    Science.gov (United States)

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  12. Program Baseline Change Control Board charter

    International Nuclear Information System (INIS)

    1993-02-01

    The purpose of this Charter is to establish the Program Baseline Change Control Board (PBCCB) for the Office of Civilian Radioactive Waste Management (OCRWM) Program, and to describe its organization, responsibilities, and basic methods of operation. Guidance for implementing this Charter is provided by the OCRWM Baseline Management Plan (BMP) and OCRWM Program Baseline Change Control Procedure

  13. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    One of the greatest challenges surrounding adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. Such knowledge could guide individually customized countermeasures, which would enable more efficient use of crew time, both preflight and inflight, and provide better outcomes. The primary goal of this project is to look for a baseline performance metric that can forecast sensorimotor adaptability without exposure to an adaptive stimulus. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations in motor performance, as a predictor of individual sensorimotor adaptive capabilities. To-date, a strong relationship has been found between baseline inter-trial correlations and adaptability in two oculomotor systems. For this project, we will explore an analogous predictive mechanism in the locomotion system. METHODS: Baseline Inter-trial Correlations: Inter-trial correlations specify the relationships among repeated trials of a given task that transpire as a consequence of correcting for previous performance errors over multiple timescales. We can quantify the strength of inter-trial correlations by measuring the decay of the autocorrelation function (ACF), which describes how rapidly information from past trials is "forgotten." Processes whose ACFs decay more slowly exhibit longer-term inter-trial correlations (longer memory processes), while processes whose ACFs decay more rapidly exhibit shorterterm inter-trial correlations (shorter memory processes). Longer-term correlations reflect low-frequency activity, which is more easily

  14. n-Gram-Based Text Compression

    Science.gov (United States)

    Duong, Hieu N.; Snasel, Vaclav

    2016-01-01

    We propose an efficient method for compressing Vietnamese text using n-gram dictionaries. It has a significant compression ratio in comparison with those of state-of-the-art methods on the same dataset. Given a text, first, the proposed method splits it into n-grams and then encodes them based on n-gram dictionaries. In the encoding phase, we use a sliding window with a size that ranges from bigram to five grams to obtain the best encoding stream. Each n-gram is encoded by two to four bytes accordingly based on its corresponding n-gram dictionary. We collected 2.5 GB text corpus from some Vietnamese news agencies to build n-gram dictionaries from unigram to five grams and achieve dictionaries with a size of 12 GB in total. In order to evaluate our method, we collected a testing set of 10 different text files with different sizes. The experimental results indicate that our method achieves compression ratio around 90% and outperforms state-of-the-art methods. PMID:27965708

  15. Baseline assessment of benthic communities of the Flower Garden Banks (2010 - 2013) using technical diving operations: 2011

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  16. Baseline assessment of fish communities of the Flower Garden Banks (2010 - 2013) using technical diving operations: 2011

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  17. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    Energy Technology Data Exchange (ETDEWEB)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen [Galson Sciences Ltd. Oakham, Rutland (United Kingdom); Bolton, Gary [National Nuclear Laboratory Risley, Warrington (United Kingdom); McKinney, James; Morris, Darrell [Nuclear Decommissioning Authority Moor Row, Cumbria (United Kingdom); Angus, Mike [National Nuclear Laboratory Risley, Warrington (United Kingdom); Cann, Gavin; Binks, Tracey [National Nuclear Laboratory Sellafield (United Kingdom)

    2013-07-01

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. During the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)

  18. Baselines For Land-Use Change In The Tropics: Application ToAvoided Deforestation Projects

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Sandra; Hall, Myrna; Andrasko, Ken; Ruiz, Fernando; Marzoli, Walter; Guerrero, Gabriela; Masera, Omar; Dushku, Aaron; Dejong,Ben; Cornell, Joseph

    2007-06-01

    /infrastructure factors(less observable) in explaining empirical land-use patterns. We proposefrom the lessons learned, a methodology comprised of three main steps andsix tasks can be used to begin developing credible baselines. We alsopropose that the baselines be projected over a 10-year period because,although projections beyond 10 years are feasible, they are likely to beunrealistic for policy purposes. In the first step, an historic land-usechange and deforestation estimate is made by determining the analyticdomain (size of the region relative to the size of proposed project),obtaining historic data, analyzing candidate historic baseline drivers,and identifying three to four major drivers. In the second step, abaseline of where deforestation is likely to occur --a potential land-usechange (PLUC) map is produced using a spatial model such as GEOMOD thatuses the key drivers from step one. Then rates of deforestation areprojected over a 10-year baseline period using any of the three models.Using the PLUC maps, projected rates of deforestation, and carbon stockestimates, baselineprojections are developed that can be used for projectGHG accounting and crediting purposes: The final step proposes that, atagreed interval (eg, +10 years), the baseline assumptions about baselinedrivers be re-assessed. This step reviews the viability of the 10-yearbaseline in light of changes in one or more key baseline drivers (e.g.,new roads, new communities, new protected area, etc.). The potentialland-use change map and estimates of rates of deforestation could beredone at the agreed interval, allowing the rates and changes in spatialdrivers to be incorporated into a defense of the existing baseline, orderivation of a new baseline projection.

  19. Effect of Enamel Caries Lesion Baseline Severity on Fluoride Dose-Response

    Directory of Open Access Journals (Sweden)

    Frank Lippert

    2017-01-01

    Full Text Available This study aimed to investigate the effect of enamel caries lesion baseline severity on fluoride dose-response under pH cycling conditions. Early caries lesions were created in human enamel specimens at four different severities (8, 16, 24, and 36 h. Lesions were allocated to treatment groups (0, 83, and 367 ppm fluoride as sodium fluoride based on Vickers surface microhardness (VHN and pH cycled for 5 d. The cycling model comprised 3 × 1 min fluoride treatments sandwiched between 2 × 60 min demineralization challenges with specimens stored in artificial saliva in between. VHN was measured again and changes versus lesion baseline were calculated (ΔVHN. Data were analyzed using two-way ANOVA (p<0.05. Increased demineralization times led to increased surface softening. The lesion severity×fluoride concentration interaction was significant (p<0.001. Fluoride dose-response was observed in all groups. Lesions initially demineralized for 16 and 8 h showed similar overall rehardening (ΔVHN and more than 24 and 36 h lesions, which were similar. The 8 h lesions showed the greatest fluoride response differential (367 versus 0 ppm F which diminished with increasing lesion baseline severity. The extent of rehardening as a result of the 0 ppm F treatment increased with increasing lesion baseline severity, whereas it decreased for the fluoride treatments. In conclusion, lesion baseline severity impacts the extent of the fluoride dose-response.

  20. Improving the accuracy of energy baseline models for commercial buildings with occupancy data

    International Nuclear Information System (INIS)

    Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping

    2016-01-01

    Highlights: • We evaluated several baseline models predicting energy use in buildings. • Including occupancy data improved accuracy of baseline model prediction. • Occupancy is highly correlated with energy use in buildings. • This simple approach can be used in decision makings of energy retrofit projects. - Abstract: More than 80% of energy is consumed during operation phase of a building’s life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essential for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. The results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.

  1. Negation scope and spelling variation for text-mining of Danish electronic patient records

    DEFF Research Database (Denmark)

    Thomas, Cecilia Engel; Jensen, Peter Bjødstrup; Werge, Thomas

    2014-01-01

    Electronic patient records are a potentially rich data source for knowledge extraction in biomedical research. Here we present a method based on the ICD10 system for text-mining of Danish health records. We have evaluated how adding functionalities to a baseline text-mining tool affected...

  2. BreakingNews: Article Annotation by Image and Text Processing.

    Science.gov (United States)

    Ramisa, Arnau; Yan, Fei; Moreno-Noguer, Francesc; Mikolajczyk, Krystian

    2018-05-01

    Building upon recent Deep Neural Network architectures, current approaches lying in the intersection of Computer Vision and Natural Language Processing have achieved unprecedented breakthroughs in tasks like automatic captioning or image retrieval. Most of these learning methods, though, rely on large training sets of images associated with human annotations that specifically describe the visual content. In this paper we propose to go a step further and explore the more complex cases where textual descriptions are loosely related to the images. We focus on the particular domain of news articles in which the textual content often expresses connotative and ambiguous relations that are only suggested but not directly inferred from images. We introduce an adaptive CNN architecture that shares most of the structure for multiple tasks including source detection, article illustration and geolocation of articles. Deep Canonical Correlation Analysis is deployed for article illustration, and a new loss function based on Great Circle Distance is proposed for geolocation. Furthermore, we present BreakingNews, a novel dataset with approximately 100K news articles including images, text and captions, and enriched with heterogeneous meta-data (such as GPS coordinates and user comments). We show this dataset to be appropriate to explore all aforementioned problems, for which we provide a baseline performance using various Deep Learning architectures, and different representations of the textual and visual features. We report very promising results and bring to light several limitations of current state-of-the-art in this kind of domain, which we hope will help spur progress in the field.

  3. Text Character Extraction Implementation from Captured Handwritten Image to Text Conversionusing Template Matching Technique

    Directory of Open Access Journals (Sweden)

    Barate Seema

    2016-01-01

    Full Text Available Images contain various types of useful information that should be extracted whenever required. A various algorithms and methods are proposed to extract text from the given image, and by using that user will be able to access the text from any image. Variations in text may occur because of differences in size, style,orientation, alignment of text, and low image contrast, composite backgrounds make the problem during extraction of text. If we develop an application that extracts and recognizes those texts accurately in real time, then it can be applied to many important applications like document analysis, vehicle license plate extraction, text- based image indexing, etc and many applications have become realities in recent years. To overcome the above problems we develop such application that will convert the image into text by using algorithms, such as bounding box, HSV model, blob analysis,template matching, template generation.

  4. Hazard Baseline Downgrade Effluent Treatment Facility

    International Nuclear Information System (INIS)

    Blanchard, A.

    1998-01-01

    This Hazard Baseline Downgrade reviews the Effluent Treatment Facility, in accordance with Department of Energy Order 5480.23, WSRC11Q Facility Safety Document Manual, DOE-STD-1027-92, and DOE-EM-STD-5502-94. It provides a baseline grouping based on the chemical and radiological hazards associated with the facility. The Determination of the baseline grouping for ETF will aid in establishing the appropriate set of standards for the facility

  5. Circular polarization control for the LCLS baseline in the soft X-ray regime

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-12-15

    The LCLS baseline includes a planar undulator system, which produces intense linearly polarized light in the wavelength range 0.15-1.5 nm. In the soft X-ray wavelength region polarization control from linear to circular is highly desirable for studying ultrafast magnetic phenomena and material science issues. Several schemes using helical undulators have been discussed in the context of the LCLS. One consists in replacing three of the last planar undulator segments by helical (APPLE III) ones. A second proposal, the 2nd harmonic helical afterburner, is based on the use of short, crossed undulators tuned to the second harmonic. This last scheme is expected to be the better one. Its advantages are a high (over 90%) and stable degree of circular polarization and a low cost. Its disadvantage is a small output power (1% of the power at the fundamental harmonic) and a narrow wavelength range. We propose a novel method to generate 10 GW level power at the fundamental harmonic with 99% degree of circular polarization from the LCLS baseline. Its merits are low cost, simplicity and easy implementation. In the option presented here, the microbunching of the planar undulator is used too. After the baseline undulator, the electron beam is sent through a 40 m long straight section, and subsequently passes through a short helical (APPLE II) radiator. In this case the microbunch structure is easily preserved, and intense coherent radiation is emitted in the helical radiator. The background radiation from the baseline undulator can be easily suppressed by letting radiation and electron beamthrough horizontal and vertical slits upstream the helical radiator, where the radiation spot size is about ten times larger than the electron bunch transverse size. Using thin Beryllium foils for the slits the divergence of the electron beam halo will increase by Coulomb scattering, but the beam will propagate through the setup without electron losses. The applicability of our method is not

  6. Automatic extraction of ontological relations from Arabic text

    Directory of Open Access Journals (Sweden)

    Mohammed G.H. Al Zamil

    2014-12-01

    The proposed methodology has been designed to analyze Arabic text using lexical semantic patterns of the Arabic language according to a set of features. Next, the features have been abstracted and enriched with formal descriptions for the purpose of generalizing the resulted rules. The rules, then, have formulated a classifier that accepts Arabic text, analyzes it, and then displays related concepts labeled with its designated relationship. Moreover, to resolve the ambiguity of homonyms, a set of machine translation, text mining, and part of speech tagging algorithms have been reused. We performed extensive experiments to measure the effectiveness of our proposed tools. The results indicate that our proposed methodology is promising for automating the process of extracting ontological relations.

  7. The artists' text as work of art

    NARCIS (Netherlands)

    van Rijn, I.A.M.J.

    2017-01-01

    Artists’ texts are texts written and produced by visual artists. Their number increasing since the 2000s, it becomes important to clarify their obscure relationship to art institutions. Analysing and comparing four different artists’ texts on a textual level, this research proposes an alternative to

  8. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

    International Nuclear Information System (INIS)

    Swita, W.R.

    1998-01-01

    This document provides a summary of the Tank Waste Remediation System (TWRS) Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost), developed to demonstrate Readiness-to-Proceed (RTP) in support of the TWRS Phase 1B mission. This Updated Baseline is the proposed TWRS plan to execute and measure the mission work scope. This document and other supporting data demonstrate that the TWRS Project Hanford Management Contract (PHMC) team is prepared to fully support Phase 1B by executing the following scope, schedule, and cost baseline activities: Deliver the specified initial low-activity waste (LAW) and high-level waste (HLW) feed batches in a consistent, safe, and reliable manner to support private contractors' operations starting in June 2002; Deliver specified subsequent LAW and HLW feed batches during Phase 1B in a consistent, safe, and reliable manner; Provide for the interim storage of immobilized HLW (IHLW) products and the disposal of immobilized LAW (ILAW) products generated by the private contractors; Provide for disposal of byproduct wastes generated by the private contractors; and Provide the infrastructure to support construction and operations of the private contractors' facilities

  9. Instantaneous Real-Time Kinematic Decimeter-Level Positioning with BeiDou Triple-Frequency Signals over Medium Baselines

    Directory of Open Access Journals (Sweden)

    Xiyang He

    2015-12-01

    Full Text Available Many applications, such as marine navigation, land vehicles location, etc., require real time precise positioning under medium or long baseline conditions. In this contribution, we develop a model of real-time kinematic decimeter-level positioning with BeiDou Navigation Satellite System (BDS triple-frequency signals over medium distances. The ambiguities of two extra-wide-lane (EWL combinations are fixed first, and then a wide lane (WL combination is reformed based on the two EWL combinations for positioning. Theoretical analysis and empirical analysis is given of the ambiguity fixing rate and the positioning accuracy of the presented method. The results indicate that the ambiguity fixing rate can be up to more than 98% when using BDS medium baseline observations, which is much higher than that of dual-frequency Hatch-Melbourne-Wübbena (HMW method. As for positioning accuracy, decimeter level accuracy can be achieved with this method, which is comparable to that of carrier-smoothed code differential positioning method. Signal interruption simulation experiment indicates that the proposed method can realize fast high-precision positioning whereas the carrier-smoothed code differential positioning method needs several hundreds of seconds for obtaining high precision results. We can conclude that a relatively high accuracy and high fixing rate can be achieved for triple-frequency WL method with single-epoch observations, displaying significant advantage comparing to traditional carrier-smoothed code differential positioning method.

  10. Understanding Kendal aquifer system: a baseline analysis for sustainable water management proposal

    Science.gov (United States)

    Lukman, A.; Aryanto, M. D.; Pramudito, A.; Andhika, A.; Irawan, D. E.

    2017-07-01

    North coast of Java has been grown as the center of economic activities and major connectivity hub for Sumatra and Bali. Sustainable water management must support such role. One of the basis is to understand the baseline of groundwater occurrences and potential. However the complex alluvium aquiver system has not been well-understood. A geoelectric measurements were performed to determine which rock layer has a good potential as groundwater aquifers in the northern coast of Kaliwungu Regency, Kendal District, Central Java province. Total of 10 vertical electrical sounding (VES) points has been performed, using a Schlumberger configuration with the current electrode spacing (AB/2) varies between 200 - 300 m and the potential difference electrode spacing (MN/2) varies between 0.5 to 20 m with depths target ranging between 150 - 200 m. Geoelectrical data processing is done using Ip2win software which generates resistivity value, thickness and depth of subsurface rock layers. Based on the correlation between resistivity value with regional geology, hydrogeology and local well data, we identify three aquifer layers. The first layer is silty clay with resistivity values vary between 0 - 10 ohm.m, then the second layer is tuffaceous claystone with resistivity value between 10 - 60 ohm.m. Both layers serve as impermeable layer. The third layer is sandy tuff with resistivity value between 60 - 100 ohm.m which serves as a confined aquifer layer located at 70 - 100 m below surface. Its thickness is vary between 70 to 110 m. The aquifer layer is a mixing of volcanic and alluvium sediment, which is a member of Damar Formation. The stratification of the aquifer system may change in short distance and depth. This natural setting prevent us to make a long continuous correlation between layers. Aquifer discharge is estimated between 5 - 71 L/s with the potential deep well locations lies in the west and southeast part of the study area. These hydrogeological settings should be used

  11. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

    International Nuclear Information System (INIS)

    Ribeiro, V. A. R. M.; Russo, P.; Cárdenas-Avendaño, A.

    2013-01-01

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009

  12. 10 CFR 850.20 - Baseline beryllium inventory.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the... inventory, the responsible employer must: (1) Review current and historical records; (2) Interview workers...

  13. 40 CFR 80.92 - Baseline auditor requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Baseline auditor requirements. 80.92... (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Anti-Dumping § 80.92 Baseline auditor requirements. (a... determination methodology, resulting baseline fuel parameter, volume and emissions values verified by an auditor...

  14. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  15. Integrated planning: A baseline development perspective

    International Nuclear Information System (INIS)

    Clauss, L.; Chang, D.

    1994-01-01

    The FEMP Baseline establishes the basis for integrating environmental activity technical requirements with their cost and schedule elements. The result is a path forward to successfully achieving the FERMCO mission. Specific to cost management, the FEMP Baseline has been incorporate into the FERMCO Project Control System (PCS) to provide a time-phased budget plan against which contractor performance is measured with an earned value management system. The result is the Performance Measurement Baseline (PMB), an important tool for keeping cost under control

  16. Supplemental Environmental Baseline Survey for Proposed Land Use Permit Modification for Expansion of the Dynamic Explosive Test Site (DETS) 9940 Main Complex Parking Lot

    Energy Technology Data Exchange (ETDEWEB)

    Peek, Dennis W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    The “subject property” is comprised of a parcel of land within the Kirtland Military Reservation, Bernalillo County, New Mexico, as shown on the map in Appendix B of this document. The land requirement for the parking lot addition to the 9940 Main Complex is approximately 2.7 acres. The scope of this Supplemental Environmental Baseline Survey (SEBS) is for the parking lot addition land transfer only. For details on the original 9940 Main Complex see Environmental Baseline Survey, Land Use Permit Request for the 9940 Complex PERM/0-KI-00-0001, August 21, 2003, and for details on the 9940 Complex Expansion see Environmental Baseline Survey, Proposed Land Use Permit Expansion for 9940 DETS Complex, June 24, 2009. The 2.7-acre parcel of land for the new parking lot, which is the subject of this EBS (also referred to as the “subject property”), is adjacent to the southwest boundary of the original 12.3- acre 9940 Main Complex. No testing is known to have taken place on the subject property site. The only activity known to have taken place was the burial of overhead utility lines in 2014. Adjacent to the subject property, the 9940 Main Complex was originally a 12.3-acre site used by the Department of Energy (DOE) under a land use permit from the United States Air Force (USAF). Historical use of the site, dating from 1964, included arming, fusing, and firing of explosives and testing of explosives systems components. In the late 1970s and early 1980s experiments at the 9940 Main Complex shifted toward reactor safety issues. From 1983 to 1988, fuel coolant interaction (FCI) experiments were conducted, as were experiments with conventional high explosives (HE). Today, the land is used for training of the Nuclear Emergency Response community and for research on energetic materials. In 2009, the original complex was expanded to include four additional 20-acre areas: 9940 Training South, 9940 Training East, T-Range 6, and Training West Landing Zone. The proposed use of

  17. Physics with a very long neutrino factory baseline

    International Nuclear Information System (INIS)

    Gandhi, Raj; Winter, Walter

    2007-01-01

    We discuss the neutrino oscillation physics of a very long neutrino factory baseline over a broad range of lengths (between 6000 km and 9000 km), centered on the 'magic baseline' (∼7500 km) where correlations with the leptonic CP phase are suppressed by matter effects. Since the magic baseline depends only on the density, we study the impact of matter density profile effects and density uncertainties over this range, and the impact of detector locations off the optimal baseline. We find that the optimal constant density describing the physics over this entire baseline range is about 5% higher than the average matter density. This implies that the magic baseline is significantly shorter than previously inferred. However, while a single detector optimization requires fine-tuning of the (very long) baseline length, its combination with a near detector at a shorter baseline is much less sensitive to the far detector location and to uncertainties in the matter density. In addition, we point out different applications of this baseline which go beyond its excellent correlation and degeneracy resolution potential. We demonstrate that such a long baseline assists in the improvement of the θ 13 precision and in the resolution of the octant degeneracy. Moreover, we show that the neutrino data from such a baseline could be used to extract the matter density along the profile up to 0.24% at 1σ for large sin 2 2θ 13 , providing a useful discriminator between different geophysical models

  18. Baseline Estimation and Outlier Identification for Halocarbons

    Science.gov (United States)

    Wang, D.; Schuck, T.; Engel, A.; Gallman, F.

    2017-12-01

    The aim of this paper is to build a baseline model for halocarbons and to statistically identify the outliers under specific conditions. In this paper, time series of regional CFC-11 and Chloromethane measurements was discussed, which taken over the last 4 years at two locations, including a monitoring station at northwest of Frankfurt am Main (Germany) and Mace Head station (Ireland). In addition to analyzing time series of CFC-11 and Chloromethane, more importantly, a statistical approach of outlier identification is also introduced in this paper in order to make a better estimation of baseline. A second-order polynomial plus harmonics are fitted to CFC-11 and chloromethane mixing ratios data. Measurements with large distance to the fitting curve are regard as outliers and flagged. Under specific requirement, the routine is iteratively adopted without the flagged measurements until no additional outliers are found. Both model fitting and the proposed outlier identification method are realized with the help of a programming language, Python. During the period, CFC-11 shows a gradual downward trend. And there is a slightly upward trend in the mixing ratios of Chloromethane. The concentration of chloromethane also has a strong seasonal variation, mostly due to the seasonal cycle of OH. The usage of this statistical method has a considerable effect on the results. This method efficiently identifies a series of outliers according to the standard deviation requirements. After removing the outliers, the fitting curves and trend estimates are more reliable.

  19. Biomarker Identification Using Text Mining

    Directory of Open Access Journals (Sweden)

    Hui Li

    2012-01-01

    Full Text Available Identifying molecular biomarkers has become one of the important tasks for scientists to assess the different phenotypic states of cells or organisms correlated to the genotypes of diseases from large-scale biological data. In this paper, we proposed a text-mining-based method to discover biomarkers from PubMed. First, we construct a database based on a dictionary, and then we used a finite state machine to identify the biomarkers. Our method of text mining provides a highly reliable approach to discover the biomarkers in the PubMed database.

  20. Exploring the potential of short-baseline physics at Fermilab

    Science.gov (United States)

    Miranda, O. G.; Pasquini, Pedro; Tórtola, M.; Valle, J. W. F.

    2018-05-01

    We study the capabilities of the short-baseline neutrino program at Fermilab to probe the unitarity of the lepton mixing matrix. We find the sensitivity to be slightly better than the current one. Motivated by the future DUNE experiment, we have also analyzed the potential of an extra liquid Argon near detector in the LBNF beamline. Adding such a near detector to the DUNE setup will substantially improve the current sensitivity on nonunitarity. This would help to remove C P degeneracies due to the new complex phase present in the neutrino mixing matrix. We also study the sensitivity of our proposed setup to light sterile neutrinos for various configurations.

  1. National greenhouse gas emissions baseline scenarios. Learning from experiences in developing countries

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    This report reviews national approaches to preparing baseline scenarios of greenhouse-gas (GHG) emissions. It does so by describing and comparing in non-technical language existing practices and choices made by ten developing countries - Brazil, China, Ethiopia, India, Indonesia, Kenya, Mexico, South Africa, Thailand and Vietnam. The review focuses on a number of key elements, including model choices, transparency considerations, choices about underlying assumptions and challenges associated with data management. The aim is to improve overall understanding of baseline scenarios and facilitate their use for policy-making in developing countries more broadly. The findings are based on the results of a collaborative project involving a number of activities undertaken by the Danish Energy Agency, the Organisation for Economic Co-operation and Development (OECD) and the UNEP Risoe Centre (URC), including a series of workshops on the subject. The ten contributing countries account for approximately 40% of current global GHG emissions - a share that is expected to increase in the future. The breakdown of emissions by sector varies widely among these countries. In some countries, the energy sector is the leading source of emissions; for others, the land-use sector and/or agricultural sector dominate emissions. The report underscores some common technical and financial capacity gaps faced by developing countries when preparing baseline scenarios. It does not endeavour to propose guidelines for preparing baseline scenarios. Rather, it is hoped that the report will inform any future attempts at preparing such kind of guidelines. (Author)

  2. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  3. Basic Test Framework for the Evaluation of Text Line Segmentation and Text Parameter Extraction

    Directory of Open Access Journals (Sweden)

    Darko Brodić

    2010-05-01

    Full Text Available Text line segmentation is an essential stage in off-line optical character recognition (OCR systems. It is a key because inaccurately segmented text lines will lead to OCR failure. Text line segmentation of handwritten documents is a complex and diverse problem, complicated by the nature of handwriting. Hence, text line segmentation is a leading challenge in handwritten document image processing. Due to inconsistencies in measurement and evaluation of text segmentation algorithm quality, some basic set of measurement methods is required. Currently, there is no commonly accepted one and all algorithm evaluation is custom oriented. In this paper, a basic test framework for the evaluation of text feature extraction algorithms is proposed. This test framework consists of a few experiments primarily linked to text line segmentation, skew rate and reference text line evaluation. Although they are mutually independent, the results obtained are strongly cross linked. In the end, its suitability for different types of letters and languages as well as its adaptability are its main advantages. Thus, the paper presents an efficient evaluation method for text analysis algorithms.

  4. Mining consumer health vocabulary from community-generated text.

    Science.gov (United States)

    Vydiswaran, V G Vinod; Mei, Qiaozhu; Hanauer, David A; Zheng, Kai

    2014-01-01

    Community-generated text corpora can be a valuable resource to extract consumer health vocabulary (CHV) and link them to professional terminologies and alternative variants. In this research, we propose a pattern-based text-mining approach to identify pairs of CHV and professional terms from Wikipedia, a large text corpus created and maintained by the community. A novel measure, leveraging the ratio of frequency of occurrence, was used to differentiate consumer terms from professional terms. We empirically evaluated the applicability of this approach using a large data sample consisting of MedLine abstracts and all posts from an online health forum, MedHelp. The results show that the proposed approach is able to identify synonymous pairs and label the terms as either consumer or professional term with high accuracy. We conclude that the proposed approach provides great potential to produce a high quality CHV to improve the performance of computational applications in processing consumer-generated health text.

  5. BILLIARDS: Baseline Instrumented Lithology Lander, Inspector and Asteroid Redirection Demonstration System

    Science.gov (United States)

    Marcus, Matthew; Sloane, Joshua; Ortiz, Oliver; Barbee, Brent

    2015-01-01

    BILLIARDS Baseline Instrumented Lithology Lander, Inspector, and Asteroid Redirection Demonstration System Proposed demonstration mission for Billiard-Ball concept Select asteroid pair with natural close approach to minimize cost and complexity Primary Objectives Rendezvous with a small (10m), near Earth (alpha) asteroid Maneuver the alpha asteroid to a collision with a 100m (beta) asteroid Produce a detectable deflection or disruption of the beta asteroid Secondary objectives Contribute knowledge of asteroid composition and characteristics Contribute knowledge of small-body formation Opportunity for international collaboration

  6. Sensitivity of amounts and distribution of tropical forest carbon credits depending on baseline rules

    International Nuclear Information System (INIS)

    Griscom, Bronson; Shoch, David; Stanley, Bill; Cortez, Rane; Virgilio, Nicole

    2009-01-01

    One of the largest sources of global greenhouse gas emissions can be addressed through conservation of tropical forests by channeling funds to developing countries at a cost-savings for developed countries. However, questions remain to be resolved in negotiating a system for including reduced emissions from deforestation and forest degradation (REDD) in a post-Kyoto climate treaty. The approach to determine national baselines, or reference levels, for quantifying REDD has emerged as central to negotiations over a REDD mechanism in a post-Kyoto policy framework. The baseline approach is critical to the success of a REDD mechanism because it affects the quantity, credibility, and equity of credits generated from efforts to reduce forest carbon emissions. We compared outcomes of seven proposed baseline approaches as a function of country circumstances, using a retrospective analysis of FAO-FRA data on forest carbon emissions from deforestation. Depending upon the baseline approach used, the total credited emissions avoided ranged over two orders of magnitude for the same quantity of actual emissions reductions. There was also a wide range in the relative distribution of credits generated among the five country types we identified. Outcomes were especially variable for countries with high remaining forest and low rates of deforestation (HFLD). We suggest that the most credible approaches measure emissions avoided with respect to a business-as-usual baseline scenario linked to historic emissions data, and allow limited adjustments based on forest carbon stocks.

  7. Proposal for modification to forward module design

    International Nuclear Information System (INIS)

    Lindsay, S.; Taylor, G.

    2000-01-01

    Concern for the baseline forward module thermal and mechanical viability has led to a proposed modification to the design described here. In view of the tight schedule to finalise the module design, proposed changes are constrained so that calculations and proto typing can be carried out without major changes to the key elements in the module. The following constraints were considered in the process of this work: 1. The hybrid contributes the bulk of the power to be removed from the module. 2. The temperature and its variation across the detector are the key specifications for the cooling design of the module. The hybrid temperature may impact via (secondary) convection and radiation heating, but its operation temperature is not assumed to be the major constraint. 3. The forward hybrid design is well advanced and represents a large effort that should be preserved. 4. The overall design of the module, in particular overall dimensions and placement of precision mounting points is well advanced. Assembly jigs based upon these dimensions are also advanced. The following problems are addressed by the current proposal: 1. The constraint of the small cooling point required to cool both the hybrid and the detector in the baseline is considered a serious limitation demanding high performance in the design and implementation of this contact in the baseline. 2. The small surface area of this contact is critical. Concerns that distortions of the block or relative distortions in the module between the detector and the hybrid, might further reduce the critical effective contact area, as well as possibly causing other problems, give further impetus to the proposed design modification. 3. Thermo-mechanical stress due to the cooling points at both ends of the module. 4. Lack of support of the hybrid near to the cable connectors. 5. Close proximity of the cooling pipe to the front-end electronics and the wire bonds. The proposal involves extending the hybrid substrate with two

  8. Base-line studies for DAE establishments

    International Nuclear Information System (INIS)

    Puranik, V.D.

    2012-01-01

    to ensure that the seasonal variations in parameters are considered. The data is generated for an area covering at least 30 km radium around the proposed location of the facility, in a manner, such that very dense data is generated close to the location and it becomes gradually sparce for the distant areas. Base-line data is generated with the help of local universities and institutions under constant interaction and supervision of the departmental personnel. The work is carried out as per the set protocols laid down by the department for such purposes. The protocols conform to the international practices for carrying out such work. The studies include measurement of concentrations of naturally occurring and man-made radionuclides and also heavy toxic metals and other pollutants in various environmental matrices such as air sub soil water, surface water, soil, sediment, biota and locally consumed food items including meat, fish, milk, eggs, vegetables and cereals. Studies on density and variety of flora and fauna in the region are carried out. Health status and demographic status is recorded in detail. Hydrogeological studies are carried out to establish ground water movement at the location. Based on the data so generated, a Remote Sensing and Geographic Information System is prepared to collate the data. For coastal locations, studies of the nearby marine environment are also carried out. The baseline data is a valuable set of information of the environmental status of a location prevailing before the start of the departmental activity. Its importance is two fold - firstly because, it can not be generated after the start of the activity at the given location and secondly because it is the most authentic data set which can be referred later to assess the environmental impact of the facility by way of evaluating the changes in the environmental parameters, if any. (author)

  9. The TDAQ Baseline Architecture

    CERN Multimedia

    Wickens, F J

    The Trigger-DAQ community is currently busy preparing material for the DAQ, HLT and DCS TDR. Over the last few weeks a very important step has been a series of meetings to complete agreement on the baseline architecture. An overview of the architecture indicating some of the main parameters is shown in figure 1. As reported at the ATLAS Plenary during the February ATLAS week, the main area where the baseline had not yet been agreed was around the Read-Out System (ROS) and details in the DataFlow. The agreed architecture has: Read-Out Links (ROLs) from the RODs using S-Link; Read-Out Buffers (ROB) sited near the RODs, mounted in a chassis - today assumed to be a PC, using PCI bus at least for configuration, control and monitoring. The baseline assumes data aggregation, in the ROB and/or at the output (which could either be over a bus or in the network). Optimization of the data aggregation will be made in the coming months, but the current model has each ROB card receiving input from 4 ROLs, and 3 such c...

  10. Flexible frontiers for text division into rows

    Directory of Open Access Journals (Sweden)

    Dan L. Lacrămă

    2009-01-01

    Full Text Available This paper presents an original solution for flexible hand-written text division into rows. Unlike the standard procedure, the proposed method avoids the isolated characters extensions amputation and reduces the recognition error rate in the final stage.

  11. Script-independent text line segmentation in freestyle handwritten documents.

    Science.gov (United States)

    Li, Yi; Zheng, Yefeng; Doermann, David; Jaeger, Stefan; Li, Yi

    2008-08-01

    Text line segmentation in freestyle handwritten documents remains an open document analysis problem. Curvilinear text lines and small gaps between neighboring text lines present a challenge to algorithms developed for machine printed or hand-printed documents. In this paper, we propose a novel approach based on density estimation and a state-of-the-art image segmentation technique, the level set method. From an input document image, we estimate a probability map, where each element represents the probability that the underlying pixel belongs to a text line. The level set method is then exploited to determine the boundary of neighboring text lines by evolving an initial estimate. Unlike connected component based methods ( [1], [2] for example), the proposed algorithm does not use any script-specific knowledge. Extensive quantitative experiments on freestyle handwritten documents with diverse scripts, such as Arabic, Chinese, Korean, and Hindi, demonstrate that our algorithm consistently outperforms previous methods [1]-[3]. Further experiments show the proposed algorithm is robust to scale change, rotation, and noise.

  12. A New Approach to Estimate Forest Parameters Using Dual-Baseline Pol-InSAR Data

    Science.gov (United States)

    Bai, L.; Hong, W.; Cao, F.; Zhou, Y.

    2009-04-01

    In POL-InSAR applications using ESPRIT technique, it is assumed that there exist stable scattering centres in the forest. However, the observations in forest severely suffer from volume and temporal decorrelation. The forest scatters are not stable as assumed. The obtained interferometric information is not accurate as expected. Besides, ESPRIT techniques could not identify the interferometric phases corresponding to the ground and the canopy. It provides multiple estimations for the height between two scattering centers due to phase unwrapping. Therefore, estimation errors are introduced to the forest height results. To suppress the two types of errors, we use the dual-baseline POL-InSAR data to estimate forest height. Dual-baseline coherence optimization is applied to obtain interferometric information of stable scattering centers in the forest. From the interferometric phases for different baselines, estimation errors caused by phase unwrapping is solved. Other estimation errors can be suppressed, too. Experiments are done to the ESAR L band POL-InSAR data. Experimental results show the proposed methods provide more accurate forest height than ESPRIT technique.

  13. A Novel Approach for Arabic Text Steganography Based on the “BloodGroup” Text Hiding Method

    Directory of Open Access Journals (Sweden)

    S. Malalla,

    2017-04-01

    Full Text Available Steganography is the science of hiding certain messages (data in groups of irrelevant data possibly of other form. The purpose of steganography is covert communication to hide the existence of a message from an intermediary. Text Steganography is the process of embedding secret message (text in another text (cover text so that the existence of secret message cannot be detected by a third party. This paper presents a novel approach for text steganography using the Blood Group (BG method based on the behavior of blood group. Experimentally it is found that the proposed method got good results in capacity, hiding capacity, time complexity, robustness, visibility, and similarity which shows its superiority as compared to most several existing methods.

  14. The first Malay language storytelling text-to-speech (TTS) corpus for ...

    African Journals Online (AJOL)

    speech annotations are described in detail in accordance to baseline work. The stories were recorded in two speaking styles that are neutral and storytelling speaking style. The first. Malay language storytelling corpus is not only necessary for the development of a storytelling text-to-speech (TTS) synthesis. It is also ...

  15. The NuMAX Long Baseline Neutrino Factory Concept

    Energy Technology Data Exchange (ETDEWEB)

    Delahaye, J-P. [SLAC; Ankenbrandt, C. [MUONS Inc., Batavia; Bogacz, A. [Jefferson Lab; Huber, P. [Virginia Tech.; Kirk, H. [Brookhaven; Neuffer, D. [Fermilab; Palmer, M. A. [Fermilab; Ryne, R. [LBL, Berkeley; Snopok, P. [IIT, Chicago

    2018-03-19

    A Neutrino Factory where neutrinos of all species are produced in equal quantities by muon decay is described as a facility at the intensity frontier for exquisite precision providing ideal conditions for ultimate neutrino studies and the ideal complement to Long Baseline Facilities like LBNF at Fermilab. It is foreseen to be built in stages with progressively increasing complexity and performance, taking advantage of existing or proposed facilities at an existing laboratory like Fermilab. A tentative layout based on a recirculating linac providing opportunities for considerable saving is discussed as well as its possible evolution toward a muon collider if and when requested by Physics. Tentative parameters of the various stages are presented as well as the necessary R&D to address the technological issues and demonstrate their feasibility.

  16. Scene text recognition in mobile applications by character descriptor and structure configuration.

    Science.gov (United States)

    Yi, Chucai; Tian, Yingli

    2014-07-01

    Text characters and strings in natural scene can provide valuable information for many applications. Extracting text directly from natural scene images or videos is a challenging task because of diverse text patterns and variant background interferences. This paper proposes a method of scene text recognition from detected text regions. In text detection, our previously proposed algorithms are applied to obtain text regions from scene image. First, we design a discriminative character descriptor by combining several state-of-the-art feature detectors and descriptors. Second, we model character structure at each character class by designing stroke configuration maps. Our algorithm design is compatible with the application of scene text extraction in smart mobile devices. An Android-based demo system is developed to show the effectiveness of our proposed method on scene text information extraction from nearby objects. The demo system also provides us some insight into algorithm design and performance improvement of scene text extraction. The evaluation results on benchmark data sets demonstrate that our proposed scheme of text recognition is comparable with the best existing methods.

  17. Long Baseline Observatory (LBO)

    Data.gov (United States)

    Federal Laboratory Consortium — The Long Baseline Observatory (LBO) comprises ten radio telescopes spanning 5,351 miles. It's the world's largest, sharpest, dedicated telescope array. With an eye...

  18. Hanford Site technical baseline database. Revision 1

    International Nuclear Information System (INIS)

    Porter, P.E.

    1995-01-01

    This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available

  19. DGEMP-OE (2008) Energy Baseline Scenario. Synthesis report

    International Nuclear Information System (INIS)

    2008-01-01

    the CAS scenarios relies primarily on 2000 data, despite the existence of sufficiently complete statistics through to 2005. The DGEMP on the other hand used a study by the BIPE (Office for Economic Information and Forecasting) provided by the SESP, the Ministry for Ecology, Energy, Sustainable Development and Spatial Planning's economic statistics and forecasting department. On the basis of the study's macro-economic projections of the French economy to 2020, the DGEMP was able to re-evaluate the prospects for activity in the industrial and tertiary sectors. In several respects (e.g. supply security, CO 2 emissions, energy efficiency), the baseline scenario proposed here is clearly not a scenario conducive to satisfying French energy policy objectives. This is not a surprising conclusion in that it implies the need to implement new policies and measures in addition to those already in place or approved. In particular, this scenario would lead to importing 66 billion cubic meters of gas (59 Mtoe) in 2020 and 78 billion cubic meters (70 Mtoe) in 2030, compared with the present 44 billion cubic meters. In addition to the resulting CO 2 emissions, the near doubling of gas imports would pose a twofold problem as to the geographic origin of the gas imported (under appropriate supply contracts) and the infrastructure (LNG terminals, gas pipelines) required to transport it. Finally, the baseline scenario is of course a long way from achieving the Community targets, whether for CO 2 emissions, projected to rise continually until 2020 and then even faster until 2030 (due to transport and electric power generation), or for the share of renewable energy in the energy mix. In that regard, the share of renewable energy in 'enlarged' final energy consumption, as it is described in the 'energy and climate change package', would grow to 13.4% in 2020 (versus 23% in the Commission's burden sharing proposal) and to 13.7% in 2030, compared with the 10.3% share observed in 2006

  20. FAIR - Baseline technical report. Executive summary

    International Nuclear Information System (INIS)

    Gutbrod, H.H.; Augustin, I.; Eickhoff, H.; Gross, K.D.; Henning, W.F.; Kraemer, D.; Walter, G.

    2006-09-01

    This document presents the Executive Summary, the first of six volumes comprising the 2006 Baseline Technical Report (BTR) for the international FAIR project (Facility for Antiproton and Ion Research). The BTR provides the technical description, cost, schedule, and assessments of risk for the proposed new facility. The purpose of the BTR is to provide a reliable basis for the construction, commissioning and operation of FAIR. The BTR is one of the central documents requested by the FAIR International Steering Committee (ISC) and its working groups, in order to prepare the legal process and the decisions on the construction and operation of FAIR in an international framework. It provides the technical basis for legal contracts on contributions to be made by, so far, 13 countries within the international FAIR Consortium. The BTR begins with this extended Executive Summary as Volume 1, which is also intended for use as a stand-alone document. The Executive Summary provides brief summaries of the accelerator facilities, the scientific programs and experimental stations, civil construction and safety, and of the workproject structure, costs and schedule. (orig.)

  1. Strategy as Texts

    DEFF Research Database (Denmark)

    Obed Madsen, Søren

    of the strategy into four categories. Second, the managers produce new texts based on the original strategy document by using four different ways of translation models. The study’s findings contribute to three areas. Firstly, it shows that translation is more than a sociological process. It is also...... a craftsmanship that requires knowledge and skills, which unfortunately seems to be overlooked in both the literature and in practice. Secondly, it shows that even though a strategy text is in singular, the translation makes strategy plural. Thirdly, the article proposes a way to open up the black box of what......This article shows empirically how managers translate a strategy plan at an individual level. By analysing how managers in three organizations translate strategies, it identifies that the translation happens in two steps: First, the managers decipher the strategy by coding the different parts...

  2. Social Media Text Classification by Enhancing Well-Formed Text Trained Model

    Directory of Open Access Journals (Sweden)

    Phat Jotikabukkana

    2016-09-01

    Full Text Available Social media are a powerful communication tool in our era of digital information. The large amount of user-generated data is a useful novel source of data, even though it is not easy to extract the treasures from this vast and noisy trove. Since classification is an important part of text mining, many techniques have been proposed to classify this kind of information. We developed an effective technique of social media text classification by semi-supervised learning utilizing an online news source consisting of well-formed text. The computer first automatically extracts news categories, well-categorized by publishers, as classes for topic classification. A bag of words taken from news articles provides the initial keywords related to their category in the form of word vectors. The principal task is to retrieve a set of new productive keywords. Term Frequency-Inverse Document Frequency weighting (TF-IDF and Word Article Matrix (WAM are used as main methods. A modification of WAM is recomputed until it becomes the most effective model for social media text classification. The key success factor was enhancing our model with effective keywords from social media. A promising result of 99.50% accuracy was achieved, with more than 98.5% of Precision, Recall, and F-measure after updating the model three times.

  3. Quality Inspection of Printed Texts

    DEFF Research Database (Denmark)

    Pedersen, Jesper Ballisager; Nasrollahi, Kamal; Moeslund, Thomas B.

    2016-01-01

    -folded: for costumers of the printing and verification system, the overall grade used to verify if the text is of sufficient quality, while for printer's manufacturer, the detailed character/symbols grades and quality measurements are used for the improvement and optimization of the printing task. The proposed system...

  4. 75 FR 66748 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  5. Cost-effective way to enhance the capabilities of the LCLS baseline

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2010-08-01

    This paper discusses the potential for enhancing the LCLS hard X-ray FEL capabilities. In the hard X-ray regime, a high longitudinal coherence will be the key to such performance upgrade. The method considered here to obtain high longitudinal coherence is based on a novel single-bunch self-seeding scheme exploiting a single crystal monochromator, which is extremely compact and can be straightforwardly installed in the LCLS baseline undulator. We present simulation results dealing with the LCLS hard X-ray FEL, and show that this method can produce fully-coherent X-ray pulses at 100 GW power level. With the radiation beam monochromatized down to the Fourier transform limit, a variety of very different techniques leading to further improvements of the LCLS performance become feasible. In particular, we describe an efficient way for obtaining full polarization control at the LCLS hard X-ray FEL. We also propose to exploit crystals in the Bragg reflection geometry as movable deflectors for the LCLS X-ray transport systems. The hard X-ray beam can be deflected of an angle of order of a radian without perturbations. The monochromatization of the output radiation constitutes the key for reaching such result. Finally, we describe a newoptical pump - hard X-ray probe technique which will allow time-resolved studies at the LCLS baseline on the femtosecond time scale. The principle of operation of the proposed scheme is essentially based on the use of the time jitter between pump and probe pulses. This eliminates the need for timing XFELs to high-power conventional lasers with femtosecond accuracy. (orig.)

  6. Cost-effective way to enhance the capabilities of the LCLS baseline

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-08-15

    This paper discusses the potential for enhancing the LCLS hard X-ray FEL capabilities. In the hard X-ray regime, a high longitudinal coherence will be the key to such performance upgrade. The method considered here to obtain high longitudinal coherence is based on a novel single-bunch self-seeding scheme exploiting a single crystal monochromator, which is extremely compact and can be straightforwardly installed in the LCLS baseline undulator. We present simulation results dealing with the LCLS hard X-ray FEL, and show that this method can produce fully-coherent X-ray pulses at 100 GW power level. With the radiation beam monochromatized down to the Fourier transform limit, a variety of very different techniques leading to further improvements of the LCLS performance become feasible. In particular, we describe an efficient way for obtaining full polarization control at the LCLS hard X-ray FEL. We also propose to exploit crystals in the Bragg reflection geometry as movable deflectors for the LCLS X-ray transport systems. The hard X-ray beam can be deflected of an angle of order of a radian without perturbations. The monochromatization of the output radiation constitutes the key for reaching such result. Finally, we describe a newoptical pump - hard X-ray probe technique which will allow time-resolved studies at the LCLS baseline on the femtosecond time scale. The principle of operation of the proposed scheme is essentially based on the use of the time jitter between pump and probe pulses. This eliminates the need for timing XFELs to high-power conventional lasers with femtosecond accuracy. (orig.)

  7. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  8. Detecting text in natural scenes with multi-level MSER and SWT

    Science.gov (United States)

    Lu, Tongwei; Liu, Renjun

    2018-04-01

    The detection of the characters in the natural scene is susceptible to factors such as complex background, variable viewing angle and diverse forms of language, which leads to poor detection results. Aiming at these problems, a new text detection method was proposed, which consisted of two main stages, candidate region extraction and text region detection. At first stage, the method used multiple scale transformations of original image and multiple thresholds of maximally stable extremal regions (MSER) to detect the text regions which could detect character regions comprehensively. At second stage, obtained SWT maps by using the stroke width transform (SWT) algorithm to compute the candidate regions, then using cascaded classifiers to propose non-text regions. The proposed method was evaluated on the standard benchmark datasets of ICDAR2011 and the datasets that we made our own data sets. The experiment results showed that the proposed method have greatly improved that compared to other text detection methods.

  9. Quality baseline of the castilla blackberry (Rubus glaucus in its food chain

    Directory of Open Access Journals (Sweden)

    Fernanda Iza

    2016-09-01

    Full Text Available A proposal for improvement in the performance of the food chain of castilla blackberry (Rubus glaucus in order to potentiate their productivity can only start from a baseline or situational diagnosis of the quality of the fruit and hence identify the main points of improvement. The food chain of the fruit identifies three stages, harvest, post-harvest (storage and transport and marketing or sale. The diagnosis in each stage began with reverse mode. It was identified the most representative producer and the supplying for traders to the point of sale. The quality evaluation of the fruit was performed through chemical and physical characterization in the four stages. Weight loss or losses were evident in all stages, light no significant changes of color from bright red bluish hue in the collection stage until opaque bluish red or off, at the stage of sale due to the short cycle time and the characteristics non-climacteric fruit. However, at all stages of collection, storage, transportation and sale, they presented significant changes in the indices of maturity which meant an increase of sugars, decreased of pH, and increase acidity. The results indicate that the fruit changed its physicochemical characteristics during the stages of the food chain affecting its productivity.

  10. Dynamic Chemical Model for $\\text {H} _2 $/$\\text {O} _2 $ Combustion Developed Through a Community Workflow

    KAUST Repository

    Oreluk, James; Needham, Craig D.; Baskaran, Sathya; Sarathy, Mani; Burke, Michael P.; West, Richard H.; Frenklach, Michael; Westmoreland, Phillip R.

    2018-01-01

    Elementary-reaction models for $\\text{H}_2$/$\\text{O}_2$ combustion were evaluated and optimized through a collaborative workflow, establishing accuracy and characterizing uncertainties. Quantitative findings were the optimized model, the importance of $\\text{H}_2 + \\text{O}_2(1\\Delta) = \\text{H} + \\text{HO}_2$ in high-pressure flames, and the inconsistency of certain low-temperature shock-tube data. The workflow described here is proposed to be even more important because the approach and publicly available cyberinfrastructure allows future community development of evolving improvements. The workflow steps applied here were to develop an initial reaction set using Burke et al. [2012], Burke et al. [2013], Sellevag et al. [2009], and Konnov [2015]; test it for thermodynamic and kinetics consistency and plausibility against other sets in the literature; assign estimated uncertainties where not stated in the sources; select key data targets (

  11. Dynamic Chemical Model for $\\text {H} _2 $/$\\text {O} _2 $ Combustion Developed Through a Community Workflow

    KAUST Repository

    Oreluk, James

    2018-01-30

    Elementary-reaction models for $\\\\text{H}_2$/$\\\\text{O}_2$ combustion were evaluated and optimized through a collaborative workflow, establishing accuracy and characterizing uncertainties. Quantitative findings were the optimized model, the importance of $\\\\text{H}_2 + \\\\text{O}_2(1\\\\Delta) = \\\\text{H} + \\\\text{HO}_2$ in high-pressure flames, and the inconsistency of certain low-temperature shock-tube data. The workflow described here is proposed to be even more important because the approach and publicly available cyberinfrastructure allows future community development of evolving improvements. The workflow steps applied here were to develop an initial reaction set using Burke et al. [2012], Burke et al. [2013], Sellevag et al. [2009], and Konnov [2015]; test it for thermodynamic and kinetics consistency and plausibility against other sets in the literature; assign estimated uncertainties where not stated in the sources; select key data targets (

  12. Building Background Knowledge through Reading: Rethinking Text Sets

    Science.gov (United States)

    Lupo, Sarah M.; Strong, John Z.; Lewis, William; Walpole, Sharon; McKenna, Michael C.

    2018-01-01

    To increase reading volume and help students access challenging texts, the authors propose a four-dimensional framework for text sets. The quad text set framework is designed around a target text: a challenging content area text, such as a canonical literary work, research article, or historical primary source document. The three remaining…

  13. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, B.C.; Menne, T.; Johansen, J.D.

    2008-01-01

    Background: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. Objectives: To examine associations of 21 allergens in the European baseline series to polysensitization....... Patients/Methods: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. Results...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization Udgivelsesdato: 2008...

  14. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, Berit Christina; Menné, Torkil; Johansen, Jeanne Duus

    2008-01-01

    BACKGROUND: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. OBJECTIVES: To examine associations of 21 allergens in the European baseline series to polysensitization....... PATIENTS/METHODS: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. RESULTS...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization....

  15. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  16. Baselines and test data for cross-lingual inference

    DEFF Research Database (Denmark)

    Agic, Zeljko; Schluter, Natalie

    2018-01-01

    The recent years have seen a revival of interest in textual entailment, sparked by i) the emergence of powerful deep neural network learners for natural language processing and ii) the timely development of large-scale evaluation datasets such as SNLI. Recast as natural language inference......, the problem now amounts to detecting the relation between pairs of statements: they either contradict or entail one another, or they are mutually neutral. Current research in natural language inference is effectively exclusive to English. In this paper, we propose to advance the research in SNLI-style natural...... language inference toward multilingual evaluation. To that end, we provide test data for four major languages: Arabic, French, Spanish, and Russian. We experiment with a set of baselines. Our systems are based on cross-lingual word embeddings and machine translation. While our best system scores an average...

  17. Baseline prevalence and longitudinal evolution of non-motor symptoms in early Parkinson's disease: the PPMI cohort.

    Science.gov (United States)

    Simuni, Tanya; Caspell-Garcia, Chelsea; Coffey, Christopher S; Weintraub, Daniel; Mollenhauer, Brit; Lasch, Shirley; Tanner, Caroline M; Jennings, Danna; Kieburtz, Karl; Chahine, Lana M; Marek, Kenneth

    2018-01-01

    To examine the baseline prevalence and longitudinal evolution in non-motor symptoms (NMS) in a prospective cohort of, at baseline, patients with de novo Parkinson's disease (PD) compared with healthy controls (HC). Parkinson's Progression Markers Initiative (PPMI) is a longitudinal, ongoing, controlled study of de novo PD participants and HC. NMS were rated using the Movement Disorder Society Unified Parkinson's Disease Rating Scale (MDS-UPDRS) Part I score and other validated NMS scales at baseline and after 2 years. Biological variables included cerebrospinal fluid (CSF) markers and dopamine transporter imaging. 423 PD subjects and 196 HC were enrolled and followed for 2 years. MDS-UPDRS Part I total mean (SD) scores increased from baseline 5.6 (4.1) to 7.7 (5.0) at year 2 in PD subjects (pbaseline NMS score was associated with female sex (p=0.008), higher baseline MDS-UPDRS Part II scores (pbaseline. There was no association with the dose or class of dopaminergic therapy. This study of NMS in early PD identified clinical and biological variables associated with both baseline burden and predictors of progression. The association of a greater longitudinal increase in NMS with lower baseline Aβ1-42 level is an important finding that will have to be replicated in other cohorts. ClinicalTrials.gov identifier: NCT01141023. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Business-as-Unusual: Existing policies in energy model baselines

    International Nuclear Information System (INIS)

    Strachan, Neil

    2011-01-01

    Baselines are generally accepted as a key input assumption in long-term energy modelling, but energy models have traditionally been poor on identifying baselines assumptions. Notably, transparency on the current policy content of model baselines is now especially critical as long-term climate mitigation policies have been underway for a number of years. This paper argues that the range of existing energy and emissions policies are an integral part of any long-term baseline, and hence already represent a 'with-policy' baseline, termed here a Business-as-Unusual (BAuU). Crucially, existing energy policies are not a sunk effort; as impacts of existing policy initiatives are targeted at future years, they may be revised through iterative policy making, and their quantitative effectiveness requires ex-post verification. To assess the long-term role of existing policies in energy modelling, currently identified UK policies are explicitly stripped out of the UK MARKAL Elastic Demand (MED) optimisation energy system model, to generate a BAuU (with-policy) and a REF (without policy) baseline. In terms of long-term mitigation costs, policy-baseline assumptions are comparable to another key exogenous modelling assumption - that of global fossil fuel prices. Therefore, best practice in energy modelling would be to have both a no-policy reference baseline, and a current policy reference baseline (BAuU). At a minimum, energy modelling studies should have a transparent assessment of the current policy contained within the baseline. Clearly identifying and comparing policy-baseline assumptions are required for cost effective and objective policy making, otherwise energy models will underestimate the true cost of long-term emissions reductions.

  19. Pakistan, Sindh Province - Baseline Indicators System : Baseline Procurement Performance Assessment Report

    OpenAIRE

    World Bank

    2009-01-01

    This document provides an assessment of the public procurement system in Sindh province using the baseline indicators system developed by the Development Assistance Committee of the Organization for Economic Cooperation and Development (OECD-DAC). This assessment, interviews and discussions were held with stakeholders from the public and private sectors as well as civil society. Developing...

  20. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  1. Baseline radionuclide concentrations in soils and vegetation around the proposed Weapons Engineering Tritium Facility and the Weapons Subsystems Laboratory at TA-16

    International Nuclear Information System (INIS)

    Fresquez, P.R.; Ennis, M.

    1995-09-01

    A preoperational environmental survey is required by the Department of Energy (DOE) for all federally funded research facilities that have the potential to cause adverse impacts on the environment. Therefore, in accordance with DOE Order 5400.1, an environmental survey was conducted over the proposed sites of the Weapons Engineering Tritium Facility (WETF) and the Weapons Subsystems Laboratory (WSL) at Los Alamos National Laboratory (LANL) at TA-16. Baseline concentrations of tritium ( 3 H), plutonium ( 238 Pu and 239 Pu) and total uranium were measured in soils, vegetation (pine needles and oak leaves) and ground litter. Tritium was also measured from air samples, while cesium ( 137 Cs) was measured in soils. The mean concentration of airborne tritiated water during 1987 was 3.9 pCi/m 3 . Although the mean annual concentration of 3 H in soil moisture at the 0--5 cm (2 in) soil depth was measured at 0.6 pCi/mL, a better background level, based on long-term regional data, was considered to be 2.6 pCi/mL. Mean values for 137 Cs, 218 Pu, 239 Pu, and total uranium in soils collected from the 0--5 cm depth were 1.08 pCi/g, 0.0014 pCi/g, 0.0325 pCi/g, and 4.01 microg/g, respectively. Ponderosa pine (Pinus ponderosa) needles contained higher values of 238 Pu, 239 Pu, and total uranium than did leaves collected from gambel's oak (Quercus gambelii). In contrast, leaves collected from gambel's oak contained higher levels of 137 Cs than what pine needles did

  2. A Relational Reasoning Approach to Text-Graphic Processing

    Science.gov (United States)

    Danielson, Robert W.; Sinatra, Gale M.

    2017-01-01

    We propose that research on text-graphic processing could be strengthened by the inclusion of relational reasoning perspectives. We briefly outline four aspects of relational reasoning: "analogies," "anomalies," "antinomies", and "antitheses". Next, we illustrate how text-graphic researchers have been…

  3. Comparison of Document Index Graph Using TextRank and HITS Weighting Method in Automatic Text Summarization

    Science.gov (United States)

    Hadyan, Fadhlil; Shaufiah; Arif Bijaksana, Moch.

    2017-01-01

    Automatic summarization is a system that can help someone to take the core information of a long text instantly. The system can help by summarizing text automatically. there’s Already many summarization systems that have been developed at this time but there are still many problems in those system. In this final task proposed summarization method using document index graph. This method utilizes the PageRank and HITS formula used to assess the web page, adapted to make an assessment of words in the sentences in a text document. The expected outcome of this final task is a system that can do summarization of a single document, by utilizing document index graph with TextRank and HITS to improve the quality of the summary results automatically.

  4. Relationship of Baseline Hemoglobin Level with Serum Ferritin, Postphlebotomy Hemoglobin Changes, and Phlebotomy Requirements among HFE C282Y Homozygotes

    Directory of Open Access Journals (Sweden)

    Seyed Ali Mousavi

    2015-01-01

    Full Text Available Objectives. We aimed to examine whether baseline hemoglobin levels in C282Y-homozygous patients are related to the degree of serum ferritin (SF elevation and whether patients with different baseline hemoglobin have different phlebotomy requirements. Methods. A total of 196 patients (124 males and 72 females who had undergone therapeutic phlebotomy and had SF and both pre- and posttreatment hemoglobin values were included in the study. Results. Bivariate correlation analysis suggested that baseline SF explains approximately 6 to 7% of the variation in baseline hemoglobin. The results also showed that males who had higher (≥150 g/L baseline hemoglobin levels had a significantly greater reduction in their posttreatment hemoglobin despite requiring fewer phlebotomies to achieve iron depletion than those who had lower (<150 g/L baseline hemoglobin, regardless of whether baseline SF was below or above 1000 µg/L. There were no significant differences between hemoglobin subgroups regarding baseline and treatment characteristics, except for transferrin saturation between male subgroups with SF above 1000 µg/L. Similar differences were observed when females with higher (≥138 g/L baseline hemoglobin were compared with those with lower (<138 g/L baseline hemoglobin. Conclusion. Dividing C282Y-homozygous patients into just two subgroups according to the degree of baseline SF elevation may obscure important subgroup variations.

  5. FAQs about Baseline Testing among Young Athletes

    Science.gov (United States)

    ... a similar exam conducted by a health care professional during the season if an athlete has a suspected concussion. Baseline testing generally takes place during the pre-season—ideally prior to the first practice. It is important to note that some baseline ...

  6. 75 FR 74706 - Notice of Baseline Filings

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings November 24, 2010. Centana Intrastate Pipeline, LLC. Docket No. PR10-84-001. Centana Intrastate Pipeline, LLC... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  7. Ontology Assisted Formal Specification Extraction from Text

    Directory of Open Access Journals (Sweden)

    Andreea Mihis

    2010-12-01

    Full Text Available In the field of knowledge processing, the ontologies are the most important mean. They make possible for the computer to understand better the natural language and to make judgments. In this paper, a method which use ontologies in the semi-automatic extraction of formal specifications from a natural language text is proposed.

  8. Translating genetic research into preventive intervention: The baseline target moderated mediator design

    Directory of Open Access Journals (Sweden)

    George W. Howe

    2016-01-01

    Full Text Available In this paper we present and discuss a novel research approach, the baseline target moderated mediation (BTMM design, that holds substantial promise for advancing our understanding of how genetic research can inform prevention research. We first discuss how genetically informed research on developmental psychopathology can be used to identify potential intervention targets. We then describe the BTMM design, which employs moderated mediation within a longitudinal study to test whether baseline levels of intervention targets moderate the impact of the intervention on change in that target, and whether change in those targets mediates causal impact of preventive or treatment interventions on distal health outcomes. We next discuss how genetically informed BTMM designs can be applied to both microtrials and full-scale prevention trials. We end with a discussion of some of the advantages and limitations of this approach.

  9. Baseline Plasma C-Reactive Protein Concentrations and Motor Prognosis in Parkinson Disease.

    Directory of Open Access Journals (Sweden)

    Atsushi Umemura

    Full Text Available C-reactive protein (CRP, a blood inflammatory biomarker, is associated with the development of Alzheimer disease. In animal models of Parkinson disease (PD, systemic inflammatory stimuli can promote neuroinflammation and accelerate dopaminergic neurodegeneration. However, the association between long-term systemic inflammations and neurodegeneration has not been assessed in PD patients.To investigate the longitudinal effects of baseline CRP concentrations on motor prognosis in PD.Retrospective analysis of 375 patients (mean age, 69.3 years; mean PD duration, 6.6 years. Plasma concentrations of high-sensitivity CRP were measured in the absence of infections, and the Unified Parkinson's Disease Rating Scale Part III (UPDRS-III scores were measured at five follow-up intervals (Days 1-90, 91-270, 271-450, 451-630, and 631-900.Change of UPDRS-III scores from baseline to each of the five follow-up periods.Change in UPDRS-III scores was significantly greater in PD patients with CRP concentrations ≥0.7 mg/L than in those with CRP concentrations <0.7 mg/L, as determined by a generalized estimation equation model (P = 0.021 for the entire follow-up period and by a generalized regression model (P = 0.030 for the last follow-up interval (Days 631-900. The regression coefficients of baseline CRP for the two periods were 1.41 (95% confidence interval [CI] 0.21-2.61 and 2.62 (95% CI 0.25-4.98, respectively, after adjusting for sex, age, baseline UPDRS-III score, dementia, and incremental L-dopa equivalent dose.Baseline plasma CRP levels were associated with motor deterioration and predicted motor prognosis in patients with PD. These associations were independent of sex, age, PD severity, dementia, and anti-Parkinsonian agents, suggesting that subclinical systemic inflammations could accelerate neurodegeneration in PD.

  10. Geochemical baseline studies of soil in Finland

    Science.gov (United States)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  11. Mouse Chromosome 4 Is Associated with the Baseline and Allergic IgE Phenotypes

    Directory of Open Access Journals (Sweden)

    Cynthia Kanagaratham

    2017-08-01

    Full Text Available Regulation of IgE concentration in the blood is a complex trait, with high concentrations associated with parasitic infections as well as allergic diseases. A/J strain mice have significantly higher plasma concentrations of IgE, both at baseline and after ovalbumin antigen exposure, when compared to C57BL/6J strain mice. Our objective was to determine the genomic regions associated with this difference in phenotype. To achieve this, we used a panel of recombinant congenic strains (RCS derived from A/J and C57BL/6J strains. We measured IgE in the RCS panel at baseline and following allergen exposure. Using marker by marker analysis of the RCS genotype and phenotype data, we identified multiple regions associated with the IgE phenotype. A single region was identified to be associated with baseline IgE level, while multiple regions wereassociated with the phenotype after allergen exposure. The most significant region was found on Chromosome 4, from 81.46 to 86.17 Mbp. Chromosome 4 substitution strain mice had significantly higher concentration of IgE than their background parental strain mice, C57BL/6J. Our data presents multiple candidate regions associated with plasma IgE concentration at baseline and following allergen exposure, with the most significant one located on Chromosome 4.

  12. Overfitting Reduction of Text Classification Based on AdaBELM

    Directory of Open Access Journals (Sweden)

    Xiaoyue Feng

    2017-07-01

    Full Text Available Overfitting is an important problem in machine learning. Several algorithms, such as the extreme learning machine (ELM, suffer from this issue when facing high-dimensional sparse data, e.g., in text classification. One common issue is that the extent of overfitting is not well quantified. In this paper, we propose a quantitative measure of overfitting referred to as the rate of overfitting (RO and a novel model, named AdaBELM, to reduce the overfitting. With RO, the overfitting problem can be quantitatively measured and identified. The newly proposed model can achieve high performance on multi-class text classification. To evaluate the generalizability of the new model, we designed experiments based on three datasets, i.e., the 20 Newsgroups, Reuters-21578, and BioMed corpora, which represent balanced, unbalanced, and real application data, respectively. Experiment results demonstrate that AdaBELM can reduce overfitting and outperform classical ELM, decision tree, random forests, and AdaBoost on all three text-classification datasets; for example, it can achieve 62.2% higher accuracy than ELM. Therefore, the proposed model has a good generalizability.

  13. Text Clustering Algorithm Based on Random Cluster Core

    Directory of Open Access Journals (Sweden)

    Huang Long-Jun

    2016-01-01

    Full Text Available Nowadays clustering has become a popular text mining algorithm, but the huge data can put forward higher requirements for the accuracy and performance of text mining. In view of the performance bottleneck of traditional text clustering algorithm, this paper proposes a text clustering algorithm with random features. This is a kind of clustering algorithm based on text density, at the same time using the neighboring heuristic rules, the concept of random cluster is introduced, which effectively reduces the complexity of the distance calculation.

  14. Efficient Text Encryption and Hiding with Double-Random Phase-Encoding

    Directory of Open Access Journals (Sweden)

    Mohammad S. Alam

    2012-10-01

    Full Text Available In this paper, a double-random phase-encoding technique-based text encryption and hiding method is proposed. First, the secret text is transformed into a 2-dimensional array and the higher bits of the elements in the transformed array are used to store the bit stream of the secret text, while the lower bits are filled with specific values. Then, the transformed array is encoded with double-random phase-encoding technique. Finally, the encoded array is superimposed on an expanded host image to obtain the image embedded with hidden data. The performance of the proposed technique, including the hiding capacity, the recovery accuracy of the secret text, and the quality of the image embedded with hidden data, is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient. By using optical information processing techniques, the proposed method has been found to significantly improve the security of text information transmission, while ensuring hiding capacity at a prescribed level.

  15. Can Text Messages Increase Empathy and Prosocial Behavior? The Development and Initial Validation of Text to Connect

    Science.gov (United States)

    Konrath, Sara; Falk, Emily; Fuhrel-Forbis, Andrea; Liu, Mary; Swain, James; Tolman, Richard; Cunningham, Rebecca; Walton, Maureen

    2015-01-01

    To what extent can simple mental exercises cause shifts in empathic habits? Can we use mobile technology to make people more empathic? It may depend on how empathy is measured. Scholars have identified a number of different facets and correlates of empathy. This study is among the first to take a comprehensive, multidimensional approach to empathy to determine how empathy training could affect these different facets and correlates. In doing so, we can learn more about empathy and its multifaceted nature. Participants (N = 90) were randomly assigned to receive either an empathy-building text message program (Text to Connect) or one of two control conditions (active versus passive). Respondents completed measures of dispositional empathy (i.e. self-perceptions of being an empathic person), affective empathy (i.e. motivations to help, immediate feelings of empathic concern), and prosocial behavior (i.e. self-reports and observer-reports) at baseline, and then again after the 14 day intervention period. We found that empathy-building messages increased affective indicators of empathy and prosocial behaviors, but actually decreased self-perceptions of empathy, relative to control messages. Although the brief text messaging intervention did not consistently impact empathy-related personality traits, it holds promise for the use of mobile technology for changing empathic motivations and behaviors. PMID:26356504

  16. Text Summarization Using FrameNet-Based Semantic Graph Model

    Directory of Open Access Journals (Sweden)

    Xu Han

    2016-01-01

    Full Text Available Text summarization is to generate a condensed version of the original document. The major issues for text summarization are eliminating redundant information, identifying important difference among documents, and recovering the informative content. This paper proposes a Semantic Graph Model which exploits the semantic information of sentence using FSGM. FSGM treats sentences as vertexes while the semantic relationship as the edges. It uses FrameNet and word embedding to calculate the similarity of sentences. This method assigns weight to both sentence nodes and edges. After all, it proposes an improved method to rank these sentences, considering both internal and external information. The experimental results show that the applicability of the model to summarize text is feasible and effective.

  17. Rotation-invariant features for multi-oriented text detection in natural images.

    Directory of Open Access Journals (Sweden)

    Cong Yao

    Full Text Available Texts in natural scenes carry rich semantic information, which can be used to assist a wide range of applications, such as object recognition, image/video retrieval, mapping/navigation, and human computer interaction. However, most existing systems are designed to detect and recognize horizontal (or near-horizontal texts. Due to the increasing popularity of mobile-computing devices and applications, detecting texts of varying orientations from natural images under less controlled conditions has become an important but challenging task. In this paper, we propose a new algorithm to detect texts of varying orientations. Our algorithm is based on a two-level classification scheme and two sets of features specially designed for capturing the intrinsic characteristics of texts. To better evaluate the proposed method and compare it with the competing algorithms, we generate a comprehensive dataset with various types of texts in diverse real-world scenes. We also propose a new evaluation protocol, which is more suitable for benchmarking algorithms for detecting texts in varying orientations. Experiments on benchmark datasets demonstrate that our system compares favorably with the state-of-the-art algorithms when handling horizontal texts and achieves significantly enhanced performance on variant texts in complex natural scenes.

  18. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    Science.gov (United States)

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  19. Extracting Baseline Electricity Usage Using Gradient Tree Boosting

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taehoon [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Lee, Dongeun [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Choi, Jaesik [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Spurlock, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-05

    To understand how specific interventions affect a process observed over time, we need to control for the other factors that influence outcomes. Such a model that captures all factors other than the one of interest is generally known as a baseline. In our study of how different pricing schemes affect residential electricity consumption, the baseline would need to capture the impact of outdoor temperature along with many other factors. In this work, we examine a number of different data mining techniques and demonstrate Gradient Tree Boosting (GTB) to be an effective method to build the baseline. We train GTB on data prior to the introduction of new pricing schemes, and apply the known temperature following the introduction of new pricing schemes to predict electricity usage with the expected temperature correction. Our experiments and analyses show that the baseline models generated by GTB capture the core characteristics over the two years with the new pricing schemes. In contrast to the majority of regression based techniques which fail to capture the lag between the peak of daily temperature and the peak of electricity usage, the GTB generated baselines are able to correctly capture the delay between the temperature peak and the electricity peak. Furthermore, subtracting this temperature-adjusted baseline from the observed electricity usage, we find that the resulting values are more amenable to interpretation, which demonstrates that the temperature-adjusted baseline is indeed effective.

  20. 75 FR 57268 - Notice of Baseline Filings

    Science.gov (United States)

    2010-09-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-103-000; Docket No. PR10-104-000; Docket No. PR10-105- 000 (Not Consolidated)] Notice of Baseline Filings September 13..., 2010, and September 10, 2010, respectively the applicants listed above submitted their baseline filing...

  1. Probing Neutrino Properties with Long-Baseline Neutrino Beams

    International Nuclear Information System (INIS)

    Marino, Alysia

    2015-01-01

    This final report on an Early Career Award grant began in April 15, 2010 and concluded on April 14, 2015. Alysia Marino's research is focussed on making precise measurements of neutrino properties using intense accelerator-generated neutrino beams. As a part of this grant, she is collaborating on the Tokai-to-Kamioka (T2K) long-baseline neutrino experiment, currently taking data in Japan, and on the Deep Underground Neutrino Experiment (DUNE) design effort for a future Long-Baseline Neutrino Facility (LBNF) in the US. She is also a member of the NA61/SHINE particle production experiment at CERN, but as that effort is supported by other funds, it will not be discussed further here. T2K was designed to search for the disappearance of muon neutrinos (?_?) and the appearance of electron neutrinos (?_e), using a beam of muon neutrino beam that travels 295 km across Japan towards the Super-Kamiokande detector. In 2011 T2K first reported indications of ?_e appearance, a previously unobserved mode of neutrino oscillations. In the past year, T2K has published a combined analysis of ?_? disappearance and ?_e appearance, and began collecting taking data with a beam of anti-neutrinos, instead of neutrinos, to search for hints of violation of the CP symmetry of the universe. The proposed DUNE experiment has similar physics goals to T2K, but will be much more sensitive due to its more massive detectors and new higher-intensity neutrino beam. This effort will be very high-priority particle physics project in the US over the next decade.

  2. Trends in Large Proposal Development at Major Research Institutions

    Science.gov (United States)

    Mulfinger, Lorraine M.; Dressler, Kevin A.; James, L. Eric; Page, Niki; Serrano, Eduardo; Vazquez, Jorge

    2016-01-01

    Research administrator interest in large research proposal development and submission support is high, arguably in response to the bleak funding landscape for research and federal agency trends toward making more frequent larger awards. In response, a team from Penn State University and Huron Consulting Group initiated a baseline study to…

  3. 75 FR 65010 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-21

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-1-000; Docket No. PR11-2-000; Docket No. PR11-3-000] Notice of Baseline Filings October 14, 2010. Cranberry Pipeline Docket..., 2010, respectively the applicants listed above submitted their baseline filing of its Statement of...

  4. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Partnership, ALMA [Astrophysics Research Institute, Liverpool John Moores University, IC2, Liverpool Science Park, 146 Brownlow Hill, Liverpool L3 5RF (United Kingdom); Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S. [Joint ALMA Observatory, Alonso de Córdova 3107, Vitacura, Santiago (Chile); Lucas, R. [Institut de Planétologie et d’Astrophysique de Grenoble (UMR 5274), BP 53, F-38041 Grenoble Cedex 9 (France); Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Asaki, Y. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Matsushita, S. [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Hills, R. E. [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Richards, A. M. S. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Broguiere, D., E-mail: efomalon@nrao.edu [Institut de Radioastronomie Millime´trique (IRAM), 300 rue de la Piscine, Domaine Universitaire, F-38406 Saint Martin d’Hères (France); and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  5. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    International Nuclear Information System (INIS)

    Partnership, ALMA; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W.; Asaki, Y.; Matsushita, S.; Hills, R. E.; Richards, A. M. S.; Broguiere, D.

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy

  6. Baseline assessment of fish and benthic communities of the Flower Garden Banks (2010 - present) using remotely operated vehicle (ROV) survey methods: 2011

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The proposed work develop baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys will employ...

  7. 76 FR 8725 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings Enstor Grama Ridge Storage and Docket No. PR10-97-002. Transportation, L.L.C.. EasTrans, LLC Docket No. PR10-30-001... revised baseline filing of their Statement of Operating Conditions for services provided under section 311...

  8. 76 FR 5797 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-114-001; Docket No. PR10-129-001; Docket No. PR10-131- 001; Docket No. PR10-68-002 Not Consolidated] Notice of Baseline... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  9. Office of Geologic Repositories program baseline procedures notebook (OGR/B-1)

    International Nuclear Information System (INIS)

    1986-06-01

    Baseline management is typically applied to aid in the internal control of a program by providing consistent programmatic direction, control, and surveillance to an evolving system development. This fundamental concept of internal program control involves the establishment of a baseline to serve as a point of departure for consistent technical program coordination and to control subsequent changes from that baseline. The existence of a program-authorized baseline ensures that all participants are working to the same ground rules. Baseline management also ensures that, once the baseline is defined, changes are assessed and approved by a process which ensures adequate consideration of overall program impact. Baseline management also includes the consideration of examptions from the baseline. The process of baseline management continues through all the phases of an evolving system development program. As the Program proceeds, there will be a progressive increase in the data contained in the baseline documentation. Baseline management has been selected as a management technique to aid in the internal control of the Office of Geologic Repositories (OGR) program. Specifically, an OGR Program Baseline, including technical and programmatic requirements, is used for program control of the four Mined Geologic Disposal System field projects, i.e., Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigation, Salt Repository Project and Crystalline Repository Project. This OGR Program Baseline Procedures Notebook provides a description of the baseline mwanagement concept, establishes the OGR Program baseline itself, and provides procedures to be followed for controlling changes to that baseline. The notebook has a controlled distribution and will be updated as required

  10. Pilot study of psychotherapeutic text messaging for depression.

    Science.gov (United States)

    Pfeiffer, Paul N; Henry, Jennifer; Ganoczy, Dara; Piette, John D

    2017-08-01

    Background Text messaging services could increase access to psychotherapeutic content for individuals with depression by avoiding barriers to in-person psychotherapy such as cost, transportation, and therapist availability. Determining whether text messages reflecting different psychotherapeutic techniques exhibit differences in acceptability or effectiveness may help guide service development. Objectives We aimed to determine: (1) the feasibility of delivering a psychotherapy-based text messaging service to people with depression identified via the internet, (2) whether there is variation in satisfaction with messages according to the type of psychotherapeutic technique they represent, and (3) whether symptoms of depression vary according to receipt of each message type and participants' satisfaction with the messages they received. Methods For this study 190 US adults who screened positive for a major depressive episode (Patient Health Questionnaire (PHQ-9) score ≥10) were recruited from online advertisements. Participants received a daily psychotherapy-based text message 6 days per week for 12 weeks. Text messages were developed by a team of psychiatrists, psychologists, and social workers to reflect three psychotherapeutic approaches: acceptance and commitment therapy (ACT), behavioural activation, and cognitive restructuring. Each week the message type for the week was randomly assigned from one of the three types, allowing for repeats. Participants were asked daily to rate each message. On the 7th day of each week, participants completed a two-item depression screener (PHQ-2). Web-based surveys at baseline, 6, and 12 weeks were used as the primary measure of depressive symptoms (PHQ-9). Results Of the 190 participants enrolled, 85 (45%) completed the 6-week web survey and 67 (35%) completed the 12-week survey. The mean baseline PHQ-9 score was 19.4 (SD 4.2) and there was a statistically significant mean improvement in PHQ-9 scores of -2.9 (SD 6.0; p

  11. Learning From Short Text Streams With Topic Drifts.

    Science.gov (United States)

    Li, Peipei; He, Lu; Wang, Haiyan; Hu, Xuegang; Zhang, Yuhong; Li, Lei; Wu, Xindong

    2017-09-18

    Short text streams such as search snippets and micro blogs have been popular on the Web with the emergence of social media. Unlike traditional normal text streams, these data present the characteristics of short length, weak signal, high volume, high velocity, topic drift, etc. Short text stream classification is hence a very challenging and significant task. However, this challenge has received little attention from the research community. Therefore, a new feature extension approach is proposed for short text stream classification with the help of a large-scale semantic network obtained from a Web corpus. It is built on an incremental ensemble classification model for efficiency. First, more semantic contexts based on the senses of terms in short texts are introduced to make up of the data sparsity using the open semantic network, in which all terms are disambiguated by their semantics to reduce the noise impact. Second, a concept cluster-based topic drifting detection method is proposed to effectively track hidden topic drifts. Finally, extensive studies demonstrate that as compared to several well-known concept drifting detection methods in data stream, our approach can detect topic drifts effectively, and it enables handling short text streams effectively while maintaining the efficiency as compared to several state-of-the-art short text classification approaches.

  12. 75 FR 70732 - Notice of Baseline Filings

    Science.gov (United States)

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-71-000; Docket No. PR11-72-000; Docket No. PR11-73- 000] Notice of Baseline Filings November 10, 2010. Docket No. PR11-71-000..., 2010, the applicants listed above submitted their baseline filing of their Statement of Operating...

  13. Precise baseline determination for the TanDEM-X mission

    Science.gov (United States)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  14. Robust keyword retrieval method for OCRed text

    Science.gov (United States)

    Fujii, Yusaku; Takebe, Hiroaki; Tanaka, Hiroshi; Hotta, Yoshinobu

    2011-01-01

    Document management systems have become important because of the growing popularity of electronic filing of documents and scanning of books, magazines, manuals, etc., through a scanner or a digital camera, for storage or reading on a PC or an electronic book. Text information acquired by optical character recognition (OCR) is usually added to the electronic documents for document retrieval. Since texts generated by OCR generally include character recognition errors, robust retrieval methods have been introduced to overcome this problem. In this paper, we propose a retrieval method that is robust against both character segmentation and recognition errors. In the proposed method, the insertion of noise characters and dropping of characters in the keyword retrieval enables robustness against character segmentation errors, and character substitution in the keyword of the recognition candidate for each character in OCR or any other character enables robustness against character recognition errors. The recall rate of the proposed method was 15% higher than that of the conventional method. However, the precision rate was 64% lower.

  15. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  16. Impact of Baseline Central Retinal Thickness on Outcomes in the VIVID-DME and VISTA-DME Studies

    Directory of Open Access Journals (Sweden)

    Edoardo Midena

    2018-01-01

    Full Text Available Purpose. To report the impact of baseline central retinal thickness (CRT on outcomes in patients with diabetic macular edema (DME in VIVID-DME and VISTA-DME. Methods. Post hoc analyses of two randomized controlled trials in which 862 DME patients were randomized 1 : 1 : 1 to treatment with intravitreal aflibercept 2.0 mg every 4 weeks (2q4, intravitreal aflibercept 2.0 mg every 8 weeks after five initial monthly doses (2q8, or macular laser photocoagulation at baseline and as needed. We compared visual and anatomical outcomes in subgroups of patients with baseline CRT < 400 μm and ≥400 μm. Results. At weeks 52 and 100, outcomes with intravitreal aflibercept 2q4 and 2q8 were superior to those in laser control-treated patients regardless of baseline CRT. When looked at in a binary fashion, the treatment effect of intravitreal aflibercept versus laser was not significantly better in the ≥400 μm than the <400 μm group; when looked at as a continuous variable, baseline CRT seemed to have an impact on the treatment effect of intravitreal aflibercept versus laser. Conclusions. Post hoc analyses of VIVID-DME and VISTA-DME demonstrated the benefits of intravitreal aflibercept treatment in DME patients with baseline CRT < 400 μm and ≥400 μm. This trial is registered with NCT01331681 and NCT01363440.

  17. IPCC Socio-Economic Baseline Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — The Intergovernmental Panel on Climate Change (IPCC) Socio-Economic Baseline Dataset consists of population, human development, economic, water resources, land...

  18. Ecological risk assessments for the baseline condition for the Port Hope and Port Granby Projects

    International Nuclear Information System (INIS)

    Hart, D.R.; Kleb, H.

    2006-01-01

    Baseline ecological risk assessments were completed in and around the areas where cleanup of low-level radioactive waste (LLRW) and marginally contaminated soil (MCS) is planned under the Port Hope Area Initiative (PHAI). Both aquatic and terrestrial environments were assessed, in the vicinity of the proposed waste management facilities near Welcome and Port Granby, in locations potentially influenced by LLRW and MCS that will be cleaned up in future, and in reference locations that are not potentially influenced. The calculated doses and risk quotients suggest potential radiation effects for pre-cleanup benthic invertebrates in Port Hope Harbour, for any ducks feeding exclusively in this area, and for soil invertebrates in some other waste sites. In addition, risk quotients suggest potential baseline effects from some elements, particularly uranium and arsenic, in localized areas that are influenced by LLRW and MCS. (author)

  19. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  20. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  1. Baseline budgeting for continuous improvement.

    Science.gov (United States)

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  2. Safe and sensible preprocessing and baseline correction of pupil-size data.

    Science.gov (United States)

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  3. Performance Measurement Baseline Change Request

    Data.gov (United States)

    Social Security Administration — The Performance Measurement Baseline Change Request template is used to document changes to scope, cost, schedule, or operational performance metrics for SSA's Major...

  4. Framework for ensuring appropriate maintenance of baseline PSA and risk monitor models in a nuclear power plant

    International Nuclear Information System (INIS)

    Vrbanic, I.; Sorman, J.

    2005-01-01

    The necessity of observing both long term and short term risk changes many times imposes the need for a nuclear power plant to have a baseline PSA model to produce an estimate of long term averaged risk and a risk monitor to produce a time-dependent risk curve and/or safety functions status at points in time or over a shorter time period of interest. By nature, a baseline PSA reflects plant systems and operation in terms of average conditions and provides time-invariant quantitative risk metrics. Risk monitor, on the other hand, requires condition-specific modeling to produce a quantitative and/or qualitative estimate of plant's condition-specific risk metrics. While risk monitor is used for computing condition-specific risk metrics over time, a baseline PSA model is needed for variety of other risk oriented applications, such as assessments of proposed design modifications or risk ranking of equipment. Having in mind their importance and roles, it is essential that both models, i.e. baseline PSA model and risk monitor are maintained in the way that they represent, as accurately as practically achievable, the actual plant status (e.g. systems' design and plant's procedures in effect) and its history (e.g. numbers of equipment failures and demands that influence relevant PSA parameters). Paper discusses the requirements for appropriate maintenance of plant's baseline PSA model and risk monitor model and presents the framework for plant's engineering and administrative procedures that would ensure they are met. (author)

  5. Arabic Text Categorization Using Improved k-Nearest neighbour Algorithm

    Directory of Open Access Journals (Sweden)

    Wail Hamood KHALED

    2014-10-01

    Full Text Available The quantity of text information published in Arabic language on the net requires the implementation of effective techniques for the extraction and classifying of relevant information contained in large corpus of texts. In this paper we presented an implementation of an enhanced k-NN Arabic text classifier. We apply the traditional k-NN and Naive Bayes from Weka Toolkit for comparison purpose. Our proposed modified k-NN algorithm features an improved decision rule to skip the classes that are less similar and identify the right class from k nearest neighbours which increases the accuracy. The study evaluates the improved decision rule technique using the standard of recall, precision and f-measure as the basis of comparison. We concluded that the effectiveness of the proposed classifier is promising and outperforms the classical k-NN classifier.

  6. Geochemical baseline level and function and contamination of phosphorus in Liao River Watershed sediments of China.

    Science.gov (United States)

    Liu, Shaoqing; Wang, Jing; Lin, Chunye; He, Mengchang; Liu, Xitao

    2013-10-15

    The quantitative assessment of P contamination in sediments is a challenge due to sediment heterogeneity and the lacking of geochemical background or baseline levels. In this study, a procedure was proposed to determine the average P background level and P geochemical baseline level (GBL) and develop P geochemical baseline functions (GBF) for riverbed sediments of the Liao River Watershed (LRW). The LRW has two river systems - the Liao River System (LRS) and the Daliao River System (DRS). Eighty-eight samples were collected and analyzed for P, Al, Fe, Ca, organic matter, pH, and texture. The results show that Fe can be used as a better particle-size proxy to construct the GBF of P (P (mg/kg) = 39.98 + 166.19 × Fe (%), R(2) = 0.835, n = 66). The GBL of P was 675 mg/kg, while the average background level of P was 355 mg/kg. Noting that many large cities are located in the DRS watershed, most of the contaminated sites were located within the DRS and the riverbed sediments were more contaminated by P in the DRS watershed than in the LRS watershed. The geochemical background and baseline information of P are of great importance in managing P levels within the LRW. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. The texting and driving epidemic : changing norms to change behavior.

    Science.gov (United States)

    2013-09-01

    This campaign was created to reduce texting and driving and to increase awareness of the serious risks involved with texting and driving. The target audience of the campaign is University of Kansas students. This plan proposes an Anti-Texting and ...

  8. Geographical baselines of sustainable planning of the regional development of Zasavje region

    Directory of Open Access Journals (Sweden)

    Dušan Plut

    2002-12-01

    Full Text Available Geographical baselines of planning the regional development and interventions into the geographical environment derive from the premises of the concept of permanent adjusting the anthropogenic changes in the landscape to specific capacities and limitations of landscape-forming components. In the landscape-degraded region of Zasavje the improvement of environmental quality (curative measures and regional economic progress within the scope of carrying capacities and space (preventative measures are the primary, developmentaly-environmentally devised goal of developmental strategy.

  9. What oral text reading fluency can reveal about reading comprehension

    NARCIS (Netherlands)

    Veenendaal, N.J.; Groen, M.A.; Verhoeven, L.T.W.

    2015-01-01

    Text reading fluency – the ability to read quickly, accurately and with a natural intonation – has been proposed as a predictor of reading comprehension. In the current study, we examined the role of oral text reading fluency, defined as text reading rate and text reading prosody, as a contributor

  10. Estimation of Cross-Lingual News Similarities Using Text-Mining Methods

    Directory of Open Access Journals (Sweden)

    Zhouhao Wang

    2018-01-01

    Full Text Available In this research, two estimation algorithms for extracting cross-lingual news pairs based on machine learning from financial news articles have been proposed. Every second, innumerable text data, including all kinds news, reports, messages, reviews, comments, and tweets are generated on the Internet, and these are written not only in English but also in other languages such as Chinese, Japanese, French, etc. By taking advantage of multi-lingual text resources provided by Thomson Reuters News, we developed two estimation algorithms for extracting cross-lingual news pairs from multilingual text resources. In our first method, we propose a novel structure that uses the word information and the machine learning method effectively in this task. Simultaneously, we developed a bidirectional Long Short-Term Memory (LSTM based method to calculate cross-lingual semantic text similarity for long text and short text, respectively. Thus, when an important news article is published, users can read similar news articles that are written in their native language using our method.

  11. A comparison of video modeling, text-based instruction, and no instruction for creating multiple baseline graphs in Microsoft Excel.

    Science.gov (United States)

    Tyner, Bryan C; Fienup, Daniel M

    2015-09-01

    Graphing is socially significant for behavior analysts; however, graphing can be difficult to learn. Video modeling (VM) may be a useful instructional method but lacks evidence for effective teaching of computer skills. A between-groups design compared the effects of VM, text-based instruction, and no instruction on graphing performance. Participants who used VM constructed graphs significantly faster and with fewer errors than those who used text-based instruction or no instruction. Implications for instruction are discussed. © Society for the Experimental Analysis of Behavior.

  12. Rationing with baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    2013-01-01

    We introduce a new operator for general rationing problems in which, besides conflicting claims, individual baselines play an important role in the rationing process. The operator builds onto ideas of composition, which are not only frequent in rationing, but also in related problems...... such as bargaining, choice, and queuing. We characterize the operator and show how it preserves some standard axioms in the literature on rationing. We also relate it to recent contributions in such literature....

  13. [Symbol: see text]2 Optimized predictive image coding with [Symbol: see text]∞ bound.

    Science.gov (United States)

    Chuah, Sceuchin; Dumitrescu, Sorina; Wu, Xiaolin

    2013-12-01

    In many scientific, medical, and defense applications of image/video compression, an [Symbol: see text]∞ error bound is required. However, pure[Symbol: see text]∞-optimized image coding, colloquially known as near-lossless image coding, is prone to structured errors such as contours and speckles if the bit rate is not sufficiently high; moreover, most of the previous [Symbol: see text]∞-based image coding methods suffer from poor rate control. In contrast, the [Symbol: see text]2 error metric aims for average fidelity and hence preserves the subtlety of smooth waveforms better than the ∞ error metric and it offers fine granularity in rate control, but pure [Symbol: see text]2-based image coding methods (e.g., JPEG 2000) cannot bound individual errors as the [Symbol: see text]∞-based methods can. This paper presents a new compression approach to retain the benefits and circumvent the pitfalls of the two error metrics. A common approach of near-lossless image coding is to embed into a DPCM prediction loop a uniform scalar quantizer of residual errors. The said uniform scalar quantizer is replaced, in the proposed new approach, by a set of context-based [Symbol: see text]2-optimized quantizers. The optimization criterion is to minimize a weighted sum of the [Symbol: see text]2 distortion and the entropy while maintaining a strict [Symbol: see text]∞ error bound. The resulting method obtains good rate-distortion performance in both [Symbol: see text]2 and [Symbol: see text]∞ metrics and also increases the rate granularity. Compared with JPEG 2000, the new method not only guarantees lower [Symbol: see text]∞ error for all bit rates, but also it achieves higher PSNR for relatively high bit rates.

  14. Mercury baseline levels in Flemish soils (Belgium)

    International Nuclear Information System (INIS)

    Tack, Filip M.G.; Vanhaesebroeck, Thomas; Verloo, Marc G.; Van Rompaey, Kurt; Ranst, Eric van

    2005-01-01

    It is important to establish contaminant levels that are normally present in soils to provide baseline data for pollution studies. Mercury is a toxic element of concern. This study was aimed at assessing baseline mercury levels in soils in Flanders. In a previous study, mercury contents in soils in Oost-Vlaanderen were found to be significantly above levels reported elsewhere. For the current study, observations were extended over two more provinces, West-Vlaanderen and Antwerpen. Ranges of soil Hg contents were distinctly higher in the province Oost-Vlaanderen (interquartile range from 0.09 to 0.43 mg/kg) than in the other provinces (interquartile ranges from 0.7 to 0.13 and 0.7 to 0.15 mg/kg for West-Vlaanderen and Antwerpen, respectively). The standard threshold method was applied to separate soils containing baseline levels of Hg from the data. Baseline concentrations for Hg were characterised by a median of 0.10 mg Hg/kg dry soil, an interquartile range from 0.07 to 0.14 mg/kg and a 90% percentile value of 0.30 mg/kg. The influence of soil properties such as clay and organic carbon contents, and pH on baseline Hg concentrations was not important. Maps of the spatial distribution of Hg levels showed that the province Oost-Vlaanderen exhibited zones with systematically higher Hg soil contents. This may be related to the former presence of many small-scale industries employing mercury in that region. - Increased mercury levels may reflect human activity

  15. Associated diacritical watermarking approach to protect sensitive arabic digital texts

    Science.gov (United States)

    Kamaruddin, Nurul Shamimi; Kamsin, Amirrudin; Hakak, Saqib

    2017-10-01

    Among multimedia content, one of the most predominant medium is text content. There have been lots of efforts to protect and secure text information over the Internet. The limitations of existing works have been identified in terms of watermark capacity, time complexity and memory complexity. In this work, an invisible digital watermarking approach has been proposed to protect and secure the most sensitive text i.e. Digital Holy Quran. The proposed approach works by XOR-ing only those Quranic letters that has certain diacritics associated with it. Due to sensitive nature of Holy Quran, diacritics play vital role in the meaning of the particular verse. Hence, securing letters with certain diacritics will preserve the original meaning of Quranic verses in case of alternation attempt. Initial results have shown that the proposed approach is promising with less memory complexity and time complexity compared to existing approaches.

  16. A unified framework for evaluating the risk of re-identification of text de-identification tools.

    Science.gov (United States)

    Scaiano, Martin; Middleton, Grant; Arbuckle, Luk; Kolhatkar, Varada; Peyton, Liam; Dowling, Moira; Gipson, Debbie S; El Emam, Khaled

    2016-10-01

    It has become regular practice to de-identify unstructured medical text for use in research using automatic methods, the goal of which is to remove patient identifying information to minimize re-identification risk. The metrics commonly used to determine if these systems are performing well do not accurately reflect the risk of a patient being re-identified. We therefore developed a framework for measuring the risk of re-identification associated with textual data releases. We apply the proposed evaluation framework to a data set from the University of Michigan Medical School. Our risk assessment results are then compared with those that would be obtained using a typical contemporary micro-average evaluation of recall in order to illustrate the difference between the proposed evaluation framework and the current baseline method. We demonstrate how this framework compares against common measures of the re-identification risk associated with an automated text de-identification process. For the probability of re-identification using our evaluation framework we obtained a mean value for direct identifiers of 0.0074 and a mean value for quasi-identifiers of 0.0022. The 95% confidence interval for these estimates were below the relevant thresholds. The threshold for direct identifier risk was based on previously used approaches in the literature. The threshold for quasi-identifiers was determined based on the context of the data release following commonly used de-identification criteria for structured data. Our framework attempts to correct for poorly distributed evaluation corpora, accounts for the data release context, and avoids the often optimistic assumptions that are made using the more traditional evaluation approach. It therefore provides a more realistic estimate of the true probability of re-identification. This framework should be used as a basis for computing re-identification risk in order to more realistically evaluate future text de-identification tools

  17. Ask and Ye Shall Receive? Automated Text Mining of Michigan Capital Facility Finance Bond Election Proposals to Identify Which Topics Are Associated with Bond Passage and Voter Turnout

    Science.gov (United States)

    Bowers, Alex J.; Chen, Jingjing

    2015-01-01

    The purpose of this study is to bring together recent innovations in the research literature around school district capital facility finance, municipal bond elections, statistical models of conditional time-varying outcomes, and data mining algorithms for automated text mining of election ballot proposals to examine the factors that influence the…

  18. Tank waste remediation system technical baseline summary description

    International Nuclear Information System (INIS)

    Raymond, R.E.

    1998-01-01

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations

  19. Carbon tetrachloride ERA soil-gas baseline monitoring

    International Nuclear Information System (INIS)

    Fancher, J.D.

    1994-01-01

    From December 1991 through December 1993, Westinghouse Hanford Company performed routine baseline monitoring of selected wells ad soil-gas points twice weekly in the 200 West Area of the Hanford Site. This work supported the carbon Tetrachloride Expedited Response Action (ERA) and provided a solid baseline of volatile organic compound (VOC) concentrations in wells and in the subsurface at the ERA site. As site remediation continues, comparisons to this baseline can be one means of measuring the success of carbon tetrachloride vapor extraction. This report contains observations of the patterns and trends associated with data obtained during soil-gas monitoring at the 200 West Area: Monitoring performed since late 1991 includes monitoring soil-gas probes ad wellheads for volatile organic compounds (VOCs). This report reflects monitoring data collected from December 1991 through December 1993

  20. ASM Based Synthesis of Handwritten Arabic Text Pages

    Directory of Open Access Journals (Sweden)

    Laslo Dinges

    2015-01-01

    Full Text Available Document analysis tasks, as text recognition, word spotting, or segmentation, are highly dependent on comprehensive and suitable databases for training and validation. However their generation is expensive in sense of labor and time. As a matter of fact, there is a lack of such databases, which complicates research and development. This is especially true for the case of Arabic handwriting recognition, that involves different preprocessing, segmentation, and recognition methods, which have individual demands on samples and ground truth. To bypass this problem, we present an efficient system that automatically turns Arabic Unicode text into synthetic images of handwritten documents and detailed ground truth. Active Shape Models (ASMs based on 28046 online samples were used for character synthesis and statistical properties were extracted from the IESK-arDB database to simulate baselines and word slant or skew. In the synthesis step ASM based representations are composed to words and text pages, smoothed by B-Spline interpolation and rendered considering writing speed and pen characteristics. Finally, we use the synthetic data to validate a segmentation method. An experimental comparison with the IESK-arDB database encourages to train and test document analysis related methods on synthetic samples, whenever no sufficient natural ground truthed data is available.

  1. Text conception(s in context of semi-present Distance Learning (DL

    Directory of Open Access Journals (Sweden)

    Fabiana Komesu

    2013-02-01

    Full Text Available By following the example proposed by Corrêa (2011 in the investigation of texts produced by undergraduate and pre-undergraduate students in two different assessment, this work aims to approach “hidden” aspects in the teaching of writing at the university (Street, 2009, to reflections produced in the language field, in particular the ones referred as “socially assumed”, proposed by Voloshinov/Bakhtin (s/d: 1926. It is particularly important to investigate the conception of text in digital context, by means of the study of updated semiotic resources in the production of undergraduate students using a computer with internet access in the process of semi-present Distance Learning (DL. The collected material comprises 29 (twenty nine texts which were produced by students of the semi-present Pedagogy Course from Univesp (Universidade Virtual do Estado de São Paulo – Virtual University from the state of São Paulo, who were studying “Education and Language”, in 2010. This qualitative analysis aims to show that regarding the institution there is a prevalence of structural and procedural aspects for the accomplishment of the proposed activity and, regarding the undergraduate student it is noticed that the production is characterized by a traditional conception of text, mainly recognized by written verbal text, although the proposal prioritized the relation between verbal and non verbal language. Regarding discursive-linguistic studies, it is important to reflect about a text conception that privileges the integration of multiple semiosis by taking into account the socio-historical interlocution character established within utterances of others.

  2. 324 Building Baseline Radiological Characterization

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  3. Digital baseline estimation method for multi-channel pulse height analyzing

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun

    2005-01-01

    The basic features of digital baseline estimation for multi-channel pulse height analysis are introduced. The weight-function of minimum-noise baseline filter is deduced with functional variational calculus. The frequency response of this filter is also deduced with Fourier transformation, and the influence of parameters on amplitude frequency response characteristics is discussed. With MATLAB software, the noise voltage signal from the charge sensitive preamplifier is simulated, and the processing effect of minimum-noise digital baseline estimation is verified. According to the results of this research, digital baseline estimation method can estimate baseline optimally, and it is very suitable to be used in digital multi-channel pulse height analysis. (authors)

  4. Durability study and lifetime prediction of baseline proton exchange membrane fuel cell under severe operating conditions

    Energy Technology Data Exchange (ETDEWEB)

    Marrony, M.; Quenet, S.; Aslanides, A. [European Institute for Energy Research, Emmy-Noether Strasse 11, 76131 Karlsruhe (Germany); Barrera, R.; Ginocchio, S.; Montelatici, L. [Edison, Via Giorgio La Pira 2, 10028 Trofarello (Italy)

    2008-08-01

    Comparative studies of mechanical and electrochemical properties of Nafion{sup registered} - and sulfonated polyetheretherketone polymer-type membranes are carried out under severe fuel cell conditions required by industrials, within stationary and cycling electric load profiles. These membranes are proposed to be used in PEM between 70 and 90 C as fluorinated or non-fluorinated baseline membranes, respectively. Thus, though the performance of both membranes remains suitable, Nafion{sup registered} backbone brought better mechanical properties and higher electrochemical stabilities than sulfonated polyetheretherketone backbone. The performance stability and the mechanical strength of the membrane-electrode assembly were shown to be influenced by several intrinsic properties of the membrane (e.g., thermal pre-treatment, thickness) and external conditions (fuel cell operating temperature, relative humidity). Finally, a lifetime prediction for membranes under stationary conditions is proposed depending on the operation temperature. At equivalent thicknesses (i.e. 50 {mu}m), Nafion{sup registered} membranes were estimated able to operate into the 80-90 C range while sulfonated polyetheretherketone would be limited into the 70-80 C range. This approach brings baseline information about the capability of these types of polymer electrolyte membrane under fuel cell critical operations. Finally, it is revealed as a potential tool for the selection of the most promising advanced polymers for the ensuing research phase. (author)

  5. A text message intervention for alcohol risk reduction among community college students: TMAP.

    Science.gov (United States)

    Bock, Beth C; Barnett, Nancy P; Thind, Herpreet; Rosen, Rochelle; Walaska, Kristen; Traficante, Regina; Foster, Robert; Deutsch, Chris; Fava, Joseph L; Scott-Sheldon, Lori A J

    2016-12-01

    Students at community colleges comprise nearly half of all U.S. college students and show higher risk of heavy drinking and related consequences compared to students at 4-year colleges, but no alcohol safety programs currently target this population. To examine the feasibility, acceptability, and preliminary efficacy of an alcohol risk-reduction program delivered through text messaging designed for community college (CC) students. Heavy drinking adult CC students (N=60) were enrolled and randomly assigned to the six-week active intervention (Text Message Alcohol Program: TMAP) or a control condition of general motivational (not alcohol related) text messages. TMAP text messages consisted of alcohol facts, strategies to limit alcohol use and related risks, and motivational messages. Assessments were conducted at baseline, week 6 (end of treatment) and week 12 (follow up). Most participants (87%) completed all follow up assessments. Intervention messages received an average rating of 6.8 (SD=1.5) on a 10-point scale. At week six, TMAP participants were less likely than controls to report heavy drinking and negative alcohol consequences. The TMAP group also showed significant increases in self-efficacy to resist drinking in high risk situations between baseline and week six, with no such increase among controls. Results were maintained through the week 12 follow up. The TMAP alcohol risk reduction program was feasible and highly acceptable indicated by high retention rates through the final follow up assessment and good ratings for the text message content. Reductions in multiple outcomes provide positive indications of intervention efficacy. Copyright © 2016. Published by Elsevier Ltd.

  6. Report made on behalf of the parity mixed commission in charge of proposing a text about the dispositions of the project of energy orientation law remaining to be discussed; Rapport fait au nom de la commission mixte paritaire (1) chargee de proposer un texte sur les dispositions restant en discussion du projet de loi d'orientation sur l'energie

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The project of energy orientation law aims at fixing the main principles of the French energy policy for the next decades. It foresees: the re-launching of the French nuclear program (building of an experimental European pressurized reactor (EPR)), the reinforcement of the mastery of energy demand (3% per year, creation of energy saving certificates and reinforcement of buildings energy efficiency rules), and the sustain of renewable energies development. This document presents, first, a direct comparison, article by article, of the text adopted in second lecture by the House of Commons, with the text adopted in second lecture by the Senate. Then, a text is proposed for the last dispositions that remained to be discussed and is presented in the second part of the report. (J.S.)

  7. Quantification of in vivo 1H magnetic resonance spectroscopy signals with baseline and lineshape estimation

    International Nuclear Information System (INIS)

    Osorio-Garcia, M I; Sima, D M; Van Huffel, S; Nielsen, F U; Dresselaers, T; Himmelreich, U; Van Leuven, F

    2011-01-01

    The in vivo quantification of magnetic resonance spectroscopy (MRS) signals is a method to estimate metabolite concentrations of living tissue. Obtaining reliable concentrations is still a challenge due to the experimental conditions affecting spectral quality. Additionally, lipids and macromolecules overlap with the metabolites of interest, affecting their reliable estimation. In this study, we propose to combine the self-deconvolution lineshape estimation method, which accounts for spectral shape distortions, with two different approaches for taking into account the macromolecular baseline contribution: (a) based on macromolecules and lipids measured in vivo using an inversion recovery technique, and (b) based on the simulation of macromolecular resonances using prior knowledge from a database of inversion recovery signals. The ultimate goal is to measure macromolecular and lipid data only once as described in (a) to create macromolecular and lipid profiles. These profiles then can be used as described in (b) for data measured under the same conditions. The method is evaluated on in vivo 1 H MRS signals at 9.4 T from mouse hippocampus. Results show that better metabolite fits are obtained when lineshape and baseline estimations are simultaneously performed and that baseline estimation based on prior knowledge from macromolecular measured signals can be reliably used to replace time-consuming individual macromolecular and lipid acquisitions

  8. First Grade Baseline Evaluation

    Science.gov (United States)

    Center for Innovation in Assessment (NJ1), 2013

    2013-01-01

    The First Grade Baseline Evaluation is an optional tool that can be used at the beginning of the school year to help teachers get to know the reading and language skills of each student. The evaluation is composed of seven screenings. Teachers may use the entire evaluation or choose to use those individual screenings that they find most beneficial…

  9. Baseline psychophysiological and cortisol reactivity as a predictor of PTSD treatment outcome in virtual reality exposure therapy.

    Science.gov (United States)

    Norrholm, Seth Davin; Jovanovic, Tanja; Gerardi, Maryrose; Breazeale, Kathryn G; Price, Matthew; Davis, Michael; Duncan, Erica; Ressler, Kerry J; Bradley, Bekh; Rizzo, Albert; Tuerk, Peter W; Rothbaum, Barbara O

    2016-07-01

    Baseline cue-dependent physiological reactivity may serve as an objective measure of posttraumatic stress disorder (PTSD) symptoms. Additionally, prior animal model and psychological studies would suggest that subjects with greatest symptoms at baseline may have the greatest violation of expectancy to danger when undergoing exposure based psychotherapy; thus treatment approaches which enhanced the learning under these conditions would be optimal for those with maximal baseline cue-dependent reactivity. However methods to study this hypothesis objectively are lacking. Virtual reality (VR) methodologies have been successfully employed as an enhanced form of imaginal prolonged exposure therapy for the treatment of PTSD. Our goal was to examine the predictive nature of initial psychophysiological (e.g., startle, skin conductance, heart rate) and stress hormone responses (e.g., cortisol) during presentation of VR-based combat-related stimuli on PTSD treatment outcome. Combat veterans with PTSD underwent 6 weeks of VR exposure therapy combined with either d-cycloserine (DCS), alprazolam (ALP), or placebo (PBO). In the DCS group, startle response to VR scenes prior to initiation of treatment accounted for 76% of the variance in CAPS change scores, p < 0.001, in that higher responses predicted greater changes in symptom severity over time. Additionally, baseline cortisol reactivity was inversely associated with treatment response in the ALP group, p = 0.04. We propose that baseline cue-activated physiological measures will be sensitive to predicting patients' level of response to exposure therapy, in particular in the presence of enhancement (e.g., DCS). Published by Elsevier Ltd.

  10. Esophageal acid exposure decreases intraluminal baseline impedance levels

    NARCIS (Netherlands)

    Kessing, Boudewijn F.; Bredenoord, Albert J.; Weijenborg, Pim W.; Hemmink, Gerrit J. M.; Loots, Clara M.; Smout, A. J. P. M.

    2011-01-01

    Intraluminal baseline impedance levels are determined by the conductivity of the esophageal wall and can be decreased in gastroesophageal reflux disease (GERD) patients. The aim of this study was to investigate the baseline impedance in GERD patients, on and off proton pump inhibitor (PPI), and in

  11. Financial Statement Fraud Detection using Text Mining

    OpenAIRE

    Rajan Gupta; Nasib Singh Gill

    2013-01-01

    Data mining techniques have been used enormously by the researchers’ community in detecting financial statement fraud. Most of the research in this direction has used the numbers (quantitative information) i.e. financial ratios present in the financial statements for detecting fraud. There is very little or no research on the analysis of text such as auditor’s comments or notes present in published reports. In this study we propose a text mining approach for detecting financial statement frau...

  12. Measuring cognitive change with ImPACT: the aggregate baseline approach.

    Science.gov (United States)

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-11-01

    The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.

  13. Study on the calibration and optimization of double theodolites baseline

    Science.gov (United States)

    Ma, Jing-yi; Ni, Jin-ping; Wu, Zhi-chao

    2018-01-01

    For the double theodolites measurement system baseline as the benchmark of the scale of the measurement system and affect the accuracy of the system, this paper puts forward a method for calibration and optimization of the double theodolites baseline. Using double theodolites to measure the known length of the reference ruler, and then reverse the baseline formula. Based on the error propagation law, the analyses show that the baseline error function is an important index to measure the accuracy of the system, and the reference ruler position, posture and so on have an impact on the baseline error. The optimization model is established and the baseline error function is used as the objective function, and optimizes the position and posture of the reference ruler. The simulation results show that the height of the reference ruler has no effect on the baseline error; the posture is not uniform; when the reference ruler is placed at x=500mm and y=1000mm in the measurement space, the baseline error is the smallest. The experimental results show that the experimental results are consistent with the theoretical analyses in the measurement space. In this paper, based on the study of the placement of the reference ruler, for improving the accuracy of the double theodolites measurement system has a reference value.

  14. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    Science.gov (United States)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; hide

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  15. Retardo estatural em menores de cinco anos: um estudo "baseline" Linear growth retardation in children under five years of age: a baseline study

    Directory of Open Access Journals (Sweden)

    Anete Rissin

    2011-10-01

    Full Text Available OBJETIVO: Descrever a prevalência e analisar fatores associados ao retardo estatural em menores de cinco anos. MÉTODOS: Estudo "baseline", que analisou 2.040 crianças, verificando possíveis associações entre o retardo estatural (índice altura/idade The scope of this study was to describe the prevalence of, and analyze factors associated with, linear growth retardation in children. The baseline study analyzed 2040 children under the age of five, establishing a possible association between growth delay (height/age index < 2 scores Z and variables in six hierarchical blocks: socio-economic, residence, sanitary, maternal, biological and healthcare access. Multivariate analysis was performed using Poisson regression with the robust standard error option, obtaining adjusted prevalence ratios with a CI of 95% and the respective significant probability values. Among non-binary variables, there was a positive association with roof type and number of inhabitants per room and a negative association with income per capita, mother's schooling and birth weight. The adjusted analysis also indicated water supply, visit from the community health agent, birth delivery location, internment for diarrhea, or for pneumonia and birth weight as significant variables. Several risk factors were identified for linear growth retardation pointing to the multi-causal aspects of the problem and highlighting the need for control measures by the various hierarchical government agents.

  16. Placental baseline conditions modulate the hyperoxic BOLD-MRI response.

    Science.gov (United States)

    Sinding, Marianne; Peters, David A; Poulsen, Sofie S; Frøkjær, Jens B; Christiansen, Ole B; Petersen, Astrid; Uldbjerg, Niels; Sørensen, Anne

    2018-01-01

    Human pregnancies complicated by placental dysfunction may be characterized by a high hyperoxic Blood oxygen level-dependent (BOLD) MRI response. The pathophysiology behind this phenomenon remains to be established. The aim of this study was to evaluate whether it is associated with altered placental baseline conditions, including a lower oxygenation and altered tissue morphology, as estimated by the placental transverse relaxation time (T2*). We included 49 normal pregnancies (controls) and 13 pregnancies complicated by placental dysfunction (cases), defined by a birth weight baseline BOLD)/baseline BOLD) from a dynamic single-echo gradient-recalled echo (GRE) MRI sequence and the absolute ΔT2* (hyperoxic T2*- baseline T2*) from breath-hold multi-echo GRE sequences. In the control group, the relative ΔBOLD response increased during gestation from 5% in gestational week 20 to 20% in week 40. In the case group, the relative ΔBOLD response was significantly higher (mean Z-score 4.94; 95% CI 2.41, 7.47). The absolute ΔT2*, however, did not differ between controls and cases (p = 0.37), whereas the baseline T2* was lower among cases (mean Z-score -3.13; 95% CI -3.94, -2.32). Furthermore, we demonstrated a strong negative linear correlation between the Log 10 ΔBOLD response and the baseline T2* (r = -0.88, p baseline conditions, as the absolute increase in placental oxygenation (ΔT2*) does not differ between groups. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Baseline values from the electrocardiograms of children and adolescents with ADHD

    Directory of Open Access Journals (Sweden)

    Zhang Shuyu

    2007-09-01

    Full Text Available Abstract Background An important issue in pediatric pharmacology is the determination of whether medications affect cardiac rhythm parameters, in particular the QT interval, which is a surrogate marker for the risk of adverse cardiac events and sudden death. To evaluate changes while on medication, it is useful to have a comparison of age appropriate values while off medication. The present meta-analysis provides baseline ECG values (i.e., off medication from approximately 6000 children and adolescents with attention-deficit/hyperactivity disorder (ADHD. Methods Subjects were aged 6–18 years and participated in global trials within the atomoxetine registration program. Patients were administered a 12-lead ECG at study screening and cardiac rhythm parameters were recorded. Baseline QT intervals were corrected for heart rate using 3 different methods: Bazett's, Fridericia's, and a population data-derived formula. Results ECG data were obtained from 5289 North American and 641 non-North American children and adolescents. Means and percentiles are presented for each ECG measure and QTc interval based on pubertal status as defined by age and sex. Prior treatment history with stimulants and racial origin (Caucasian were each associated with significantly longer mean QTc values. Conclusion Baseline ECG and QTc data from almost 6000 children and adolescents presenting with ADHD are provided to contribute to the knowledge base regarding mean values for pediatric cardiac parameters. Consistent with other studies of QT interval in children and adolescents, Bazett correction formula appears to overestimate the prevalence of prolonged QTc in the pediatric population.

  18. A proposal for a drug information database and text templates for generating package inserts

    Directory of Open Access Journals (Sweden)

    Okuya R

    2013-07-01

    Full Text Available Ryo Okuya,1 Masaomi Kimura,2 Michiko Ohkura,2 Fumito Tsuchiya3 1Graduate School of Engineering and Science, 2Faculty of Engineering, Shibaura Institute of Technology, Tokyo, 3School of Pharmacy, International University of Health and Welfare, Tokyo, Japan Abstract: To prevent prescription errors caused by information systems, a database to store complete and accurate drug information in a user-friendly format is needed. In previous studies, the primary method for obtaining data stored in a database is to extract drug information from package inserts by employing pattern matching or more sophisticated methods such as text mining. However, it is difficult to obtain a complete database because there is no strict rule concerning expressions used to describe drug information in package inserts. The authors' strategy was to first build a database and then automatically generate package inserts by embedding data in the database using templates. To create this database, the support of pharmaceutical companies to input accurate data is required. It is expected that this system will work, because these companies can earn merit for newly developed drugs to decrease the effort to create package inserts from scratch. This study designed the table schemata for the database and text templates to generate the package inserts. To handle the variety of drug-specific information in the package inserts, this information in drug composition descriptions was replaced with labels and the replacement descriptions utilizing cluster analysis were analyzed. To improve the method by which frequently repeated ingredient information and/or supplementary information are stored, the method was modified by introducing repeat tags in the templates to indicate repetition and improving the insertion of data into the database. The validity of this method was confirmed by inputting the drug information described in existing package inserts and checking that the method could

  19. The Role of Baseline Vagal Tone in Dealing with a Stressor during Face to Face and Computer-Based Social Interactions

    Directory of Open Access Journals (Sweden)

    Daniele Rigoni

    2017-11-01

    Full Text Available Facing a stressor involves a cardiac vagal tone response and a feedback effect produced by social interaction in visceral regulation. This study evaluated the contribution of baseline vagal tone and of social engagement system (SES functioning on the ability to deal with a stressor. Participants (n = 70 were grouped into a minimized social interaction condition (procedure administered through a PC and a social interaction condition (procedure administered by an experimenter. The State Trait Anxiety Inventory, the Social Interaction Anxiety Scale, the Emotion Regulation Questionnaire and a debriefing questionnaire were completed by the subjects. The baseline vagal tone was registered during the baseline, stressor and recovery phases. The collected results highlighted a significant effect of the baseline vagal tone on vagal suppression. No effect of minimized vs. social interaction conditions on cardiac vagal tone during stressor and recovery phases was detected. Cardiac vagal tone and the results of the questionnaires appear to be not correlated. The study highlighted the main role of baseline vagal tone on visceral regulation. Some remarks on SES to be deepen in further research were raised.

  20. Effect of the Interaction of Text Structure, Background Knowledge and Purpose on Attention to Text.

    Science.gov (United States)

    1982-04-01

    in4 the sense proposed by Craik and Lockhart (1972). All levels of representation would entail such preliminary processing operations as perceptual...109. Craik , F. I., & Lockhart , R. S. Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 1972, 11... processes this information to a deeper level than those text elements that are less important or irrelevant. The terminology "deeper" level is used here

  1. Preoperational baseline and site characterization report for the Environmental Restoration Disposal Facility. Volume 2, Revision 2

    International Nuclear Information System (INIS)

    Weekes, D.C.; Lindsey, K.A.; Ford, B.H.; Jaeger, G.K.

    1996-12-01

    This document is Volume 2 in a two-volume series that comprise the site characterization report, the Preoperational Baseline and Site Characterization Report for the Environmental Restoration Disposal Facility. Volume 1 contains data interpretation and information supporting the conclusions in the main text. This document presents original data in support of Volume 1 of the report. The following types of data are presented: well construction reports; borehole logs; borehole geophysical data; well development and pump installation; survey reports; preoperational baseline chemical data and aquifer test data. Five groundwater monitoring wells, six deep characterization boreholes, and two shallow characterization boreholes were drilled at the Environmental Restoration Disposal Facility (ERDF) site to directly investigate site-specific hydrogeologic conditions

  2. Reasoning with Annotations of Texts

    OpenAIRE

    Ma , Yue; Lévy , François; Ghimire , Sudeep

    2011-01-01

    International audience; Linguistic and semantic annotations are important features for text-based applications. However, achieving and maintaining a good quality of a set of annotations is known to be a complex task. Many ad hoc approaches have been developed to produce various types of annotations, while comparing those annotations to improve their quality is still rare. In this paper, we propose a framework in which both linguistic and domain information can cooperate to reason with annotat...

  3. Baseline restoration technique based on symmetrical zero-area trapezoidal pulse shaper

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Guoqiang, E-mail: 24829500@qq.com [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Yang, Jian, E-mail: 22105653@qq.com [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Hu, Tianyu; Ge, Liangquan [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Ouyang, Xiaoping [Northwest Institute of Nuclear Technology, Xi’an 710024,China (China); Zhang, Qingxian; Gu, Yi [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China)

    2017-06-21

    Since the baseline of the unipolar pulse shaper have the direct-current (DC) offset and drift, an additional baseline estimator is need to obtain baseline values in real-time. The bipolar zero-area (BZA) pulse shapers can be used for baseline restoration, but they cannot restrain the baseline drift due to their asymmetrical shape. In this study, three trapezoids are synthesized as a symmetrical zero-area (SZA) shape, which can remove the DC offset and restrain the baseline drift. This baseline restoration technique can be easily implemented in digital pulse processing (DPP) systems base on the recursive algorithm. To strengthen our approach, the iron's characteristic x-ray was detected using a Si-PIN diode detector. Compared with traditional trapezoidal pulse shapers, the SZA trapezoidal pulse shaper improved the energy resolution from 237 eV to 216 eV for the 6.403 keV Kα peak.

  4. High Baseline Postconcussion Symptom Scores and Concussion Outcomes in Athletes.

    Science.gov (United States)

    Custer, Aimee; Sufrinko, Alicia; Elbin, R J; Covassin, Tracey; Collins, Micky; Kontos, Anthony

    2016-02-01

    Some healthy athletes report high levels of baseline concussion symptoms, which may be attributable to several factors (eg, illness, personality, somaticizing). However, the role of baseline symptoms in outcomes after sport-related concussion (SRC) has not been empirically examined. To determine if athletes with high symptom scores at baseline performed worse than athletes without baseline symptoms on neurocognitive testing after SRC. Cohort study. High school and collegiate athletic programs. A total of 670 high school and collegiate athletes participated in the study. Participants were divided into groups with either no baseline symptoms (Postconcussion Symptom Scale [PCSS] score = 0, n = 247) or a high level of baseline symptoms (PCSS score > 18 [top 10% of sample], n = 68). Participants were evaluated at baseline and 2 to 7 days after SRC with the Immediate Post-concussion Assessment and Cognitive Test and PCSS. Outcome measures were Immediate Post-concussion Assessment and Cognitive Test composite scores (verbal memory, visual memory, visual motor processing speed, and reaction time) and total symptom score on the PCSS. The groups were compared using repeated-measures analyses of variance with Bonferroni correction to assess interactions between group and time for symptoms and neurocognitive impairment. The no-symptoms group represented 38% of the original sample, whereas the high-symptoms group represented 11% of the sample. The high-symptoms group experienced a larger decline from preinjury to postinjury than the no-symptoms group in verbal (P = .03) and visual memory (P = .05). However, total concussion-symptom scores increased from preinjury to postinjury for the no-symptoms group (P = .001) but remained stable for the high-symptoms group. Reported baseline symptoms may help identify athletes at risk for worse outcomes after SRC. Clinicians should examine baseline symptom levels to better identify patients for earlier referral and treatment for their

  5. Baseline effects on carbon footprints of biofuels: The case of wood

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Eric, E-mail: johnsonatlantic@gmail.com [Atlantic Consulting, 8136 Gattikon (Switzerland); Tschudi, Daniel [ETH, Berghaldenstrasse 46, 8800 Thalwil (Switzerland)

    2012-11-15

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: Black-Right-Pointing-Pointer Four baseline types for biofuel footprinting are identified. Black-Right-Pointing-Pointer One type, 'biomass opportunity cost', is defined mathematically and graphically. Black-Right-Pointing-Pointer Choice of baseline can dramatically affect the footprint result. Black-Right-Pointing-Pointer The 'no baseline' approach is not acceptable. Black-Right-Pointing-Pointer Choice between the other three baselines depends on the question being addressed.

  6. Automated baseline change detection phase I. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  7. Automated baseline change detection phase I. Final report

    International Nuclear Information System (INIS)

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER ampersand WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements

  8. Vegetation Parameter Extraction Using Dual Baseline Polarimetric SAR Interferometry Data

    Science.gov (United States)

    Zhang, H.; Wang, C.; Chen, X.; Tang, Y.

    2009-04-01

    For vegetation parameter inversion, the single baseline polarimetric SAR interferometry (POLinSAR) technique, such as the three-stage method and the ESPRIT algorithm, is limited by the observed data with the minimum ground to volume amplitude ration, which effects the estimation of the effective phase center for the vegetation canopy or the surface, and thus results in the underestimated vegetation height. In order to remove this effect of the single baseline inversion techniques in some extend, another baseline POLinSAR data is added on vegetation parameter estimation in this paper, and a dual baseline POLinSAR technique for the extraction of the vegetation parameter is investigated and improved to reduce the dynamic bias for the vegetation parameter estimation. Finally, the simulated data and real data are used to validate this dual baseline technique.

  9. Updated baseline for a staged Compact Linear Collider

    CERN Document Server

    Boland, M J; Giansiracusa, P J; Lucas, T G; Rassool, R P; Balazs, C; Charles, T K; Afanaciev, K; Emeliantchik, I; Ignatenko, A; Makarenko, V; Shumeiko, N; Patapenka, A; Zhuk, I; Abusleme Hoffman, A C; Diaz Gutierrez, M A; Gonzalez, M Vogel; Chi, Y; He, X; Pei, G; Pei, S; Shu, G; Wang, X; Zhang, J; Zhao, F; Zhou, Z; Chen, H; Gao, Y; Huang, W; Kuang, Y P; Li, B; Li, Y; Shao, J; Shi, J; Tang, C; Wu, X; Ma, L; Han, Y; Fang, W; Gu, Q; Huang, D; Huang, X; Tan, J; Wang, Z; Zhao, Z; Laštovička, T; Uggerhoj, U; Wistisen, T N; Aabloo, A; Eimre, K; Kuppart, K; Vigonski, S; Zadin, V; Aicheler, M; Baibuz, E; Brücken, E; Djurabekova, F; Eerola, P; Garcia, F; Haeggström, E; Huitu, K; Jansson, V; Karimaki, V; Kassamakov, I; Kyritsakis, A; Lehti, S; Meriläinen, A; Montonen, R; Niinikoski, T; Nordlund, K; Österberg, K; Parekh, M; Törnqvist, N A; Väinölä, J; Veske, M; Farabolini, W; Mollard, A; Napoly, O; Peauger, F; Plouin, J; Bambade, P; Chaikovska, I; Chehab, R; Davier, M; Kaabi, W; Kou, E; LeDiberder, F; Pöschl, R; Zerwas, D; Aimard, B; Balik, G; Baud, J-P; Blaising, J-J; Brunetti, L; Chefdeville, M; Drancourt, C; Geoffroy, N; Jacquemier, J; Jeremie, A; Karyotakis, Y; Nappa, J M; Vilalte, S; Vouters, G; Bernard, A; Peric, I; Gabriel, M; Simon, F; Szalay, M; van der Kolk, N; Alexopoulos, T; Gazis, E N; Gazis, N; Ikarios, E; Kostopoulos, V; Kourkoulis, S; Gupta, P D; Shrivastava, P; Arfaei, H; Dayyani, M K; Ghasem, H; Hajari, S S; Shaker, H; Ashkenazy, Y; Abramowicz, H; Benhammou, Y; Borysov, O; Kananov, S; Levy, A; Levy, I; Rosenblat, O; D'Auria, G; Di Mitri, S; Abe, T; Aryshev, A; Higo, T; Makida, Y; Matsumoto, S; Shidara, T; Takatomi, T; Takubo, Y; Tauchi, T; Toge, N; Ueno, K; Urakawa, J; Yamamoto, A; Yamanaka, M; Raboanary, R; Hart, R; van der Graaf, H; Eigen, G; Zalieckas, J; Adli, E; Lillestøl, R; Malina, L; Pfingstner, J; Sjobak, K N; Ahmed, W; Asghar, M I; Hoorani, H; Bugiel, S; Dasgupta, R; Firlej, M; Fiutowski, T A; Idzik, M; Kopec, M; Kuczynska, M; Moron, J; Swientek, K P; Daniluk, W; Krupa, B; Kucharczyk, M; Lesiak, T; Moszczynski, A; Pawlik, B; Sopicki, P; Wojtoń, T; Zawiejski, L; Kalinowski, J; Krawczyk, M; Żarnecki, A F; Firu, E; Ghenescu, V; Neagu, A T; Preda, T; Zgura, I-S; Aloev, A; Azaryan, N; Budagov, J; Chizhov, M; Filippova, M; Glagolev, V; Gongadze, A; Grigoryan, S; Gudkov, D; Karjavine, V; Lyablin, M; Olyunin, A; Samochkine, A; Sapronov, A; Shirkov, G; Soldatov, V; Solodko, A; Solodko, E; Trubnikov, G; Tyapkin, I; Uzhinsky, V; Vorozhtov, A; Levichev, E; Mezentsev, N; Piminov, P; Shatilov, D; Vobly, P; Zolotarev, K; Bozovic-Jelisavcic, I; Kacarevic, G; Lukic, S; Milutinovic-Dumbelovic, G; Pandurovic, M; Iriso, U; Perez, F; Pont, M; Trenado, J; Aguilar-Benitez, M; Calero, J; Garcia-Tabares, L; Gavela, D; Gutierrez, J L; Lopez, D; Toral, F; Moya, D; Ruiz-Jimeno, A; Vila, I; Argyropoulos, T; Blanch Gutierrez, C; Boronat, M; Esperante, D; Faus-Golfe, A; Fuster, J; Fuster Martinez, N; Galindo Muñoz, N; García, I; Giner Navarro, J; Ros, E; Vos, M; Brenner, R; Ekelöf, T; Jacewicz, M; Ögren, J; Olvegård, M; Ruber, R; Ziemann, V; Aguglia, D; Alipour Tehrani, N; Aloev, A; Andersson, A; Andrianala, F; Antoniou, F; Artoos, K; Atieh, S; Ballabriga Sune, R; Barnes, M J; Barranco Garcia, J; Bartosik, H; Belver-Aguilar, C; Benot Morell, A; Bett, D R; Bettoni, S; Blanchot, G; Blanco Garcia, O; Bonnin, X A; Brunner, O; Burkhardt, H; Calatroni, S; Campbell, M; Catalan Lasheras, N; Cerqueira Bastos, M; Cherif, A; Chevallay, E; Constance, B; Corsini, R; Cure, B; Curt, S; Dalena, B; Dannheim, D; De Michele, G; De Oliveira, L; Deelen, N; Delahaye, J P; Dobers, T; Doebert, S; Draper, M; Duarte Ramos, F; Dubrovskiy, A; Elsener, K; Esberg, J; Esposito, M; Fedosseev, V; Ferracin, P; Fiergolski, A; Foraz, K; Fowler, A; Friebel, F; Fuchs, J-F; Fuentes Rojas, C A; Gaddi, A; Garcia Fajardo, L; Garcia Morales, H; Garion, C; Gatignon, L; Gayde, J-C; Gerwig, H; Goldblatt, A N; Grefe, C; Grudiev, A; Guillot-Vignot, F G; Gutt-Mostowy, M L; Hauschild, M; Hessler, C; Holma, J K; Holzer, E; Hourican, M; Hynds, D; Inntjore Levinsen, Y; Jeanneret, B; Jensen, E; Jonker, M; Kastriotou, M; Kemppinen, J M K; Kieffer, R B; Klempt, W; Kononenko, O; Korsback, A; Koukovini Platia, E; Kovermann, J W; Kozsar, C-I; Kremastiotis, I; Kulis, S; Latina, A; Leaux, F; Lebrun, P; Lefevre, T; Linssen, L; Llopart Cudie, X; Maier, A A; Mainaud Durand, H; Manosperti, E; Marelli, C; Marin Lacoma, E; Martin, R; Mazzoni, S; Mcmonagle, G; Mete, O; Mether, L M; Modena, M; Münker, R M; Muranaka, T; Nebot Del Busto, E; Nikiforou, N; Nisbet, D; Nonglaton, J-M; Nuiry, F X; Nürnberg, A; Olvegard, M; Osborne, J; Papadopoulou, S; Papaphilippou, Y; Passarelli, A; Patecki, M; Pazdera, L; Pellegrini, D; Pepitone, K; Perez, F; Perez Codina, E; Perez Fontenla, A; Persson, T H B; Petrič, M; Pitters, F; Pittet, S; Plassard, F; Rajamak, R; Redford, S; Renier, Y; Rey, S F; Riddone, G; Rinolfi, L; Rodriguez Castro, E; Roloff, P; Rossi, C; Rude, V; Rumolo, G; Sailer, A; Santin, E; Schlatter, D; Schmickler, H; Schulte, D; Shipman, N; Sicking, E; Simoniello, R; Skowronski, P K; Sobrino Mompean, P; Soby, L; Sosin, M P; Sroka, S; Stapnes, S; Sterbini, G; Ström, R; Syratchev, I; Tecker, F; Thonet, P A; Timeo, L; Timko, H; Tomas Garcia, R; Valerio, P; Vamvakas, A L; Vivoli, A; Weber, M A; Wegner, R; Wendt, M; Woolley, B; Wuensch, W; Uythoven, J; Zha, H; Zisopoulos, P; Benoit, M; Vicente Barreto Pinto, M; Bopp, M; Braun, H H; Csatari Divall, M; Dehler, M; Garvey, T; Raguin, J Y; Rivkin, L; Zennaro, R; Aksoy, A; Nergiz, Z; Pilicer, E; Tapan, I; Yavas, O; Baturin, V; Kholodov, R; Lebedynskyi, S; Miroshnichenko, V; Mordyk, S; Profatilova, I; Storizhko, V; Watson, N; Winter, A; Goldstein, J; Green, S; Marshall, J S; Thomson, M A; Xu, B; Gillespie, W A; Pan, R; Tyrk, M A; Protopopescu, D; Robson, A; Apsimon, R; Bailey, I; Burt, G; Constable, D; Dexter, A; Karimian, S; Lingwood, C; Buckland, M D; Casse, G; Vossebeld, J; Bosco, A; Karataev, P; Kruchinin, K; Lekomtsev, K; Nevay, L; Snuverink, J; Yamakawa, E; Boisvert, V; Boogert, S; Boorman, G; Gibson, S; Lyapin, A; Shields, W; Teixeira-Dias, P; West, S; Jones, R; Joshi, N; Bodenstein, R; Burrows, P N; Christian, G B; Gamba, D; Perry, C; Roberts, J; Clarke, J A; Collomb, N A; Jamison, S P; Shepherd, B J A; Walsh, D; Demarteau, M; Repond, J; Weerts, H; Xia, L; Wells, J D; Adolphsen, C; Barklow, T; Breidenbach, M; Graf, N; Hewett, J; Markiewicz, T; McCormick, D; Moffeit, K; Nosochkov, Y; Oriunno, M; Phinney, N; Rizzo, T; Tantawi, S; Wang, F; Wang, J; White, G; Woodley, M

    2016-01-01

    The Compact Linear Collider (CLIC) is a multi-TeV high-luminosity linear e+e- collider under development. For an optimal exploitation of its physics potential, CLIC is foreseen to be built and operated in a staged approach with three centre-of-mass energy stages ranging from a few hundred GeV up to 3 TeV. The first stage will focus on precision Standard Model physics, in particular Higgs and top-quark measurements. Subsequent stages will focus on measurements of rare Higgs processes, as well as searches for new physics processes and precision measurements of new states, e.g. states previously discovered at LHC or at CLIC itself. In the 2012 CLIC Conceptual Design Report, a fully optimised 3 TeV collider was presented, while the proposed lower energy stages were not studied to the same level of detail. This report presents an updated baseline staging scenario for CLIC. The scenario is the result of a comprehensive study addressing the performance, cost and power of the CLIC accelerator complex as a function of...

  10. A Baseline-Free Defect Imaging Technique in Plates Using Time Reversal of Lamb Waves

    International Nuclear Information System (INIS)

    Jeong, Hyunjo; Cho, Sungjong; Wei, Wei

    2011-01-01

    We present an analytical investigation for a baseline-free imaging of a defect in plate-like structures using the time-reversal of Lamb waves. We first consider the flexural wave (A 0 mode) propagation in a plate containing a defect, and reception and time reversal process of the output signal at the receiver. The received output signal is then composed of two parts: a directly propagated wave and a scattered wave from the defect. The time reversal of these waves recovers the original input signal, and produces two additional sidebands that contain the time-of-flight information on the defect location. One of the side-band signals is then extracted as a pure defect signal. A defect localization image is then constructed from a beamforming technique based on the time-frequency analysis of the side band signal for each transducer pair in a network of sensors. The simulation results show that the proposed scheme enables the accurate, baseline-free imaging of a defect. (fundamental areas of phenomenology(including applications))

  11. Effects of triplet Higgs bosons in long baseline neutrino experiments

    Science.gov (United States)

    Huitu, K.; Kärkkäinen, T. J.; Maalampi, J.; Vihonen, S.

    2018-05-01

    The triplet scalars (Δ =Δ++,Δ+,Δ0) utilized in the so-called type-II seesaw model to explain the lightness of neutrinos, would generate nonstandard interactions (NSI) for a neutrino propagating in matter. We investigate the prospects to probe these interactions in long baseline neutrino oscillation experiments. We analyze the upper bounds that the proposed DUNE experiment might set on the nonstandard parameters and numerically derive upper bounds, as a function of the lightest neutrino mass, on the ratio the mass MΔ of the triplet scalars, and the strength |λϕ| of the coupling ϕ ϕ Δ of the triplet Δ and conventional Higgs doublet ϕ . We also discuss the possible misinterpretation of these effects as effects arising from a nonunitarity of the neutrino mixing matrix and compare the results with the bounds that arise from the charged lepton flavor violating processes.

  12. Automatic path proposal computation for CT-guided percutaneous liver biopsy.

    Science.gov (United States)

    Helck, A; Schumann, C; Aumann, J; Thierfelder, K; Strobl, F F; Braunagel, M; Niethammer, M; Clevert, D A; Hoffmann, R T; Reiser, M; Sandner, T; Trumm, C

    2016-12-01

    To evaluate feasibility of automatic software-based path proposals for CT-guided percutaneous biopsies. Thirty-three patients (60 [Formula: see text] 12 years) referred for CT-guided biopsy of focal liver lesions were consecutively included. Pre-interventional CT and dedicated software (FraunhoferMeVis Pathfinder) were used for (semi)automatic segmentation of relevant structures. The software subsequently generated three path proposals in downward quality for CT-guided biopsy. Proposed needle paths were compared with consensus proposal of two experts (comparable, less suitable, not feasible). In case of comparable results, equivalent approach to software-based path proposal was used. Quality of segmentation process was evaluated (Likert scale, 1 [Formula: see text] best, 6 [Formula: see text] worst), and time for processing was registered. All biopsies were performed successfully without complications. In 91 % one of the three automatic path proposals was rated comparable to experts' proposal. None of the first proposals was rated not feasible, and 76 % were rated comparable to the experts' proposal. 7 % automatic path proposals were rated not feasible, all being second choice ([Formula: see text]) or third choice ([Formula: see text]). In 79 %, segmentation at least was good. Average total time for establishing automatic path proposal was 42 [Formula: see text] 9 s. Automatic software-based path proposal for CT-guided liver biopsies in the majority provides path proposals that are easy to establish and comparable to experts' insertion trajectories.

  13. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  14. Mobile characters, mobile texts: homelessness and intertextuality in contemporary texts for young people

    Directory of Open Access Journals (Sweden)

    Mavis Reimer

    2013-06-01

    Full Text Available Since the 1990s, narratives about homelessness for and about young people have proliferated around the world. A cluster of thematic elements shared by many of these narratives of the age of globalization points to the deep anxiety that is being expressed about a social, economic, and cultural system under stress or struggling to find a new formation. More surprisingly, many of the narratives also use canonical cultural texts extensively as intertexts. This article considers three novels from three different national traditions to address the work of intertextuality in narratives about homelessness: Skellig by UK author David Almond, which was published in 1998; Chronicler of the Winds by Swedish author Henning Mankell, which was first published in 1988 in Swedish as Comédia Infantil and published in an English translation in 2006; and Stained Glass by Canadian author Michael Bedard, which was published in 2002. Using Julia Kristeva's definition of intertextuality as the “transposition of one (or several sign systems into another,” I propose that all intertexts can be thought of as metaphoric texts, in the precise sense that they carry one text into another. In the narratives under discussion in this article, the idea of homelessness is in perpetual motion between texts and intertexts, ground and figure, the literal and the symbolic. What the child characters and the readers who take up the position offered to implied readers are asked to do, I argue, is to put on a way of seeing that does not settle, a way of being that strains forward toward the new.

  15. Impact of baseline Diabetic Retinopathy Severity Scale scores on visual outcomes in the VIVID-DME and VISTA-DME studies.

    Science.gov (United States)

    Staurenghi, Giovanni; Feltgen, Nicolas; Arnold, Jennifer J; Katz, Todd A; Metzig, Carola; Lu, Chengxing; Holz, Frank G

    2017-10-19

    To evaluate intravitreal aflibercept versus laser in subgroups of patients with baseline Diabetic Retinopathy Severity Scale (DRSS) scores ≤43, 47, and ≥53 in VIVID-DME and VISTA-DME. Patients with diabetic macular oedema were randomised to receive intravitreal aflibercept 2 mg every 4 weeks (2q4), intravitreal aflibercept 2 mg every 8 weeks after five initial monthly doses (2q8), or macular laser photocoagulation at baseline with sham injections at every visit. These post hoc analyses evaluate outcomes based on baseline DRSS scores in patients in the integrated dataset. The 2q4 and 2q8 treatment groups were also pooled. 748 patients had a baseline DRSS score based on fundus photographs (≤43, n=301; 47, n=153; ≥53, n=294). At week 100, the least squares mean difference between treatment groups (effect of intravitreal aflibercept above that of laser, adjusting for baseline best-corrected visual acuity) was 8.9 (95% CI 5.99 to 11.81), 9.7 (95% CI 5.54 to 13.91), and 11.0 (95% CI 7.96 to 14.1) letters in those with baseline DRSS scores ≤43, 47, and ≥53, respectively. The proportions of patients with ≥2 step DRSS score improvement were greater in the intravitreal aflibercept group versus laser, respectively, for those with baseline DRSS scores of ≤43 (13% vs 5.9%), 47 (25.8% vs 4.5%), and ≥53 (64.5% vs 28.4%). Regardless of baseline DRSS score, functional outcomes were superior in intravitreal aflibercept-treated patients, demonstrating consistent treatment benefit across various baseline levels of retinopathy. NCT01331681 and NCT01363440, Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Process industry energy retrofits: the importance of emission baselines for greenhouse gas reductions

    International Nuclear Information System (INIS)

    Aadahl, Anders; Harvey, Simon; Berntsson, Thore

    2004-01-01

    Fuel combustion for heat and/or electric power production is often the largest contributor of greenhouse gas (GHG) emissions from an industrial process plant. Economically feasible options to reduce these emissions include fuel switching and retrofitting the plant's energy system. Process integration methods and tools can be used to evaluate potential retrofit measures. For assessing the GHG emissions reduction potential for the measures considered, it is also necessary to define appropriate GHG emission baselines. This paper presents a systematic GHG emission calculation method for retrofit situations including improved heat exchange, integration of combined heat and power (CHP) units, and combinations of both. The proposed method is applied to five different industrial processes in order to compare the impact of process specific parameters and energy market specific parameters. For potential GHG emission reductions the results from the applied study reveal that electricity grid emissions are significantly more important than differences between individual processes. Based on the results of the study, it is suggested that for sustainable investment decision considerations a conservative emission baseline is most appropriate. Even so, new industrial CHP in the Northern European energy market could play a significant role in the common effort to decrease GHG emissions

  17. A study of man made radioactivity baseline in dietary materials

    International Nuclear Information System (INIS)

    de la Paz, L.; Estacio, J.; Palattao, M.V.; Anden, A.

    1986-01-01

    This paper describes the radioactivity baseline from literature data coming from various countries where data are available. 1979-1985 were chosen as the baseline years for the following: milk (fresh and powdered), meat and meat products, cereals, fruits, coffee and tea, fish and vegetables. Pre- and post-Chernobyl baseline data are given. (ELC). 21 figs; 17 refs

  18. Validity and Reliability of Baseline Testing in a Standardized Environment.

    Science.gov (United States)

    Higgins, Kathryn L; Caze, Todd; Maerlender, Arthur

    2017-08-11

    The Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) is a computerized neuropsychological test battery commonly used to determine cognitive recovery from concussion based on comparing post-injury scores to baseline scores. This model is based on the premise that ImPACT baseline test scores are a valid and reliable measure of optimal cognitive function at baseline. Growing evidence suggests that this premise may not be accurate and a large contributor to invalid and unreliable baseline test scores may be the protocol and environment in which baseline tests are administered. This study examined the effects of a standardized environment and administration protocol on the reliability and performance validity of athletes' baseline test scores on ImPACT by comparing scores obtained in two different group-testing settings. Three hundred-sixty one Division 1 cohort-matched collegiate athletes' baseline data were assessed using a variety of indicators of potential performance invalidity; internal reliability was also examined. Thirty-one to thirty-nine percent of the baseline cases had at least one indicator of low performance validity, but there were no significant differences in validity indicators based on environment in which the testing was conducted. Internal consistency reliability scores were in the acceptable to good range, with no significant differences between administration conditions. These results suggest that athletes may be reliably performing at levels lower than their best effort would produce. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Learning Convolutional Text Representations for Visual Question Answering

    OpenAIRE

    Wang, Zhengyang; Ji, Shuiwang

    2017-01-01

    Visual question answering is a recently proposed artificial intelligence task that requires a deep understanding of both images and texts. In deep learning, images are typically modeled through convolutional neural networks, and texts are typically modeled through recurrent neural networks. While the requirement for modeling images is similar to traditional computer vision tasks, such as object recognition and image classification, visual question answering raises a different need for textual...

  20. Baseline effects on carbon footprints of biofuels: The case of wood

    International Nuclear Information System (INIS)

    Johnson, Eric; Tschudi, Daniel

    2012-01-01

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: ► Four baseline types for biofuel footprinting are identified. ► One type, ‘biomass opportunity cost’, is defined mathematically and graphically. ► Choice of baseline can dramatically affect the footprint result. ► The ‘no baseline’ approach is not acceptable. ► Choice between the other three baselines depends on the question being addressed.

  1. Utilizing Multi-Field Text Features for Efficient Email Spam Filtering

    Directory of Open Access Journals (Sweden)

    Wuying Liu

    2012-06-01

    Full Text Available Large-scale spam emails cause a serious waste of time and resources. This paper investigates the text features of email documents and the feature noises among multi-field texts, resulting in an observation of a power law distribution of feature strings within each text field. According to the observation, we propose an efficient filtering approach including a compound weight method and a lightweight field text classification algorithm. The compound weight method considers both the historical classifying ability of each field classifier and the classifying contribution of each text field in the current classified email. The lightweight field text classification algorithm straightforwardly calculates the arithmetical average of multiple conditional probabilities predicted from feature strings according to a string-frequency index for labeled emails storing. The string-frequency index structure has a random-sampling-based compressible property owing to the power law distribution and can largely reduce the storage space. The experimental results in the TREC spam track show that the proposed approach can complete the filtering task in low space cost and high speed, whose overall performance 1-ROCA exceeds the best one among the participators at the trec07p evaluation.

  2. Long baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Gallagher, H.

    2006-01-01

    In this paper I will review briefly the experimental results which established the existence of neutrino mixing, the current generation of long baseline accelerator experiments, and the prospects for the future. In particular I will focus on the recent analysis of the MINOS experiment. (author)

  3. Privacy Preserving Similarity Based Text Retrieval through Blind Storage

    Directory of Open Access Journals (Sweden)

    Pinki Kumari

    2016-09-01

    Full Text Available Cloud computing is improving rapidly due to their more advantage and more data owners give interest to outsource their data into cloud storage for centralize their data. As huge files stored in the cloud storage, there is need to implement the keyword based search process to data user. At the same time to protect the privacy of data, encryption techniques are used for sensitive data, that encryption is done before outsourcing data to cloud server. But it is critical to search results in encryption data. In this system we propose similarity text retrieval from the blind storage blocks with encryption format. This system provides more security because of blind storage system. In blind storage system data is stored randomly on cloud storage.  In Existing Data Owner cannot encrypt the document data as it was done only at server end. Everyone can access the data as there was no private key concept applied to maintained privacy of the data. But In our proposed system, Data Owner can encrypt the data himself using RSA algorithm.  RSA is a public key-cryptosystem and it is widely used for sensitive data storage over Internet. In our system we use Text mining process for identifying the index files of user documents. Before encryption we also use NLP (Nature Language Processing technique to identify the keyword synonyms of data owner document. Here text mining process examines text word by word and collect literal meaning beyond the words group that composes the sentence. Those words are examined in API of word net so that only equivalent words can be identified for index file use. Our proposed system provides more secure and authorized way of recover the text in cloud storage with access control. Finally, our experimental result shows that our system is better than existing.

  4. Baseline composition of solar energetic particles

    International Nuclear Information System (INIS)

    Meyer, J.

    1985-01-01

    We analyze all existing spacecraft observations of the highly variable heavy element composition of solar energetic particles (SEP) during non- 3 He-rich events. All data show the imprint of an ever-present basic composition pattern (dubbed ''mass-unbiased baseline'' SEP composition) that differs from the photospheric composition by a simple bias related to first ionization potential (FIP). In each particular observation, this mass-unbiased baseline composition is being distorted by an additional bias, which is always a monotonic function of mass (or Z). This latter bias varies in amplitude and even sign from observation to observation. To first order, it seems related to differences in the A/Z* ratio between elements (Z* = mean effective charge)

  5. Micronuclei and erythrocytic abnormalities frequencies of freshwater fishes: Establishing a baseline for health status

    Science.gov (United States)

    Sousa, Debora Batista Pinheiro; Torres, Audalio Rebelo; Oliveira, Suelen Rosana Sampaio; Castro, Jonatas da Silva; Neta, Raimunda Nonata Fortes Carvalho

    2017-11-01

    Majority papers shows that micronucleus test and erythrocyte abnormalities are excellent tools such as tools for monitor fish health and the level of impact in aquatic ecosystems. Nevertheless, still do not know the baseline for those changes in freshwater fishes communities in the Brazilian Northeastern river. In this study, we show the level of basis of two species of freshwater fishes (Colossoma macropomum -tambaqui and Oreochromis niloticus - tilápia) with the aim of establish levels of background these species. The animals were collected from Ambude river in the protected area and blood collected from all fish for analysis. Erythrocyte indices—mean corpuscular volume (MCV), mean corpuscular hemoglobin (MCH), and mean corpuscular hemoglobin concentration (MCHC)—were calculated. Blood samples from all fish were examined for micronuclear changes after Giemsa staining. Micronuclei were found in fish from from Ambude River. The baseline values determined for tambaqui was (micronuclei= 0.0071±0.0026; MCV=0.0073±0.0037; MCHV=0.0071±0.0024) and tilapia (micronuclei= 0.0061±0.0026; MCV=0.0037±0.0017; MCHV=0.056±0.0036). We belive that, we propose using the genotoxic approach for estimating fish health status as the technique allows examination in locus of live fish without the need for animal euthanasia. Besides, baseline level can be to establish levels of background and patterns to pathological and physiological research of these species in future biomonitoring programs.

  6. 78 FR 31563 - Agency Information Collection Activities; Proposed Collection; Public Comment Request

    Science.gov (United States)

    2013-05-24

    ... a planned expansion, as per the statute, THCGME funding may only be used to support an expanded... Abstract: The THCGME Program Eligible Resident/FTE Chart published in the THCGME Funding Opportunity... program during the baseline academic year, and a projection of the program's proposed expansion over the...

  7. Baseline Antibody Titre against Salmonella enterica in Healthy Population of Mumbai, Maharashtra, India

    Directory of Open Access Journals (Sweden)

    Rucha Patki

    2017-01-01

    Full Text Available Objective. The aim of this study was to establish a baseline titre for the population of Mumbai, Maharashtra, India. Method. Four hundred healthy blood donors, attending blood donation camps, were screened using a survey questionnaire. Widal tube agglutination test was performed on the diluted sera (with 0.9% normal saline of blood donors, with final dilution ranging from 1 : 40 to 1 : 320. Results. Out of 400 individuals providing samples, 78 (19.5% individuals showed antibody titres ≥ 1 : 40 for at least one antigen and 322 (80.5% showed no agglutination. The baseline antibody titres against O antigen and H antigen of Salmonella enterica serotype Typhi were found to be 1 : 40 and 1 : 80, respectively. Similarly, the baseline antibody titres for the H antigen of Salmonella enterica serotypes Paratyphi A and Paratyphi B were found to be 1 : 40 and 1 : 80, respectively. Conclusion. Thus, it was noted that the diagnostically significant cutoff of antibody titre from acute phase sample was ≥ 1 : 80 for S. Typhi O antigen and titre of ≥ 1 : 160 for both S. Typhi H antigen and S. Paratyphi BH antigen. Antibody titre of ≥ 1 : 80 can be considered significant for S. Paratyphi AH antigen.

  8. The effect of a motivational intervention on weight loss is moderated by level of baseline controlled motivation

    Directory of Open Access Journals (Sweden)

    Tate Deborah F

    2010-01-01

    Full Text Available Abstract Background Clinic-based behavioral weight loss programs are effective in producing significant weight loss. A one-size-fits-all approach is often taken with these programs. It may be beneficial to tailor programs based on participants' baseline characteristics. Type and level of motivation may be an important factor to consider. Previous research has found that, in general, higher levels of controlled motivation are detrimental to behavior change while higher levels of autonomous motivation improve the likelihood of behavior modification. Methods This study assessed the outcomes of two internet behavioral weight loss interventions and assessed the effect of baseline motivation levels on program success. Eighty females (M (SD age 48.7 (10.6 years; BMI 32.0 (3.7 kg/m2; 91% Caucasian were randomized to one of two groups, a standard group or a motivation-enhanced group. Both received a 16-week internet behavioral weight loss program and attended an initial and a four-week group session. Weight and motivation were measured at baseline, four and 16 weeks. Hierarchical regression analysis was conducted to test for moderation. Results There was significant weight loss at 16-weeks in both groups (p p = 0.57 (standard group 3.4 (3.6 kg; motivation-enhanced group 3.9 (3.4 kg. Further analysis was conducted to examine predictors of weight loss. Baseline controlled motivation level was negatively correlated with weight loss in the entire sample (r = -0.30; p = 0.01. Statistical analysis revealed an interaction between study group assignment and baseline level of controlled motivation. Weight loss was not predicted by baseline level of controlled motivation in the motivation-enhanced group, but was significantly predicted by controlled motivation in the standard group. Baseline autonomous motivation did not predict weight change in either group. Conclusions This research found that, in participants with high levels of baseline controlled motivation

  9. A Text-Based Chat System Embodied with an Expressive Agent

    Directory of Open Access Journals (Sweden)

    Lamia Alam

    2017-01-01

    Full Text Available Life-like characters are playing vital role in social computing by making human-computer interaction more easy and spontaneous. Nowadays, use of these characters to interact in online virtual environment has gained immense popularity. In this paper, we proposed a framework for a text-based chat system embodied with a life-like virtual agent that aims at natural communication between the users. To achieve this kind of system, we developed an agent that performs some nonverbal communications such as generating facial expression and motions by analyzing the text messages of the users. More specifically, this agent is capable of generating facial expressions for six basic emotions such as happy, sad, fear, angry, surprise, and disgust along with two additional emotions, irony and determined. Then to make the interaction between the users more realistic and lively, we added motions such as eye blink and head movements. We measured our proposed system from different aspects and found the results satisfactory, which make us believe that this kind of system can play a significant role in making an interaction episode more natural, effective, and interesting. Experimental evaluation reveals that the proposed agent can display emotive expressions correctly 93% of the time by analyzing the users’ text input.

  10. Text localization using standard deviation analysis of structure elements and support vector machines

    Directory of Open Access Journals (Sweden)

    Zagoris Konstantinos

    2011-01-01

    Full Text Available Abstract A text localization technique is required to successfully exploit document images such as technical articles and letters. The proposed method detects and extracts text areas from document images. Initially a connected components analysis technique detects blocks of foreground objects. Then, a descriptor that consists of a set of suitable document structure elements is extracted from the blocks. This is achieved by incorporating an algorithm called Standard Deviation Analysis of Structure Elements (SDASE which maximizes the separability between the blocks. Another feature of the SDASE is that its length adapts according to the requirements of the application. Finally, the descriptor of each block is used as input to a trained support vector machines that classify the block as text or not. The proposed technique is also capable of adjusting to the text structure of the documents. Experimental results on benchmarking databases demonstrate the effectiveness of the proposed method.

  11. Health warnings promote healthier dietary decision making: Effects of positive versus negative message framing and graphic versus text-based warnings.

    Science.gov (United States)

    Rosenblatt, Daniel H; Bode, Stefan; Dixon, Helen; Murawski, Carsten; Summerell, Patrick; Ng, Alyssa; Wakefield, Melanie

    2018-08-01

    Food product health warnings have been proposed as a potential obesity prevention strategy. This study examined the effects of text-only and text-and-graphic, negatively and positively framed health warnings on dietary choice behavior. In a 2 × 5 mixed experimental design, 96 participants completed a dietary self-control task. After providing health and taste ratings of snack foods, participants completed a baseline measure of dietary self-control, operationalized as participants' frequency of choosing healthy but not tasty items and rejecting unhealthy yet tasty items to consume at the end of the experiment. Participants were then randomly assigned to one of five health warning groups and presented with 10 health warnings of a given form: text-based, negative framing; graphic, negative framing; text, positive framing; graphic, positive framing; or a no warning control. Participants then completed a second dietary decision making session to determine whether health warnings influenced dietary self-control. Linear mixed effects modeling revealed a significant interaction between health warning group and decision stage (pre- and post-health warning presentation) on dietary self-control. Negatively framed graphic health warnings promoted greater dietary self-control than other health warnings. Negatively framed text health warnings and positively framed graphic health warnings promoted greater dietary self-control than positively framed text health warnings and control images, which did not increase dietary self-control. Overall, HWs primed healthier dietary decision making behavior, with negatively framed graphic HWs being most effective. Health warnings have potential to become an important element of obesity prevention. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Use of Popular Culture Texts in Mother Tongue Education

    Science.gov (United States)

    Bal, Mazhar

    2018-01-01

    The aim of this study was to associate popular culture texts with Turkish language lessons of middle school students. For this purpose, a model was proposed and a suitable curriculum was prepared for this model. It was aimed to determine how this program, which was the result of associating popular culture texts with Turkish language lesson…

  13. Report made on behalf of the parity mixed commission in charge of proposing a text about the dispositions of the project of energy orientation law remaining to be discussed

    International Nuclear Information System (INIS)

    2005-01-01

    The project of energy orientation law aims at fixing the main principles of the French energy policy for the next decades. It foresees: the re-launching of the French nuclear program (building of an experimental European pressurized reactor (EPR)), the reinforcement of the mastery of energy demand (3% per year, creation of energy saving certificates and reinforcement of buildings energy efficiency rules), and the sustain of renewable energies development. This document presents, first, a direct comparison, article by article, of the text adopted in second lecture by the House of Commons, with the text adopted in second lecture by the Senate. Then, a text is proposed for the last dispositions that remained to be discussed and is presented in the second part of the report. (J.S.)

  14. Scene text detection by leveraging multi-channel information and local context

    Science.gov (United States)

    Wang, Runmin; Qian, Shengyou; Yang, Jianfeng; Gao, Changxin

    2018-03-01

    As an important information carrier, texts play significant roles in many applications. However, text detection in unconstrained scenes is a challenging problem due to cluttered backgrounds, various appearances, uneven illumination, etc.. In this paper, an approach based on multi-channel information and local context is proposed to detect texts in natural scenes. According to character candidate detection plays a vital role in text detection system, Maximally Stable Extremal Regions(MSERs) and Graph-cut based method are integrated to obtain the character candidates by leveraging the multi-channel image information. A cascaded false positive elimination mechanism are constructed from the perspective of the character and the text line respectively. Since the local context information is very valuable for us, these information is utilized to retrieve the missing characters for boosting the text detection performance. Experimental results on two benchmark datasets, i.e., the ICDAR 2011 dataset and the ICDAR 2013 dataset, demonstrate that the proposed method have achieved the state-of-the-art performance.

  15. Visual Saliency Models for Text Detection in Real World.

    Directory of Open Access Journals (Sweden)

    Renwu Gao

    Full Text Available This paper evaluates the degree of saliency of texts in natural scenes using visual saliency models. A large scale scene image database with pixel level ground truth is created for this purpose. Using this scene image database and five state-of-the-art models, visual saliency maps that represent the degree of saliency of the objects are calculated. The receiver operating characteristic curve is employed in order to evaluate the saliency of scene texts, which is calculated by visual saliency models. A visualization of the distribution of scene texts and non-texts in the space constructed by three kinds of saliency maps, which are calculated using Itti's visual saliency model with intensity, color and orientation features, is given. This visualization of distribution indicates that text characters are more salient than their non-text neighbors, and can be captured from the background. Therefore, scene texts can be extracted from the scene images. With this in mind, a new visual saliency architecture, named hierarchical visual saliency model, is proposed. Hierarchical visual saliency model is based on Itti's model and consists of two stages. In the first stage, Itti's model is used to calculate the saliency map, and Otsu's global thresholding algorithm is applied to extract the salient region that we are interested in. In the second stage, Itti's model is applied to the salient region to calculate the final saliency map. An experimental evaluation demonstrates that the proposed model outperforms Itti's model in terms of captured scene texts.

  16. ASM Based Synthesis of Handwritten Arabic Text Pages.

    Science.gov (United States)

    Dinges, Laslo; Al-Hamadi, Ayoub; Elzobi, Moftah; El-Etriby, Sherif; Ghoneim, Ahmed

    2015-01-01

    Document analysis tasks, as text recognition, word spotting, or segmentation, are highly dependent on comprehensive and suitable databases for training and validation. However their generation is expensive in sense of labor and time. As a matter of fact, there is a lack of such databases, which complicates research and development. This is especially true for the case of Arabic handwriting recognition, that involves different preprocessing, segmentation, and recognition methods, which have individual demands on samples and ground truth. To bypass this problem, we present an efficient system that automatically turns Arabic Unicode text into synthetic images of handwritten documents and detailed ground truth. Active Shape Models (ASMs) based on 28046 online samples were used for character synthesis and statistical properties were extracted from the IESK-arDB database to simulate baselines and word slant or skew. In the synthesis step ASM based representations are composed to words and text pages, smoothed by B-Spline interpolation and rendered considering writing speed and pen characteristics. Finally, we use the synthetic data to validate a segmentation method. An experimental comparison with the IESK-arDB database encourages to train and test document analysis related methods on synthetic samples, whenever no sufficient natural ground truthed data is available.

  17. Mining Sequential Update Summarization with Hierarchical Text Analysis

    Directory of Open Access Journals (Sweden)

    Chunyun Zhang

    2016-01-01

    Full Text Available The outbreak of unexpected news events such as large human accident or natural disaster brings about a new information access problem where traditional approaches fail. Mostly, news of these events shows characteristics that are early sparse and later redundant. Hence, it is very important to get updates and provide individuals with timely and important information of these incidents during their development, especially when being applied in wireless and mobile Internet of Things (IoT. In this paper, we define the problem of sequential update summarization extraction and present a new hierarchical update mining system which can broadcast with useful, new, and timely sentence-length updates about a developing event. The new system proposes a novel method, which incorporates techniques from topic-level and sentence-level summarization. To evaluate the performance of the proposed system, we apply it to the task of sequential update summarization of temporal summarization (TS track at Text Retrieval Conference (TREC 2013 to compute four measurements of the update mining system: the expected gain, expected latency gain, comprehensiveness, and latency comprehensiveness. Experimental results show that our proposed method has good performance.

  18. A high capacity text steganography scheme based on LZW compression and color coding

    Directory of Open Access Journals (Sweden)

    Aruna Malik

    2017-02-01

    Full Text Available In this paper, capacity and security issues of text steganography have been considered by employing LZW compression technique and color coding based approach. The proposed technique uses the forward mail platform to hide the secret data. This algorithm first compresses secret data and then hides the compressed secret data into the email addresses and also in the cover message of the email. The secret data bits are embedded in the message (or cover text by making it colored using a color coding table. Experimental results show that the proposed method not only produces a high embedding capacity but also reduces computational complexity. Moreover, the security of the proposed method is significantly improved by employing stego keys. The superiority of the proposed method has been experimentally verified by comparing with recently developed existing techniques.

  19. Probing neutrino oscillations jointly in long and very long baseline experiments

    International Nuclear Information System (INIS)

    Wang, Y.F.; Whisnant, K.; Young Binglin; Xiong Zhaohua; Yang Jinmin

    2002-01-01

    We examine the prospects of making a joint analysis of neutrino oscillations at two baselines with neutrino superbeams. Assuming narrow band superbeams and a 100 kiloton water Cherenkov calorimeter, we calculate the event rates and sensitivities to the matter effect, the signs of the neutrino mass differences, the CP phase, and the mixing angle θ 13 . Taking into account all possible experimental errors under general consideration, we explore the optimum cases of a narrow band beam to measure the matter effect and the CP violation effect at all baselines up to 3000 km. We then focus on two specific baselines, a long baseline of 300 km and a very long baseline of 2100 km, and analyze their joint capabilities. We find that the joint analysis can offer extra leverage to resolve some of the ambiguities that are associated with the measurement at a single baseline

  20. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman

    2015-06-01

    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  1. Baseline cerebral oximetry values depend on non-modifiable patient characteristics.

    Science.gov (United States)

    Valencia, Lucía; Rodríguez-Pérez, Aurelio; Ojeda, Nazario; Santana, Romen Yone; Morales, Laura; Padrón, Oto

    2015-12-01

    The aim of the present study was to evaluate baseline regional cerebral oxygen saturation (rSO2) values and identify factors influencing preoperative rSO2 in elective minor surgery. Observational analysis post-hoc. Observational post-hoc analysis of data for the patient sample (n=50) of a previously conducted clinical trial in patients undergoing tumourectomy for breast cancer or inguinal hernia repair. Exclusion criteria included pre-existing cerebrovascular diseases, anaemia, baseline pulse oximetry values were recorded while the patient breathed room air, using the INVOS 5100C monitor™ (Covidien, Dublin, Ireland). Thirty-seven women (72%) and 13 men (28%) 48 ± 13 years of age were enrolled in this study. Baseline rSO2 was 62.01 ± 10.38%. Baseline rSO2 was significantly different between men (67.6 ± 11.2%) and women (60 ± 9.4%), (P=0.023). There were also differences between baseline rSO2 and ASA physical status (ASA I: 67.6 ± 10.7%, ASA II: 61.6 ± 8.4%, ASA III: 55.8 ± 13.9%, P=0.045). Baseline rSO2 had a positive correlation with body weight (r=0.347, P=0.014) and height (r=0.345, P=0.014). We also found significant differences in baseline rSO2 among patients with and without chronic renal failure (P=0.005). No differences were found in any other studied variables. Non-modifiable patient characteristics (ASA physical status, sex, chronic renal failure, body weight and height) influence baseline rSO2. Copyright © 2015 Société française d’anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  2. Directed Activities Related to Text: Text Analysis and Text Reconstruction.

    Science.gov (United States)

    Davies, Florence; Greene, Terry

    This paper describes Directed Activities Related to Text (DART), procedures that were developed and are used in the Reading for Learning Project at the University of Nottingham (England) to enhance learning from texts and that fall into two broad categories: (1) text analysis procedures, which require students to engage in some form of analysis of…

  3. A baseline for the multivariate comparison of resting state networks

    Directory of Open Access Journals (Sweden)

    Elena A Allen

    2011-02-01

    Full Text Available As the size of functional and structural MRI datasets expands, it becomes increasingly important to establish a baseline from which diagnostic relevance may be determined, a processing strategy that efficiently prepares data for analysis, and a statistical approach that identifies important effects in a manner that is both robust and reproducible. In this paper, we introduce a multivariate analytic approach that optimizes sensitivity and reduces unnecessary testing. We demonstrate the utility of this mega-analytic approach by identifying the effects of age and gender on the resting state networks of 603 healthy adolescents and adults (mean age: 23.4 years, range: 12 to 71 years. Data were collected on the same scanner, preprocessed using an automated analysis pipeline based in SPM, and studied using group independent component analysis. Resting state networks were identified and evaluated in terms of three primary outcome measures: time course spectral power, spatial map intensity, and functional network connectivity. Results revealed robust effects of age on all three outcome measures, largely indicating decreases in network coherence and connectivity with increasing age. Gender effects were of smaller magnitude but suggested stronger intra-network connectivity in females and more inter-network connectivity in males, particularly with regard to sensorimotor networks. These findings, along with the analysis approach and statistical framework described here, provide a useful baseline for future investigations of brain networks in health and disease.

  4. Effects of Baseline Selection on Magnetocardiography: P-Q and T-P Intervals

    International Nuclear Information System (INIS)

    Lim, Hyun Kyoon; Kwon, Hyuk Chan; Kim, Tae En; Lee, Yong Ho; Kim, Jin Mok; Kim, In Seon; Kim, Ki Woong; Park, Yong Ki

    2007-01-01

    The baseline selection is the first and important step to analyze magnetocardiography (MCG) parameters. There are no difficulties to select the baseline between P- and Q-wave peak (P-Q interval) of MCG wave recorded from healthy subjects because the P-Q intervals of the healthy subjects do not much vary. However, patients with ischemic heart disease often show an unstable P-Q interval which does not seem to be appropriate for the baseline. In this case, T-P interval is alternatively recommended for the baseline. However, there has been no study on the difference made by the baseline selection. In this study, we studied the effect of the different baseline selection. MCG data were analyzed from twenty healthy subjects and twenty one patients whose baselines were alternatively selected in the T-P interval for their inappropriate P-Q interval. Paired T-test was used to compare two set of data. Fifteen parameters derived from the R-wave peak, the T-wave peak, and the period, T max/3 ∼ T max were compared for the different baseline selection. As a result, most parameters did not show significant differences (p>0.05) except few parameters. Therefore, there will be no significant differences if anyone of two intervals were selected for the MCG baseline. However, for the consistent analysis, P-Q interval is strongly recommended for the baseline correction.

  5. Baseline trace metals in water and sediment of the Baleh River-a tropical river in Sarawak, Malaysia.

    Science.gov (United States)

    Sim, Siong Fong; Chai, Hui Ping; Nyanti, Lee; Ling, Teck Yee; Grinang, Jongkar

    2016-09-01

    Quantitative indices are classically employed to evaluate the contamination status of metals with reference to the baseline concentrations. The baselines vary considerably across different geographical zones. It is imperative to determine the local geochemical baseline to evaluate the contamination status. No study has been done to establish the background concentrations in tropical rivers of this region. This paper reports the background concentrations of metals in water and sediment of the Baleh River, Sarawak, derived based on the statistical methods where the areas possibly disturbed are distinguished from the undisturbed area. The baseline levels of six elements in water determined were Al (0.34 mg/L), Fe (0.51 mg/L), Mn (0.12 mg/L), Cu (0.01 mg/L), Pb (0.03 mg/L), and Zn (0.05 mg/L). Arsenic and selenium were below the detection limit. For sediment, the background values were established according to statistical methods including (mean + 2σ), iterative 2σ, cumulative distribution frequency, interquartile, and calculation distribution function. The background values derived using the iterative 2σ algorithm and calculated distribution function were relatively lower. The baseline levels calculated were within the range reported in the literatures mainly from tropical and sub-tropical regions. The upper limits proposed for nine elements in sediment were Al (100,879 mg/kg), Cr (75.45 mg/kg), Cu (34.59 mg/kg), Fe (37,823 mg/kg), Mn (793 mg/kg), Ni (22.88 mg/kg), Pb (27.26 mg/kg), Zn (70.64 mg/kg), and Hg (0.33 mg/kg). Quantitative indices calculated suggest low risk of contamination at the Baleh River.

  6. Prognostic value of baseline seric Syndecan-1 in initially unresectable metastatic colorectal cancer patients: a simple biological score.

    Science.gov (United States)

    Jary, Marine; Lecomte, Thierry; Bouché, Olivier; Kim, Stefano; Dobi, Erion; Queiroz, Lise; Ghiringhelli, Francois; Etienne, Hélène; Léger, Julie; Godet, Yann; Balland, Jérémy; Lakkis, Zaher; Adotevi, Olivier; Bonnetain, Franck; Borg, Christophe; Vernerey, Dewi

    2016-11-15

    In first-line metastatic colorectal cancer (mCRC), baseline prognostic factors allowing death risk and treatment strategy stratification are lacking. Syndecan-1 (CD138) soluble form was never described as a prognostic biomarker in mCRC. We investigated its additional prognostic value for overall survival (OS). mCRC patients with unresectable disease at diagnosis were treated with bevacizumab-based chemotherapy in two independent prospective clinical trials (development set: n = 126, validation set: n = 51, study NCT00489697 and study NCT00544011, respectively). Serums were collected at baseline for CD138 measurement. OS determinants were assessed and, based on the final multivariate model, a prognostic score was proposed. Two independent OS prognostic factors were identified: Lactate Dehydrogenase (LDH) high level (p = 0.0066) and log-CD138 high level (p = 0.0190). The determination of CD138 binary information (cutoff: 75 ng/mL) allowed the assessment of a biological prognostic score with CD138 and LDH values, identifying three risk groups for death (median OS= 38.9, 30.1 and 19.8 months for the low, intermediate and high risk groups, respectively; p value for OS, in mCRC patients. A simple biological scoring system is proposed including LDH and CD138 binary status values. © 2016 UICC.

  7. Baseline projections of transportation energy consumption by mode: 1981 update

    Energy Technology Data Exchange (ETDEWEB)

    Millar, M; Bunch, J; Vyas, A; Kaplan, M; Knorr, R; Mendiratta, V; Saricks, C

    1982-04-01

    A comprehensive set of activity and energy-demand projections for each of the major transportation modes and submodes is presented. Projections are developed for a business-as-usual scenario, which provides a benchmark for assessing the effects of potential conservation strategies. This baseline scenario assumes a continuation of present trends, including fuel-efficiency improvements likely to result from current efforts of vehicle manufacturers. Because of anticipated changes in fuel efficiency, fuel price, modal shifts, and a lower-than-historic rate of economic growth, projected growth rates in transportation activity and energy consumption depart from historic patterns. The text discusses the factors responsible for this departure, documents the assumptions and methodologies used to develop the modal projections, and compares the projections with other efforts.

  8. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Science.gov (United States)

    2010-07-01

    ... baseline toxics value if it can determine an applicable toxics value for every batch of gasoline produced... of gasoline batch i produced or imported between January 1, 1998 and December 31, 2000, inclusive. i = Individual batch of gasoline produced or imported between January 1, 1998 and December 31, 2000, inclusive. n...

  9. Improved Small Baseline processing by means of CAESAR eigen-interferograms decomposition

    Science.gov (United States)

    Verde, Simona; Reale, Diego; Pauciullo, Antonio; Fornaro, Gianfranco

    2018-05-01

    The Component extrAction and sElection SAR (CAESAR) is a method for the selection and filtering of scattering mechanisms recently proposed in the multibaseline interferometric SAR framework. Its strength is related to the possibility to select and extract multiple dominant scattering mechanisms, even interfering in the same pixel, since the stage of the interferograms generation, and to carry out a decorrelation noise phase filtering. Up to now, the validation of CAESAR has been addressed in the framework of SAR Tomography for the model-based detection of Persistent Scatterers (PSs). In this paper we investigate the effectiveness related to the use of CAESAR eigen-interferograms in classical multi-baseline DInSAR processing, based on the Small BAseline Subset (SBAS) strategy, typically adopted to extract large scale distributed deformation and atmospheric phase screen. Such components are also exploited for the calibration of the full resolution data for PS or tomographic analysis. By using COSMO-SKyMed (CSK) SAR data, it is demonstrated that dominant scattering component filtering effectively improves the monitoring of distributed spatially decorrelated areas (f.i. bare soil, rocks, etc.) and allows bringing to light man-made structures with dominant backscattering characteristics embedded in highly temporally decorrelated scenario, as isolated asphalt roads and block of buildings in non-urban areas. Moreover it is shown that, thanks to the CAESAR multiple scattering components separation, the layover mitigation in low-topography eigen-interferograms relieves Phase Unwrapping (PhU) errors in urban areas due to abrupt height variations.

  10. An improved algorithm for information hiding based on features of Arabic text: A Unicode approach

    Directory of Open Access Journals (Sweden)

    A.A. Mohamed

    2014-07-01

    Full Text Available Steganography means how to hide secret information in a cover media, so that other individuals fail to realize their existence. Due to the lack of data redundancy in the text file in comparison with other carrier files, text steganography is a difficult problem to solve. In this paper, we proposed a new promised steganographic algorithm for Arabic text based on features of Arabic text. The focus is on more secure algorithm and high capacity of the carrier. Our extensive experiments using the proposed algorithm resulted in a high capacity of the carrier media. The embedding capacity rate ratio of the proposed algorithm is high. In addition, our algorithm can resist traditional attacking methods since it makes the changes in carrier text as minimum as possible.

  11. Baseline Projection Data Book: GRI baseline projection of U.S. Energy Supply and Demand to 2010. 1992 Edition. Volume 1 and Volume 2

    International Nuclear Information System (INIS)

    Holtberg, P.D.; Woods, T.J.; Lihn, M.L.; Koklauner, A.K.

    1992-01-01

    The 1992 Baseline Projection Data Book provides backup data in tabular form for the 1992 GRI Baseline Projection of U.S. Energy Supply and Demand to 2010. Summary tables and data for the residential, commercial, industrial, electric utility, and transportation sectors are presented in the volume

  12. A Comparison of Video Modeling, Text-Based Instruction, and No Instruction for Creating Multiple Baseline Graphs in Microsoft Excel

    Science.gov (United States)

    Tyner, Bryan C.; Fienup, Daniel M.

    2015-01-01

    Graphing is socially significant for behavior analysts; however, graphing can be difficult to learn. Video modeling (VM) may be a useful instructional method but lacks evidence for effective teaching of computer skills. A between-groups design compared the effects of VM, text-based instruction, and no instruction on graphing performance.…

  13. Experimental increase in baseline corticosterone level reduces oxidative damage and enhances innate immune response.

    Directory of Open Access Journals (Sweden)

    Csongor I Vágási

    Full Text Available Glucocorticoid (GC hormones are significant regulators of homeostasis. The physiological effects of GCs critically depend on the time of exposure (short vs. long as well as on their circulating levels (baseline vs. stress-induced. Previous experiments, in which chronic and high elevation of GC levels was induced, indicate that GCs impair both the activity of the immune system and the oxidative balance. Nonetheless, our knowledge on how mildly elevated GC levels, a situation much more common in nature, might influence homeostasis is limited. Therefore, we studied whether an increase in GC level within the baseline range suppresses or enhances condition (body mass, hematocrit and coccidian infestation and physiological state (humoral innate immune system activity and oxidative balance. We implanted captive house sparrows Passer domesticus with either 60 days release corticosterone (CORT or control pellets. CORT-treated birds had elevated baseline CORT levels one week after the implantation, but following this CORT returned to its pre-treatment level and the experimental groups had similar CORT levels one and two months following the implantation. The mass of tail feathers grown during the initial phase of treatment was smaller in treated than in control birds. CORT implantation had a transient negative effect on body mass and hematocrit, but both of these traits resumed the pre-treatment values by one month post-treatment. CORT treatment lowered oxidative damage to lipids (malondialdehyde and enhanced constitutive innate immunity at one week and one month post-implantation. Our findings suggest that a relatively short-term (i.e. few days elevation of baseline CORT might have a positive and stimulatory effect on animal physiology.

  14. ConText : Contactless Sensors for Body Monitoring Incorporated in Textiles

    NARCIS (Netherlands)

    Langereis, G.; Voogd-Claessen, L. de; Spaepen, A.; Sipliä, A.; Rotsch, C.; Linz, T.

    2007-01-01

    The aim of the ConText project is to develop a vest with integrated sensors and electronics for constant monitoring of muscle activity. The vest measures muscle activity in order to derive the psychological stress level of a person. The ConText project proposes to develop a sensor technology, which

  15. Baseline for the cumulants of net-proton distributions at STAR

    International Nuclear Information System (INIS)

    Luo, Xiaofeng; Mohanty, Bedangadas; Xu, Nu

    2014-01-01

    We present a systematic comparison between the recently measured cumulants of the net-proton distributions by STAR for 0–5% central Au + Au collisions at √(s NN )=7.7–200 GeV and two kinds of possible baseline measure, the Poisson and Binomial baselines. These baseline measures are assuming that the proton and anti-proton distributions independently follow Poisson statistics or Binomial statistics. The higher order cumulant net-proton data are observed to deviate from all the baseline measures studied at 19.6 and 27 GeV. We also compare the net-proton with net-baryon fluctuations in UrQMD and AMPT model, and convert the net-proton fluctuations to net-baryon fluctuations in AMPT model by using a set of formula

  16. Parametric estimation of time varying baselines in airborne interferometric SAR

    DEFF Research Database (Denmark)

    Mohr, Johan Jacob; Madsen, Søren Nørvang

    1996-01-01

    A method for estimation of time varying spatial baselines in airborne interferometric synthetic aperture radar (SAR) is described. The range and azimuth distortions between two images acquired with a non-linear baseline are derived. A parametric model of the baseline is then, in a least square...... sense, estimated from image shifts obtained by cross correlation of numerous small patches throughout the image. The method has been applied to airborne EMISAR imagery from the 1995 campaign over the Storstrommen Glacier in North East Greenland conducted by the Danish Center for Remote Sensing. This has...... reduced the baseline uncertainties from several meters to the centimeter level in a 36 km scene. Though developed for airborne SAR the method can easily be adopted to satellite data...

  17. An Approach to Retrieval of OCR Degraded Text

    Directory of Open Access Journals (Sweden)

    Yuen-Hsien Tseng

    1998-12-01

    Full Text Available The major problem with retrieval of OCR text is the unpredictable distortion of characters due to recognition errors. Because users have no ideas of such distortion, the terms they query can hardly match the terms stored in the OCR text exactly. Thus retrieval effectiveness is significantly reduced , especially for low-quality input. To reduce the losses from retrieving such noisy OCR text, a fault-tolerant retrieval strategy based on automatic keyword extraction and fuzzy matching is proposed. In this strategy, terms, correct or not, and their term frequencies are extracted from the noisy text and presented for browsing and selection in response to users' initial queries , With theunderstanding of the real terms stored in the noisy text and of their estimated frequency distributions, users may then choose appropriate terms for a more effective searching, A text retrieval system based on this strategy has been built. Examples to show the effectiveness are demonstrated. Finally, some OCR issues for further enhancingretrieval effectiveness are discussed.

  18. The Effect of Pretest Exercise on Baseline Computerized Neurocognitive Test Scores.

    Science.gov (United States)

    Pawlukiewicz, Alec; Yengo-Kahn, Aaron M; Solomon, Gary

    2017-10-01

    Baseline neurocognitive assessment plays a critical role in return-to-play decision making following sport-related concussions. Prior studies have assessed the effect of a variety of modifying factors on neurocognitive baseline test scores. However, relatively little investigation has been conducted regarding the effect of pretest exercise on baseline testing. The aim of our investigation was to determine the effect of pretest exercise on baseline Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores in adolescent and young adult athletes. We hypothesized that athletes undergoing self-reported strenuous exercise within 3 hours of baseline testing would perform more poorly on neurocognitive metrics and would report a greater number of symptoms than those who had not completed such exercise. Cross-sectional study; Level of evidence, 3. The ImPACT records of 18,245 adolescent and young adult athletes were retrospectively analyzed. After application of inclusion and exclusion criteria, participants were dichotomized into groups based on a positive (n = 664) or negative (n = 6609) self-reported history of strenuous exercise within 3 hours of the baseline test. Participants with a positive history of exercise were then randomly matched, based on age, sex, education level, concussion history, and hours of sleep prior to testing, on a 1:2 basis with individuals who had reported no pretest exercise. The baseline ImPACT composite scores of the 2 groups were then compared. Significant differences were observed for the ImPACT composite scores of verbal memory, visual memory, reaction time, and impulse control as well as for the total symptom score. No significant between-group difference was detected for the visual motor composite score. Furthermore, pretest exercise was associated with a significant increase in the overall frequency of invalid test results. Our results suggest a statistically significant difference in ImPACT composite scores between

  19. New Regional and Global HFC Projections and Effects of National Regulations and Montreal Protocol Amendment Proposals

    Science.gov (United States)

    Velders, G. J. M.

    2015-12-01

    Hydrofluorocarbons (HFCs) are used as substitutes for ozone-depleting substances that are being phased out globally under Montreal Protocol regulations. New global scenarios of HFC emissions reach 4.0-5.3 GtCO2-eq yr-1 in 2050, which corresponds to a projected growth from 2015 to 2050 which is 9% to 29% of that for CO2 over the same time period. New baseline scenarios are formulated for 10 HFC compounds, 11 geographic regions, and 13 use categories. These projections are the first to comprehensively assess production and consumption of individual HFCs in multiple use sectors and geographic regions with emission estimates constrained by atmospheric observations. In 2050, in percent of global HFC emissions, China (~30%), India and the rest of Asia (~25%), Middle East and northern Africa (~10%), and USA (~10%) are the principal source regions; and refrigeration and stationary air conditioning are the major use sectors. National regulations to limit HFC use have been adopted recently in the European Union, Japan and USA, and four proposals have been submitted in 2015 to amend the Montreal Protocol to substantially reduce growth in HFC use. Calculated baseline emissions are reduced by 90% in 2050 by implementing the North America Montreal Protocol amendment proposal. Global adoption of technologies required to meet national regulations would be sufficient to reduce 2050 baseline HFC consumption by more than 50% of that achieved with the North America proposal for most developed and developing countries. The new HFC scenarios and effects of national regulations and Montreal Protocol amendment proposals will be presented.

  20. Fat Metaplasia on Sacroiliac Joint Magnetic Resonance Imaging at Baseline Is Associated with Spinal Radiographic Progression in Patients with Axial Spondyloarthritis.

    Directory of Open Access Journals (Sweden)

    Kwi Young Kang

    Full Text Available To study the relationship between inflammatory and structural lesions in the sacroiliac joints (SIJs on MRI and spinal progression observed on conventional radiographs in patients with axial spondyloarthritis (axSpA.One hundred and ten patients who fulfilled the ASAS axSpA criteria were enrolled. All underwent SIJ MRI at baseline and lumbar spine radiographs at baseline and after 2 years. Inflammatory and structural lesions on SIJ MRI were scored using the SPondyloArthritis Research Consortium of Canada (SPARCC method. Spinal radiographs were scored using the Stoke AS Spinal Score (SASSS. Multivariate logistic regression analysis was performed to identify predictors of spinal progression.Among the 110 patients, 25 (23% showed significant radiographic progression (change of SASSS≥2 over 2 years. There was no change in the SASSS over 2 years according to the type of inflammatory lesion. Patients with fat metaplasia or ankyloses on baseline MRI showed a significantly higher SASSS at 2 years than those without (p<0.001. According to univariate logistic regression analysis, age at diagnosis, HLA-B27 positivity, the presence of fat metaplasia, erosion, and ankyloses on SIJ MRI, increased baseline CRP levels, and the presence of syndesmophytes at baseline were associated with spinal progression over 2 years. Multivariate analysis identified syndesmophytes and severe fat metaplasia on baseline SIJ MRI as predictive of spinal radiographic progression (OR, 14.74 and 5.66, respectively.Inflammatory lesions in the SIJs on baseline MRI were not associated with spinal radiographic progression. However, fat metaplasia at baseline was significantly associated with spinal progression after 2 years.

  1. FED baseline engineering studies report

    Energy Technology Data Exchange (ETDEWEB)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  2. FED baseline engineering studies report

    International Nuclear Information System (INIS)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept

  3. Measuring Baseline Agriculture-Related Sustainable Development Goals Index for Southern Africa

    Directory of Open Access Journals (Sweden)

    Charles Nhemachena

    2018-03-01

    Full Text Available Sustainable development has become the main focus of the global development agenda as presented in the 2015 Sustainable Development Goals (SDGs. However, for countries to assess progress, they need to have reliable baseline indicators. Therefore, the objective of this paper is to develop a composite baseline index of the agriculture-related SDGs in Southern Africa to guide progress reporting. The paper identified eight of the SDG indicators related to the agriculture sector. The paper relies on data for indicators from five SDGs (SDGs 1, 2, 6, 7 and 15. Applying the arithmetic mean method of aggregation, an agriculture-related SDG composite index for Southern Africa between zero (0 = poor performance and 100 (best possible performance was computed for thirteen countries that had data on all identified indicators. The results show that the best performing countries (Botswana, Angola, Namibia, Zambia and South Africa in the assessment recorded high scores in SDGs 1, 2 and 7. The three countries (Democratic Republic of Congo, Zimbabwe and Madagascar that performed poorly on both SDG 1 and 2 also had the least scores on the overall agriculture-related SDG composite index. The water stress indicator for SDG 6 recorded the worst performance among most countries in the region. Possible approaches to improve the contribution of agriculture to SDGs may include investing more resources in priority areas for each agriculture-related SDG depending on baseline country conditions. The implementation, monitoring and evaluation of regional and continental commitments in the agriculture sector and the SDGs are critical for achievement of the targets at the national and local levels. While the methods employed are well-grounded in literature, data unavailability for some of the SDGs in some countries presented a limitation to the study, and future efforts should focus on collecting data for the other SDGs in order to permit a wider application.

  4. “By his wind, he put Yam into his net” – (R. H. [Chaim] Cohen correction proposal of the BHS text of Job 26:13

    Directory of Open Access Journals (Sweden)

    Osvaldo Luiz Ribeiro

    2015-07-01

    Full Text Available Is formulated as a proposed textual criticism the suggestion of correction of the text of Job 26.13 of the Hebrew Bible Stuttgartensia, constant of dissertation by Harold R. (Chaim Cohen , 1975, published in 1978, with the title of Biblical Hapax in the Light of Akkadian and Ugaritic . Cohen presents two statements: 1 retrieves the recommendation of Tur-Sinai ( 1941, that the word hrpX in Job 26.13 should be translated from Akkadian cognate , "saparru", playing to him as "network", so that , then, would treat a case of hapax legomena. Also, 2 Cohen says there were copyist error in the transmission of the Hebrew verse - two independent original vocabulary - ~X and ~y - have been mistakenly clumped by the scribe and processed in the now constant standard text of BHS , ~yIm:åv'. The Cohen’s suggestions recover the condition of the four parallel synonymic verses in Job 26.12-13, since Yam, appearing in if and then corrected v . 13a, compose parallel with the other dragons mentioned in v. 12a, 12b and 13b. Job 26.13 should then be read as follows: " with his wind, he put Yam on your network". Not identified any version or comment that had heeded the suggestion of Cohen.

  5. Underlying topography extraction over forest areas from multi-baseline PolInSAR data

    Science.gov (United States)

    Fu, Haiqiang; Zhu, Jianjun; Wang, Changcheng; Li, Zhiwei

    2017-11-01

    In this paper, the digital elevation model (DEM) for a forest area is extracted from multi-baseline (MB) polarimetric interferometric synthetic aperture radar (PolInSAR) data. On the basis of the random-volume-over-ground (RVoG) model, the weighted complex least-squares adjustment (WCLSA) method is proposed for the ground phase estimation, so that the MB PolInSAR observations can be constrained by a generalized observation function and the observation contribution to the solution can be adjusted by a weighting strategy. A baseline length weighting strategy is then adopted to syncretize the DEMs estimated with the ground phases. The results of the simulated experiment undertaken in this study demonstrate that the WCLSA method is sensitive to the number of redundant observations and can adjust the contributions of the different observations. We also applied the WCLSA method to E-SAR L- and P-band MB PolInSAR data from the Krycklan River catchment in Northern Sweden. The results show that the two extracted DEMs are in close agreement with the Light Detection and Ranging (Lidar) DEM, with root-mean-square errors of 3.54 and 3.16 m. The DEM vertical error is correlated with the terrain slope and ground-cover condition, but not with the forest height.

  6. Influence of Baseline Psychological Health on Muscle Pain During Atorvastatin Treatment.

    Science.gov (United States)

    Zaleski, Amanda L; Taylor, Beth A; Pescatello, Linda S; Dornelas, Ellen A; White, Charles Michael; Thompson, Paul D

    3-hydroxy-3-methylglutaryl coenzyme A reductase reductase inhibitors (statins) are generally well tolerated, with statin-associated muscle symptoms (SAMS) the most common side effect (~10%) seen in statin users. However, studies and clinical observations indicate that many of the self-reported SAMS appear to be nonspecific (ie, potentially not attributable to statins). Mental health and well-being influence self-perception of pain, so we sought to assess the effect of baseline well-being and depression on the development of muscle pain with 6 months of atorvastatin 80 mg/d (ATORVA) or placebo in healthy, statin-naive adults. The Psychological General Well-being Index (n = 83) and Beck Depression Inventory (n = 55) questionnaires were administered at baseline in participants (aged 59.5 ± 1.2 years) from the effect of Statins on Skeletal Muscle Function and Performance (STOMP) trial (NCT00609063). Muscle pain (Short-Form McGill Pain Questionnaire [SF-MPQ]), pain that interferes with daily life (Brief Pain Inventory [BPI]), and pain severity (BPI) were then measured before, throughout, and after treatment. At baseline, there were no differences in well-being (Psychological General Well-being Index), depression (Beck Depression Inventory), or pain measures (SF-MPQ and BPI) (P values ≥ .05) between the placebo and ATORVA groups. Baseline well-being correlated negatively with baseline BPI pain severity (r = -0.290, P = .008). Baseline depression correlated with baseline pain (SF-MPQ; r = 0.314, P = .020). Baseline well-being and depression did not predict the change in pain severity or interference after 6 months among the total sample or between groups (P values ≥ .05). Baseline well-being and depression were not significant predictors of pain after 6 months of ATORVA (P values ≥ .05). Thus, they do not appear to increase the risk of SAMS in otherwise healthy adults.

  7. Baseline Report on HB2320

    Science.gov (United States)

    State Council of Higher Education for Virginia, 2015

    2015-01-01

    Staff provides this baseline report as a summary of its preliminary considerations and initial research in fulfillment of the requirements of HB2320 from the 2015 session of the General Assembly. Codified as § 23-7.4:7, this legislation compels the Education Secretary and the State Council of Higher Education for Virginia (SCHEV) Director, in…

  8. Pediatric Heart Transplantation: Transitioning to Adult Care (TRANSIT): Baseline Findings.

    Science.gov (United States)

    Grady, Kathleen L; Hof, Kathleen Van't; Andrei, Adin-Cristian; Shankel, Tamara; Chinnock, Richard; Miyamoto, Shelley; Ambardekar, Amrut V; Anderson, Allen; Addonizio, Linda; Latif, Farhana; Lefkowitz, Debra; Goldberg, Lee; Hollander, Seth A; Pham, Michael; Weissberg-Benchell, Jill; Cool, Nichole; Yancy, Clyde; Pahl, Elfriede

    2018-02-01

    Young adult solid organ transplant recipients who transfer from pediatric to adult care experience poor outcomes related to decreased adherence to the medical regimen. Our pilot trial for young adults who had heart transplant (HT) who transfer to adult care tests an intervention focused on increasing HT knowledge, self-management and self-advocacy skills, and enhancing support, as compared to usual care. We report baseline findings between groups regarding (1) patient-level outcomes and (2) components of the intervention. From 3/14 to 9/16, 88 subjects enrolled and randomized to intervention (n = 43) or usual care (n = 45) at six pediatric HT centers. Patient self-report questionnaires and medical records data were collected at baseline, and 3 and 6 months after transfer. For this report, baseline findings (at enrollment and prior to transfer to adult care) were analyzed using Chi-square and t-tests. Level of significance was p Baseline demographics were similar in the intervention and usual care arms: age 21.3 ± 3.2 vs 21.5 ± 3.3 years and female 44% vs 49%, respectively. At baseline, there were no differences between intervention and usual care for use of tacrolimus (70 vs 62%); tacrolimus level (mean ± SD = 6.5 ± 2.3 ng/ml vs 5.6 ± 2.3 ng/ml); average of the within patient standard deviation of the baseline mean tacrolimus levels (1.6 vs 1.3); and adherence to the medical regimen [3.6 ± 0.4 vs 3.5 ± 0.5 (1 = hardly ever to 4 = all of the time)], respectively. At baseline, both groups had a modest amount of HT knowledge, were learning self-management and self-advocacy, and perceived they were adequately supported. Baseline findings indicate that transitioning HT recipients lack essential knowledge about HT and have incomplete self-management and self-advocacy skills.

  9. Association between baseline impedance values and response proton pump inhibitors in patients with heartburn.

    Science.gov (United States)

    de Bortoli, Nicola; Martinucci, Irene; Savarino, Edoardo; Tutuian, Radu; Frazzoni, Marzio; Piaggi, Paolo; Bertani, Lorenzo; Furnari, Manuele; Franchi, Riccardo; Russo, Salvatore; Bellini, Massimo; Savarino, Vincenzo; Marchi, Santino

    2015-06-01

    Esophageal impedance measurements have been proposed to indicate the status of the esophageal mucosa, and might be used to study the roles of the impaired mucosal integrity and increased acid sensitivity in patients with heartburn. We compared baseline impedance levels among patients with heartburn who did and did not respond to proton pump inhibitor (PPI) therapy, along with the pathophysiological characteristics of functional heartburn (FH). In a case-control study, we collected data from January to December 2013 on patients with heartburn and normal findings from endoscopy who were not receiving PPI therapy and underwent impedance pH testing at hospitals in Italy. Patients with negative test results were placed on an 8-week course of PPI therapy (84 patients received esomeprazole and 36 patients received pantoprazole). Patients with more than 50% symptom improvement were classified as FH/PPI responders and patients with less than 50% symptom improvement were classified as FH/PPI nonresponders. Patients with hypersensitive esophagus and healthy volunteers served as controls. In all patients and controls, we measured acid exposure time, number of reflux events, baseline impedance, and swallow-induced peristaltic wave indices. FH/PPI responders had higher acid exposure times, numbers of reflux events, and acid refluxes compared with FH/PPI nonresponders (P < .05). Patients with hypersensitive esophagus had mean acid exposure times and numbers of reflux events similar to those of FH/PPI responders. Baseline impedance levels were lower in FH/PPI responders and patients with hypersensitive esophagus, compared with FH/PPI nonresponders and healthy volunteers (P < .001). Swallow-induced peristaltic wave indices were similar between FH/PPI responders and patients with hypersensitive esophagus. Patients with FH who respond to PPI therapy have impedance pH features similar to those of patients with hypersensitive esophagus. Baseline impedance measurements might allow for

  10. Active Collection of Land Cover Sample Data from Geo-Tagged Web Texts

    Directory of Open Access Journals (Sweden)

    Dongyang Hou

    2015-05-01

    Full Text Available Sample data plays an important role in land cover (LC map validation. Traditionally, they are collected through field survey or image interpretation, either of which is costly, labor-intensive and time-consuming. In recent years, massive geo-tagged texts are emerging on the web and they contain valuable information for LC map validation. However, this kind of special textual data has seldom been analyzed and used for supporting LC map validation. This paper examines the potential of geo-tagged web texts as a new cost-free sample data source to assist LC map validation and proposes an active data collection approach. The proposed approach uses a customized deep web crawler to search for geo-tagged web texts based on land cover-related keywords and string-based rules matching. A data transformation based on buffer analysis is then performed to convert the collected web texts into LC sample data. Using three provinces and three municipalities directly under the Central Government in China as study areas, geo-tagged web texts were collected to validate artificial surface class of China’s 30-meter global land cover datasets (GlobeLand30-2010. A total of 6283 geo-tagged web texts were collected at a speed of 0.58 texts per second. The collected texts about built-up areas were transformed into sample data. User’s accuracy of 82.2% was achieved, which is close to that derived from formal expert validation. The preliminary results show that geo-tagged web texts are valuable ancillary data for LC map validation and the proposed approach can improve the efficiency of sample data collection.

  11. Text Messaging for Psychiatric Outpatients: Effect on Help-Seeking and Self-Harming Behaviors.

    Science.gov (United States)

    Kodama, Toyohiko; Syouji, Hiroko; Takaki, Sachiko; Fujimoto, Hirokazu; Ishikawa, Shinichi; Fukutake, Masaaki; Taira, Masaru; Hashimoto, Takeshi

    2016-04-01

    A mobile phone intervention was developed and tested with 30 psychiatric outpatients with mental illness, who had high ideation for suicide. The intervention involved promoting help-seeking behaviors by sending text messages, including information about social welfare services and reminders about medical appointments, for 6 months. After the intervention period, the number of participants who used social services significantly increased, and more than 80% of participants reported that the text messaging service was helpful and useful. Compared to baseline, participants' self-harming behaviors decreased and the attending psychiatrists rated their suicide ideation as weaker. This is the first intervention study to promote psychiatric patients' help-seeking using text messaging, and although it was not a randomized controlled trial, this intervention has practical value and may lead to the prevention of suicide. Copyright 2016, SLACK Incorporated.

  12. Modeling and Simulation of Offshore Wind Power Platform for 5 MW Baseline NREL Turbine

    Science.gov (United States)

    Roni Sahroni, Taufik

    2015-01-01

    This paper presents the modeling and simulation of offshore wind power platform for oil and gas companies. Wind energy has become the fastest growing renewable energy in the world and major gains in terms of energy generation are achievable when turbines are moved offshore. The objective of this project is to propose new design of an offshore wind power platform. Offshore wind turbine (OWT) is composed of three main structures comprising the rotor/blades, the tower nacelle, and the supporting structure. The modeling analysis was focused on the nacelle and supporting structure. The completed final design was analyzed using finite element modeling tool ANSYS to obtain the structure's response towards loading conditions and to ensure it complies with guidelines laid out by classification authority Det Norske Veritas. As a result, a new model of the offshore wind power platform for 5 MW Baseline NREL turbine was proposed. PMID:26550605

  13. Who's Gotta Have It?: The Ownership of Meaning and Mass Media Texts.

    Science.gov (United States)

    Wolfe, Arnold S.

    1992-01-01

    Argues that the contention that media texts have no meaning is problematic. Repositions the concept of "text" within the context of general semiotic theory. Uses an approach culled from literary, film, and communication perspectives to reanalyze canonical research on television texts. Proposes a new research agenda. (PRA)

  14. Detecting causality from online psychiatric texts using inter-sentential language patterns

    Directory of Open Access Journals (Sweden)

    Wu Jheng-Long

    2012-07-01

    Full Text Available Abstract Background Online psychiatric texts are natural language texts expressing depressive problems, published by Internet users via community-based web services such as web forums, message boards and blogs. Understanding the cause-effect relations embedded in these psychiatric texts can provide insight into the authors’ problems, thus increasing the effectiveness of online psychiatric services. Methods Previous studies have proposed the use of word pairs extracted from a set of sentence pairs to identify cause-effect relations between sentences. A word pair is made up of two words, with one coming from the cause text span and the other from the effect text span. Analysis of the relationship between these words can be used to capture individual word associations between cause and effect sentences. For instance, (broke up, life and (boyfriend, meaningless are two word pairs extracted from the sentence pair: “I broke up with my boyfriend. Life is now meaningless to me”. The major limitation of word pairs is that individual words in sentences usually cannot reflect the exact meaning of the cause and effect events, and thus may produce semantically incomplete word pairs, as the previous examples show. Therefore, this study proposes the use of inter-sentential language patterns such as ≪broke up, boyfriend>, Results Performance was evaluated on a corpus of texts collected from PsychPark (http://www.psychpark.org, a virtual psychiatric clinic maintained by a group of volunteer professionals from the Taiwan Association of Mental Health Informatics. Experimental results show that the use of inter-sentential language patterns outperformed the use of word pairs proposed in previous studies. Conclusions This study demonstrates the acquisition of inter-sentential language patterns for causality detection from online psychiatric texts. Such semantically more complete and precise features can improve causality detection performance.

  15. Baseline risk assessment of ground water contamination at the uranium mill tailings sites near Slick Rock, Colorado

    International Nuclear Information System (INIS)

    1994-11-01

    This baseline risk assessment of ground water contamination at the uranium mill tailings sites near Slick Rock, Colorado, evaluates potential public health and environmental impacts resulting from ground water contamination at the former North Continent (NC) and Union Carbide (UC) uranium mill processing sites. The tailings at these sites will be placed in a disposal cell at the proposed Burro Canyon, Colorado, site. The US Department of Energy (DOE) anticipates the start of the first phase remedial action by the spring of 1995 under the direction of the DOE's Uranium Mill Tailings Remedial Action (UMTRA) Project. The second phase of the UMTRA Project will evaluate ground water contamination. This baseline risk assessment is the first site-specific document for these sites under the Ground Water Project. It will help determine the compliance strategy for contaminated ground water at the site. In addition, surface water and sediment are qualitatively evaluated in this report

  16. Baseline risk assessment of ground water contamination at the uranium mill tailings sites near Slick Rock, Colorado

    Energy Technology Data Exchange (ETDEWEB)

    1994-11-01

    This baseline risk assessment of ground water contamination at the uranium mill tailings sites near Slick Rock, Colorado, evaluates potential public health and environmental impacts resulting from ground water contamination at the former North Continent (NC) and Union Carbide (UC) uranium mill processing sites. The tailings at these sites will be placed in a disposal cell at the proposed Burro Canyon, Colorado, site. The US Department of Energy (DOE) anticipates the start of the first phase remedial action by the spring of 1995 under the direction of the DOE`s Uranium Mill Tailings Remedial Action (UMTRA) Project. The second phase of the UMTRA Project will evaluate ground water contamination. This baseline risk assessment is the first site-specific document for these sites under the Ground Water Project. It will help determine the compliance strategy for contaminated ground water at the site. In addition, surface water and sediment are qualitatively evaluated in this report.

  17. Baseline Optimization for the Measurement of CP Violation, Mass Hierarchy, and $\\theta_{23}$ Octant in a Long-Baseline Neutrino Oscillation Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Bass, M. [Colorado State U.; Bishai, M. [Brookhaven; Cherdack, D. [Colorado State U.; Diwan, M. [Brookhaven; Djurcic, Z. [Argonne; Hernandez, J. [Houston U.; Lundberg, B. [Fermilab; Paolone, V. [Pittsburgh U.; Qian, X. [Brookhaven; Rameika, R. [Fermilab; Whitehead, L. [Houston U.; Wilson, R. J. [Colorado State U.; Worcester, E. [Brookhaven; Zeller, G. [Fermilab

    2015-03-19

    Next-generation long-baseline electron neutrino appearance experiments will seek to discover CP violation, determine the mass hierarchy and resolve the θ23 octant. In light of the recent precision measurements of θ13, we consider the sensitivity of these measurements in a study to determine the optimal baseline, including practical considerations regarding beam and detector performance. We conclude that a detector at a baseline of at least 1000 km in a wide-band muon neutrino beam is the optimal configuration.

  18. Should Studies of Diabetes Treatment Stratification Correct for Baseline HbA1c?

    Science.gov (United States)

    Jones, Angus G.; Lonergan, Mike; Henley, William E.; Pearson, Ewan R.; Hattersley, Andrew T.; Shields, Beverley M.

    2016-01-01

    Aims Baseline HbA1c is a major predictor of response to glucose lowering therapy and therefore a potential confounder in studies aiming to identify other predictors. However, baseline adjustment may introduce error if the association between baseline HbA1c and response is substantially due to measurement error and regression to the mean. We aimed to determine whether studies of predictors of response should adjust for baseline HbA1c. Methods We assessed the relationship between baseline HbA1c and glycaemic response in 257 participants treated with GLP-1R agonists and assessed whether it reflected measurement error and regression to the mean using duplicate ‘pre-baseline’ HbA1c measurements not included in the response variable. In this cohort and an additional 2659 participants treated with sulfonylureas we assessed the relationship between covariates associated with baseline HbA1c and treatment response with and without baseline adjustment, and with a bias correction using pre-baseline HbA1c to adjust for the effects of error in baseline HbA1c. Results Baseline HbA1c was a major predictor of response (R2 = 0.19,β = -0.44,pHbA1c were associated with response, however these associations were weak or absent after adjustment for baseline HbA1c. Bias correction did not substantially alter associations. Conclusions Adjustment for the baseline HbA1c measurement is a simple and effective way to reduce bias in studies of predictors of response to glucose lowering therapy. PMID:27050911

  19. Baseline series fragrance markers fail to predict contact allergy.

    Science.gov (United States)

    Mann, Jack; McFadden, John P; White, Jonathan M L; White, Ian R; Banerjee, Piu

    2014-05-01

    Negative patch test results with fragrance allergy markers in the European baseline series do not always predict a negative reaction to individual fragrance substances. To determine the frequencies of positive test reactions to the 26 fragrance substances for which labelling is mandatory in the EU, and how effectively reactions to fragrance markers in the baseline series predict positive reactions to the fragrance substances that are labelled. The records of 1951 eczema patients, routinely tested with the labelled fragrance substances and with an extended European baseline series in 2011 and 2012, were retrospectively reviewed. Two hundred and eighty-one (14.4%) (71.2% females) reacted to one or more allergens from the labelled-fragrance substance series and/or a fragrance marker from the European baseline series. The allergens that were positive with the greatest frequencies were cinnamyl alcohol (48; 2.46%), Evernia furfuracea (44; 2.26%), and isoeugenol (40; 2.05%). Of the 203 patients who reacted to any of the 26 fragrances in the labelled-fragrance substance series, only 117 (57.6%) also reacted to a fragrance marker in the baseline series. One hundred and seven (52.7%) reacted to either fragrance mix I or fragrance mix II, 28 (13.8%) reacted to Myroxylon pereirae, and 13 (6.4%) reacted to hydroxyisohexyl 3-cyclohexene carboxaldehyde. These findings confirm that the standard fragrance markers fail to identify patients with contact allergies to the 26 fragrances. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Indexation automatique des textes arabes : état de l’art

    Directory of Open Access Journals (Sweden)

    Mohamed Salim El Bazzi

    2016-11-01

    Full Text Available Document indexing is a crucial step in the text mining process. It is used to represent documents by the most relevant descriptors of their contents. Several approaches are proposed in the literature, particularly for English, but they are unusable for Arabic documents, considering its specific characteristics and its morphological complexity, grammar and vocabulary. In this paper, we present a reading in the state of the art of indexation methods and their contribution to improve Arabic document’s processing. We also propose a categorization of works according to the most used approaches and methods for indexing textual documents. We adopted a qualitative selection of papers and we retained papers approving notable indexation contributions and illustrating significant results

  1. An Approach to a Comprehensive Test Framework for Analysis and Evaluation of Text Line Segmentation Algorithms

    Directory of Open Access Journals (Sweden)

    Zoran N. Milivojevic

    2011-09-01

    Full Text Available The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.

  2. A publication database for optical long baseline interferometry

    Science.gov (United States)

    Malbet, Fabien; Mella, Guillaume; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain

    2010-07-01

    Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.

  3. Industrial Hazardous Waste Management In Egypt-the baseline study: An Updated review

    International Nuclear Information System (INIS)

    Farida M, S.

    1999-01-01

    Increased industrialization over the past decades in Egypt has resulted in an increased and uncontrolled generation of industrial hazardous waste. This was not accompanied by any concerted efforts to control these wastes. Consequently, no system for handling or disposing of industrial wastes, in general, and industrial hazardous wastes, in specific, exists. In 1993, a baseline report was formulated to assess the overall problem of industrial hazardous waste management in Egypt. Consequently, recommendations for priority actions were identified and the main components of a national hazardous waste system under the provision of Law 4/ 1994 were presented. This paper provides an updated review of this report in light of the proposed technical, legal and institutional guidelines to help in the realization of such a needed waste management system in Egypt

  4. Predictive value of isolated DLCO reduction in systemic sclerosis patients without cardio-pulmonary involvement at baseline

    Directory of Open Access Journals (Sweden)

    M. Colaci

    2016-05-01

    Full Text Available Impaired diffusing capacity of the lung for carbon monoxide (DLCO was frequently observed in systemic sclerosis (SSc patients, generally related to the presence of interstitial lung disease (ILD and/or pulmonary arterial hypertension (PAH. However, in clinical practice abnormally low DLCO values may be found also in the absence of these SSc complications. The objective was to investigate the prospective clinical relevance of isolated DLCO reduction at baseline in SSc patients. Ninety-seven SSc female patients (age at the diagnosis: 51.3±14.5 years; disease duration: 10.4±6.6 years; limited/diffuse skin subsets: 92/5, without any clinical, radiological (high resolution computed tomography, and echocardiographic manifestations of ILD or PAH at baseline, nor other lung or heart diseases able to affect DLCO, were recruited at our Rheumatology Centre. Patients with DLCO <55% (15 patients; group A were compared with those with normal DLCO (82 patients; group B, at baseline and at the end of follow-up. At baseline, patients of group A showed significantly higher percentage of anticentromere autoantibodies compared to group B (13/15, 86.6% vs 48/82, 58.5%; p=0.044. More interestingly, at the end of long-lasting clinical follow-up (11.6±6.7 years, pre-capillary PAH (right heart catheterization solely developed in some patients of group A (3/15, 20% vs 0/82; p=0.003. In SSc patients, the presence at baseline of isolated, marked DLCO reduction (<55% of predicted and serum anticentromere autoantibodies might characterize a peculiar SSc subset that may precede the development of PAH. Therefore, careful clinical follow-up of patients with isolated moderate-severe DLCO reduction should be mandatory.

  5. Back-calculating baseline creatinine overestimates prevalence of acute kidney injury with poor sensitivity.

    Science.gov (United States)

    Kork, F; Balzer, F; Krannich, A; Bernardi, M H; Eltzschig, H K; Jankowski, J; Spies, C

    2017-03-01

    Acute kidney injury (AKI) is diagnosed by a 50% increase in creatinine. For patients without a baseline creatinine measurement, guidelines suggest estimating baseline creatinine by back-calculation. The aim of this study was to evaluate different glomerular filtration rate (GFR) equations and different GFR assumptions for back-calculating baseline creatinine as well as the effect on the diagnosis of AKI. The Modification of Diet in Renal Disease, the Chronic Kidney Disease Epidemiology (CKD-EPI) and the Mayo quadratic (MQ) equation were evaluated to estimate baseline creatinine, each under the assumption of either a fixed GFR of 75 mL min -1  1.73 m -2 or an age-adjusted GFR. Estimated baseline creatinine, diagnoses and severity stages of AKI based on estimated baseline creatinine were compared to measured baseline creatinine and corresponding diagnoses and severity stages of AKI. The data of 34 690 surgical patients were analysed. Estimating baseline creatinine overestimated baseline creatinine. Diagnosing AKI based on estimated baseline creatinine had only substantial agreement with AKI diagnoses based on measured baseline creatinine [Cohen's κ ranging from 0.66 (95% CI 0.65-0.68) to 0.77 (95% CI 0.76-0.79)] and overestimated AKI prevalence with fair sensitivity [ranging from 74.3% (95% CI 72.3-76.2) to 90.1% (95% CI 88.6-92.1)]. Staging AKI severity based on estimated baseline creatinine had moderate agreement with AKI severity based on measured baseline creatinine [Cohen's κ ranging from 0.43 (95% CI 0.42-0.44) to 0.53 (95% CI 0.51-0.55)]. Diagnosing AKI and staging AKI severity on the basis of estimated baseline creatinine in surgical patients is not feasible. Patients at risk for post-operative AKI should have a pre-operative creatinine measurement to adequately assess post-operative AKI. © 2016 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  6. Solid Waste Program technical baseline description

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  7. Geodesy by radio interferometry - Determinations of baseline vector, earth rotation, and solid earth tide parameters with the Mark I very long baseline radio interferometery system

    Science.gov (United States)

    Ryan, J. W.; Clark, T. A.; Coates, R. J.; Ma, C.; Wildes, W. T.

    1986-01-01

    Thirty-seven very long baseline radio interferometry experiments performed between 1972 and 1978 are analyzed and estimates of baseline vectors between six sites, five in the continental United States and one in Europe are derived. No evidence of significant changes in baseline length is found. For example, with a statistical level of confidence of approximately 85 percent, upper bounds on such changes within the United States ranged from a low of 10 mm/yr for the 850 km baseline between Westford, Massachusetts, and Green Bank, West Virginia, to a high of 90 mm/yr for the nearly 4000 km baseline between Westford and Goldstone, California. Estimates for universal time and for the x component of the position of the earth's pole are obtained. For the last 15 experiments, the only ones employing wideband receivers, the root-mean-square differences between the derived values and the corresponding ones published by the Bureau International de l'Heure are 0.0012 s and 0.018 arc sec respectively. The average value obtained for the radial Love number for the solid earth is 0.62 + or - 0.02 (estimated standard error).

  8. A Customizable Text Classifier for Text Mining

    Directory of Open Access Journals (Sweden)

    Yun-liang Zhang

    2007-12-01

    Full Text Available Text mining deals with complex and unstructured texts. Usually a particular collection of texts that is specified to one or more domains is necessary. We have developed a customizable text classifier for users to mine the collection automatically. It derives from the sentence category of the HNC theory and corresponding techniques. It can start with a few texts, and it can adjust automatically or be adjusted by user. The user can also control the number of domains chosen and decide the standard with which to choose the texts based on demand and abundance of materials. The performance of the classifier varies with the user's choice.

  9. SIAM (Suicide intervention assisted by messages): the development of a post-acute crisis text messaging outreach for suicide prevention.

    Science.gov (United States)

    Berrouiguet, Sofian; Alavi, Zarrin; Vaiva, Guillaume; Courtet, Philippe; Baca-García, Enrique; Vidailhet, Pierre; Gravey, Michel; Guillodo, Elise; Brandt, Sara; Walter, Michel

    2014-11-18

    Suicidal behaviour and deliberate self-harm are common among adults. Research indicates that maintaining contact either via letter or postcard with at-risk adults following discharge from care services can reduce reattempt risk. Feasibility trials demonstrated that intervention through text message was also effective in preventing suicide repetition amongst suicide attempters. The aim of the current study is to investigate the effect of text message intervention versus traditional treatment on reducing the risk of suicide attempt repetition among adults after self-harm. The study will be a 2-year multicentric randomized controlled trial conducted by the Brest University Hospital, France. Participants will be adults discharged after self-harm, from emergency services or after a short hospitalization. Participants will be recruited over a 12-month period. The intervention is comprised of an SMS that will be sent at h48, D7, D15 and monthly. The text message enquires about the patients' well-being and includes information regarding individual sources of help and evidence-based self help strategies. Participants will be assessed at the baseline, month 6 and 13. As primary endpoint, we will assess the number of patients who reattempt suicide in each group at 6 months. As secondary endpoints, we will assess the number of patients who reattempt suicide at 13 month, the number of suicide attempts in the intervention and control groups at 6 and 13 month, the number of death by suicide in the intervention and control groups at month 6 and 13. In both groups, suicidal ideations, will be assessed at the baseline, month 6 and 13. Medical costs and satisfaction will be assessed at month 13. This paper describes the design and deployment of a trial SIAM; an easily reproducible intervention that aims to reduce suicide risk in adults after self-harm. It utilizes several characteristics of interventions that have shown a significant reduction in the number of suicide reattempts. We

  10. A meta-analysis of the effects of texting on driving.

    Science.gov (United States)

    Caird, Jeff K; Johnston, Kate A; Willness, Chelsea R; Asbridge, Mark; Steel, Piers

    2014-10-01

    Text messaging while driving is considered dangerous and known to produce injuries and fatalities. However, the effects of text messaging on driving performance have not been synthesized or summarily estimated. All available experimental studies that measured the effects of text messaging on driving were identified through database searches using variants of "driving" and "texting" without restriction on year of publication through March 2014. Of the 1476 abstracts reviewed, 82 met general inclusion criteria. Of these, 28 studies were found to sufficiently compare reading or typing text messages while driving with a control or baseline condition. Independent variables (text-messaging tasks) were coded as typing, reading, or a combination of both. Dependent variables included eye movements, stimulus detection, reaction time, collisions, lane positioning, speed and headway. Statistics were extracted from studies to compute effect sizes (rc). A total sample of 977 participants from 28 experimental studies yielded 234 effect size estimates of the relationships among independent and dependent variables. Typing and reading text messages while driving adversely affected eye movements, stimulus detection, reaction time, collisions, lane positioning, speed and headway. Typing text messages alone produced similar decrements as typing and reading, whereas reading alone had smaller decrements over fewer dependent variables. Typing and reading text messages affects drivers' capability to adequately direct attention to the roadway, respond to important traffic events, control a vehicle within a lane and maintain speed and headway. This meta-analysis provides convergent evidence that texting compromises the safety of the driver, passengers and other road users. Combined efforts, including legislation, enforcement, blocking technologies, parent modeling, social media, social norms and education, will be required to prevent continued deaths and injuries from texting and driving

  11. 40 CFR 80.1285 - How does a refiner apply for a benzene baseline?

    Science.gov (United States)

    2010-07-01

    ... baseline? 80.1285 Section 80.1285 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (abt) Program § 80.1285 How does a refiner apply for a benzene baseline? (a) A benzene baseline... credits. (b) For U.S. Postal delivery, the benzene baseline application shall be sent to: Attn: MSAT2...

  12. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    Science.gov (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  13. Marker Registration Technique for Handwritten Text Marker in Augmented Reality Applications

    Science.gov (United States)

    Thanaborvornwiwat, N.; Patanukhom, K.

    2018-04-01

    Marker registration is a fundamental process to estimate camera poses in marker-based Augmented Reality (AR) systems. We developed AR system that creates correspondence virtual objects on handwritten text markers. This paper presents a new method for registration that is robust for low-content text markers, variation of camera poses, and variation of handwritten styles. The proposed method uses Maximally Stable Extremal Regions (MSER) and polygon simplification for a feature point extraction. The experiment shows that we need to extract only five feature points per image which can provide the best registration results. An exhaustive search is used to find the best matching pattern of the feature points in two images. We also compared performance of the proposed method to some existing registration methods and found that the proposed method can provide better accuracy and time efficiency.

  14. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    Science.gov (United States)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification

  15. Scheme for simultaneous generation of three-color ten GW-level X-ray pulses from baseline XFEL undulator and multi-user distribution system for XFEL laboratory

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2010-01-01

    The baseline design of present XFEL projects only considers the production of a single photon beam at fixed wavelength from each baseline undulator. At variance, the scheme described in this paper considers the simultaneous production of high intensity SASE FEL radiation at three different wavelengths. We present a feasibility study of our scheme, and we make exemplifications with parameters of the baseline SASE2 line of the European XFEL operating in simultaneous mode at 0.05 nm, 0.15 nm and 0.4 nm. Our technique for generating the two colors at 0.05 nm and 0.15 nm is based in essence on a ''fresh bunch'' technique. For the generation of radiation at 0.4 nm we propose to use an ''afterburner'' technique. Implementation of these techniques does not perturb the baseline mode of operation of the SASE2 undulator. The present paper also describes an efficient way to obtain a multi-user facility. It is shown that, although the XFEL photon beam from a given undulator is meant for a single user, movable multilayer X-ray mirrors can be used to serve many users simultaneously. The proposed photon beam distribution system would allow to switch the FEL beam quickly between many experiments in order to make an efficient use of the source. Distribution of photons is achieved on the basis of pulse trains and it is possible to distribute the multicolor photon beam among many independent beam lines, thereby enabling many users to work in parallel with different wavelengths. (orig.)

  16. IEA Wind Task 26: Offshore Wind Farm Baseline Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Gavin [Offshore Renewable Energy Catapult, Blyth, Northumberland (United Kingdom); Smith, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sperstad, Iver Bakken [SINTEF Energy Research, Trondheim (Norway); Prinsen, Bob [Ecofys, Utrecht (Netherlands). TKI Wind Op Zee; Lacal-Arantegui, Roberto [European Commission Joint Research Centre (JRC), Brussels (Belgium)

    2016-06-02

    This document has been produced to provide the definition and rationale for the Baseline Offshore Wind Farm established within IEA Wind Task 26--Cost of Wind Energy. The Baseline has been developed to provide a common starting point for country comparisons and sensitivity analysis on key offshore wind cost and value drivers. The baseline project reflects an approximate average of the characteristics of projects installed between 2012 and 2014, with the project life assumed to be 20 years. The baseline wind farm is located 40 kilometres (km) from construction and operations and maintenance (O&M) ports and from export cable landfall. The wind farm consists of 100 4-megawatt (MW) wind turbines mounted on monopile foundations in an average water depth of 25 metres (m), connected by 33-kilovolt (kV) inter-array cables. The arrays are connected to a single offshore substation (33kV/220kV) mounted on a jacket foundation, with the substation connected via a single 220kV export cable to an onshore substation, 10km from landfall. The wind farm employs a port-based O&M strategy using crew-transfer vessels.

  17. Monitoring What Governments “Give for” and “Spend on” Vaccine Procurement: Vaccine Procurement Assistance and Vaccine Procurement Baseline

    OpenAIRE

    Nelson, E. A. S.; Bloom, David E.; Mahoney, Richard T.

    2014-01-01

    BACKGROUND: The Global Vaccine Action Plan will require, inter alia, the mobilization of financial resources from donors and national governments - both rich and poor. Vaccine Procurement Assistance (VPA) and Vaccine Procurement Baseline (VPB) are two metrics that could measure government performance and track resources in this arena. VPA is proposed as a new subcategory of Official Development Assistance (ODA) given for the procurement of vaccines and VPB is a previously suggested measure of...

  18. Baselining PMU Data to Find Patterns and Anomalies

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Follum, James D.; Freeman, Kimberly A.; Dagle, Jeffery E.

    2016-10-25

    This paper looks at the application of situational awareness methodologies with respect to power grid data. These methodologies establish baselines that look for typical patterns and atypical behavior in the data. The objectives of the baselining analyses are to provide: real-time analytics, the capability to look at historical trends and events, and reliable predictions of the near future state of the grid. Multivariate algorithms were created to establish normal baseline behavior and then score each moment in time according to its variance from the baseline. Detailed multivariate analytical techniques are described in this paper that produced ways to identify typical patterns and atypical behavior. In this case, atypical behavior is behavior that is unenvisioned. Visualizations were also produced to help explain the behavior that was identified mathematically. Examples are shown to help describe how to read and interpret the analyses and visualizations. Preliminary work has been performed on PMU data sets from BPA (Bonneville Power Administration) and EI (Eastern Interconnect). Actual results are not fully shown here because of confidentiality issues. Comparisons between atypical events found mathematically and actual events showed that many of the actual events are also atypical events; however there are many atypical events that do not correlate to any actual events. Additional work needs to be done to help classify the atypical events into actual events, so that the importance of the events can be better understood.

  19. A text zero-watermarking method based on keyword dense interval

    Science.gov (United States)

    Yang, Fan; Zhu, Yuesheng; Jiang, Yifeng; Qing, Yin

    2017-07-01

    Digital watermarking has been recognized as a useful technology for the copyright protection and authentication of digital information. However, rarely did the former methods focus on the key content of digital carrier. The idea based on the protection of key content is more targeted and can be considered in different digital information, including text, image and video. In this paper, we use text as research object and a text zero-watermarking method which uses keyword dense interval (KDI) as the key content is proposed. First, we construct zero-watermarking model by introducing the concept of KDI and giving the method of KDI extraction. Second, we design detection model which includes secondary generation of zero-watermark and the similarity computing method of keyword distribution. Besides, experiments are carried out, and the results show that the proposed method gives better performance than other available methods especially in the attacks of sentence transformation and synonyms substitution.

  20. Energy Consumption Analysis for Concrete Residences—A Baseline Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Kuo-Liang Lin

    2017-02-01

    Full Text Available Estimating building energy consumption is difficult because it deals with complex interactions among uncertain weather conditions, occupant behaviors, and building characteristics. To facilitate estimation, this study employs a benchmarking methodology to obtain energy baseline for sample buildings. Utilizing a scientific simulation tool, this study attempts to develop energy consumption baselines of two typical concrete residences in Taiwan, and subsequently allows a simplified energy consumption prediction process at an early design stage of building development. Using weather data of three metropolitan cities as testbeds, annual energy consumption of two types of modern residences are determined through a series of simulation sessions with different building settings. The impacts of key building characteristics, including building insulation, air tightness, orientation, location, and residence type, are carefully investigated. Sample utility bills are then collected to validate the simulated results, resulting in three adjustment parameters for normalization, including ‘number of residents’, ‘total floor area’, and ‘air conditioning comfort level’, for justification of occupant behaviors in different living conditions. Study results not only provide valuable benchmarking data serving as references for performance evaluation of different energy-saving strategies, but also show how effective extended building insulation, enhanced air tightness, and prudent selection of residence location and orientation can be for successful implementation of building sustainability in tropical and subtropical regions.

  1. Automated analysis of instructional text

    Energy Technology Data Exchange (ETDEWEB)

    Norton, L.M.

    1983-05-01

    The development of a capability for automated processing of natural language text is a long-range goal of artificial intelligence. This paper discusses an investigation into the issues involved in the comprehension of descriptive, as opposed to illustrative, textual material. The comprehension process is viewed as the conversion of knowledge from one representation into another. The proposed target representation consists of statements of the prolog language, which can be interpreted both declaratively and procedurally, much like production rules. A computer program has been written to model in detail some ideas about this process. The program successfully analyzes several heavily edited paragraphs adapted from an elementary textbook on programming, automatically synthesizing as a result of the analysis a working Prolog program which, when executed, can parse and interpret let commands in the basic language. The paper discusses the motivations and philosophy of the project, the many kinds of prerequisite knowledge which are necessary, and the structure of the text analysis program. A sentence-by-sentence account of the analysis of the sample text is presented, describing the syntactic and semantic processing which is involved. The paper closes with a discussion of lessons learned from the project, possible alternative approaches, and possible extensions for future work. The entire project is presented as illustrative of the nature and complexity of the text analysis process, rather than as providing definitive or optimal solutions to any aspects of the task. 12 references.

  2. 200-BP-5 operable unit Technical Baseline report

    International Nuclear Information System (INIS)

    Jacques, I.D.; Kent, S.K.

    1991-10-01

    This report supports development of a remedial investigation/feasibility study work plan for the 200-BP-5 operable unit. The report summarizes baseline information for waste sites and unplanned release sites located in the 200-BP-5 operable unit. The sites were investigated by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (Westinghouse Hanford). The investigation consisted of review and evaluation of current and historical Hanford Site reports, drawings, and photographs, and was supplemented with recent inspections of the Hanford Site and employee interviews. No field investigations or sampling were conducted

  3. Benchmarking infrastructure for mutation text mining.

    Science.gov (United States)

    Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo

    2014-02-25

    Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.

  4. Benchmarking infrastructure for mutation text mining

    Science.gov (United States)

    2014-01-01

    Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600

  5. Environmental baselines: preparing for shale gas in the UK

    Science.gov (United States)

    Bloomfield, John; Manamsa, Katya; Bell, Rachel; Darling, George; Dochartaigh, Brighid O.; Stuart, Marianne; Ward, Rob

    2014-05-01

    Groundwater is a vital source of freshwater in the UK. It provides almost 30% of public water supply on average, but locally, for example in south-east England, it is constitutes nearly 90% of public supply. In addition to public supply, groundwater has a number of other uses including agriculture, industry, and food and drink production. It is also vital for maintaining river flows especially during dry periods and so is essential for maintaining ecosystem health. Recently, there have been concerns expressed about the potential impacts of shale gas development on groundwater. The UK has abundant shales and clays which are currently the focus of considerable interest and there is active research into their characterisation, resource evaluation and exploitation risks. The British Geological Survey (BGS) is undertaking research to provide information to address some of the environmental concerns related to the potential impacts of shale gas development on groundwater resources and quality. The aim of much of this initial work is to establish environmental baselines, such as a baseline survey of methane occurrence in groundwater (National methane baseline study) and the spatial relationships between potential sources and groundwater receptors (iHydrogeology project), prior to any shale gas exploration and development. The poster describes these two baseline studies and presents preliminary findings. BGS are currently undertaking a national survey of baseline methane concentrations in groundwater across the UK. This work will enable any potential future changes in methane in groundwater associated with shale gas development to be assessed. Measurements of methane in potable water from the Cretaceous, Jurassic and Triassic carbonate and sandstone aquifers are variable and reveal methane concentrations of up to 500 micrograms per litre, but the mean value is relatively low at documented in the range 2km. The geological modelling process will be presented and discussed

  6. Evidence of the shifting baseline syndrome in ethnobotanical research.

    Science.gov (United States)

    Hanazaki, Natalia; Herbst, Dannieli Firme; Marques, Mel Simionato; Vandebroek, Ina

    2013-11-14

    The shifting baseline syndrome is a concept from ecology that can be analyzed in the context of ethnobotanical research. Evidence of shifting baseline syndrome can be found in studies dealing with intracultural variation of knowledge, when knowledge from different generations is compared and combined with information about changes in the environment and/or natural resources. We reviewed 84 studies published between 1993 and 2012 that made comparisons of ethnobotanical knowledge according to different age classes. After analyzing these studies for evidence of the shifting baseline syndrome (lower knowledge levels in younger generations and mention of declining abundance of local natural resources), we searched within these studies for the use of the expressions "cultural erosion", "loss of knowledge", or "acculturation". The studies focused on different groups of plants (e.g. medicinal plants, foods, plants used for general purposes, or the uses of specific important species). More than half of all 84 studies (57%) mentioned a concern towards cultural erosion or knowledge loss; 54% of the studies showed evidence of the shifting baseline syndrome; and 37% of the studies did not provide any evidence of shifting baselines (intergenerational knowledge differences but no information available about the abundance of natural resources). The general perception of knowledge loss among young people when comparing ethnobotanical repertoires among different age groups should be analyzed with caution. Changes in the landscape or in the abundance of plant resources may be associated with changes in ethnobotanical repertoires held by people of different age groups. Also, the relationship between the availability of resources and current plant use practices rely on a complexity of factors. Fluctuations in these variables can cause changes in the reference (baseline) of different generations and consequently be responsible for differences in intergenerational knowledge. Unraveling

  7. Advanced text and video analytics for proactive decision making

    Science.gov (United States)

    Bowman, Elizabeth K.; Turek, Matt; Tunison, Paul; Porter, Reed; Thomas, Steve; Gintautas, Vadas; Shargo, Peter; Lin, Jessica; Li, Qingzhe; Gao, Yifeng; Li, Xiaosheng; Mittu, Ranjeev; Rosé, Carolyn Penstein; Maki, Keith; Bogart, Chris; Choudhari, Samrihdi Shree

    2017-05-01

    Today's warfighters operate in a highly dynamic and uncertain world, and face many competing demands. Asymmetric warfare and the new focus on small, agile forces has altered the framework by which time critical information is digested and acted upon by decision makers. Finding and integrating decision-relevant information is increasingly difficult in data-dense environments. In this new information environment, agile data algorithms, machine learning software, and threat alert mechanisms must be developed to automatically create alerts and drive quick response. Yet these advanced technologies must be balanced with awareness of the underlying context to accurately interpret machine-processed indicators and warnings and recommendations. One promising approach to this challenge brings together information retrieval strategies from text, video, and imagery. In this paper, we describe a technology demonstration that represents two years of tri-service research seeking to meld text and video for enhanced content awareness. The demonstration used multisource data to find an intelligence solution to a problem using a common dataset. Three technology highlights from this effort include 1) Incorporation of external sources of context into imagery normalcy modeling and anomaly detection capabilities, 2) Automated discovery and monitoring of targeted users from social media text, regardless of language, and 3) The concurrent use of text and imagery to characterize behaviour using the concept of kinematic and text motifs to detect novel and anomalous patterns. Our demonstration provided a technology baseline for exploiting heterogeneous data sources to deliver timely and accurate synopses of data that contribute to a dynamic and comprehensive worldview.

  8. The influence of baseline marijuana use on treatment of cocaine dependence: application of an informative-priors Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Charles eGreen

    2012-10-01

    Full Text Available Background: Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1 examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2 apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect.Method: A secondary data analysis of two previously published, double-blind, randomized controlled trials provided samples for the historical dataset (Study 1: N = 64 complete observations and current dataset (Study 2: N = 113 complete observations. Negative binomial regression evaluated Treatment Effectiveness Scores (TES as a function of medication condition (levodopa/carbidopa, placebo, baseline marijuana use (days in past 30, and their interaction. Results: Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES.Conclusion: The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect.

  9. Forest baseline and deforestation map of the Dominican Republic through the analysis of time series of MODIS data

    Directory of Open Access Journals (Sweden)

    Florencia Sangermano

    2015-09-01

    Full Text Available Deforestation is one of the major threats to habitats in the Dominican Republic. In this work we present a forest baseline for the year 2000 and a deforestation map for the year 2011. Maps were derived from Moderate Resolution Imaging Radiometer (MODIS products at 250 m resolution. The vegetation continuous fields product (MOD44B for the year 2000 was used to produce the forest baseline, while the vegetation indices product (MOD13Q1 was used to detect change between 2000 and 2011. Major findings based on the data presented here are reported in the manuscript “Habitat suitability and protection status of four species of amphibians in the Dominican Republic” (Sangermano et al., Appl. Geogr., [7].63, 2015, 55–65

  10. Text-Independent Speaker Identification Using the Histogram Transform Model

    DEFF Research Database (Denmark)

    Ma, Zhanyu; Yu, Hong; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose a novel probabilistic method for the task of text-independent speaker identification (SI). In order to capture the dynamic information during SI, we design a super-MFCCs features by cascading three neighboring Mel-frequency Cepstral coefficients (MFCCs) frames together....... These super-MFCC vectors are utilized for probabilistic model training such that the speaker’s characteristics can be sufficiently captured. The probability density function (PDF) of the aforementioned super-MFCCs features is estimated by the recently proposed histogram transform (HT) method. To recedes...

  11. 41 CFR 109-1.5202 - Establishment of a personal property holdings baseline.

    Science.gov (United States)

    2010-07-01

    ... personal property holdings baseline. 109-1.5202 Section 109-1.5202 Public Contracts and Property Management...-1.5202 Establishment of a personal property holdings baseline. (a) If the contractor is a new... baseline or may perform a complete physical inventory of all personal property. This physical inventory is...

  12. Automated Determination of the Type of Genre and Stylistic Coloring of Russian Texts

    Directory of Open Access Journals (Sweden)

    Barakhnin Vladimir

    2017-01-01

    Full Text Available In this paper we propose the algorithm of automated definition of the genre type and semantic characteristics of poetic texts in Russian. We formulated the approaches to the construction of a joint (“two-dimensional” classifier of genre types and stylistic colouring of poetic texts, based on the definition of interdependence of the type of genre and stylistic colouring of the text. On the basis of these approaches the principles of formation of the training samples for the algorithms for the definition of styles and genre types were analyzed. The computational experiments with a corpus of texts of the Lyceum lyrics of A.S.Pushkin were implemented, which showed good results in determining the stylistic colouring of poetic texts and sufficient results in determining the genres. The proposed algorithms can be used for automation of the complex analysis of Russian poetic texts, significantly facilitating the work of the expert in determining their styles and genres by providing appropriate recommendations.

  13. Baseline PSA in a Spanish male population aged 40-49 years anticipates detection of prostate cancer.

    Science.gov (United States)

    Angulo, J C; Viñas, M A; Gimbernat, H; Fata, F Ramón de; Granados, R; Luján, M

    2015-12-01

    We researched the usefulness of optimizing prostate cancer (PC) screening in our community using baseline PSA readings in men between 40-49 years of age. A retrospective study was performed that analyzed baseline PSA in the fifth decade of life and its ability to predict the development of PC in a population of Madrid (Spain). An ROC curve was created and a cutoff was proposed. We compared the evolution of PSA from baseline in patients with consecutive readings using the Friedman test. We established baseline PSA ranges with different risks of developing cancer and assessed the diagnostic utility of the annual PSA velocity (PSAV) in this population. Some 4,304 men aged 40-49 years underwent opportunistic screening over the course of 17 years, with at least one serum PSA reading (6,001 readings) and a mean follow-up of 57.1±36.8 months. Of these, 768 underwent biopsy of some organ, and 104 underwent prostate biopsy. Fourteen patients (.33%) were diagnosed with prostate cancer. The median baseline PSA was .74 (.01-58.5) ng/mL for patients without PC and 4.21 (.76-47.4) ng/mL for those with PC. The median time from the reading to diagnosis was 26.8 (1.5-143.8) months. The optimal cutoff for detecting PC was 1.9ng/mL (sensitivity, 92.86%; specificity, 92.54%; PPV, 3.9%; NPV, 99.97%), and the area under the curve was 92.8%. In terms of the repeated reading, the evolution of the PSA showed no statistically significant differences between the patients without cancer (p=.56) and those with cancer (P=.64). However, a PSAV value >.3ng/mL/year revealed high specificity for detecting cancer in this population. A baseline PSA level ≥1.9ng/mL in Spanish men aged 40-49 years predicted the development of PC. This value could therefore be of use for opportunistic screening at an early age. An appropriate follow-up adapted to the risk of this population needs to be defined, but an annual PSAV ≥.3ng/mL/year appears of use for reaching an early diagnosis. Copyright © 2015 AEU

  14. A Baseline Load Schedule for the Manual Calibration of a Force Balance

    Science.gov (United States)

    Ulbrich, N.; Gisler, R.

    2013-01-01

    A baseline load schedule for the manual calibration of a force balance is defined that takes current capabilities at the NASA Ames Balance Calibration Laboratory into account. The chosen load schedule consists of 18 load series with a total of 194 data points. It was designed to satisfy six requirements: (i) positive and negative loadings should be applied for each load component; (ii) at least three loadings should be applied between 0 % and 100 % load capacity; (iii) normal and side force loadings should be applied at the forward gage location, aft gage location, and the balance moment center; (iv) the balance should be used in "up" and "down" orientation to get positive and negative axial force loadings; (v) the constant normal and side force approaches should be used to get the rolling moment loadings; (vi) rolling moment loadings should be obtained for 0, 90, 180, and 270 degrees balance orientation. In addition, three different approaches are discussed in the paper that may be used to independently estimate the natural zeros, i.e., the gage outputs of the absolute load datum of the balance. These three approaches provide gage output differences that can be used to estimate the weight of both the metric and non-metric part of the balance. Data from the calibration of a six-component force balance will be used in the final manuscript of the paper to illustrate characteristics of the proposed baseline load schedule.

  15. Automatically classifying sentences in full-text biomedical articles into Introduction, Methods, Results and Discussion.

    Science.gov (United States)

    Agarwal, Shashank; Yu, Hong

    2009-12-01

    Biomedical texts can be typically represented by four rhetorical categories: Introduction, Methods, Results and Discussion (IMRAD). Classifying sentences into these categories can benefit many other text-mining tasks. Although many studies have applied different approaches for automatically classifying sentences in MEDLINE abstracts into the IMRAD categories, few have explored the classification of sentences that appear in full-text biomedical articles. We first evaluated whether sentences in full-text biomedical articles could be reliably annotated into the IMRAD format and then explored different approaches for automatically classifying these sentences into the IMRAD categories. Our results show an overall annotation agreement of 82.14% with a Kappa score of 0.756. The best classification system is a multinomial naïve Bayes classifier trained on manually annotated data that achieved 91.95% accuracy and an average F-score of 91.55%, which is significantly higher than baseline systems. A web version of this system is available online at-http://wood.ims.uwm.edu/full_text_classifier/.

  16. 75 FR 30014 - Consumers Energy Company; Notice of Baseline Filing

    Science.gov (United States)

    2010-05-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-25-000] Consumers Energy Company; Notice of Baseline Filing May 21, 2010. Take notice that on May 17, 2010, Consumers Energy Company (Consumers) submitted a baseline filing of its Statement of Operating Conditions for the...

  17. Preoperational baseline and site characterization report for the Environmental Restoration Disposal Facility

    International Nuclear Information System (INIS)

    Weekes, D.C.; Ford, B.H.; Jaeger, G.K.

    1996-09-01

    This document Volume 2 in a two-volume series that comprise the site characterization report for the Environmental Restoration Disposal Facility. Volume 1 contains data interpretation and information supporting the conclusions in the main text. This document presents original data in support of Volume 1 of the report. The following types of data are presented: well construction reports; borehole logs; borehole geophysical data; well development and pump installation; survey reports; and preoperational baseline chemical data and aquifer test data. This does not represent the entire body of data available. Other types of information are archived at BHI Document Control. Five ground water monitoring wells were drilled at the Environmental Restoration Disposal Facility site to directly investigate site- specific hydrogeologic conditions. Well and borehole activity summaries are presented in Volume 1. Field borehole logs and geophysical data from the drilling are presented in this document. Well development and pump installation sheets are presented for the groundwater monitoring wells. Other data presented in this document include borehole geophysical logs from existing wells; chemical data from the sampling of soil, vegetation, and mammals from the ERDF to support the preoperational baseline; ERDF surface radiation surveys;a nd aquifer testing data for well 699-32-72B

  18. Baseline values of immunologic parameters in the lizard Salvator merianae (Teiidae, Squamata

    Directory of Open Access Journals (Sweden)

    Ana Paula Mestre

    2017-05-01

    Full Text Available The genus Salvator is widely distributed throughout South America. In Argentina, the species most abundant widely distributed is Salvator merianae. Particularly in Santa Fe province, the area occupied by populations of these lizards overlaps with areas where agriculture was extended. With the aim of established baseline values for four immunologic biomarkers widely used, 36 tegu lizards were evaluated tacking into account different age classes and both sexes. Total leukocyte counts were not different between age classes. Of the leucocytes count, eosinophils levels were higher in neonates compared with juvenile and adults; nevertheless, the heterophils group was the most prevalent leukocyte in the peripheral blood in all age classes. Lymphocytes, monocytes, heterophils, azurophils and basophils levels did not differ with age. Natural antibodies titres were higher in the adults compared with neonates and juveniles lizards. Lastly, complement system activity was low in neonates compared with juveniles and adults. Statistical analysis within each age group showed that gender was not a factor in the outcomes. Based on the results, we concluded that S. merianae demonstrated age (but not gender related differences in the immune parameters analyzed. Having established baseline values for these four widely-used immunologic biomarkers, ongoing studies will seek to optimize the use of the S. merianae model in future research.

  19. Parkinson’s Disease Severity at 3 Years Can Be Predicted from Non-Motor Symptoms at Baseline

    Directory of Open Access Journals (Sweden)

    Alba Ayala

    2017-10-01

    Full Text Available ObjectiveThe aim of this study is to present a predictive model of Parkinson’s disease (PD global severity, measured with the Clinical Impression of Severity Index for Parkinson’s Disease (CISI-PD.MethodsThis is an observational, longitudinal study with annual follow-up assessments over 3 years (four time points. A multilevel analysis and multiple imputation techniques were performed to generate a predictive model that estimates changes in the CISI-PD at 1, 2, and 3 years.ResultsThe clinical state of patients (CISI-PD significantly worsened in the 3-year follow-up. However, this change was of small magnitude (effect size: 0.44. The following baseline variables were significant predictors of the global severity change: baseline global severity of disease, levodopa equivalent dose, depression and anxiety symptoms, autonomic dysfunction, and cognitive state. The goodness-of-fit of the model was adequate, and the sensitive analysis showed that the data imputation method applied was suitable.ConclusionDisease progression depends more on the individual’s baseline characteristics than on the 3-year time period. Results may contribute to a better understanding of the evolution of PD including the non-motor manifestations of the disease.

  20. Kuiseb environment: the development of a monitoring baseline. A report of the Committee for Terrestrial Ecosystems. National Programme for Environmental Sciences

    CSIR Research Space (South Africa)

    Huntley, BJ

    1985-01-01

    Full Text Available to occur in the area as a consequence of water extraction from the Kuiseb River, and provides details of features of the geomorphology, hydrology and ecology that might be used as baselines against which to measure changes within the system....

  1. FigSum: automatically generating structured text summaries for figures in biomedical literature.

    Science.gov (United States)

    Agarwal, Shashank; Yu, Hong

    2009-11-14

    Figures are frequently used in biomedical articles to support research findings; however, they are often difficult to comprehend based on their legends alone and information from the full-text articles is required to fully understand them. Previously, we found that the information associated with a single figure is distributed throughout the full-text article the figure appears in. Here, we develop and evaluate a figure summarization system - FigSum, which aggregates this scattered information to improve figure comprehension. For each figure in an article, FigSum generates a structured text summary comprising one sentence from each of the four rhetorical categories - Introduction, Methods, Results and Discussion (IMRaD). The IMRaD category of sentences is predicted by an automated machine learning classifier. Our evaluation shows that FigSum captures 53% of the sentences in the gold standard summaries annotated by biomedical scientists and achieves an average ROUGE-1 score of 0.70, which is higher than a baseline system.

  2. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  3. Semantic Indexing of Multimedia Content Using Visual, Audio, and Text Cues

    Directory of Open Access Journals (Sweden)

    W. H. Adams

    2003-02-01

    Full Text Available We present a learning-based approach to the semantic indexing of multimedia content using cues derived from audio, visual, and text features. We approach the problem by developing a set of statistical models for a predefined lexicon. Novel concepts are then mapped in terms of the concepts in the lexicon. To achieve robust detection of concepts, we exploit features from multiple modalities, namely, audio, video, and text. Concept representations are modeled using Gaussian mixture models (GMM, hidden Markov models (HMM, and support vector machines (SVM. Models such as Bayesian networks and SVMs are used in a late-fusion approach to model concepts that are not explicitly modeled in terms of features. Our experiments indicate promise in the proposed classification and fusion methodologies: our proposed fusion scheme achieves more than 10% relative improvement over the best unimodal concept detector.

  4. OntoGene web services for biomedical text mining.

    Science.gov (United States)

    Rinaldi, Fabio; Clematide, Simon; Marques, Hernani; Ellendorff, Tilia; Romacker, Martin; Rodriguez-Esteban, Raul

    2014-01-01

    Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges,with top ranked results in several of them.

  5. Resonant island divertor experiments on text

    International Nuclear Information System (INIS)

    deGrassie, J.S.; Evans, T.E.; Jackson, G.L.

    1988-09-01

    The first experimental tests of the resonant island divertor (RID) concept have been carried out on the Texas Experimental Tokamak (TEXT). Modular perturbation coils produce static resonant magnetic fields at the tokamak boundary. The resulting magnetic islands are used to guide heat and particle fluxes around a small scoop limiter head. An enhancement in the limiter collection efficiency over the nonisland operation, as evidenced by enhanced neutral density within the limiter head, of up to a factor of 4 is obtained. This enhancement is larger than one would expect given the measured magnitude of the cross-field particle transport in TEXT. It is proposed that electrostatic perturbations occur which enhance the ion convection rate around the islands. Preliminary experiments utilizing electron cyclotron heating (ECH) in conjunction with RID operation have also have been performed. 6 refs., 3 figs

  6. The Harbin Cohort Study on Diet, Nutrition and Chronic Non-communicable Diseases: study design and baseline characteristics.

    Directory of Open Access Journals (Sweden)

    Lixin Na

    Full Text Available Diet and nutrition have been reported to be associated with many common chronic diseases and blood-based assessment would be vital to investigate the association and mechanism, however, blood-based prospective studies are limited. The Harbin Cohort Study on Diet, Nutrition and Chronic Non-communicable Diseases was set up in 2010. From 2010 to 2012, 9,734 participants completed the baseline survey, including demographic characteristics, dietary intake, lifestyles and physical condition, and anthropometrics. A re-survey on 490 randomly selected participants was done by using the same methods which were employed in the baseline survey. For all participants, the mean age was 50 years and 36% of them were men. Approximately 99.4 % of cohort members donated blood samples. The mean total energy intake was 2671.7 kcal/day in men and 2245.9 kcal/day in women, the mean body mass index was 25.7 kg/m2 in men and 24.6 kg/m2 in women, with 18.4% being obese (≥ 28 kg/m2, 12.7% being diabetic, and 29.5% being hypertensive. A good agreement was obtained for the physical measurements between the baseline survey and re-survey. The resources from the cohort and its fasting and postprandial blood samples collected both at baseline and in each follow-up will be valuable and powerful in investigating relationship between diet, nutrition and chronic diseases and discovering novel blood biomarkers and the metabolism of these biomarkers related to chronic diseases.

  7. Waste management project technical baseline description

    International Nuclear Information System (INIS)

    Sederburg, J.P.

    1997-01-01

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project

  8. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  9. Baseline hematology and serum biochemistry results for Indian leopards (Panthera pardus fusca

    Directory of Open Access Journals (Sweden)

    Arun Attur Shanmugam

    2017-07-01

    Full Text Available Aim: The aim of the study was to establish the baseline hematology and serum biochemistry values for Indian leopards (Panthera pardus fusca, and to assess the possible variations in these parameters based on age and gender. Materials and Methods: Hemato-biochemical test reports from a total of 83 healthy leopards, carried out as part of routine health evaluation in Bannerghatta Biological Park and Manikdoh Leopard Rescue Center, were used to establish baseline hematology and serum biochemistry parameters for the subspecies. The hematological parameters considered for the analysis included hemoglobin (Hb, packed cell volume, total erythrocyte count (TEC, total leukocyte count (TLC, mean corpuscular volume (MCV, mean corpuscular Hb (MCH, and MCH concentration. The serum biochemistry parameters considered included total protein (TP, albumin, globulin, aspartate aminotransferase, alanine aminotransferase (ALT, blood urea nitrogen, creatinine, triglycerides, calcium, and phosphorus. Results: Even though few differences were observed in hematologic and biochemistry values between male and female Indian leopards, the differences were statistically not significant. Effects of age, however, were evident in relation to many hematologic and biochemical parameters. Sub-adults had significantly greater values for Hb, TEC, and TLC compared to adults and geriatric group, whereas they had significantly lower MCV and MCH compared to adults and geriatric group. Among, serum biochemistry parameters the sub-adult age group was observed to have significantly lower values for TP and ALT than adult and geriatric leopards. Conclusion: The study provides a comprehensive analysis of hematologic and biochemical parameters for Indian leopards. Baselines established here will permit better captive management of the subspecies, serve as a guide to assess the health and physiological status of the free ranging leopards, and may contribute valuable information for making

  10. Historical baselines of coral cover on tropical reefs as estimated by expert opinion.

    Science.gov (United States)

    Eddy, Tyler D; Cheung, William W L; Bruno, John F

    2018-01-01

    Coral reefs are important habitats that represent global marine biodiversity hotspots and provide important benefits to people in many tropical regions. However, coral reefs are becoming increasingly threatened by climate change, overfishing, habitat destruction, and pollution. Historical baselines of coral cover are important to understand how much coral cover has been lost, e.g., to avoid the 'shifting baseline syndrome'. There are few quantitative observations of coral reef cover prior to the industrial revolution, and therefore baselines of coral reef cover are difficult to estimate. Here, we use expert and ocean-user opinion surveys to estimate baselines of global coral reef cover. The overall mean estimated baseline coral cover was 59% (±19% standard deviation), compared to an average of 58% (±18% standard deviation) estimated by professional scientists. We did not find evidence of the shifting baseline syndrome, whereby respondents who first observed coral reefs more recently report lower estimates of baseline coral cover. These estimates of historical coral reef baseline cover are important for scientists, policy makers, and managers to understand the extent to which coral reefs have become depleted and to set appropriate recovery targets.

  11. Print versus digital texts: understanding the experimental research and challenging the dichotomies

    Directory of Open Access Journals (Sweden)

    Bella Ross

    2017-11-01

    Full Text Available This article presents the results of a systematic critical review of interdisciplinary literature concerned with digital text (or e-text uses in education and proposes recommendations for how e-texts can be implemented for impactful learning. A variety of e-texts can be found in the repertoire of educational resources accessible to students, and in the constantly changing terrain of educational technologies, they are rapidly evolving, presenting new opportunities and affordances for student learning. We highlight some of the ways in which academic studies have examined e-texts as part of teaching and learning practices, placing a particular emphasis on aspects of learning such as recall, comprehension, retention of information and feedback. We also review diverse practices associated with uses of e-text tools such as note-taking, annotation, bookmarking, hypertexts and highlighting. We argue that evidence-based studies into e-texts are overwhelmingly structured around reinforcing the existing dichotomy pitting print-based (‘traditional’ texts against e-texts. In this article, we query this approach and instead propose to focus on factors such as students’ level of awareness of their options in accessing learning materials and whether they are instructed and trained in how to take full advantage of the capabilities of e-texts, both of which have been found to affect learning performance.

  12. Mechanical Thrombectomy in Elderly Stroke Patients with Mild-to-Moderate Baseline Disability.

    Science.gov (United States)

    Slawski, Diana E; Salahuddin, Hisham; Shawver, Julie; Kenmuir, Cynthia L; Tietjen, Gretchen E; Korsnack, Andrea; Zaidi, Syed F; Jumaa, Mouhammad A

    2018-04-01

    The number of elderly patients suffering from ischemic stroke is rising. Randomized trials of mechanical thrombectomy (MT) generally exclude patients over the age of 80 years with baseline disability. The aim of this study was to understand the efficacy and safety of MT in elderly patients, many of whom may have baseline impairment. Between January 2015 and April 2017, 96 patients ≥80 years old who underwent MT for stroke were selected for a chart review. The data included baseline characteristics, time to treatment, the rate of revascularization, procedural complications, mortality, and 90-day good outcome defined as a modified Rankin Scale (mRS) score of 0-2 or return to baseline. Of the 96 patients, 50 had mild baseline disability (mRS score 0-1) and 46 had moderate disability (mRS score 2-4). Recanalization was achieved in 84% of the patients, and the rate of symptomatic hemorrhage was 6%. At 90 days, 34% of the patients had a good outcome. There were no significant differences in good outcome between those with mild and those with moderate baseline disability (43 vs. 24%, p = 0.08), between those aged ≤85 and those aged > 85 years (40.8 vs. 26.1%, p = 0.19), and between those treated within and those treated beyond 8 h (39 vs. 20%, p = 0.1). The mortality rate was 38.5% at 90 days. The Alberta Stroke Program Early CT Score (ASPECTS) and the National Institutes of Health Stroke Scale (NIHSS) predicted good outcome regardless of baseline disability ( p baseline disability, and delayed treatment are associated with sub-optimal outcomes after MT. However, redefining good outcome to include return to baseline functioning demonstrates that one-third of this patient population benefits from MT, suggesting the real-life utility of this treatment.

  13. 77 FR 31841 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-30

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-23-001] Hope Gas, Inc.; Notice of Baseline Filing Take notice that on May 16, 2012, Hope Gas, Inc. (Hope Gas) submitted a revised baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the...

  14. 77 FR 26535 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-04

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-23-000] Hope Gas, Inc.; Notice of Baseline Filing Take notice that on April 26, 2012, Hope Gas, Inc. (Hope Gas) submitted a baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the...

  15. 75 FR 33799 - EasTrans, LLC; Notice of Baseline Filing

    Science.gov (United States)

    2010-06-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-30-000] EasTrans, LLC; Notice of Baseline Filing June 8, 2010. Take notice that on June 4, 2010, EasTrans, LLC submitted a baseline filing of its Statement of Operating Conditions for services provided under section 311 of the...

  16. A Systematic Review and Meta-Analysis of Baseline Ohip-Edent Scores.

    Science.gov (United States)

    Duale, J M J; Patel, Y A; Wu, J; Hyde, T P

    2018-03-01

    OHIP-EDENT is widely used in the literature to assess Oral-Health-Related-Quality-of-Life (OHRQoL) for edentulous patients. However the normal variance and mean of the baseline OHIP scores has not been reported. It would facilitate critical appraisal of studies if we had knowledge of the normal variation and mean of baseline OHIP-EDENT scores. An established figure for baseline OHIP-EDENT, obtained from a meta-analysis, would simplify comparisons of studies and quantify variations in initial OHRQoL of the trial participants. The aim of this study is to quantify a normal baseline value for pre-operative OHIP-EDENT scores by a systematic review and meta-analysis of the available literature. A systematic literature review was carried. 83 papers were identified that included OHIP-EDENT values. After screening and eligibility assessment, 7 papers were selected and included in the meta-analysis. A meta-analysis for the 7 papers by a random-effect model yielded a mean baseline OHIP-EDENT score of 28.63 with a 95% Confidence intervals from 21.93 to 35.34. A pre-operative baseline OHIP-EDENT has been established by meta-analysis of published papers. This will facilitate the comparison of the initial OHRQoL of one study population to that found elsewhere in the published literature. Copyright© 2018 Dennis Barber Ltd.

  17. Ankylosing Spondylitis Patients Commencing Biologic Therapy Have High Baseline Levels of Comorbidity: A Report from the Australian Rheumatology Association Database

    Directory of Open Access Journals (Sweden)

    John Oldroyd

    2009-01-01

    Full Text Available Aims. To compare the baseline characteristics of a population-based cohort of patients with ankylosing spondylitis (AS commencing biological therapy to the reported characteristics of bDMARD randomised controlled trials (RCTs participants. Methods. Descriptive analysis of AS participants in the Australian Rheumatology Association Database (ARAD who were commencing bDMARD therapy. Results. Up to December 2008, 389 patients with AS were enrolled in ARAD. 354 (91.0% had taken bDMARDs at some time, and 198 (55.9% completed their entry questionnaire prior to or within 6 months of commencing bDMARDs. 131 (66.1% had at least one comorbid condition, and 24 (6.8% had a previous malignancy (15 nonmelanoma skin, 4 melanoma, 2 prostate, 1 breast, cervix, and bowel. Compared with RCT participants, ARAD participants were older, had longer disease duration and higher baseline disease activity. Conclusions. AS patients commencing bDMARDs in routine care are significantly different to RCT participants and have significant baseline comorbidities.

  18. How to write a research proposal?

    Directory of Open Access Journals (Sweden)

    K Sudheesh

    2016-01-01

    Full Text Available Writing the proposal of a research work in the present era is a challenging task due to the constantly evolving trends in the qualitative research design and the need to incorporate medical advances into the methodology. The proposal is a detailed plan or ′blueprint′ for the intended study, and once it is completed, the research project should flow smoothly. Even today, many of the proposals at post-graduate evaluation committees and application proposals for funding are substandard. A search was conducted with keywords such as research proposal, writing proposal and qualitative using search engines, namely, PubMed and Google Scholar, and an attempt has been made to provide broad guidelines for writing a scientifically appropriate research proposal.

  19. Efficacy of a text messaging (SMS) based intervention for adults with hypertension: protocol for the StAR (SMS Text-message Adherence suppoRt trial) randomised controlled trial.

    Science.gov (United States)

    Bobrow, Kirsty; Brennan, Thomas; Springer, David; Levitt, Naomi S; Rayner, Brian; Namane, Mosedi; Yu, Ly-Mee; Tarassenko, Lionel; Farmer, Andrew

    2014-01-11

    Interventions to support people with hypertension in attending clinics and taking their medication have potential to improve outcomes, but delivery on a wide scale and at low cost is challenging. Some trials evaluating clinical interventions using short message service (SMS) text-messaging systems have shown important outcomes, although evidence is limited. We have developed a novel SMS system integrated with clinical care for use by people with hypertension in a low-resource setting. We aim to test the efficacy of the system in improving blood pressure control and treatment adherence compared to usual care. The SMS Text-message Adherence suppoRt trial (StAR) is a pragmatic individually randomised three-arm parallel group trial in adults treated for hypertension at a single primary care centre in Cape Town, South Africa. The intervention is a structured programme of clinic appointment, medication pick-up reminders, medication adherence support and hypertension-related education delivered remotely using an automated system with either informational or interactive SMS text-messages. Usual care is supplemented by infrequent non-hypertension related SMS text-messages. Participants are 1:1:1 individually randomised, to usual care or to one of the two active interventions using minimisation to dynamically adjust for gender, age, baseline systolic blood pressure, years with hypertension, and previous clinic attendance. The primary outcome is the change in mean systolic blood pressure at 12-month follow-up from baseline measured with research staff blinded to trial allocation. Secondary outcomes include the proportion of patients with 80% or more of days medication available, proportion of participants achieving a systolic blood pressure less than 140 mmHg and a diastolic blood pressure less than 90 mmHg, hospital admissions, health status, retention in clinical care, satisfaction with treatment and care, and patient related quality of life. Anonymised demographic data

  20. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  1. Texting while driving: is speech-based text entry less risky than handheld text entry?

    Science.gov (United States)

    He, J; Chaparro, A; Nguyen, B; Burge, R J; Crandall, J; Chaparro, B; Ni, R; Cao, S

    2014-11-01

    Research indicates that using a cell phone to talk or text while maneuvering a vehicle impairs driving performance. However, few published studies directly compare the distracting effects of texting using a hands-free (i.e., speech-based interface) versus handheld cell phone, which is an important issue for legislation, automotive interface design and driving safety training. This study compared the effect of speech-based versus handheld text entries on simulated driving performance by asking participants to perform a car following task while controlling the duration of a secondary text-entry task. Results showed that both speech-based and handheld text entries impaired driving performance relative to the drive-only condition by causing more variation in speed and lane position. Handheld text entry also increased the brake response time and increased variation in headway distance. Text entry using a speech-based cell phone was less detrimental to driving performance than handheld text entry. Nevertheless, the speech-based text entry task still significantly impaired driving compared to the drive-only condition. These results suggest that speech-based text entry disrupts driving, but reduces the level of performance interference compared to text entry with a handheld device. In addition, the difference in the distraction effect caused by speech-based and handheld text entry is not simply due to the difference in task duration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. National Cyberethics, Cybersafety, Cybersecurity Baseline Study

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2009

    2009-01-01

    This article presents findings from a study that explores the nature of the Cyberethics, Cybersafety, and Cybersecurity (C3) educational awareness policies, initiatives, curriculum, and practices currently taking place in the U.S. public and private K-12 educational settings. The study establishes baseline data on C3 awareness, which can be used…

  3. ParticipACTION: Overview and introduction of baseline research on the "new" ParticipACTION

    Directory of Open Access Journals (Sweden)

    Craig Cora L

    2009-12-01

    Full Text Available Abstract Background This paper provides a brief overview of the Canadian physical activity communications and social marketing organization "ParticipACTION"; introduces the "new" ParticipACTION; describes the research process leading to the collection of baseline data on the new ParticipACTION; and outlines the accompanying series of papers in the supplement presenting the detailed baseline data. Methods Information on ParticipACTION was gathered from close personal involvement with the organization, from interviews and meetings with key leaders of the organization, from published literature and from ParticipACTION archives. In 2001, after nearly 30 years of operation, ParticipACTION ceased operations because of inadequate funding. In February 2007 the organization was officially resurrected and the launch of the first mass media campaign of the "new" ParticipACTION occurred in October 2007. The six-year absence of ParticipACTION, or any equivalent substitute, provided a unique opportunity to examine the impact of a national physical activity social marketing organization on important individual and organizational level indicators of success. A rapid response research team was established in January 2007 to exploit this natural intervention research opportunity. Results The research team was successful in obtaining funding through the new Canadian Institutes of Health Research Intervention Research (Healthy Living and Chronic Disease Prevention Funding Program. Data were collected on individuals and organizations prior to the complete implementation of the first mass media campaign of the new ParticipACTION. Conclusion Rapid response research and funding mechanisms facilitated the collection of baseline information on the new ParticipACTION. These data will allow for comprehensive assessments of future initiatives of ParticipACTION.

  4. A Survey of Text Mining in Social Media: Facebook and Twitter Perspectives

    Directory of Open Access Journals (Sweden)

    Said A. Salloum

    2017-01-01

    Full Text Available Text mining has become one of the trendy fields that has been incorporated in several research fields such as computational linguistics, Information Retrieval (IR and data mining. Natural Language Processing (NLP techniques were used to extract knowledge from the textual text that is written by human beings. Text mining reads an unstructured form of data to provide meaningful information patterns in a shortest time period. Social networking sites are a great source of communication as most of the people in today’s world use these sites in their daily lives to keep connected to each other. It becomes a common practice to not write a sentence with correct grammar and spelling. This practice may lead to different kinds of ambiguities like lexical, syntactic, and semantic and due to this type of unclear data, it is hard to find out the actual data order. Accordingly, we are conducting an investigation with the aim of looking for different text mining methods to get various textual orders on social media websites. This survey aims to describe how studies in social media have used text analytics and text mining techniques for the purpose of identifying the key themes in the data. This survey focused on analyzing the text mining studies related to Facebook and Twitter; the two dominant social media in the world. Results of this survey can serve as the baselines for future text mining research.

  5. Groundwater chemical baseline values to assess the Recovery Plan in the Matanza-Riachuelo River basin, Argentina.

    Science.gov (United States)

    Zabala, M E; Martínez, S; Manzano, M; Vives, L

    2016-01-15

    The two most exploited aquifers in the Matanza-Riachuelo River basin are being monitored in the framework of the Integrated Environmental Sanitation Plan that implements the Basin Authority, Autoridad de Cuenca Matanza Riachuelo. In this context, this work identifies the groundwater chemical types and the natural processes behind them; determines spatial and temporal changes; establishes ranges of variation for chemical components, and proposes concentration values for the upper limit of the natural chemical background. A total of 1007 samples from three aquifer-layers (Upper Aquifer, top and bottom of Puelche Aquifer) have been studied. As concrete guidelines for practical determination of baseline values are not available in the region, the methodology used follows the proposals of European projects which assessed European water directives. The groundwater composition is very stable in terms of both chemical facies and mineralization degree, and the changes observed in the dry and wet periods analysed are subtle in general. Most of the groundwater is Na-HCO3 type, except a few samples that are Ca-HCO3, Na-ClSO4 and Na-Cl types. The Ca-HCO3 waters are the result of calcium carbonate dissolution, Na-HCO3 waters result from cation exchange and carbonate dissolution, while in the Na-ClSO4 and Na-Cl waters, mixing with connate and with encroached old marine water from the underlying and overlying sediments are the most relevant processes. The proposed values for the upper limit of the natural background consider the influence of geology and Holocene marine ingressions in the baseline of coastal groundwater. This study allowed to know the initial chemical conditions of the groundwater system of the Matanza-Riachuelo River basin and to establish the reference from which Basin Authority can start to evaluate trends and monitor the recovery plan. At the same time, it sets a precedent for future studies in the region. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Modelling text as process a dynamic approach to EFL classroom discourse

    CERN Document Server

    Yang, Xueyan

    2010-01-01

    A discourse analysis that is not based on grammar is likely to end up as a running commentary on a text, whereas a grammar-based one tends to treat text as a finished product rather than an on-going process. This book offers an approach to discourse analysis that is both grammar-based and oriented towards text as process. It proposes a model called TEXT TYPE within the framework of Hallidayan systemic-functional linguistics, which views grammatical choices in a text not as elements that combine to form a clause structure, but as semantic features that link successive clauses into an unfolding

  7. Important Text Characteristics for Early-Grades Text Complexity

    Science.gov (United States)

    Fitzgerald, Jill; Elmore, Jeff; Koons, Heather; Hiebert, Elfrieda H.; Bowen, Kimberly; Sanford-Moore, Eleanor E.; Stenner, A. Jackson

    2015-01-01

    The Common Core set a standard for all children to read increasingly complex texts throughout schooling. The purpose of the present study was to explore text characteristics specifically in relation to early-grades text complexity. Three hundred fifty primary-grades texts were selected and digitized. Twenty-two text characteristics were identified…

  8. Future Long-Baseline Neutrino Facilities and Detectors

    Directory of Open Access Journals (Sweden)

    Milind Diwan

    2013-01-01

    Full Text Available We review the ongoing effort in the US, Japan, and Europe of the scientific community to study the location and the detector performance of the next-generation long-baseline neutrino facility. For many decades, research on the properties of neutrinos and the use of neutrinos to study the fundamental building blocks of matter has unveiled new, unexpected laws of nature. Results of neutrino experiments have triggered a tremendous amount of development in theory: theories beyond the standard model or at least extensions of it and development of the standard solar model and modeling of supernova explosions as well as the development of theories to explain the matter-antimatter asymmetry in the universe. Neutrino physics is one of the most dynamic and exciting fields of research in fundamental particle physics and astrophysics. The next-generation neutrino detector will address two aspects: fundamental properties of the neutrino like mass hierarchy, mixing angles, and the CP phase, and low-energy neutrino astronomy with solar, atmospheric, and supernova neutrinos. Such a new detector naturally allows for major improvements in the search for nucleon decay. A next-generation neutrino observatory needs a huge, megaton scale detector which in turn has to be installed in a new, international underground laboratory, capable of hosting such a huge detector.

  9. Semantic Document Image Classification Based on Valuable Text Pattern

    Directory of Open Access Journals (Sweden)

    Hossein Pourghassem

    2011-01-01

    Full Text Available Knowledge extraction from detected document image is a complex problem in the field of information technology. This problem becomes more intricate when we know, a negligible percentage of the detected document images are valuable. In this paper, a segmentation-based classification algorithm is used to analysis the document image. In this algorithm, using a two-stage segmentation approach, regions of the image are detected, and then classified to document and non-document (pure region regions in the hierarchical classification. In this paper, a novel valuable definition is proposed to classify document image in to valuable or invaluable categories. The proposed algorithm is evaluated on a database consisting of the document and non-document image that provide from Internet. Experimental results show the efficiency of the proposed algorithm in the semantic document image classification. The proposed algorithm provides accuracy rate of 98.8% for valuable and invaluable document image classification problem.

  10. Atmospheric pressure loading parameters from very long baseline interferometry observations

    Science.gov (United States)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  11. Safety Evaluation of the ESP Sludge Washing Baselines Runs. Revision 1

    International Nuclear Information System (INIS)

    Gupta, M.K.

    1994-08-01

    The purpose is to provide the technical basis for the evaluation of Unreviewed Safety Question for the Extended Sludge Processing (ESP) Sludge Washing Baseline Runs. The Baseline runs are necessary: to ascertain the mechanical fitness of the equipment and modifications not operated since 1988 and to resolve technical questions associated with process control; i.e., sludge suspension, sludge settling, heat transfer, and temperature control. These issues need to be resolved prior to resumption of normal ESP operations. The equipment used for the Baseline runs are Tanks 42H and 51H and their associated equipment

  12. Network analysis of named entity co-occurrences in written texts

    Science.gov (United States)

    Amancio, Diego Raphael

    2016-06-01

    The use of methods borrowed from statistics and physics to analyze written texts has allowed the discovery of unprecedent patterns of human behavior and cognition by establishing links between models features and language structure. While current models have been useful to unveil patterns via analysis of syntactical and semantical networks, only a few works have probed the relevance of investigating the structure arising from the relationship between relevant entities such as characters, locations and organizations. In this study, we represent entities appearing in the same context as a co-occurrence network, where links are established according to a null model based on random, shuffled texts. Computational simulations performed in novels revealed that the proposed model displays interesting topological features, such as the small world feature, characterized by high values of clustering coefficient. The effectiveness of our model was verified in a practical pattern recognition task in real networks. When compared with traditional word adjacency networks, our model displayed optimized results in identifying unknown references in texts. Because the proposed representation plays a complementary role in characterizing unstructured documents via topological analysis of named entities, we believe that it could be useful to improve the characterization of written texts (and related systems), specially if combined with traditional approaches based on statistical and deeper paradigms.

  13. Baseline Architecture of ITER Control System

    Science.gov (United States)

    Wallander, A.; Di Maio, F.; Journeaux, J.-Y.; Klotz, W.-D.; Makijarvi, P.; Yonekawa, I.

    2011-08-01

    The control system of ITER consists of thousands of computers processing hundreds of thousands of signals. The control system, being the primary tool for operating the machine, shall integrate, control and coordinate all these computers and signals and allow a limited number of staff to operate the machine from a central location with minimum human intervention. The primary functions of the ITER control system are plant control, supervision and coordination, both during experimental pulses and 24/7 continuous operation. The former can be split in three phases; preparation of the experiment by defining all parameters; executing the experiment including distributed feed-back control and finally collecting, archiving, analyzing and presenting all data produced by the experiment. We define the control system as a set of hardware and software components with well defined characteristics. The architecture addresses the organization of these components and their relationship to each other. We distinguish between physical and functional architecture, where the former defines the physical connections and the latter the data flow between components. In this paper, we identify the ITER control system based on the plant breakdown structure. Then, the control system is partitioned into a workable set of bounded subsystems. This partition considers at the same time the completeness and the integration of the subsystems. The components making up subsystems are identified and defined, a naming convention is introduced and the physical networks defined. Special attention is given to timing and real-time communication for distributed control. Finally we discuss baseline technologies for implementing the proposed architecture based on analysis, market surveys, prototyping and benchmarking carried out during the last year.

  14. Why some people discount more than others: Baseline activation in the dorsal PFC mediates the link between COMT genotype and impatient choice

    Directory of Open Access Journals (Sweden)

    Lorena R. R. Gianotti

    2012-05-01

    Full Text Available Individuals differ widely in how steeply they discount future rewards. The sources of these stable individual differences in delay discounting (DD are largely unknown. One candidate is the COMT Val158Met polymorphism, known to modulate prefrontal dopamine levels and affect DD. To identify possible neural mechanisms by which this polymorphism may contribute to stable individual DD differences, we measured 73 participants’ neural baseline activation using resting electroencephalogram (EEG. Such neural baseline activation measures are highly heritable and stable over time, thus an ideal endophenotype candidate to explain how genes may influence behavior via individual differences in neural function. After EEG-recording, participants made a series of incentive-compatible intertemporal choices to determine the steepness of their DD. We found that COMT significantly affected DD and that this effect was mediated by baseline activation level in the left dorsal prefrontal cortex (DPFC: (i COMT had a significant effect on DD such that the number of Val alleles was positively correlated with steeper DD (higher numbers of Val alleles means greater COMT activity and thus lower dopamine levels. (ii A whole-brain search identified a cluster in left DPFC where baseline activation was correlated with DD; lower activation was associated with steeper DD. (iii COMT had a significant effect on the baseline activation level in this left DPFC cluster such that a higher number of Val alleles was associated with lower baseline activation. (iv The effect of COMT on DD was explained by the mediating effect of neural baseline activation in the left DPFC cluster. Our study thus establishes baseline activation level in left DPFC as salient neural signature in the form of an endophenotype that mediates the link between COMT and DD.

  15. Network meta-analysis of disconnected networks: How dangerous are random baseline treatment effects?

    Science.gov (United States)

    Béliveau, Audrey; Goring, Sarah; Platt, Robert W; Gustafson, Paul

    2017-12-01

    In network meta-analysis, the use of fixed baseline treatment effects (a priori independent) in a contrast-based approach is regularly preferred to the use of random baseline treatment effects (a priori dependent). That is because, often, there is not a need to model baseline treatment effects, which carry the risk of model misspecification. However, in disconnected networks, fixed baseline treatment effects do not work (unless extra assumptions are made), as there is not enough information in the data to update the prior distribution on the contrasts between disconnected treatments. In this paper, we investigate to what extent the use of random baseline treatment effects is dangerous in disconnected networks. We take 2 publicly available datasets of connected networks and disconnect them in multiple ways. We then compare the results of treatment comparisons obtained from a Bayesian contrast-based analysis of each disconnected network using random normally distributed and exchangeable baseline treatment effects to those obtained from a Bayesian contrast-based analysis of their initial connected network using fixed baseline treatment effects. For the 2 datasets considered, we found that the use of random baseline treatment effects in disconnected networks was appropriate. Because those datasets were not cherry-picked, there should be other disconnected networks that would benefit from being analyzed using random baseline treatment effects. However, there is also a risk for the normality and exchangeability assumption to be inappropriate in other datasets even though we have not observed this situation in our case study. We provide code, so other datasets can be investigated. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Interactive Effects of Dopamine Baseline Levels and Cycle Phase on Executive Functions: The Role of Progesterone

    Directory of Open Access Journals (Sweden)

    Esmeralda Hidalgo-Lopez

    2017-07-01

    Full Text Available Estradiol and progesterone levels vary along the menstrual cycle and have multiple neuroactive effects, including on the dopaminergic system. Dopamine relates to executive functions in an “inverted U-shaped” manner and its levels are increased by estradiol. Accordingly, dopamine dependent changes in executive functions along the menstrual cycle have been previously studied in the pre-ovulatory phase, when estradiol levels peak. Specifically it has been demonstrated that working memory is enhanced during the pre-ovulatory phase in women with low dopamine baseline levels, but impaired in women with high dopamine baseline levels. However, the role of progesterone, which peaks in the luteal cycle phase, has not been taken into account previously. Therefore, the main goals of the present study were to extend these findings (i to the luteal cycle phase and (ii to other executive functions. Furthermore, the usefulness of the eye blink rate (EBR as an indicator of dopamine baseline levels in menstrual cycle research was explored. 36 naturally cycling women were tested during three cycle phases (menses–low sex hormones; pre-ovulatory–high estradiol; luteal–high progesterone and estradiol. During each session, women performed a verbal N-back task, as measure of working memory, and a single trial version of the Stroop task, as measure of response inhibition and cognitive flexibility. Hormone levels were assessed from saliva samples and spontaneous eye blink rate was recorded during menses. In the N-back task, women were faster during the luteal phase the higher their progesterone levels, irrespective of their dopamine baseline levels. In the Stroop task, we found a dopamine-cycle interaction, which was also driven by the luteal phase and progesterone levels. For women with higher EBR performance decreased during the luteal phase, whereas for women with lower EBR performance improved during the luteal phase. These findings suggest an important

  17. Effects of milk curd on saliva secretion in healthy volunteer compared to baseline, 2% pilocarpine and equivalent pH adjusted acetic acid solutions

    Directory of Open Access Journals (Sweden)

    Neda Babaee

    2011-01-01

    Full Text Available Background: Dry mouth is a common clinical problem, and different products have been proposed to improve it. In this investigation, the effects of "milk curd" on the amount of saliva secretion were studied. Materials and Methods: A total of 32 patients (aged 20-30 were selected from healthy volunteers. Milk curd concentrations of 0.5, 1, 2 and 4%, and 2% pilocarpine were prepared as drops. The impact of the drugs on the saliva weight was assessed after 1-5 min. To determine the effects of the pH of the milk curd on the amount of saliva secretion, different concentrations of acetic acid were used. Results: At the end of the first minute, the differences between the data for all groups were statistically significant, and the difference between the 2% and 4% milk curd groups was higher than the others (P < 0.0001. The differences in the amount of the saliva secreted at the end of the second minute between the baseline and 4% milk curd groups and between the 0.5% and 4% MC groups were significant (P = 0.006 and P = 0.025, respectively. In total, there was no significant difference between the effect of various pH treatments and the amount of baseline saliva secretion. Conclusion: Milk curd has a significant local impact, and the saliva increase depends on the dose. It seems that this effect is not only related to its acidic taste. As a result, factors other than pH are involved in the effect.

  18. Exhaled Breath Condensate Detects Baseline Reductions in Chloride and Increases in Response to Albuterol in Cystic Fibrosis Patients

    Directory of Open Access Journals (Sweden)

    Courtney M. Wheatley

    2013-01-01

    Full Text Available Impaired ion regulation and dehydration is the primary pathophysiology in cystic fibrosis (CF lung disease. A potential application of exhaled breath condensate (EBC collection is to assess airway surface liquid ionic composition at baseline and in response to pharmacological therapy in CF. Our aims were to determine if EBC could detect differences in ion regulation between CF and healthy and measure the effect of the albuterol on EBC ions in these populations. Baseline EBC Cl − , DLCO and SpO 2 were lower in CF (n = 16 compared to healthy participants (n = 16. EBC Cl − increased in CF subjects, while there was no change in DLCO or membrane conductance, but a decrease in pulmonary-capillary blood volume in both groups following albuterol. This resulted in an improvement in diffusion at the alveolar-capillary unit, and removal of the baseline difference in SpO 2 by 90-minutes in CF subjects. These results demonstrate that EBC detects differences in ion regulation between healthy and CF individuals, and that albuterol mediates increases in Cl − in CF, suggesting that the benefits of albuterol extend beyond simple bronchodilation.

  19. Finding Text Information in the Ocean of Electronic Documents

    Energy Technology Data Exchange (ETDEWEB)

    Medvick, Patricia A.; Calapristi, Augustin J.

    2003-02-05

    Information management in natural resources has become an overwhelming task. A massive amount of electronic documents and data is now available for creating informed decisions. The problem is finding the relevant information to support the decision-making process. Determining gaps in knowledge in order to propose new studies or to determine which proposals to fund for maximum potential is a time-consuming and difficult task. Additionally, available data stores are increasing in complexity; they now may include not only text and numerical data, but also images, sounds, and video recordings. Information visualization specialists at Pacific Northwest National Laboratory (PNNL) have software tools for exploring electronic data stores and for discovering and exploiting relationships within data sets. These provide capabilities for unstructured text explorations, the use of data signatures (a compact format for the essence of a set of scientific data) for visualization (Wong et al 2000), visualizations for multiple query results (Havre et al. 2001), and others (http://www.pnl.gov/infoviz ). We will focus on IN-SPIRE, a MS Windows vision of PNNL’s SPIRE (Spatial Paradigm for Information Retrieval and Exploration). IN-SPIRE was developed to assist information analysts find and discover information in huge masses of text documents.

  20. Spent Nuclear Fuel Project technical baseline document. Fiscal year 1995: Volume 1, Baseline description

    International Nuclear Information System (INIS)

    Womack, J.C.; Cramond, R.; Paedon, R.J.

    1995-01-01

    This document is a revision to WHC-SD-SNF-SD-002, and is issued to support the individual projects that make up the Spent Nuclear Fuel Project in the lower-tier functions, requirements, interfaces, and technical baseline items. It presents results of engineering analyses since Sept. 1994. The mission of the SNFP on the Hanford site is to provide safety, economic, environmentally sound management of Hanford SNF in a manner that stages it to final disposition. This particularly involves K Basin fuel, although other SNF is involved also

  1. Mass hierarchy sensitivity of medium baseline reactor neutrino experiments with multiple detectors

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hong-Xin, E-mail: hxwang@iphy.me [Department of Physics, Nanjing University, Nanjing 210093 (China); Zhan, Liang; Li, Yu-Feng; Cao, Guo-Fu [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Chen, Shen-Jian [Department of Physics, Nanjing University, Nanjing 210093 (China)

    2017-05-15

    We report the neutrino mass hierarchy (MH) determination of medium baseline reactor neutrino experiments with multiple detectors, where the sensitivity of measuring the MH can be significantly improved by adding a near detector. Then the impact of the baseline and target mass of the near detector on the combined MH sensitivity has been studied thoroughly. The optimal selections of the baseline and target mass of the near detector are ∼12.5 km and ∼4 kton respectively for a far detector with the target mass of 20 kton and the baseline of 52.5 km. As typical examples of future medium baseline reactor neutrino experiments, the optimal location and target mass of the near detector are selected for the specific configurations of JUNO and RENO-50. Finally, we discuss distinct effects of the reactor antineutrino energy spectrum uncertainty for setups of a single detector and double detectors, which indicate that the spectrum uncertainty can be well constrained in the presence of the near detector.

  2. Dependence of optimum baseline setting on scatter fraction and detector response function

    International Nuclear Information System (INIS)

    Atkins, F.B.; Beck, R.N.; Hoffer, P.B.; Palmer, D.

    1977-01-01

    A theoretical and experimental investigation has been undertaken to determine the dependence of an optimum baseline setting on the amount of scattered radiation recorded in a spectrum, and on the energy resolution of the detector. In particular, baseline settings were established for clinical examinations which differed greatly in the amount of scattered radiation, namely, liver and brain scans, for which individual variations were found to produce only minimal fluctuations in the optimum baseline settings. This analysis resulted in an optimum baseline setting of 125.0 keV for brain scans and 127.2 keV for liver scans for the scintillation camera used in these studies. The criterion that was used is based on statistical considerations of the measurement of an unscattered component in the presence of a background due to scattered photons. The limitations of such a criterion are discussed, and phantom images are presented to illustrate these effects at various baseline settings. (author)

  3. An Automatic Baseline Regulation in a Highly Integrated Receiver Chip for JUNO

    Science.gov (United States)

    Muralidharan, P.; Zambanini, A.; Karagounis, M.; Grewing, C.; Liebau, D.; Nielinger, D.; Robens, M.; Kruth, A.; Peters, C.; Parkalian, N.; Yegin, U.; van Waasen, S.

    2017-09-01

    This paper describes the data processing unit and an automatic baseline regulation of a highly integrated readout chip (Vulcan) for JUNO. The chip collects data continuously at 1 Gsamples/sec. The Primary data processing which is performed in the integrated circuit can aid to reduce the memory and data processing efforts in the subsequent stages. In addition, a baseline regulator compensating a shift in the baseline is described.

  4. A Text-Mining Framework for Supporting Systematic Reviews.

    Science.gov (United States)

    Li, Dingcheng; Wang, Zhen; Wang, Liwei; Sohn, Sunghwan; Shen, Feichen; Murad, Mohammad Hassan; Liu, Hongfang

    2016-11-01

    Systematic reviews (SRs) involve the identification, appraisal, and synthesis of all relevant studies for focused questions in a structured reproducible manner. High-quality SRs follow strict procedures and require significant resources and time. We investigated advanced text-mining approaches to reduce the burden associated with abstract screening in SRs and provide high-level information summary. A text-mining SR supporting framework consisting of three self-defined semantics-based ranking metrics was proposed, including keyword relevance, indexed-term relevance and topic relevance. Keyword relevance is based on the user-defined keyword list used in the search strategy. Indexed-term relevance is derived from indexed vocabulary developed by domain experts used for indexing journal articles and books. Topic relevance is defined as the semantic similarity among retrieved abstracts in terms of topics generated by latent Dirichlet allocation, a Bayesian-based model for discovering topics. We tested the proposed framework using three published SRs addressing a variety of topics (Mass Media Interventions, Rectal Cancer and Influenza Vaccine). The results showed that when 91.8%, 85.7%, and 49.3% of the abstract screening labor was saved, the recalls were as high as 100% for the three cases; respectively. Relevant studies identified manually showed strong topic similarity through topic analysis, which supported the inclusion of topic analysis as relevance metric. It was demonstrated that advanced text mining approaches can significantly reduce the abstract screening labor of SRs and provide an informative summary of relevant studies.

  5. The effectiveness of the PRISMA integrated service delivery network: preliminary report on methods and baseline data

    Directory of Open Access Journals (Sweden)

    Réjean Hébert

    2008-02-01

    Full Text Available Purpose: The PRISMA study analyzes an innovative coordination-type integrated service delivery (ISD system developed to improve continuity and increase the effectiveness and efficiency of services, especially for older and disabled populations. The objective of the PRISMA study is to evaluate the effectiveness of this system to improve health, empowerment and satisfaction of frail older people, modify their health and social services utilization, without increasing the burden of informal caregivers. The objective of this paper is to present the methodology and give baseline data on the study participants. Methods: A quasi-experimental study with pre-test, multiple post-tests, and a comparison group was used to evaluate the impact of PRISMA ISD. Elders at risk of functional decline (501 experimental, 419 control participated in the study. Results: At entry, the two groups were comparable for most variables. Over the first year, when the implementation rate was low (32%, participants from the control group used fewer services than those from the experimental group. After the first year, no significant statistical difference was observed for functional decline and changes in the other outcome variables. Conclusion: This first year must be considered a baseline year, showing the situation without significant implementation of PRISMA ISD systems. Results for the following years will have to be examined with consideration of these baseline results.

  6. Semantic Linking and Contextualization for Social Forensic Text Analysis

    NARCIS (Netherlands)

    Ren, Z.; van Dijk, D.; Graus, D.; van der Knaap, N.; Henseler, H.; de Rijke, M.; Brynielsson, J.; Johansson, F.

    2013-01-01

    With the development of social media, forensic text analysis is becoming more and more challenging as forensic analysts have begun to include this information source in their practice. In this paper, we report on our recent work related to semantic search in e-discovery and propose the use of entity

  7. Fusion of space-borne multi-baseline and multi-frequency interferometric results based on extended Kalman filter to generate high quality DEMs

    Science.gov (United States)

    Zhang, Xiaojie; Zeng, Qiming; Jiao, Jian; Zhang, Jingfa

    2016-01-01

    Repeat-pass Interferometric Synthetic Aperture Radar (InSAR) is a technique that can be used to generate DEMs. But the accuracy of InSAR is greatly limited by geometrical distortions, atmospheric effect, and decorrelations, particularly in mountainous areas, such as western China where no high quality DEM has so far been accomplished. Since each of InSAR DEMs generated using data of different frequencies and baselines has their own advantages and disadvantages, it is therefore very potential to overcome some of the limitations of InSAR by fusing Multi-baseline and Multi-frequency Interferometric Results (MMIRs). This paper proposed a fusion method based on Extended Kalman Filter (EKF), which takes the InSAR-derived DEMs as states in prediction step and the flattened interferograms as observations in control step to generate the final fused DEM. Before the fusion, detection of layover and shadow regions, low-coherence regions and regions with large height error is carried out because MMIRs in these regions are believed to be unreliable and thereafter are excluded. The whole processing flow is tested with TerraSAR-X and Envisat ASAR datasets. Finally, the fused DEM is validated with ASTER GDEM and national standard DEM of China. The results demonstrate that the proposed method is effective even in low coherence areas.

  8. A new approach to the classification of African oral texts | Kam ...

    African Journals Online (AJOL)

    Toutes ces raisons ont conduit à un nouvel examen des différents genres oraux dans le cadre africain et à proposer une division de ces textes en cinq grandes catégories. Mots clés: littérature orale, genres oraux, textes oraux, discours, énoncés, jeux de plaisanterie, chercheurs en littérature orale. Tydskrif vir Letterkunde ...

  9. Stability analysis of geomagnetic baseline data obtained at Cheongyang observatory in Korea

    Science.gov (United States)

    Amran, Shakirah M.; Kim, Wan-Seop; Cho, Heh Ree; Park, Po Gyu

    2017-07-01

    The stability of baselines produced by Cheongyang (CYG) observatory from the period of 2014 to 2016 is analysed. Step heights of higher than 5 nT were found in H and Z components in 2014 and 2015 due to magnetic noise in the absolute-measurement hut. In addition, a periodic modulation behaviour observed in the H and Z baseline curves was related to annual temperature variation of about 20 °C in the fluxgate magnetometer hut. Improvement in data quality was evidenced by a small dispersion between successive measurements from June 2015 to the end of 2016. Moreover, the baseline was also improved by correcting the discontinuity in the H and Z baselines.

  10. Corrective action baseline report for underground storage tank 2331-U Building 9201-1

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this report is to provide baseline geochemical and hydrogeologic data relative to corrective action for underground storage tank (UST) 2331-U at the Building 9201-1 Site. Progress in support of the Building 9201-1 Site has included monitoring well installation and baseline groundwater sampling and analysis. This document represents the baseline report for corrective action at the Building 9201-1 site and is organized into three sections. Section 1 presents introductory information relative to the site, including the regulatory initiative, site description, and progress to date. Section 2 includes the summary of additional monitoring well installation activities and the results of baseline groundwater sampling. Section 3 presents the baseline hydrogeology and planned zone of influence for groundwater remediation

  11. 77 FR 28373 - Liberty Energy (Midstates) Corp.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-14

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-24-000] Liberty Energy (Midstates) Corp.; Notice of Baseline Filing Take notice that on April 30, 2012, Liberty Energy (Midstates) Corp. submitted a baseline filing of their Statement of Operating Conditions for services provided...

  12. Mining the Text: 34 Text Features that Can Ease or Obstruct Text Comprehension and Use

    Science.gov (United States)

    White, Sheida

    2012-01-01

    This article presents 34 characteristics of texts and tasks ("text features") that can make continuous (prose), noncontinuous (document), and quantitative texts easier or more difficult for adolescents and adults to comprehend and use. The text features were identified by examining the assessment tasks and associated texts in the national…

  13. Text extraction method for historical Tibetan document images based on block projections

    Science.gov (United States)

    Duan, Li-juan; Zhang, Xi-qun; Ma, Long-long; Wu, Jian

    2017-11-01

    Text extraction is an important initial step in digitizing the historical documents. In this paper, we present a text extraction method for historical Tibetan document images based on block projections. The task of text extraction is considered as text area detection and location problem. The images are divided equally into blocks and the blocks are filtered by the information of the categories of connected components and corner point density. By analyzing the filtered blocks' projections, the approximate text areas can be located, and the text regions are extracted. Experiments on the dataset of historical Tibetan documents demonstrate the effectiveness of the proposed method.

  14. The qualitative research proposal

    Directory of Open Access Journals (Sweden)

    H Klopper

    2008-09-01

    Full Text Available Qualitative research in the health sciences has had to overcome many prejudices and a number of misunderstandings, but today qualitative research is as acceptable as quantitative research designs and is widely funded and published. Writing the proposal of a qualitative study, however, can be a challenging feat, due to the emergent nature of the qualitative research design and the description of the methodology as a process. Even today, many sub-standard proposals at post-graduate evaluation committees and application proposals to be considered for funding are still seen. This problem has led the researcher to develop a framework to guide the qualitative researcher in writing the proposal of a qualitative study based on the following research questions: (i What is the process of writing a qualitative research proposal? and (ii What does the structure and layout of a qualitative proposal look like? The purpose of this article is to discuss the process of writing the qualitative research proposal, as well as describe the structure and layout of a qualitative research proposal. The process of writing a qualitative research proposal is discussed with regards to the most important questions that need to be answered in your research proposal with consideration of the guidelines of being practical, being persuasive, making broader links, aiming for crystal clarity and planning before you write. While the structure of the qualitative research proposal is discussed with regards to the key sections of the proposal, namely the cover page, abstract, introduction, review of the literature, research problem and research questions, research purpose and objectives, research paradigm, research design, research method, ethical considerations, dissemination plan, budget and appendices.

  15. U-10Mo Baseline Fuel Fabrication Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Hubbard, Lance R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Arendt, Christina L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dye, Daniel F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clayton, Christopher K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lerchen, Megan E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lombardo, Nicholas J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zacher, Alan H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-27

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle of the USHPRR program. This document, along with the accompanying PFD, is updated regularly

  16. Spent nuclear fuel technical baseline description, Fiscal Year 1996: Volume II, supporting data

    International Nuclear Information System (INIS)

    Womack, J.C.

    1995-11-01

    The Technical Baseline Description documents the Project-Level functions and requirements, along with associated enabling assumptions, issues, trade studies, interfaces, and products. It is a snapshot in time of the baseline at the beginning of September 1995. It supports the individual subprojects in the development of lower-tier functions, requirements, and specifications in FY 1996. It also supports the need for Hanford site planning to be based on an integrated Hanford site systems engineering technical baseline; and is traceable to that baseline. This document replaces and supercedes WHC-SD-SNF-SD-003

  17. 75 FR 70732 - Enterprise Texas Pipeline LLC; Notice of Baseline Filing

    Science.gov (United States)

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-92-001] Enterprise Texas Pipeline LLC; Notice of Baseline Filing November 10, 2010. Take notice that on November 9, 2010, Enterprise Texas Pipeline LLC submitted a revised baseline filing of its Statement of Operating Conditions...

  18. System maintenance test plan for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    TWRS [Tank Waste Remediation System] Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the maintenance testing approach for software testing of the TCBD system once SCR/PRs are implemented

  19. 40 CFR 80.290 - How does a refiner apply for a sulfur baseline?

    Science.gov (United States)

    2010-07-01

    ... baseline? 80.290 Section 80.290 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (abt) Program-General Information § 80.290 How does a refiner apply for a sulfur baseline? (a) The... accordance with § 80.217. (b) The sulfur baseline request must be sent to: U.S. EPA, Attn: Sulfur Program...

  20. Baseline inventory data recommendations for National Wildlife Refuges

    Data.gov (United States)

    Department of the Interior — The Baseline Inventory Team recommends that each refuge have available abiotic “data layers” for topography, aerial photography, hydrography, soils, boundaries, and...

  1. Baseline for trust: defining 'new and additional' climate funding

    Energy Technology Data Exchange (ETDEWEB)

    Stadelmann, Martin [University of Zurich (Switzerland); Roberts, J. Timmons [Brown University (United States); Huq, Saleemul

    2010-06-15

    Climate finance is becoming a dark curve on the road from Copenhagen to Cancún. Poorer nations fear that richer ones will fulfil the US$30 billion 'fast-start' climate finance promises made in the non-binding Copenhagen Accord by relabelling or diverting basic development aid, or by simply delivering on past climate finance pledges. The problem is simple: contributor countries are operating with no clear baseline against which their promise of 'new and additional' funding can be counted – and they do not accept the baselines put forth by developing countries. A viable solution for the short term is to use projections of business-as-usual development assistance as baselines. The longer-term benchmark could be the provision of truly 'new' funds from new funding sources. Substantial up-front negotiations may be required, but seizing this opportunity to define baselines will build confidence on both sides and create predictability for future finance.

  2. Predicting Baseline for Analysis of Electricity Pricing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Lee, D. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Choi, J. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Spurlock, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-03

    To understand the impact of new pricing structure on residential electricity demands, we need a baseline model that captures every factor other than the new price. The standard baseline is a randomized control group, however, a good control group is hard to design. This motivates us to devlop data-driven approaches. We explored many techniques and designed a strategy, named LTAP, that could predict the hourly usage years ahead. The key challenge in this process is that the daily cycle of electricity demand peaks a few hours after the temperature reaching its peak. Existing methods rely on the lagged variables of recent past usages to enforce this daily cycle. These methods have trouble making predictions years ahead. LTAP avoids this trouble by assuming the daily usage profile is determined by temperature and other factors. In a comparison against a well-designed control group, LTAP is found to produce accurate predictions.

  3. CASA Uno GPS orbit and baseline experiments

    Science.gov (United States)

    Schutz, B. E.; Ho, C. S.; Abusali, P. A. M.; Tapley, B. D.

    1990-01-01

    CASA Uno data from sites distributed in longitude from Australia to Europe have been used to determine orbits of the GPS satellites. The characteristics of the orbits determined from double difference phase have been evaluated through comparisons of two-week solutions with one-week solutions and by comparisons of predicted and estimated orbits. Evidence of unmodeled effects is demonstrated, particularly associated with the orbit planes that experience solar eclipse. The orbit accuracy has been assessed through the repeatability of unconstrained estimated baseline vectors ranging from 245 km to 5400 km. Both the baseline repeatability and the comparison with independent space geodetic methods give results at the level of 1-2 parts in 100,000,000. In addition, the Mojave/Owens Valley (245 km) and Kokee Park/Ft. Davis (5409 km) estimates agree with VLBI and SLR to better than 1 part in 100,000,000.

  4. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    Energy Technology Data Exchange (ETDEWEB)

    Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

  5. Communication dated 16 June 2008 received from the Permanent Mission of the Islamic Republic of Iran to the Agency concerning the text of the 'Islamic Republic of Iran's proposed package for constructive negotiation'

    International Nuclear Information System (INIS)

    2008-01-01

    The Secretariat has received a Note Verbale dated 16 June 2008 from the Permanent Mission of the Islamic Republic of Iran attaching the text of the 'Islamic Republic of Iran's proposed package for constructive negotiation'. The Note Verbale and, as requested therein, its attachment, are circulated herewith for the information of the Member States

  6. 75 FR 38802 - DCP Raptor Pipeline, LLC; Notice of Baseline Filing

    Science.gov (United States)

    2010-07-06

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-42-000] DCP Raptor Pipeline, LLC; Notice of Baseline Filing June 28, 2010. Take notice that on June 22, 2010, DCP Raptor Pipeline, LLC submitted a baseline filing of its Statement of Operating Conditions for services provided...

  7. Guidance on Port Biological Baseline Surveys (PBBS)

    Digital Repository Service at National Institute of Oceanography (India)

    Awad, A.; Haag, F.; Anil, A.C.; Abdulla, A.

    This publication has been prepared by GBP, IOI, CSIR-NIO and IUCN in order to serve as guidance to those who are planning to carry out a port biological baseline survey, in particular in the context of Ballast Water Management. It has been drafted...

  8. Baseline and Multimodal UAV GCS Interface Design

    Science.gov (United States)

    2013-07-01

    complete a computerized version of the NASA - TLX assessment of perceived mental workload. 2.3 Results The baseline condition ran smoothly and with...System MALE Medium-altitude, Long-endurance NASA - TLX NASA Task Load Index SA Situation Awareness TDT Tucker Davis Technologies UAV Uninhabited Aerial

  9. 48 CFR 34.202 - Integrated Baseline Reviews.

    Science.gov (United States)

    2010-10-01

    ... or contractor, and the Government, of the— (1) Ability of the project's technical plan to achieve the... successfully achieve the project schedule objectives; (3) Ability of the Performance Measurement Baseline (PMB... SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 34.202...

  10. Build It, But Will They Come? A Geoscience Cyberinfrastructure Baseline Analysis

    Directory of Open Access Journals (Sweden)

    Joel Cutcher-Gershenfeld

    2016-07-01

    Full Text Available Understanding the earth as a system requires integrating many forms of data from multiple fields. Builders and funders of the cyberinfrastructure designed to enable open data sharing in the geosciences risk a key failure mode: What if geoscientists do not use the cyberinfrastructure to share, discover and reuse data? In this study, we report a baseline assessment of engagement with the NSF EarthCube initiative, an open cyberinfrastructure effort for the geosciences. We find scientists perceive the need for cross-disciplinary engagement and engage where there is organizational or institutional support. However, we also find a possibly imbalanced involvement between cyber and geoscience communities at the outset, with the former showing more interest than the latter. This analysis highlights the importance of examining fields and disciplines as stakeholders to investments in the cyberinfrastructure supporting science.

  11. Baseline glucocorticoids are drivers of body mass gain in a diving seabird.

    Science.gov (United States)

    Hennin, Holly L; Wells-Berlin, Alicia M; Love, Oliver P

    2016-03-01

    Life-history trade-offs are influenced by variation in individual state, with individuals in better condition often completing life-history stages with greater success. Although resource accrual significantly impacts key life-history decisions such as the timing of reproduction, little is known about the underlying mechanisms driving resource accumulation. Baseline corticosterone (CORT, the primary avian glucocorticoid) mediates daily and seasonal energetics, responds to changes in food availability, and has been linked to foraging behavior, making it a strong potential driver of individual variation in resource accrual and deposition. Working with a captive colony of white-winged scoters (Melanitta fusca deglandi), we aimed to causally determine whether variation in baseline CORT drives individual body mass gains mediated through fattening rate (plasma triglycerides corrected for body mass). We implanted individuals with each of three treatment pellets to elevate CORT within a baseline range in a randomized order: control, low dose of CORT, high dose of CORT, then blood sampled and recorded body mass over a two-week period to track changes in baseline CORT, body mass, and fattening rates. The high CORT treatment significantly elevated levels of plasma hormone for a short period of time within the biologically relevant, baseline range for this species, but importantly did not inhibit the function of the HPA (hypothalamic-pituitary-adrenal) axis. Furthermore, an elevation in baseline CORT resulted in a consistent increase in body mass throughout the trial period compared to controls. This is some of the first empirical evidence demonstrating that elevations of baseline CORT within a biologically relevant range have a causal, direct, and positive influence on changes in body mass.

  12. Baseline glucocorticoids are drivers of body mass gain in a diving seabird

    Science.gov (United States)

    Hennin, Holly; Berlin, Alicia; Love, Oliver P.

    2016-01-01

    Life-history trade-offs are influenced by variation in individual state, with individuals in better condition often completing life-history stages with greater success. Although resource accrual significantly impacts key life-history decisions such as the timing of reproduction, little is known about the underlying mechanisms driving resource accumulation. Baseline corticosterone (CORT, the primary avian glucocorticoid) mediates daily and seasonal energetics, responds to changes in food availability, and has been linked to foraging behavior, making it a strong potential driver of individual variation in resource accrual and deposition. Working with a captive colony of white-winged scoters (Melanitta fusca deglandi), we aimed to causally determine whether variation in baseline CORT drives individual body mass gains mediated through fattening rate (plasma triglycerides corrected for body mass). We implanted individuals with each of three treatment pellets to elevate CORT within a baseline range in a randomized order: control, low dose of CORT, high dose of CORT, then blood sampled and recorded body mass over a two-week period to track changes in baseline CORT, body mass, and fattening rates. The high CORT treatment significantly elevated levels of plasma hormone for a short period of time within the biologically relevant, baseline range for this species, but importantly did not inhibit the function of the HPA (hypothalamic–pituitary–adrenal) axis. Furthermore, an elevation in baseline CORT resulted in a consistent increase in body mass throughout the trial period compared to controls. This is some of the first empirical evidence demonstrating that elevations of baseline CORT within a biologically relevant range have a causal, direct, and positive influence on changes in body mass.

  13. Item response theory analysis of the mechanics baseline test

    Science.gov (United States)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  14. Improving Ambiguity Resolution for Medium Baselines Using Combined GPS and BDS Dual/Triple-Frequency Observations.

    Science.gov (United States)

    Gao, Wang; Gao, Chengfa; Pan, Shuguo; Wang, Denghui; Deng, Jiadong

    2015-10-30

    The regional constellation of the BeiDou navigation satellite system (BDS) has been providing continuous positioning, navigation and timing services since 27 December 2012, covering China and the surrounding area. Real-time kinematic (RTK) positioning with combined BDS and GPS observations is feasible. Besides, all satellites of BDS can transmit triple-frequency signals. Using the advantages of multi-pseudorange and carrier observations from multi-systems and multi-frequencies is expected to be of much benefit for ambiguity resolution (AR). We propose an integrated AR strategy for medium baselines by using the combined GPS and BDS dual/triple-frequency observations. In the method, firstly the extra-wide-lane (EWL) ambiguities of triple-frequency system, i.e., BDS, are determined first. Then the dual-frequency WL ambiguities of BDS and GPS were resolved with the geometry-based model by using the BDS ambiguity-fixed EWL observations. After that, basic (i.e., L1/L2 or B1/B2) ambiguities of BDS and GPS are estimated together with the so-called ionosphere-constrained model, where the ambiguity-fixed WL observations are added to enhance the model strength. During both of the WL and basic AR, a partial ambiguity fixing (PAF) strategy is adopted to weaken the negative influence of new-rising or low-elevation satellites. Experiments were conducted and presented, in which the GPS/BDS dual/triple-frequency data were collected in Nanjing and Zhengzhou of China, with the baseline distance varying from about 28.6 to 51.9 km. The results indicate that, compared to the single triple-frequency BDS system, the combined system can significantly enhance the AR model strength, and thus improve AR performance for medium baselines with a 75.7% reduction of initialization time on average. Besides, more accurate and stable positioning results can also be derived by using the combined GPS/BDS system.

  15. Information Technology Sector Baseline Risk Assessment

    Science.gov (United States)

    2009-08-01

    alternative root be economically advantageous , an actor’s ability to exploit market forces and create an alternative root would be significantly improved...conduct their operations. Therefore, a loss or disruption to Internet services would not be advantageous for the desired outcomes of these syndicates.26... eCommerce Service loss or disruption [C] Traffic Redirection [C] = Undesired consequence Information Technology Sector Baseline Risk Assessment

  16. The prognostic utility of baseline alpha-fetoprotein for hepatocellular carcinoma patients.

    Science.gov (United States)

    Silva, Jack P; Gorman, Richard A; Berger, Nicholas G; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2017-12-01

    Alpha-fetoprotein (AFP) has a valuable role in postoperative surveillance for hepatocellular carcinoma (HCC) recurrence. The utility of pretreatment or baseline AFP remains controversial. The present study hypothesized that elevated baseline AFP levels are associated with worse overall survival in HCC patients. Adult HCC patients were identified using the National Cancer Database (2004-2013). Patients were stratified according to baseline AFP measurements into the following groups: Negative (2000). The primary outcome was overall survival (OS), which was analyzed by log-rank test and graphed using Kaplan-Meier method. Multivariate regression modeling was used to determine hazard ratios (HR) for OS. Of 41 107 patients identified, 15 809 (33.6%) were Negative. Median overall survival was highest in the Negative group, followed by Borderline, Elevated, and Highly Elevated (28.7 vs 18.9 vs 8.8 vs 3.2 months; P < 0.001). On multivariate analysis, overall survival hazard ratios for the Borderline, Elevated, and Highly Elevated groups were 1.18 (P = 0.267), 1.94 (P < 0.001), and 1.77 (P = 0.007), respectively (reference Negative). Baseline AFP independently predicted overall survival in HCC patients regardless of treatment plan. A baseline AFP value is a simple and effective method to assist in expected survival for HCC patients. © 2017 Wiley Periodicals, Inc.

  17. Effect of computer game playing on baseline laparoscopic simulator skills.

    Science.gov (United States)

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  18. Investigating the baselines of Bismuth, Optical Fiber and LED calibrated photomultiplier tubes

    CERN Document Server

    Evans, Hywel Turner

    2016-01-01

    LUCID is a detector that is the luminosity monitor for the ATLAS experiment, and its aim is to determine luminosity with an uncertainty of a few percent. The main purpose of this work is the study of the baseline stability of the LUCID readout channels during calibration runs. This study represents the first systematic approach of this problem performed by the LUCID group. By replacing the mean baseline with the minimum baseline of each event, an upper limit of 2.85% was placed upon possible improvement in determining the LED amplitude. It is therefore better to use a fixed baseline for LED, as pollution has been observed when calculating event by event. For Bismuth and Fiber, the improvement cannot be more than the gain stability of 1%, therefore the existing method is verified as optimal.

  19. Baseline Screening Mammography: Performance of Full-Field Digital Mammography Versus Digital Breast Tomosynthesis.

    Science.gov (United States)

    McDonald, Elizabeth S; McCarthy, Anne Marie; Akhtar, Amana L; Synnestvedt, Marie B; Schnall, Mitchell; Conant, Emily F

    2015-11-01

    Baseline mammography studies have significantly higher recall rates than mammography studies with available comparison examinations. Digital breast tomosynthesis reduces recalls when compared with digital mammographic screening alone, but many sites operate in a hybrid environment. To maximize the effect of screening digital breast tomosynthesis with limited resources, choosing which patient populations will benefit most is critical. This study evaluates digital breast tomosynthesis in the baseline screening population. Outcomes were compared for 10,728 women who underwent digital mammography screening, including 1204 (11.2%) baseline studies, and 15,571 women who underwent digital breast tomosynthesis screening, including 1859 (11.9%) baseline studies. Recall rates, cancer detection rates, and positive predictive values were calculated. Logistic regression estimated the odds ratios of recall for digital mammography versus digital breast tomosynthesis for patients undergoing baseline screening and previously screened patients, adjusted for age, race, and breast density. In the baseline subgroup, recall rates for digital mammography and digital breast tomosynthesis screening were 20.5% and 16.0%, respectively (p = 0.002); digital breast tomosynthesis screening in the baseline subgroup resulted in a 22% reduction in recall compared with digital mammography, or 45 fewer patients recalled per 1000 patients screened. Digital breast tomosynthesis screening in the previously screened patients resulted in recall reduction of 14.3% (p tomosynthesis than from digital mammography alone.

  20. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.