WorldWideScience

Sample records for evaluated data

  1. Evaluated Nuclear Data

    Energy Technology Data Exchange (ETDEWEB)

    Oblozinsky, P.; Oblozinsky,P.; Herman,M.; Mughabghab,S.F.

    2010-10-01

    This chapter describes the current status of evaluated nuclear data for nuclear technology applications. We start with evaluation procedures for neutron-induced reactions focusing on incident energies from the thermal energy up to 20 MeV, though higher energies are also mentioned. This is followed by examining the status of evaluated neutron data for actinides that play dominant role in most of the applications, followed by coolants/moderators, structural materials and fission products. We then discuss neutron covariance data that characterize uncertainties and correlations. We explain how modern nuclear evaluated data libraries are validated against an extensive set of integral benchmark experiments. Afterwards, we briefly examine other data of importance for nuclear technology, including fission yields, thermal neutron scattering and decay data. A description of three major evaluated nuclear data libraries is provided, including the latest version of the US library ENDF/B-VII.0, European JEFF-3.1 and Japanese JENDL-3.3. A brief introduction is made to current web retrieval systems that allow easy access to a vast amount of up-to-date evaluated nuclear data for nuclear technology applications.

  2. Temperature Data Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, David

    2003-03-01

    Groundwater temperature is sensitive to the competing processes of heat flow from below the advective transport of heat by groundwater flow. Because groundwater temperature is sensitive to conductive and advective processes, groundwater temperature may be utilized as a tracer to further constrain the uncertainty of predictions of advective radionuclide transport models constructed for the Nevada Test Site (NTS). Since heat transport, geochemical, and hydrologic models for a given area must all be consistent, uncertainty can be reduced by devaluing the weight of those models that do not match estimated heat flow. The objective of this study was to identify the quantity and quality of available heat flow data at the NTS. One-hundred-forty-five temperature logs from 63 boreholes were examined. Thirteen were found to have temperature profiles suitable for the determination of heat flow values from one or more intervals within the boreholes. If sufficient spatially distributed heat flow values are obtained, a heat transport model coupled to a hydrologic model may be used to reduce the uncertainty of a nonisothermal hydrologic model of the NTS.

  3. International scope of data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Pearlstein, S.

    1985-01-01

    This paper summarizes the principal national and international evaluation activities that contributed to the widespread use of evaluated data files. Emphasis is placed on those efforts that have become best known through the availability of data, documentation, and computer codes. Early attempts at nuclear data evaluation consisted of improving communication among measurers of similar information. As reactor methodology proceeded from a four factor formula to multi-group theory the demand for detailed representation of nuclear data increased. The systematic access to large volumes of data required placing the information in computer readable formats.

  4. NUCLEAR DATA EVALUATIONS AND RECOMMENDATIONS

    Energy Technology Data Exchange (ETDEWEB)

    HOLDEN, N.E.

    2005-05-08

    The published scientific literature is scanned and periodically evaluated for neutron and non-neutron nuclear data and the resulting recommendations are published [1,2]. After the literature has been scanned and appropriate data collected, there are often problems with regard to the treatment of the various types of data during this evaluation process and with regard to the method by which the recommendations are drawn from the assessment of the collection of individual measurements. Some-problems with uncertainties are presented.

  5. The Decay Data Evaluation Project

    Science.gov (United States)

    Browne, E.

    1999-10-01

    Nuclear scientists from France, Germany, the United Kingdom, and the United States have joined their efforts to evaluate decay data for radionuclides specifically used in applied research and detector calibrations. The purpose of this collaboration, the Decay Data Evaluation Project (DDEP), is to produce recommended values, and to suggest new measurements for data that are unsatisfactory. Uniformity and reproducibility are features of importance for this work. These features, as well as the general scope of this project and its relation to the Evaluated Nuclear Structure Data File (ENSDF) will be presented and illustrated with examples. This work is presented on behalf of the Decay Data Evaluation Project. Work supported in part by the US Department of Energy under contract number DEAC03-76SF00098.

  6. Evaluated Nuclear Structure Data File

    Science.gov (United States)

    Tuli, Jagdish K.

    2004-10-01

    The Evaluated Nuclear Structure Data File (ENSDF) is a leading resource for the experimental nuclear data. It is maintained and distributed by the National Nuclear Data Center, Brookhaven National Laboratory. The file is mainly contributed to by an international network of evaluators under the auspice of the International Atomic Energy Agency. The ENSDF is updated, generally by mass number, i.e., evaluating together all isobars for a given mass number. If, however, experimental activity in an isobaric chain is limited to a particular nuclide then only that nuclide is updated. The evaluations are published in the journal Nuclear Data Sheets, a publication of Elsevier. This presentation will briefly review this and other databases and dissemination services of the US and international network, and reflect on how the network resources can help scientists in both the basic as well as the applied fields.

  7. Computational methods for data evaluation and assimilation

    CERN Document Server

    Cacuci, Dan Gabriel

    2013-01-01

    Data evaluation and data combination require the use of a wide range of probability theory concepts and tools, from deductive statistics mainly concerning frequencies and sample tallies to inductive inference for assimilating non-frequency data and a priori knowledge. Computational Methods for Data Evaluation and Assimilation presents interdisciplinary methods for integrating experimental and computational information. This self-contained book shows how the methods can be applied in many scientific and engineering areas. After presenting the fundamentals underlying the evaluation of experiment

  8. Evaluated data collections from ENSDF. [ORNL

    Energy Technology Data Exchange (ETDEWEB)

    Ewbank, W. B.

    1979-01-01

    For several years the Nuclear Data Project has been maintaining an Evaluated Nuclear Structure Data File (ENSDF), which is designed to include critically evaluated values for most nuclear spectroscopic quantities. The information in ENSDF is the same as in the Nuclear Data Sheets, which illustrates two particular output formats (drawings and tables). Spectroscopic information for nuclei with A < 45 is put into ENSDF from the evaluations of Aizenberg-Selove and of Endt and van der Leun. An international network was organized to provide regular revisions of the data file. Computer facilities were developed to retrieve collections of evaluated data for special calculations or detailed examination.

  9. Evaluation of Oracle Big Data Integration Tools

    OpenAIRE

    Urhan, Harun; Baranowski, Zbigniew

    2015-01-01

    Abstract The project’s objective is evaluating Oracle’s Big Data Integration Tools. The project covers evaluation of two of Oracle’s tools, Oracle Data Integrator: Application Adapters for Hadoop to load data from Oracle Database to Hadoop and Oracle SQL Connectors for HDFS to query data stored on a Hadoop file system by using SQL statements executed on an Oracle Database.

  10. Data Testing CIELO Evaluations with ICSBEP Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-03-09

    We review criticality data testing performed at Los Alamos with a combination of ENDF/B-VII.1 + potential CIELO nuclear data evaluations. The goal of CIELO is to develop updated, best available evaluated nuclear data files for 1H, 16O, 56Fe, 235,238U and 239Pu. because the major international evaluated nuclear data libraries don’t agree on the internal cross section details of these most important nuclides.

  11. Evaluation of Linked Data Approach on Scientific Geospatial Data

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Minnis, P.; Spagenberg, D. A.

    2012-12-01

    Vast amount of scientific data are collected from NASA Earth Observing System (EOS), and this creates a challenge to collaborate, discovery, and access large and diverse data sets that can span across multiple discipline. Typically, web tools or services are created to allow users or applications to discover, order, and retrieve the relevant data that resides behind a repository in varying formats and semantics. This access model makes it difficult to search and access data sets across multiple repositories in a quick and efficient manner. The authors describe a framework for transforming various satellite data into Resource Description Framework (RDF) data model where all observations are linked spatially and temporally in a Linked Data context using new and existing vocabularies to describe these observations. Sample case study data sets are transformed and stored into multiple semantic repositories for demonstration and evaluation. The challenges and results from the evaluation are presented and discussed.

  12. Genetic algorithm for nuclear data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Arthur, Jennifer Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-02

    These are slides on genetic algorithm for nuclear data evaluation. The following is covered: initial population, fitness (outer loop), calculate fitness, selection (first part of inner loop), reproduction (second part of inner loop), solution, and examples.

  13. Evaluation of neutron data for americium-241

    Energy Technology Data Exchange (ETDEWEB)

    Maslov, V.M.; Sukhovitskij, E.Sh.; Porodzinskij, Yu.V.; Klepatskij, A.B.; Morogovskij, G.B. [Radiation Physics and Chemistry Problems Inst., Minsk-Sosny (Belarus)

    1997-03-01

    The evaluation of neutron data for {sup 241}Am is made in the energy region from 10{sup -5} eV up to 20 MeV. The results of the evaluation are compiled in the ENDF/B-VI format. This work is performed under the Project Agreement CIS-03-95 with the International Science and Technology Center (Moscow). The Financing Party for the Project is Japan. The evaluation was requested by Y. Kikuchi (JAERI). (author). 60 refs.

  14. Data Mining of Undergraduate Course Evaluations

    Science.gov (United States)

    Jian, Yuheng Helen; Javaad, Sohail Syed; Golab, Lukasz

    2016-01-01

    In this paper, we take a new look at the problem of analyzing course evaluations. We examine ten years of undergraduate course evaluations from a large Engineering faculty. To the best of our knowledge, our data set is an order of magnitude larger than those used by previous work on this topic, at over 250,000 student evaluations of over 5,000…

  15. Computer-Assisted Evaluation of Videokymographic Data

    Czech Academy of Sciences Publication Activity Database

    Novozámský, Adam; Sedlář, Jiří; Zita, A.; Švec, J. G.; Zitová, Barbara; Flusser, Jan; Hauzar, D.

    2013-01-01

    Roč. 1, č. 1 (2013), s. 49-49 ISSN 1805-8698. [EFMI STC Prague Data and Knowledge for Medical Decision Support. 17.04.2013-19.04.2013, Praha] Institutional support: RVO:67985556 Keywords : videokymography * image processing * computerassisted evaluation Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2013/ZOI/novozamsky-computer-assisted evaluation of videokymographic data.pdf

  16. An evaluated neutronic data file for bismuth

    Energy Technology Data Exchange (ETDEWEB)

    Guenther, P.T.; Lawson, R.D.; Meadows, J.W.; Smith, A.B.; Smith, D.L.; Sugimoto, M. (Argonne National Lab., IL (USA)); Howerton, R.J. (Lawrence Livermore National Lab., CA (USA))

    1989-11-01

    A comprehensive evaluated neutronic data file for bismuth, extending from 10{sup {minus}5} eV to 20.0 MeV, is described. The experimental database, the application of the theoretical models, and the evaluation rationale are outlined. Attention is given to uncertainty specification, and comparisons are made with the prior ENDF/B-V evaluation. The corresponding numerical file, in ENDF/B-VI format, has been transmitted to the National Nuclear Data Center, Brookhaven National Laboratory. 106 refs., 10 figs., 6 tabs.

  17. ONTOLOGY BASED QUALITY EVALUATION FOR SPATIAL DATA

    Directory of Open Access Journals (Sweden)

    C. Yılmaz

    2015-08-01

    Full Text Available Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI. Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public “data accreditation” institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial’s 1Validate and ESRI’s Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.

  18. International Cooperation in Nuclear Data Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Herman, M.; Herman, M.; Katakura, J.; Koning, A. J.; Nordborg, C.

    2011-08-01

    The OECD Nuclear Energy Agency (NEA) is organising a co-operation between the major nuclear data evaluation projects in the world. The co-operation involves the ENDF, JEFF, and JENDL projects, and, owing to the collaboration with the International Atomic Energy Agency (IAEA), also the Russian BROND and the Chinese CENDL projects. The Working Party on international nuclear data Evaluation Cooperation (WPEC), comprised of about 20 core members, manages this co-operation and meets annually to discuss progress in each evaluation project and also related experimental activities. The WPEC assesses common needs for nuclear data improvements and these needs are then addressed by initiating joint evaluation efforts. The work is performed in specially established subgroups, consisting of experts from the participating evaluation projects. The outcome of these subgroups is published in reports, issued by the NEA. Current WPEC activities comprise for example a number of studies related to nuclear data uncertainties, including a review of methods for the combined use of integral experiments and covariance data, as well as evaluations of some of the major actinides, such as {sup 235}U and {sup 239}Pu. This paper gives an overview of current and planned activities within the WPEC.

  19. Image quality evaluation: the data mining approach

    Science.gov (United States)

    Cui, Chengwu

    2005-01-01

    It is difficult if not impossible to derive a model to adequately describe the entire visual, cognitive and preference decision process of image quality evaluation and to replace it with objective alternatives. Even if some parts of the process can be modeled based on the current knowledge of the visual system, there is often a lack of sufficient data to support the modeling process. On the other hand, image quality evaluation is constantly required for those working on imaging devices and software. Measurements and surveys are regularly conducted to test a newer processing algorithm or a methodology. Large scale subjective measurement or surveys are often conducted before a product is released. Here we propose to combine the two processes and apply data mining techniques to achieve both goals of routine subjective testing and modeling. Specifically, we propose to use relational databases to log and store regular evaluation processes. When combined with web applications, the relational databases approach allow one to maximally improve the efficiency of designing, conducting, analyzing, and reporting test data. The collection of large amounts of data makes it possible to apply data mining techniques to discover knowledge and patterns in the data. Here we report one such system for printing quality evaluation and some theories on data mining including data visualization, observer mining, text comment mining, test case mining, model mining. We also present some preliminary results based on some of these techniques.

  20. Building Assessment Survey and Evaluation Data (BASE)

    Science.gov (United States)

    The Building Assessment Survey and Evaluation (BASE) study was a five year study to characterize determinants of indoor air quality and occupant perceptions in representative public and commercial office buildings across the U.S. This data source is the raw data from this study about the indoor air quality.

  1. Nuclear Structure Data Evaluation and the US Nuclear Data Program

    Science.gov (United States)

    Helmer, Richard G.

    2001-10-01

    The United States Nuclear Data Program and the Nuclear Structure and Decay Data network of the International Atomic Energy Agency work in collaboration to provide recommended nuclear structure and decay data for use in basic and applied research. Data are evaluted for nuclear levels and for various types of radiation. In addition, an extensive bibliographic database of nuclear structure, reaction, and decay data references is maintained. The databases maintained by the National Nuclear Data Center at Brookhaven National Laboratory include the Evaluated Nuclear Structure Data File (ENSDF), the Experimental Unevaluated Nuclear Data Library (XUNDL), and the Nuclear Sciences References (NSR). These data are all available via the Internet and some are available in hard-copy. Although the data in the ENSDF database for a particular nucleus has often not been updated in the last few fears, XUNDL provides compiled experimental data from papers with a delay of only about one month. Also, the NSR bibliographic database is current for the many journals that are monitored regularly.

  2. Columbia River Component Data Evaluation Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    C.S. Cearlock

    2006-08-02

    The purpose of the Columbia River Component Data Compilation and Evaluation task was to compile, review, and evaluate existing information for constituents that may have been released to the Columbia River due to Hanford Site operations. Through this effort an extensive compilation of information pertaining to Hanford Site-related contaminants released to the Columbia River has been completed for almost 965 km of the river.

  3. Evaluating Emergency Physicians: Data Envelopment Analysis Approach

    Science.gov (United States)

    Fiallos, Javier; Farion, Ken; Michalowski, Wojtek; Patrick, Jonathan

    2013-01-01

    The purpose of this research is to develop an evaluation tool to assess performance of Emergency Physicians according to such criteria as resource utilization, patient throughput and the quality of care. Evaluation is conducted using a mathematical programming model known as Data Envelopment Analysis (DEA). Use of this model does not require the subjective assignment of weights associated with each criterion – a feature typical of methodologies that rely on composite scores. The DEA model presented in this paper was developed using a hypothetical data set describing a representative set of profiles of Emergency Physicians. The solution to the model relates the performance of each Emergency Physician in relation to the others and to a benchmark. We discuss how such an evaluation tool can be used in practice. PMID:24551348

  4. Evaluating emergency physicians: data envelopment analysis approach.

    Science.gov (United States)

    Fiallos, Javier; Farion, Ken; Michalowski, Wojtek; Patrick, Jonathan

    2013-01-01

    The purpose of this research is to develop an evaluation tool to assess performance of Emergency Physicians according to such criteria as resource utilization, patient throughput and the quality of care. Evaluation is conducted using a mathematical programming model known as Data Envelopment Analysis (DEA). Use of this model does not require the subjective assignment of weights associated with each criterion - a feature typical of methodologies that rely on composite scores. The DEA model presented in this paper was developed using a hypothetical data set describing a representative set of profiles of Emergency Physicians. The solution to the model relates the performance of each Emergency Physician in relation to the others and to a benchmark. We discuss how such an evaluation tool can be used in practice.

  5. Adaptive Monte Carlo for nuclear data evaluation

    Directory of Open Access Journals (Sweden)

    Schnabel Georg

    2017-01-01

    Full Text Available An adaptive Monte Carlo method for nuclear data evaluation is presented. A fast evaluation method based on the linearization of the nuclear model guides the adaptation of the sampling distribution towards the posterior distribution. The method is suited for parallel computation and provides detailed uncertainty information about nuclear model parameters. Especially, the posterior distribution of the model parameters is not restricted to be multivariate normal. The method is demonstrated in an evaluation of the 181Ta total cross section for incident neutrons. Future applications are as an efficient sampling scheme in the Total Monte Carlo method, and the restriction of parameter uncertainties in nuclear models by both differential and integral data.

  6. Evaluation of 233Pa decay data.

    Science.gov (United States)

    Xiaolong, Huang; Ping, Liu; Baosong, Wang

    2005-05-01

    An evaluation of the complete scheme and data for (233)Pa decay, including results of the recent measurements, is presented. Several data evaluation procedures were used in the analysis of the half-life data. The half-life has been determined to be 26.971+/-0.013 days. All the gamma-ray emission probabilities ever published have been examined, and the gamma-ray emission probability for the reference 312-keV gamma line is recommended as 38.35+/-0.28%. The calculated internal conversion coefficients and their uncertainties have been used to obtain the complete decay intensity balance. Other decay characteristics have been calculated using the ENSDF analysis program. Finally, a new (233)Pa decay scheme has been built.

  7. Performance indicators to evaluate Spatial Data Infrastructures

    Directory of Open Access Journals (Sweden)

    Lola Jiménez-Calderón

    2017-08-01

    Full Text Available The importance of geographic information in decision-making and the ability of Spatial Data Infrastructures to transform government actions from a spatial perspective have placed the IDE as a fundamental solution in decision-making at many levels. In view of its relevance and major investments in this area, there is great interest in different sectors of society to observe the impact of the IDE, its evolution and future scenarios. This creates the need for an assessment of its impact, to obtain an approximate measure the success of these initiatives. This requires the availability of reliable mechanisms of evaluation from different approaches and objectives, which does not prevent, despite having a wide variety of studies and proposals, remains a difficult task. As part of the evaluation and monitoring IDE there is a variety of approaches using performance indicators as valid to estimate their development mechanism. In this context it is noted, the effort of experts and organizations, by agreeing a method to evaluate the IDE on the one hand and to define indicators, which are the essential part of various methodological approaches on the other. Given that the indicators are principally a desire to measure, marked differences in purpose of evaluation, the name and scope of the indicators themselves are observed, which indicates there is still much work to be done. This article presents a comparative analysis of the use of indicators for monitoring and evaluation of IDE development, which has been considered the study of five major initiatives: IDEC of Catalonia, GIDEON the Netherlands, eSDI-Net + e INSPIRE European and UN-GGIM of America.

  8. Evaluation of 235U decay data.

    Science.gov (United States)

    Xiaolong, Huang; Baosong, Wang

    2009-09-01

    Evaluation of the complete decay scheme and data for (235)U including new measurements are presented in this report; literature data available up to June 2008 are included. The half-life is determined to be (7.04+/-0.01) x 10(8) yr. All known measured gamma-ray absolute intensities have been examined; the gamma-ray emission probability of the reference gamma-ray line of 185.72 keV is recommended to be 57.0+/-0.3%. The calculated internal conversion coefficients and their uncertainties have been used to obtain the complete decay intensity balance. The other decay characteristics are calculated using the ENSDF analysis program. Finally the new decay scheme for (235)U is presented.

  9. Evaluation of 225Ac decay data.

    Science.gov (United States)

    Xiaolong, Huang; Baosong, Wang

    2007-06-01

    Evaluation of the complete decay scheme and data for (225)Ac including new measurements are presented in this report; literature data available up to March 2006 are included. The half-life is determined to be 10.0+/-0.1 days. All known measured gamma-ray relative intensities have been examined; the gamma-ray emission probability of the reference gamma-ray line of 150.04 keV is recommended to be 0.693+/-0.012%. The calculated internal conversion coefficients and their uncertainties have been used to obtain the complete decay intensity balance. The other decay characteristics are calculated using the ENSDF analysis program. Finally, the new decay scheme for (225)Ac is presented.

  10. Biological Modeling As A Method for Data Evaluation and ...

    Science.gov (United States)

    Biological Models, evaluating consistency of data and integrating diverse data, examples of pharmacokinetics and response and pharmacodynamics Biological Models, evaluating consistency of data and integrating diverse data, examples of pharmacokinetics and response and pharmacodynamics

  11. A critical evaluation of HIPE data.

    LENUS (Irish Health Repository)

    O'Callaghan, A

    2012-01-01

    Resource allocation and planning of future services is dependent on current volumes, making it imperative that procedural data is accurately recorded. We sought to evaluate the effectiveness of the information gathered by the Hospital Inpatient Enquiry (HIPE) system in recording such activity. Five index vascular procedures (open\\/endovascular abdominal aneurysm repair, carotid endarterectomy, lower limb angioplasty\\/bypass) were chosen to reflect activity. The Economic and Social Research Institute (ESRI), and HIPE databases were interrogated to obtain the regional and hospital specific figures for the years 2005, 2006 and 2009, and then compared with the prospective vascular database in St James\\'s hospital. Data for 2006 (the most recent year available) shows significant discrepancies between the HIPE and vascular database figures for St James\\'s hospital. The HIPE and database figures respectively for; open aneurysm 13\\/30 (-50%), endovascular aneurysm 39\\/31 (+25%), carotid 62\\/48 (+29%), angioplasty 242\\/111 (+100%) and bypass 24\\/10 (+100%) These inaccuracies are likely to be magnified on a regional and national level when pooling data.

  12. Participation and Data Quality in Open Data use : Open Data Infrastructures Evaluated

    NARCIS (Netherlands)

    Zuiderwijk-van Eijk, A.M.G.; Janssen, M.F.W.H.A.

    2015-01-01

    Infrastructures may improve the use of Open Government Data (OGD) by providing insight in how individuals can participate in data reuse and in the quality of open data. Yet, most OGD infrastructures do not support such activities. The objective of this paper is to evaluate the importance and

  13. Evaluation the decay data of (153)Gd.

    Science.gov (United States)

    Xiaolong, Huang

    2010-01-01

    (153)Gd decays to the excited states of (153)Eu through the electron capture decay mode. The evaluation of the complete decay scheme and data of (153)Gd including the recent new measurements are presented in this report. The limitation of relative statistical weight method (LRSW) was applied to average numbers throughout the evaluation. The uncertainty assigned to the average value was always greater than or equal to the smallest uncertainty of the values used to calculate the average. The half-life is determined to be 239.47+/-0.07 days. All known measured gamma-ray relative emission probabilities have been examined. And the gamma-ray emission probability of the reference gamma line of 97.431keV is recommended to be 29.5+/-0.5%. The calculated internal conversion coefficients and their uncertainties have been used to obtain the complete decay intensity balance. The other decay characteristics are calculated using the ENSDF analysis program. Finally the new (153)Gd decay scheme was re-built.

  14. EML Gamma Spectrometry Data Evaluation Program

    Energy Technology Data Exchange (ETDEWEB)

    Decker, Karin M. [Environmental Measurements Laboratory (EML), New York, NY (United States)

    1998-02-28

    This report represents the results of the analyses for the second EML Gamma Spectrometry Data Evaluation Program (August 1997). A calibration spectrum, a background spectrum and three sample spectra were included for each software format as part of the evaluation. The calibration spectrum contained nuclides covering the range from 59.5 keV to 1836 keV. The participants were told fallout and fission product nuclides as well as naturally occurring nuclides could be present. The samples were designed to test the detection and quantification of very low levels of nuclides and the ability of the software and user to properly resolve multiplets. The participants were asked to report values and uncertainties as Becquerel per sample with no decay correction. Twenty-nine sets of results were reported from a total of 70 laboratories who received the spectra. The percentage of the results within 1 F of the expected value was 76, 67, and 55 for samples 1, 2, and 3, respectively. From all three samples, 12% of the results were more than 3 F from the expected value. Sixty-two nuclides out of a total of 580 expected results were not reported for the three samples. Sixty percent of these false negatives were due to nuclides which were present at the minimum detectable activity level. There were 53 false positives reported with 60% of the responses due to problems with background subtraction. The results indicate that the Program is beneficial to the participating laboratories in that it provides them with analysis problems that are difficult to create with spiked samples due to the unavailability of many nuclides and the short half-lives of others. EML will continue its annual distribution, the third is to be held in March 1999.

  15. Techniques for evaluating optimum data center operation

    Science.gov (United States)

    Hamann, Hendrik F.; Rodriguez, Sergio Adolfo Bermudez; Wehle, Hans-Dieter

    2017-06-14

    Techniques for modeling a data center are provided. In one aspect, a method for determining data center efficiency is provided. The method includes the following steps. Target parameters for the data center are obtained. Technology pre-requisite parameters for the data center are obtained. An optimum data center efficiency is determined given the target parameters for the data center and the technology pre-requisite parameters for the data center.

  16. Evaluating a healthcare data warehouse for cancer diseases

    OpenAIRE

    Sheta, Dr. Osama E.; Eldeen, Ahmed Nour

    2013-01-01

    This paper presents the evaluation of the architecture of healthcare data warehouse specific to cancer diseases. This data warehouse containing relevant cancer medical information and patient data. The data warehouse provides the source for all current and historical health data to help executive manager and doctors to improve the decision making process for cancer patients. The evaluation model based on Bill Inmon's definition of data warehouse is proposed to evaluate the Cancer data warehouse.

  17. 40 CFR 610.25 - Evaluation of test data.

    Science.gov (United States)

    2010-07-01

    ... base; (d) Definition of claims which can be made based on the available data; and (e) Substantiation of... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Evaluation of test data. 610.25... Analysis § 610.25 Evaluation of test data. Valid manufacturer-furnished test data will be evaluated with...

  18. EML Gamma Spectrometry Data Evaluation Program

    Energy Technology Data Exchange (ETDEWEB)

    Decker, Karin M. [Environmental Measurements Lab. (EML), New York, NY (United States)

    2001-01-01

    This report presents the results of the analyses for the third EML Gamma Spectrometry Data Evaluation Program (October 1999). This program assists laboratories in providing more accurate gamma spectra analysis results and provides a means for users of gamma data to assess how a laboratory performed on various types of gamma spectrometry analyses. This is accomplished through the use of synthetic gamma spectra. A calibration spectrum, a background spectrum, and three sample spectra are sent to each participant in the spectral file format requested by the laboratory. The calibration spectrum contains nuclides covering the energy range from 59.5 keV to 1836 keV. The participants are told fallout and fission product nuclides could be present. The sample spectra are designed to test the ability of the software and user to properly resolve multiplets and to identify and quantify nuclides in a complicated fission product spectrum. The participants were asked to report values and uncertainties as Becquerel per sample with no decay correction. Thirty-one sets of results were reported from a total of 60 laboratories who received the spectra. Six foreign laboratories participated. The percentage of the results within 1 of the expected value was 68, 33, and 46 for samples 1, 2, and 3, respectively. From all three samples, 18% of the results were more than 3 from the expected value. Eighty-three (12%) values out of a total of 682 expected results were not reported for the three samples. Approximately 30% of these false negatives were due the laboratories not reporting 144Pr in sample 2 which was present at the minimum detectable activity level. There were 53 false positives reported with 25% of these responses due to problems with background subtraction. The results show improvement in the ability of the software or user to resolve peaks separated by 1 keV. Improvement is still needed either in the analysis report produced by the software or in the review of these

  19. Evaluation of Ammunition Data Cards (REDACTED)

    Science.gov (United States)

    2016-04-29

    the ammunition data card approval process. Train Joint Munitions Command onsite Government representatives on MIL‑STD‑1168C requirements. Commanding...the ammunition data cards comply with MIL‑STD‑1168, the Ammunition‑Data Repository Program User Manual, and contractual requirements. 7. Train DCMA...onsite Government representatives to improve oversight of the ammunition data card approval process. Train DCMA onsite Government representatives on

  20. Building Assessment Survey and Evaluation Data (BASE)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Building Assessment Survey and Evaluation (BASE) study was a five year study to characterize determinants of indoor air quality and occupant perceptions in...

  1. Introducing Nuclear Data Evaluations of Prompt Fission Neutron Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Neudecker, Denise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-06-17

    Nuclear data evaluations provide recommended data sets for nuclear data applications such as reactor physics, stockpile stewardship or nuclear medicine. The evaluated data are often based on information from multiple experimental data sets and nuclear theory using statistical methods. Therefore, they are collaborative efforts of evaluators, theoreticians, experimentalists, benchmark experts, statisticians and application area scientists. In this talk, an introductions is given to the field of nuclear data evaluation at the specific example of a recent evaluation of the outgoing neutron energy spectrum emitted promptly after fission from 239Pu and induced by neutrons from thermal to 30 MeV.

  2. Technology evaluation for time sensitive data transport

    DEFF Research Database (Denmark)

    Wessing, Henrik; Breach, Tony; Colmenero, Alberto

    Emerging research and commercial services like IPTV, high quality video conferencing, remote surgeries and cloud computing in particular are time sensitive and their successful deployment assumes network with minimal delay and jitter in combination with high bandwidth and preferably low packet loss...... Backbone Bridging (PBB), “Transparent Interconnect of Lots of Links” (TRILL) to Optical Transport Network (OTN) and SDH. The transport technologies are evaluated theoreti-cally, using simulations and/or experimentally. Each transport technology is evaluated based on its performances and capabilities...

  3. Component failure data handbook. Technical evaluation report

    Energy Technology Data Exchange (ETDEWEB)

    Gentillon, C.D.

    1991-04-01

    This report presents generic component failure rates that are used in reliability and risk studies of commercial nuclear power plants. The rates are computed using plant-specific data from published probabilistic risk assessments supplemented by selected other sources. Each data source is described. For rates with four or more separate estimates among the sources, plots show the data that are combined. The method for combining data from different sources is presented. The resulting aggregated rates are listed with upper bounds that reflect the variability observed in each rate across the nuclear power plant industry. Thus, the rates are generic. Both per hour and per demand rates are included. They may be used for screening in risk assessments or for forming distributions to be updated with plant-specific data.

  4. Evaluating osteological ageing from digital data

    DEFF Research Database (Denmark)

    Villa, Chiara; Buckberry, Jo; Lynnerup, Niels

    2018-01-01

    Age at death estimation of human skeletal remains is one of the key issues in constructing a biological profile both in forensic and archaeological contexts. The traditional adult osteological methods evaluate macroscopically the morphological changes that occur with increasing age of specific...

  5. Data Collection Instruments for Evaluating Family Involvement

    Science.gov (United States)

    Westmoreland, Helen; Bouffard, Suzanne; O'Carroll, Kelley; Rosenberg, Heidi

    2009-01-01

    As evidence supporting the benefits of family involvement in learning mounts, there is an increasing demand for evaluation of family involvement initiatives and for additional research to inform practice and policy. Those designing and implementing family involvement programs must be responsive to calls to bolster the quality of the evidence base…

  6. Bayesian inference data evaluation and decisions

    CERN Document Server

    Harney, Hanns Ludwig

    2016-01-01

    This new edition offers a comprehensive introduction to the analysis of data using Bayes rule. It generalizes Gaussian error intervals to situations in which the data follow distributions other than Gaussian. This is particularly useful when the observed parameter is barely above the background or the histogram of multiparametric data contains many empty bins, so that the determination of the validity of a theory cannot be based on the chi-squared-criterion. In addition to the solutions of practical problems, this approach provides an epistemic insight: the logic of quantum mechanics is obtained as the logic of unbiased inference from counting data. New sections feature factorizing parameters, commuting parameters, observables in quantum mechanics, the art of fitting with coherent and with incoherent alternatives and fitting with multinomial distribution. Additional problems and examples help deepen the knowledge. Requiring no knowledge of quantum mechanics, the book is written on introductory level, with man...

  7. Model evaluation using grouped or individual data.

    Science.gov (United States)

    Cohen, Andrew L; Sanborn, Adam N; Shiffrin, Richard M

    2008-08-01

    Analyzing the data of individuals has several advantages over analyzing the data combined across the individuals (the latter we term group analysis): Grouping can distort the form of data, and different individuals might perform the task using different processes and parameters. These factors notwithstanding, we demonstrate conditions in which group analysis outperforms individual analysis. Such conditions include those in which there are relatively few trials per subject per condition, a situation that sometimes introduces distortions and biases when models are fit and parameters are estimated. We employed a simulation technique in which data were generated from each of two known models, each with parameter variation across simulated individuals. We examined how well the generating model and its competitor each fared in fitting (both sets of) the data, using both individual and group analysis. We examined the accuracy of model selection (the probability that the correct model would be selected by the analysis method). Trials per condition and individuals per experiment were varied systematically. Three pairs of cognitive models were compared: exponential versus power models of forgetting, generalized context versus prototype models of categorization, and the fuzzy logical model of perception versus the linear integration model of information integration. We show that there are situations in which small numbers of trials per condition cause group analysis to outperform individual analysis. Additional tables and figures may be downloaded from the Psychonomic Society Archive of Norms, Stimuli, and Data, www.psychonomic.org/archive.

  8. Evaluation of empirical atmospheric diffusion data

    Energy Technology Data Exchange (ETDEWEB)

    Horst, T.W.; Doran, J.C.; Nickola, P.W.

    1979-10-01

    A study has been made of atmospheric diffusion over level, homogeneous terrain of contaminants released from non-buoyant point sources up to 100 m in height. Current theories of diffusion are compared to empirical diffusion data, and specific dispersion estimation techniques are recommended which can be implemented with the on-site meteorological instrumentation required by the Nuclear Regulatory Commission. A comparison of both the recommended diffusion model and the NRC diffusion model with the empirical data demonstrates that the predictions of the recommended model have both smaller scatter and less bias, particularly for ground-level sources.

  9. New Evaluation Techniques of Hyperspectral Data

    Directory of Open Access Journals (Sweden)

    Veronika Kozma-Bognár

    2010-10-01

    Full Text Available Multiband aerial imagery in remote sensing is a technology used more and more widely in present days. It can excellently be used in research fields where there is need for high spectral resolution images in order to obtain adequate level results. At present, data collection is of a much higher level than processing and use. As the technical development of sensors is followed by a significant delay in data processing methods and applications, it seems reasonable to refine processing methods as well as to widen practical uses (agriculture, environmental protection. In the year of 2004, a new examination method based on fractal structure was introduced, which, according to our experiences, has made more accurate spectral measurement possible as opposed to other techniques. The mathematical process named spectral fractal dimension (SFD is directly applicable in multidimension colour space as well, making thus possible to choose new examination methods of multiband images. With the help of SFD, it is possible to obtain more useful data offered by high spectral resolution, or to choose the bands wished to process applying different methods later.

  10. Data Collection Methods for Evaluating Museum Programs and Exhibitions

    Science.gov (United States)

    Nelson, Amy Crack; Cohn, Sarah

    2015-01-01

    Museums often evaluate various aspects of their audiences' experiences, be it what they learn from a program or how they react to an exhibition. Each museum program or exhibition has its own set of goals, which can drive what an evaluator studies and how an evaluation evolves. When designing an evaluation, data collection methods are purposefully…

  11. Evaluating Trajectory Queries over Imprecise Location Data

    DEFF Research Database (Denmark)

    Xie, Scott, Xike; Cheng, Reynold; Yiu, Man Lung

    2012-01-01

    Trajectory queries, which retrieve nearby objects for every point of a given route, can be used to identify alerts of potential threats along a vessel route, or monitor the adjacent rescuers to a travel path. However, the locations of these objects (e.g., threats, succours) may not be precisely...... obtained due to hardware limitations of measuring devices, as well as the constantly-changing nature of the external environment. Ignoring data uncertainty can render low query quality, and cause undesirable consequences such as missing alerts of threats and poor response time in rescue operations. Also...

  12. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  13. Statistical modeling for visualization evaluation through data fusion.

    Science.gov (United States)

    Chen, Xiaoyu; Jin, Ran

    2017-11-01

    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Neutron data evaluation of {sup 238}U

    Energy Technology Data Exchange (ETDEWEB)

    Maslov, V.M.; Porodzinskij, Y.V.; Hasegawa, Akira; Shibata, Keiichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-08-01

    Cross sections for neutron-induced reactions on {sup 238}U are calculated by using the Hauser-Feshbach-Moldauer theory, the coupled channel model and the double-humped fission barrier model. The direct excitation of ground state band levels is calculated with a rigid-rotator model. The direct excitation of vibrational octupole and K = 2{sup +} quadrupole bands is included using a soft (deformable) rotator model. The competition of inelastic scattering to fission reaction is shown to be sensitive to the target nucleus level density at excitations above the pairing gap. As for fission, (n,2n), (n,3n), and (n,4n) reactions, secondary neutron spectra data are consistently reproduced. Pre-equilibrium emission of first neutron is included. Shell effects in the level densities are shown to be important for estimation of energy dependence of non-emissive fission cross section. (author). 105 refs.

  15. Photonuclear data evaluation of {sup 239}Pu

    Energy Technology Data Exchange (ETDEWEB)

    Raskinyte, I.; Dupont, E.; Ridikas, D

    2006-07-01

    This document presents cross-section calculations up to 130 MeV for Pu{sup 239} using the Talys-0.64 code. The photoabsorption process is described by the giant dipole resonance and quasi-deuteron mechanisms. Preequilibrium particle emission is treated with the classical exciton model. At equilibrium, the compound nucleus decay channels are handled within the Hauser-Feshbach statistical model. Neutron transmission coefficients are calculated with a double humped parabolic model. A few sensitive nuclear parameters were fine-tuned to better reproduce the experimental data available for ({gamma},n), ({gamma},2n) and ({gamma},f) partial cross-sections. In addition, the nuclear models provide predictions of the emitted neutron energy and angular distributions. (A.C.)

  16. STATISTICAL EVALUATION OF FATIGUE DATA OF COMPONENTS

    Directory of Open Access Journals (Sweden)

    Chi Nghia Chung

    2016-02-01

    Full Text Available A variety of steels, cast iron grades and other metals have long been used for the production of machine components. In recent years, however, new materials such as sintered materials and plastics become increasingly important. Because of the large number of different fibers, matrices, stacking sequences, processing conditions and processes and the variety of resulting material configurations it is not possible to rely on proven fatigue models for conventional materials. Moreover, the development of models, which are valid for all composites are generally extremely difficult. In this work, a possible application of high-performance composites as materials for machine elements are investigated. This study attempts to predict the fatigue behavior and the consequent durability, based on laboratory measurements. Using the statistics program JMP, the aquired data was subjected to a reliability analysis in order to ensure the plausibility, validity and accuracy of the measured values.

  17. Evaluating different methods of microarray data normalization

    Directory of Open Access Journals (Sweden)

    Ferreira Carlos

    2006-10-01

    Full Text Available Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.

  18. EXPERIMENTAL EVALUATION OF LIDAR DATA VISUALIZATION SCHEMES

    Directory of Open Access Journals (Sweden)

    S. Ghosh

    2012-07-01

    Full Text Available LiDAR (Light Detection and Ranging has attained the status of an industry standard method of data collection for gathering three dimensional topographic information. Datasets captured through LiDAR are dense, redundant and are perceivable from multiple directions, which is unlike other geospatial datasets collected through conventional methods. This three dimensional information has triggered an interest in the scientific community to develop methods for visualizing LiDAR datasets and value added products. Elementary schemes of visualization use point clouds with intensity or colour, triangulation and tetrahedralization based terrain models draped with texture. Newer methods use feature extraction either through the process of classification or segmentation. In this paper, the authors have conducted a visualization experience survey where 60 participants respond to a questionnaire. The questionnaire poses six different questions on the qualities of feature perception and depth for 12 visualization schemes. The answers to these questions are obtained on a scale of 1 to 10. Results are thus presented using the non-parametric Friedman's test, using post-hoc analysis for hypothetically ranking the visualization schemes based on the rating received and finally confirming the rankings through the Page's trend test. Results show that a heuristic based visualization scheme, which has been developed by Ghosh and Lohani (2011 performs the best in terms of feature and depth perception.

  19. Evaluated Nuclear Structure Data File and Related Products

    Science.gov (United States)

    Tuli, Jagdish K.

    2005-05-01

    The Evaluated Nuclear Structure Data File (ENSDF) is a leading resource for experimental nuclear data. It is maintained and distributed by the National Nuclear Data Center, Brookhaven National Laboratory. The file is mainly contributed to by an international network of evaluators under the auspice of the International Atomic Energy Agency. The ENSDF is updated, generally by mass number, i.e., evaluating together all isobars for a given mass number. If, however, experimental activity in an isobaric chain is limited to a particular nuclide, then only that nuclide is updated. The evaluations are published in the Journal of Nuclear Data Sheets, Academic Press, a division of Elsevier.

  20. Decay data evaluation project (DDEP): updated evaluations of the 233Th and 241Am decay characteristics.

    Science.gov (United States)

    Chechev, Valery P; Kuzmenko, Nikolay K

    2010-01-01

    The results of new decay data evaluations are presented for (233)Th (beta(-)) decay to nuclear levels in (233)Pa and (241)Am (alpha) decay to nuclear levels in (237)Np. These evaluated data have been obtained within the Decay Data Evaluation Project using information published up to 2009. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. Decay Data Evaluation Project (DDEP): Evaluation of the main {sup 233}Pa decay characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Chechev, Valery P. [V.G. Khlopin Radium Institute, 28 Second Murinsky Ave., St. Petersburg, 194021 (Russian Federation)]. E-mail: chechev@atom.nw.ru; Kuzmenko, Nikolay K. [V.G. Khlopin Radium Institute, 28 Second Murinsky Ave., St. Petersburg, 194021 (Russian Federation)

    2006-10-15

    The results of a decay data evaluation are presented for {sup 233}Pa ({beta} {sup -}) decay to nuclear levels in {sup 233}U. These evaluated data have been obtained within the Decay Data Evaluation Project using information published up to 2005.

  2. On data mining in context : cases, fusion and evaluation

    NARCIS (Netherlands)

    Putten, Petrus Wilhelmus Henricus van der

    2010-01-01

    Data mining can be seen as a process, with modeling as the core step. However, other steps such as planning, data preparation, evaluation and deployment are of key importance for applications. This thesis studies data mining in the context of these other steps with the goal of improving data mining

  3. CENDL project, the chinese evaluated nuclear data library

    Science.gov (United States)

    Ge, Zhigang; Wu, Haicheng; Chen, Guochang; Xu, Ruirui

    2017-09-01

    The status of Chinese Evaluated Nuclear Data Library (CENDL) and the relevant CENDL project are introduced in this paper. Recently, a new version CENDL-3.2b0 was being prepared on the basis of the previous CENDL-3.1. The data in the light and actinide nuclide regions are updated from CENDL-3.1, and the new evaluations and calculations are performed mainly around structure and fission product nuclide regions. Covariance was also evaluated for structure and actinide nuclides. At the same time, the methodologies are systematically developed to fulfil the requirements of evaluations for CENDL-3.2b0. The updated nuclear reaction models for light and middle-heavy nuclides, non-model dependent nuclear data evaluation, covariance evaluation approaches, systematics, and integral validation system of nuclear data are incorporated in present CENDL project. The future developments are also planned.

  4. Evaluating Clustering in Subspace Projections of High Dimensional Data

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Günnemann, Stephan; Assent, Ira

    2009-01-01

    Clustering high dimensional data is an emerging research field. Subspace clustering or projected clustering group similar objects in subspaces, i.e. projections, of the full space. In the past decade, several clustering paradigms have been developed in parallel, without thorough evaluation and co...... and create a common baseline for future developments and comparable evaluations in the field. For repeatability, all implementations, data sets and evaluation measures are available on our website....

  5. Nuclear Structure and Decay Data Evaluation: Status and Perspectives

    Science.gov (United States)

    Kondev, F. G.; Browne, E.; Ouellet, C.; Pritychenko, B.; Reich, C.; Sonzogni, A.; Tandel, S.; Tuli, J. K.; Cameron, J.; Chen, A.; Singh, B.; Kelley, J.; Kwan, E.; Baglin, C.; Basunia, M. S.; Firestone, R. B.; Nica, N.; Nesaraja, C. D.; Smith, M. S.

    2009-10-01

    Reliable nuclear structure data represent the fundamental building blocks of nuclear structure physics and astrophysics research, and are also of vital importance in a large number of applications. Members of the Nuclear Structure and Decay Data Working Group of the U.S. Nuclear Data Program, in collaboration with scientists from Japan and other countries within the International Nuclear Structure and Decay Data Network (under the auspices of IAEA), are involved in compilation, evaluation, and dissemination of nuclear structure and decay data for all known nuclei. The network's principal effort is devoted to the timely revision of information in the Evaluated Nuclear Structure Data File (ENSDF) library, which is the only comprehensive nuclear structure and decay data database that is updated continuously. This presentation will briefly review recent achievements of the network, present on-going activities, and reflect on ideas for future projects and challenges in the field of nuclear structure and decay data evaluation.

  6. Big Data as Innovative Approach for Usability Evaluations of Buildings

    OpenAIRE

    Olsson, Nils; Bull-Berg, Heidi; Junghans, Antje

    2014-01-01

    Purpose: The purpose of this paper is to investigate how Big Data can add a new dimension to usability evaluations of buildings. Background: There is a tremendous growth in the volume of available data, creating the “Big Data” trend. Industries such as IT, retail and transportation can present a number of examples of successful applications of Big Data. Usability has traditionally been analysed by qualitative research methods, and Big Data gives an opportunity to add quantitative data in s...

  7. The Decay Data Evaluation Project (DDEP) and the JEFF-3.3 radioactive decay data library: Combining international collaborative efforts on evaluated decay data

    OpenAIRE

    Kellett Mark A.; Bersillon Olivier

    2017-01-01

    The Decay Data Evaluation Project (DDEP), is an international collaboration of decay data evaluators formed with groups from France, Germany, USA, China, Romania, Russia, Spain and the UK, mainly from the metrology community. DDEP members have evaluated over 220 radionuclides, following an agreed upon methodology, including a peer review. Evaluations include all relevant parameters relating to the nuclear decay and the associated atomic processes. An important output of these evaluations are ...

  8. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    Science.gov (United States)

    Lehrack, S.; Duckeck, G.; Ebke, J.

    2014-06-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  9. EVALUATED NUCLEAR STRUCTURE DATA FILE -- A MANUAL FOR PREPARATION OF DATA SETS.

    Energy Technology Data Exchange (ETDEWEB)

    TULI, J.K.

    2001-02-01

    This manual describes the organization and structure of the Evaluated Nuclear Structure Data File (ENSDF). This computer-based file is maintained by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory for the international Nuclear Structure and Decay Data Network. For every mass number (presently, A {le} 293), the Evaluated Nuclear Structure Data File (ENSDF) contains evaluated structure information. For masses A {ge} 44, this information is published in the Nuclear Data Sheets; for A < 44, ENSDF is based on compilations published in the journal Nuclear Physics. The information in ENSDF is updated by mass chain or by nuclide with a varying cycle time dependent on the availability of new information.

  10. Evaluated nuclear structure data file: A manual for preparation of data sets

    Energy Technology Data Exchange (ETDEWEB)

    Tuli, J.K.

    1987-04-01

    This manual describes the organization and structure of the Evaluated Nuclear Structure Data File (ENSDF). This computer-based file is maintained by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory for the international Nuclear Structure and Decay Data Network. For every mass number (presently, Aless than or equal to263), the Evaluated Nuclear Structure Data File (ENSDF) contains evaluated structure information. For masses Agreater than or equal to45, this information is documented in the Nuclear Data Sheets; for A<45, ENSDF is based on compilations published in the journal Nuclear Physics. The information in ENSDF is updated by mass chains with a present cycle time of approximately six years.

  11. Working Party on International Nuclear Data Evaluation Cooperation (WPEC)

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, E.; Herman, M.; Dupont, E.; Chadwick, M. B.; Danon, Y.; De Saint Jean, C.; Dunn, M.; Fischer, U.; Forrest, R. A.; Fukahori, T.; Ge, Z.; Harada, H.; Herman, M.; Igashira, M.; Ignatyuk, A.; Ishikawa, M.; Iwamoto, O.; Jacqmin, R.; Kahler, A. C.; Kawano, T.; Koning, A. J.; Leal, L.; Lee, Y. O.; McKnight, R.; McNabb, D.; Mills, R. W.; Palmiotti, G.; Plompen, A.; Salvatores, M.; Schillebeeckx, P.

    2014-06-01

    The OECD Nuclear Energy Agency (NEA) organizes cooperation between the major nuclear data evaluation projects in the world. Moreover, the NEA Working Party on International Nuclear Data Evaluation Cooperation (WPEC) was established to promote the exchange of information on nuclear data evaluation, measurement, nuclear model calculation, validation, and related topics, and to provide a framework for cooperative activities between the participating projects. The working party assesses nuclear data improvement needs and addresses these needs by initiating joint activities in the framework of dedicated WPEC subgroups. Studies recently completed comprise a number of works related to nuclear data covariance and associated processing issues, as well as more specific studies related to the resonance parameter representation in the unresolved resonance region, the gamma production from fission product capture reactions, the 235U capture cross section, the EXFOR database, and the improvement of nuclear data for advanced reactor systems. Ongoing activities focus on the evaluation of 239Pu in the resonance region, scattering angular distribution in the fast energy range, and reporting/usage of experimental data for evaluation in the resolved resonance region. New activities include two subgroups on improved fission product yield evaluation methodologies and on modern nuclear database structures. Some future activities under discussion include a pilot project for a Collaborative International Evaluated Library Organization (CIELO) and methods to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data. In addition to the above mentioned short-term task-oriented subgroups, WPEC also hosts a longer-term subgroup charged with reviewing and compiling the most important nuclear data requirements in a high priority request list (HPRL).

  12. Working Party on International Nuclear Data Evaluation Cooperation (WPEC)

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, E., E-mail: wpec@oecd-nea.org [OECD Nuclear Energy Agency, Issy-les-Moulineaux (France); Chadwick, M.B. [Los Alamos National Laboratory, Los Alamos, NM (United States); Danon, Y. [Rensselaer Polytechnic Institute, Troy, NY (United States); De Saint Jean, C. [CEA, Nuclear Energy Division, Cadarache (France); Dunn, M. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Fischer, U. [Karlsruhe Institute of Technology, Karlsruhe (Germany); Forrest, R.A. [Nuclear Data Section, International Atomic Energy Agency (Austria); Fukahori, T. [Japan Atomic Energy Agency (Japan); Ge, Z. [China Institute of Atomic Energy (China); Harada, H. [Japan Atomic Energy Agency (Japan); Herman, M. [Brookhaven National Laboratory, Upton, NY (United States); Igashira, M. [Tokyo Institute of Technology, Tokyo (Japan); Ignatyuk, A. [Institute of Physics and Power Engineering, Obninsk (Russian Federation); Ishikawa, M.; Iwamoto, O. [Japan Atomic Energy Agency (Japan); Jacqmin, R. [CEA, Nuclear Energy Division, Cadarache (France); Kahler, A.C.; Kawano, T. [Los Alamos National Laboratory, Los Alamos, NM (United States); Koning, A.J. [Nuclear Research and Consultancy Group, Petten (Netherlands); Leal, L. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2014-06-15

    The OECD Nuclear Energy Agency (NEA) organizes cooperation between the major nuclear data evaluation projects in the world. The NEA Working Party on International Nuclear Data Evaluation Cooperation (WPEC) was established to promote the exchange of information on nuclear data evaluation, measurement, nuclear model calculation, validation, and related topics, and to provide a framework for cooperative activities between the participating projects. The working party assesses nuclear data improvement needs and addresses these needs by initiating joint activities in the framework of dedicated WPEC subgroups. Studies recently completed comprise a number of works related to nuclear data covariance and associated processing issues, as well as more specific studies related to the resonance parameter representation in the unresolved resonance region, the gamma production from fission product capture reactions, the {sup 235}U capture cross section, the EXFOR database, and the improvement of nuclear data for advanced reactor systems. Ongoing activities focus on the evaluation of {sup 239}Pu in the resonance region, scattering angular distribution in the fast energy range, and reporting/usage of experimental data for evaluation in the resolved resonance region. New activities include two subgroups on improved fission product yield evaluation methodologies and on modern nuclear database structures. Future activities under discussion include a pilot project for a Collaborative International Evaluated Library Organization (CIELO) and methods to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data. In addition to the above mentioned short-term task-oriented subgroups, WPEC also hosts a longer-term subgroup charged with reviewing and compiling the most important nuclear data requirements in a high priority request list (HPRL)

  13. Data formats and procedures for the Evaluated Nuclear Data File, ENDF

    Energy Technology Data Exchange (ETDEWEB)

    Garber, D.; Dunford, C.; Pearlstein, S.

    1975-10-01

    This report describes the philosophy of the Evaluated Nuclear Data File (ENDF) and the data formats and procedures that have been developed for it. The ENDF system was designed for the storage and retrieval of the evaluated nuclear data that are required for neutronics, photonics and decay heat calculations. This system is composed of several parts that include a series of data processing codes and neutron and photon cross section nuclear structure libraries.

  14. Space Physics Cosmic & Heliospheric Data Evaluation Panel Report

    Science.gov (United States)

    McGuiere, R. E.; Cooper, J.; Gazis, P.; Kurth, W.; Lazarus, A.; McDonald, F.; McNutt, R.; Pyle, R.; Tsurutani, B. T.

    1995-01-01

    This Cosmic and Heliospheric (C&H) Data Evaluation Panel was charged with the task of identifying and prioritizing important C&H data sets. It was requested to provide C&H community input to the Space Physics Division for a program of revitalizing data holdings. Details and recommendations are provided. Highest C&H priority is assigned to Voyager, Pioneer, Helios, IMP-8, and ISEE-3 data.

  15. Hanford Site background: Evaluation of existing soil radionuclide data

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    This report is an evaluation of the existing data on radiological background for soils in the vicinity of the Hanford Site. The primary purpose of this report is to assess the adequacy of the existing data to serve as a radiological background baseline for use in environmental restoration and remediation activities at the Hanford Site. The soil background data compiled and evaluated in this report were collected by the Pacific Northwest Laboratory (PNL) and Washington State Department of Health (DOH) radiation surveillance programs in southeastern Washington. These two programs provide the largest well-documented, quantitative data sets available to evaluate background conditions at the Hanford Site. The data quality objectives (DQOs) considered in this evaluation include the amount of data, number of sampling localities, spatial coverage, number and types of radionuclides reported, frequency of reporting, documentation and traceability of sampling and laboratory methods used, and comparability between sets of data. Although other data on soil radionuclide abundances around the Hanford Site exist, they are generally limited in scope and lack the DQOs necessary for consideration with the PNL and DOH data sets. Collectively, these two sources provide data on the activities of 25 radionuclides and four other parameters (gross alpha, gross beta, total uranium, and total thorium). These measurements were made on samples from the upper 2.5 cm of soil at over 70 localities within the region.

  16. Evaluation of technological data in the DFI and PIES models

    Energy Technology Data Exchange (ETDEWEB)

    Bhagat, N.; Beller, M.; Hermelee, A.; Wagner, J.; Lamontagne, J.

    1979-04-01

    This report evaluates the data used in two of the models available to the Energy Information Administration (EIA). Specifically, the study involves updating, reviewing, and documenting the technological data on primary energy conversion, transportation, distribution and end-use conversion. The major focus is upon data used in the Decision Focus, Inc. (DFI), LEAP model. This is an abbreviated version of the Gulf-Stanford Research, Inc., energy model developed to assess the potential future impacts of synthetic fuels in the US energy system. A parallel effort assesses the data used in the model commonly known as the Project Independence Evaluation System (PIES).

  17. Data-driven performance evaluation method for CMS RPC trigger ...

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 79; Issue 4. Data-driven performance evaluation method for CMS ... The information transmitted from the three muon subsystems (DT, CSC and RPC) are collected by the Global Muon Trigger (GMT) Board and merged. A method for evaluating the RPC system trigger ...

  18. Cross-lingual tagger evaluation without test data

    DEFF Research Database (Denmark)

    Agic, Zeljko; Plank, Barbara; Søgaard, Anders

    2017-01-01

    We address the challenge of cross-lingual POS tagger evaluation in absence of manually annotated test data. We put forth and evaluate two dictionary-based metrics. On the tasks of accuracy prediction and system ranking, we reveal that these metrics are reliable enough to approximate test set...

  19. An Evaluation Framework for Data Competitions in TEL

    NARCIS (Netherlands)

    Drachsler, Hendrik; Stoyanov, Slavi; d'Aquin, Mathieu; Herder, Eelco; Guy, Marieke; Dietze, Stefanie

    2015-01-01

    This paper presents a study describing the development of an Evaluation Framework (EF) for data competitions in TEL. The study applies the Group Concept Method (GCM) to empirically depict criteria and their indicators for evaluating software applications in TEL. A statistical analysis of

  20. Accurate evaluation and analysis of functional genomics data and methods

    Science.gov (United States)

    Greene, Casey S.; Troyanskaya, Olga G.

    2016-01-01

    The development of technology capable of inexpensively performing large-scale measurements of biological systems has generated a wealth of data. Integrative analysis of these data holds the promise of uncovering gene function, regulation, and, in the longer run, understanding complex disease. However, their analysis has proved very challenging, as it is difficult to quickly and effectively assess the relevance and accuracy of these data for individual biological questions. Here, we identify biases that present challenges for the assessment of functional genomics data and methods. We then discuss evaluation methods that, taken together, begin to address these issues. We also argue that the funding of systematic data-driven experiments and of high-quality curation efforts will further improve evaluation metrics so that they more-accurately assess functional genomics data and methods. Such metrics will allow researchers in the field of functional genomics to continue to answer important biological questions in a data-driven manner. PMID:22268703

  1. Lazy evaluation of FP programs: A data-flow approach

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Y.H. [International Business Machines Corp., Yorktown Heights, NY (United States). Thomas J. Watson Research Center; Gaudiot, J.L. [University of Southern California, Los Angeles, CA (United States). Computer Research Inst.

    1988-12-31

    This paper presents a lazy evaluation system for the list-based functional language, Backus` FP in data-driven environment. A superset language of FP, called DFP (Demand-driven FP), is introduced. FP eager programs are transformed into DFP lazy programs which contain the notions of demands. The data-driven execution of DFP programs has the same effects of lazy evaluation. DFP lazy programs have the property of always evaluating a sufficient and necessary result. The infinite sequence generator is used to demonstrate the eager-lazy program transformation and the execution of the lazy programs.

  2. Can complex health interventions be evaluated using routine clinical and administrative data? - a realist evaluation approach.

    Science.gov (United States)

    Riippa, Iiris; Kahilakoski, Olli-Pekka; Linna, Miika; Hietala, Minni

    2014-12-01

    Interventions aimed at improving chronic care typically consist of multiple interconnected parts, all of which are essential to the effect of the intervention. Limited attention has been paid to the use of routine clinical and administrative data in the evolution of these complex interventions. The purpose of this study is to examine the feasibility of routinely collected data when evaluating complex interventions and to demonstrate how a theory-based, realist approach to evaluation may increase the feasibility of routine data. We present a case study of evaluating a complex intervention, namely, the chronic care model (CCM), in Finnish primary health care. Issues typically faced when evaluating the effects of a complex intervention on health outcomes and resource use are identified by using routine data in a natural setting, and we apply context-mechanism-outcome (CMO) approach from the realist evaluation paradigm to improve the feasibility of using routine data in evaluating complex interventions. From an experimentalist approach that dominates the medical literature, routine data collected from a single centre offered a poor starting point for evaluating complex interventions. However, the CMO approach offered tools for identifying indicators needed to evaluate complex interventions. Applying the CMO approach can aid in a typical evaluation setting encountered by primary care managers: one in which the intervention is complex, the primary data source is routinely collected clinical and administrative data from a single centre, and in which randomization of patients into two research arms is too resource consuming to arrange. © 2014 John Wiley & Sons, Ltd.

  3. A transparent and transportable methodology for evaluating Data Linkage software.

    Science.gov (United States)

    Ferrante, Anna; Boyd, James

    2012-02-01

    There has been substantial growth in Data Linkage (DL) activities in recent years. This reflects growth in both the demand for, and the supply of, linked or linkable data. Increased utilisation of DL "services" has brought with it increased need for impartial information about the suitability and performance capabilities of DL software programs and packages. Although evaluations of DL software exist; most have been restricted to the comparison of two or three packages. Evaluations of a large number of packages are rare because of the time and resource burden placed on the evaluators and the need for a suitable "gold standard" evaluation dataset. In this paper we present an evaluation methodology that overcomes a number of these difficulties. Our approach involves the generation and use of representative synthetic data; the execution of a series of linkages using a pre-defined linkage strategy; and the use of standard linkage quality metrics to assess performance. The methodology is both transparent and transportable, producing genuinely comparable results. The methodology was used by the Centre for Data Linkage (CDL) at Curtin University in an evaluation of ten DL software packages. It is also being used to evaluate larger linkage systems (not just packages). The methodology provides a unique opportunity to benchmark the quality of linkages in different operational environments. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Coherent investigation of nuclear data at CEA DAM: Theoretical models, experiments and evaluated data

    Energy Technology Data Exchange (ETDEWEB)

    Bauge, E.; Belier, G.; Cartier, J.; Chatillon, A.; Daugas, J.M.; Delaroche, J.P.; Dossantos-Uzarralde, P.; Duarte, H.; Dubray, N.; Ducauze-Philippe, M.; Gaudefroy, L.; Gosselin, G.; Granier, T.; Hilaire, S.; Chau, Huu-Tai P.; Laborie, J.M.; Laurent, B.; Ledoux, X.; Le Luel, C.; Meot, V.; Morel, P.; Morillon, B.; Roig, O.; Romain, P.; Taieb, J.; Varignon, C. [CEA, DAM, DIF, Arpajon (France); Authier, N.; Casoli, P.; Richard, B. [CEA Valduc, Is-sur-Tille (France)

    2012-08-15

    The domain of evaluated nuclear data involves at the same time, a close interaction between the field of nuclear applications and that of nuclear physics, and a close interaction between experiments and theory. The final product, the evaluated data file, synthesises vast amounts of information stemming from all of the above fields. In CEA DAM, all these aspects of nuclear data are investigated in a consistent way, making full use of experimental facilities and high-performance computing as well as numerous national and international collaborations, for the measurement, calculation, evaluation, and validation of nuclear data. (orig.)

  5. INTERNATIONAL CO-OPERATION IN NUCLEAR DATA EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Herman, M.; Katakura,J.; Koning,A.; Nordborg,C.

    2010-04-30

    The OECD Nuclear Energy Agency (NEA) is organising a co-operation between the major nuclear data evaluation projects in the world. The co-operation involves the ENDF, JEFF, and JENDL projects, and, owing to the collaboration with the International Atomic Energy Agency (IAEA), also the Russian RUSFOND and the Chinese CENDL projects. The Working Party on international nuclear data Evaluation Cooperation (WPEC), comprised of about 20 core members, manages this co-operation and meets annually to discuss progress in each evaluation project and also related experimental activities. The WPEC assesses common needs for nuclear data improvements and these needs are then addressed by initiating joint evaluation efforts. The work is performed in specially established subgroups, consisting of experts from the participating evaluation projects. The outcome of these subgroups is published in reports, issued by the NEA. Current WPEC activities comprise for example a number of studies related to nuclear data uncertainties, including a review of methods for the combined use of integral experiments and covariance data, as well as evaluations of some of the major actinides, such as {sup 235}U and {sup 239}Pu. This paper gives an overview of current and planned activities within the WPEC.

  6. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING & EVALUATION MEHTODS & REQUIREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    SCHOFIELD JS

    2007-10-04

    This document has two purposes: {sm_bullet} Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. {sm_bullet} Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals.

  7. Data-driven performance evaluation method for CMS RPC trigger ...

    Indian Academy of Sciences (India)

    2012-10-06

    Oct 6, 2012 ... Data-driven performance evaluation method for CMS RPC trigger system using 2011 data at LHC. A SHARMA. ∗ and S B BERI, on behalf of the CMS Collaboration. Department of Physics, Panjab University, Chandigarh 160 014, India. ∗. Corresponding author. E-mail: archiesharma12@gmail.com.

  8. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Hendrik, Drachsler; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  9. TENDL-2011: TALYS-based Evaluated Nuclear Data Library

    Energy Technology Data Exchange (ETDEWEB)

    Rochman, D.; Koning, A. J. [Nuclear Research and Consultancy Group, Petten (Netherlands)

    2012-07-01

    The 4. release of the TENDL library, TENDL-2011 (TALYS-based Evaluated Nuclear Data Library) is described. This library consists of a complete set of nuclear reaction data for incident neutrons, photons, protons, deuterons, tritons, helions and alpha particles, from 10-5 eV up to 200 MeV, for all isotopes from {sup 6}Li to {sup 281}Ds that are either stable of have a half-life longer than 1 second. All data are completely and consistently evaluated using a software system consisting of the TALYS-1.2 nuclear reaction code, and other programs to handle resonance data, experimental data, data from existing evaluations, and to provide the final ENDF-6 formatting. The result is a nuclear data library with mutually consistent reaction information for all isotopes and a quality that increases with yearly updates. To produce this library, TALYS input parameters are adjusted for many nuclides so that calculated cross sections agree with experimental data, while for important nuclides experimental data are directly included. All information is available on www.talys.eu and www.talys.eu/TENDL-2011. (authors)

  10. Nuclear Structure and Decay Data Evaluation: Challenges and Perspectives

    Science.gov (United States)

    Kondev, F. G.; Tuli, J. K.

    2003-10-01

    The expression ``Nuclear Structure and Decay Data'' refers to complex nuclear level schemes and tables of numerical values, which quantify fundamental properties of nuclear structure, such as level energies, quantum numbers and state lifetimes, as well as various decay modes and associated radiation. These data are not only at the core of basic nuclear structure and nuclear astrophysics research, but they are also relevant for many applied technologies, including nuclear energy production, reactor design and safety, medical diagnostic and radiotherapy, health physics, environmental research and monitoring, safeguards, material analysis, etc. The mission of the Nuclear Structure and Decay Data Working Group of the US DOE funded Nuclear Data Program is to evaluate, compile, maintain, and disseminate nuclear structure and decay data for all known nuclei (more than 2900!). The network's principal effort is devoted to the timely revision of information in the ENSDF (Evaluated Nuclear Structure Data File) database. However, in recent years, special attention has been given to topical evaluations of properties of particular interest to the nuclear structure community, such as log ft values, α- and proton decay properties, super-deformed and magnetic dipole collective structures, nuclear moments, and nuclear isomers (under development). This presentation will briefly review recent achievements of the network, present on-going activities, and reflect on ideas for future projects and challenges in the field of nuclear structure and decay data evaluation.

  11. Data governance tools evaluation criteria, big data governance, and alignment with enterprise data management

    CERN Document Server

    Soares, Sunil

    2015-01-01

    Data governance programs often start off using programs such as Microsoft Excel and Microsoft SharePoint to document and share data governance artifacts. But these tools often lack critical functionality. Meanwhile, vendors have matured their data governance offerings to the extent that today's organizations need to consider tools as a critical component of their data governance programs. In this book, data governance expert Sunil Soares reviews the Enterprise Data Management (EDM) reference architecture and discusses key data governance tasks that can be automated by tools for business glossa

  12. Nuclear data evaluation methodology including estimates of covariances

    Directory of Open Access Journals (Sweden)

    Smith D.L.

    2010-10-01

    Full Text Available Evaluated nuclear data rather than raw experimental and theoretical information are employed in nuclear applications such as the design of nuclear energy systems. Therefore, the process by which such information is produced and ultimately used is of critical interest to the nuclear science community. This paper provides an overview of various contemporary methods employed to generate evaluated cross sections and related physical quantities such as particle emission angular distributions and energy spectra. The emphasis here is on data associated with neutron induced reaction processes, with consideration of the uncertainties in these data, and on the more recent evaluation methods, e.g., those that are based on stochastic (Monte Carlo techniques. There is no unique way to perform such evaluations, nor are nuclear data evaluators united in their opinions as to which methods are superior to the others in various circumstances. In some cases it is not critical which approaches are used as long as there is consistency and proper use is made of the available physical information. However, in other instances there are definite advantages to using particular methods as opposed to other options. Some of these distinctions are discussed in this paper and suggestions are offered regarding fruitful areas for future research in the development of evaluation methodology.

  13. Dissemination and visualisation of reference decay data from Decay Data Evaluation Project (DDEP)

    Science.gov (United States)

    Dulieu, Christophe; Kellett, Mark A.; Mougeot, Xavier

    2017-09-01

    As a primary laboratory in the field of ionising radiation metrology, the Laboratoire National Henri Becquerel (LNE-LNHB), CEA Saclay, is involved in measurements, evaluations and dissemination of radioactive decay data. Data measurements undertaken by various laboratories are evaluated by an international commission of experts (Decay Data Evaluation Project) coordinated by LNHB staff in order to establish a set of recommended decay scheme data. New nuclide evaluations are regularly added to our website, the Nucléide database, published in the BIPM-5 Monographie series and uploaded to our web application Laraweb, a dedicated tool for alpha and gamma spectrometry. The Mini Table of Radionuclides is produced from time-to-time with data extracted from our database. Various publications are described, along with new search criteria and decay scheme visualisation in Laraweb. Note to the reader: the pdf file has been changed on September 22, 2017.

  14. Data driven uncertainty evaluation for complex engineered system design

    Science.gov (United States)

    Liu, Boyuan; Huang, Shuangxi; Fan, Wenhui; Xiao, Tianyuan; Humann, James; Lai, Yuyang; Jin, Yan

    2016-09-01

    Complex engineered systems are often difficult to analyze and design due to the tangled interdependencies among their subsystems and components. Conventional design methods often need exact modeling or accurate structure decomposition, which limits their practical application. The rapid expansion of data makes utilizing data to guide and improve system design indispensable in practical engineering. In this paper, a data driven uncertainty evaluation approach is proposed to support the design of complex engineered systems. The core of the approach is a data-mining based uncertainty evaluation method that predicts the uncertainty level of a specific system design by means of analyzing association relations along different system attributes and synthesizing the information entropy of the covered attribute areas, and a quantitative measure of system uncertainty can be obtained accordingly. Monte Carlo simulation is introduced to get the uncertainty extrema, and the possible data distributions under different situations is discussed in detail. The uncertainty values can be normalized using the simulation results and the values can be used to evaluate different system designs. A prototype system is established, and two case studies have been carried out. The case of an inverted pendulum system validates the effectiveness of the proposed method, and the case of an oil sump design shows the practicability when two or more design plans need to be compared. This research can be used to evaluate the uncertainty of complex engineered systems completely relying on data, and is ideally suited for plan selection and performance analysis in system design.

  15. Engaging Youth in Evaluation: Using Clickers for Data Collection

    Directory of Open Access Journals (Sweden)

    Lynne M. Borden

    2012-03-01

    Full Text Available Now, more than ever, evaluation is an essential component for all programs. Although the need for outcome data is clear, collecting data from youth populations is often difficult, particularly among youth who are vulnerable and/or disenfranchised. While the use of paper-and-pencil (PAP surveys is a commonly used method of data collection, different technological methods, such as online surveys, text messaging, and personal digital assistants (PDA’s, are increasingly employed in data collection efforts. This article explores the use of audience response systems (“clickers” as an innovative data collection method that is especially suited for use with youth. In this paper we examine qualitative findings from key informant interviews regarding data collected from youth participants on a youth program quality measure using clicker technology. Findings from the study indicate that the use of clickers may increase youth engagement in and improve the efficiency of the data collection process.

  16. A public data set of human balance evaluations

    Directory of Open Access Journals (Sweden)

    Damiana A. Santos

    2016-11-01

    Full Text Available The goal of this study was to create a public data set with results of qualitative and quantitative evaluations related to human balance. Subject’s balance was evaluated by posturography using a force platform and by the Mini Balance Evaluation Systems Tests. In the posturography test, we evaluated subjects standing still for 60 s in four different conditions where vision and the standing surface were manipulated: on a rigid surface with eyes open; on a rigid surface with eyes closed; on an unstable surface with eyes open; on an unstable surface with eyes closed. Each condition was performed three times and the order of the conditions was randomized. In addition, the following tests were employed in order to better characterize each subject: Short Falls Efficacy Scale International; International Physical Activity Questionnaire Short Version; and Trail Making Test. The subjects were also interviewed to collect information about their socio-cultural, demographic, and health characteristics. The data set comprises signals from the force platform (raw data for the force, moments of forces, and centers of pressure of 163 subjects plus one file with information about the subjects and balance conditions and the results of the other evaluations. All the data is available at PhysioNet and at Figshare.

  17. Evaluation of the Wishart test statistics for polarimetric SAR data

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    2003-01-01

    A test statistic for equality of two covariance matrices following the complex Wishart distribution has previously been used in new algorithms for change detection, edge detection and segmentation in polarimetric SAR images. Previously, the results for change detection and edge detection have been...... quantitatively evaluated. This paper deals with the evaluation of segmentation. A segmentation performance measure originally developed for single-channel SAR images has been extended to polarimetric SAR images, and used to evaluate segmentation for a merge-using-moment algorithm for polarimetric SAR data....

  18. Evaluated kinetic and photochemical data for atmospheric chemistry

    Science.gov (United States)

    Baulch, D. L.; Cox, R. A.; Hampson, R. F., Jr.; Kerr, J. A.; Troe, J.; Watson, R. T.

    1980-01-01

    This paper contains a critical evaluation of the kinetics and photochemistry of gas phase chemical reactions of neutral species involved in middle atmosphere chemistry (10-55 km altitude). Data sheets have been prepared for 148 thermal and photochemical reactions, containing summaries of the available experimental data with notes giving details of the experimental procedures. For each reaction a preferred value of the rate coefficient at 298 K is given together with a temperature dependency where possible. The selection of the preferred value is discussed, and estimates of the accuracies of the rate coefficients and temperature coefficients have been made for each reaction. The data sheets are intended to provide the basic physical chemical data needed as input for calculations which model atmospheric chemistry. A table summarizing the preferred rate data is provided, together with an appendix listing the available data on enthalpies of formation of the reactant and product species.

  19. Evaluating Terra MODIS Satellite Sensor Data Products for Maize ...

    African Journals Online (AJOL)

    Celeste

    Evaluating Terra MODIS Satellite Sensor Data Products for Maize. Yield Estimation in South Africa. Celeste ... The Terra (EOS AM-1) research satellite carries the Moderate Resolution Imaging. Spectroradiometer (MODIS) sensor. .... VIS = Visable wavelengths. EVI is computed using this equation (Huete et al, 2002):. [2].

  20. Issues in Evaluating Model Fit With Missing Data

    Science.gov (United States)

    Davey, Adam

    2005-01-01

    Effects of incomplete data on fit indexes remain relatively unexplored. We evaluate a wide set of fit indexes (?[squared], root mean squared error of appproximation, Normed Fit Index [NFI], Tucker-Lewis Index, comparative fit index, gamma-hat, and McDonald's Centrality Index) varying conditions of sample size (100-1,000 in increments of 50),…

  1. Statistical methods to evaluate thermoluminescence ionizing radiation dosimetry data

    Energy Technology Data Exchange (ETDEWEB)

    Segre, Nadia; Matoso, Erika; Fagundes, Rosane Correa, E-mail: nadia.segre@ctmsp.mar.mil.b [Centro Tecnologico da Marinha em Sao Paulo (CEA/CTMSP), Ipero, SP (Brazil). Centro Experimental Aramar

    2011-07-01

    Ionizing radiation levels, evaluated through the exposure of CaF{sub 2}:Dy thermoluminescence dosimeters (TLD- 200), have been monitored at Centro Experimental Aramar (CEA), located at Ipero in Sao Paulo state, Brazil, since 1991 resulting in a large amount of measurements until 2009 (more than 2,000). The data amount associated with measurements dispersion, since every process has deviation, reinforces the utilization of statistical tools to evaluate the results, procedure also imposed by the Brazilian Standard CNEN-NN-3.01/PR- 3.01-008 which regulates the radiometric environmental monitoring. Thermoluminescence ionizing radiation dosimetry data are statistically compared in order to evaluate potential CEA's activities environmental impact. The statistical tools discussed in this work are box plots, control charts and analysis of variance. (author)

  2. Assessment of Existing Data and Reports for System Evaluation

    Science.gov (United States)

    Matolak, David W.; Skidmore, Trent A.

    2000-01-01

    This report describes work done as part of the Weather Datalink Research project grant. We describe the work done under Task 1 of this project: the assessment of the suitability of available reports and data for use in evaluation of candidate weather datalink systems, and the development of a performance parameter set for comparative system evaluation. It was found that existing data and reports are inadequate for a complete physical layer characterization, but that these reports provide a good foundation for system comparison. In addition, these reports also contain some information useful for evaluation at higher layers. The performance parameter list compiled can be viewed as near complete-additional investigations, both analytical/simulation and experimental, will likely result in additions and improvements to this list.

  3. Electromagnetic Nondestructive Evaluation of Tubes using Data Mining Procedure

    Science.gov (United States)

    Savin, A.; Iftimie, N.; Vizureanu, P.; Steigmann, R.; Dobrescu, G. S.

    2017-06-01

    The fundamental issues in nondestructive evaluation consists in the identification of events corresponding to the flaws which can appear in the examined object and their extraction from noises. This is usually done by comparison with pre-established thresholds, experimentally determined by using standard samples or in the basis of the solution of the forward problem and simulations. This paper presents the features extraction using data mining procedure in the case of tubes from steam generators having different flaws. The data mining is carried on using simulated models in CIVA 9 and experimental data gathered using an inner differential sensor developed in this purpose.

  4. PrismTech Data Distribution Service Java API Evaluation

    Science.gov (United States)

    Riggs, Cortney

    2008-01-01

    My internship duties with Launch Control Systems required me to start performance testing of an Object Management Group's (OMG) Data Distribution Service (DDS) specification implementation by PrismTech Limited through the Java programming language application programming interface (API). DDS is a networking middleware for Real-Time Data Distribution. The performance testing involves latency, redundant publishers, extended duration, redundant failover, and read performance. Time constraints allowed only for a data throughput test. I have designed the testing applications to perform all performance tests when time is allowed. Performance evaluation data such as megabits per second and central processing unit (CPU) time consumption were not easily attainable through the Java programming language; they required new methods and classes created in the test applications. Evaluation of this product showed the rate that data can be sent across the network. Performance rates are better on Linux platforms than AIX and Sun platforms. Compared to previous C++ programming language API, the performance evaluation also shows the language differences for the implementation. The Java API of the DDS has a lower throughput performance than the C++ API.

  5. Evaluation of ERTS-1 data for acquiring land use data of northern Megalopolis. [New England

    Science.gov (United States)

    Simpson, R. B.; Lindgren, D. T.; Goldstein, W. D.

    1974-01-01

    State planners are increasingly becoming interested in ERTS as a possible method for acquiring land use data. An important consideration to them is whether ERTS can provide such data at a savings in both time and money over alternative systems. A preliminary evaluation of ERTS as a planning tool is given.

  6. Generic and Automated Data Evaluation in Analytical Measurement.

    Science.gov (United States)

    Adam, Martin; Fleischer, Heidi; Thurow, Kerstin

    2017-04-01

    In the past year, automation has become more and more important in the field of elemental and structural chemical analysis to reduce the high degree of manual operation and processing time as well as human errors. Thus, a high number of data points are generated, which requires fast and automated data evaluation. To handle the preprocessed export data from different analytical devices with software from various vendors offering a standardized solution without any programming knowledge should be preferred. In modern laboratories, multiple users will use this software on multiple personal computers with different operating systems (e.g., Windows, Macintosh, Linux). Also, mobile devices such as smartphones and tablets have gained growing importance. The developed software, Project Analytical Data Evaluation (ADE), is implemented as a web application. To transmit the preevaluated data from the device software to the Project ADE, the exported XML report files are detected and the included data are imported into the entities database using the Data Upload software. Different calculation types of a sample within one measurement series (e.g., method validation) are identified using information tags inside the sample name. The results are presented in tables and diagrams on different information levels (general, detailed for one analyte or sample).

  7. Critical Evaluation of Validation Rules Automated Extraction from Data

    Directory of Open Access Journals (Sweden)

    David Pejcoch

    2014-10-01

    Full Text Available The goal of this article is to critically evaluate a possibility of automatic extraction of such kind of rules which could be later used within a Data Quality Management process for validation of records newly incoming to Information System. For practical demonstration the 4FT-Miner procedure implemented in LISpMiner System was chosen. A motivation for this task is the potential simplification of projects focused on Data Quality Management. Initially, this article is going to critically evaluate a possibility of fully automated extraction with the aim to identify strengths and weaknesses of this approach in comparison to its alternative, when at least some a priori knowledge is available. As a result of practical implementation, this article provides design of recommended process which would be used as a guideline for future projects. Also the question of how to store and maintain extracted rules and how to integrate them with existing tools supporting Data Quality Management is discussed

  8. An evaluated neutronic data file for elemental cobalt

    Energy Technology Data Exchange (ETDEWEB)

    Guenther, P.; Lawson, R.; Meadows, J.; Sugimoto, M.; Smith, A.; Smith, D.; Howerton, R.

    1988-08-01

    A comprehensive evaluated neutronic data file for elemental cobalt is described. The experimental data base, the calculational methods, the evaluation techniques and judgments, and the physical content are outlined. The file contains: neutron total and scattering cross sections and associated properties, (n,2n) and (n,3n) processes, neutron radiative capture processes, charged-particle-emission processes, and photon-production processes. The file extends from 10/sup /minus/5/ eV to 20 MeV, and is presented in the ENDF/B-VI format. Detailed attention is given to the uncertainties and correlations associated with the prominent neutron-induced processes. The numerical contents of the file have been transmitted to the National Nuclear Data Center, Brookhaven National Laboratory. 143 refs., 16 figs., 5 tabs.

  9. The Challenges of Data Quality Evaluation in a Joint Data Warehouse.

    Science.gov (United States)

    Bae, Charles J; Griffith, Sandra; Fan, Youran; Dunphy, Cheryl; Thompson, Nicolas; Urchek, John; Parchman, Alandra; Katzan, Irene L

    2015-01-01

    The use of clinically derived data from electronic health records (EHRs) and other electronic clinical systems can greatly facilitate clinical research as well as operational and quality initiatives. One approach for making these data available is to incorporate data from different sources into a joint data warehouse. When using such a data warehouse, it is important to understand the quality of the data. The primary objective of this study was to determine the completeness and concordance of common types of clinical data available in the Knowledge Program (KP) joint data warehouse, which contains feeds from several electronic systems including the EHR. A manual review was performed of specific data elements for 250 patients from an EHR, and these were compared with corresponding elements in the KP data warehouse. Completeness and concordance were calculated for five categories of data including demographics, vital signs, laboratory results, diagnoses, and medications. In general, data elements for demographics, vital signs, diagnoses, and laboratory results were present in more cases in the source EHR compared to the KP. When data elements were available in both sources, there was a high concordance. In contrast, the KP data warehouse documented a higher prevalence of deaths and medications compared to the EHR. Several factors contributed to the discrepancies between data in the KP and the EHR-including the start date and frequency of data feeds updates into the KP, inability to transfer data located in nonstructured formats (e.g., free text or scanned documents), as well as incomplete and missing data variables in the source EHR. When evaluating the quality of a data warehouse with multiple data sources, assessing completeness and concordance between data set and source data may be better than designating one to be a gold standard. This will allow the user to optimize the method and timing of data transfer in order to capture data with better accuracy.

  10. Evaluation of a Data Messaging System Solution : Case: Evaluation of Apache Kafka™ at Accanto Systems

    OpenAIRE

    Nguyen, Huong

    2017-01-01

    This study aims to explore the process of adapting a new Data Messaging System Solution, i.e Apache Kafka™ (Kafka), and to evaluate whether it is suitable for the needs at Accanto Systems. The research follows the framework for Design Science research methodology. Evaluation of the artefact involves the use of a software quality model. The results of the study confirm that Kafka is satisfactory as a Data Messaging System solution. The results may also serve as an implementation guide...

  11. Data-Intensive Evaluation: The Concept, Methods, and Prospects of Higher Education Monitoring Evaluation

    Science.gov (United States)

    Wang, Zhanjun; Qiao, Weifeng; Li, Jiangbo

    2016-01-01

    Higher education monitoring evaluation is a process that uses modern information technology to continually collect and deeply analyze relevant data, visually present the state of higher education, and provide an objective basis for value judgments and scientific decision making by diverse bodies Higher education monitoring evaluation is…

  12. NASA Data Evaluation (2015): Chemical Kinetics and Photochemical Data for Use in Atmospheric Studies

    Science.gov (United States)

    Burkholder, J. B.; Sander, S. P.; Abbatt, J.; Barker, J. R.; Huie, R. E.; Kolb, C. E., Jr.; Kurylo, M. J., III; Orkin, V. L.; Wilmouth, D. M.; Wine, P. H.

    2015-12-01

    Atmospheric chemistry models must include a large number of processes to accurately describe the temporal and spatial behavior of atmospheric composition. They require a wide range of chemical and physical data (parameters) that describe elementary gas-phase and heterogeneous processes. The review and evaluation of chemical and physical data has, therefore, played an important role in the development of chemical models and in their use in environmental assessment activities. The NASA data panel evaluation has a broad atmospheric focus that includes Ox, O(1D), singlet O2, HOx, NOx, Organic, FOx, ClOx, BrOx, IOx, SOx, and Na reactions, three-body reactions, equilibrium constants, photochemistry, Henry's Law coefficients, aqueous chemistry, heterogeneous chemistry and processes, and thermodynamic parameters. The 2015 evaluation includes critical coverage of ~700 bimolecular reactions, 86 three-body reactions, 33 equilibrium constants, ~220 photochemical species, ~360 aqueous and heterogeneous processes, and thermodynamic parameters for ~800 species with over 5000 literature citations reviewed. Each evaluation includes (1) recommended values (e.g. rate coefficients, absorption cross sections, solubilities, and uptake coefficients) with estimated uncertainty factors and (2) a note describing the available experimental and theoretical data and an explanation for the recommendation. This presentation highlights some of the recent additions to the evaluation that include: (1) expansion of thermochemical parameters, including Hg species, (2) CH2OO (Criegee) chemistry, (3) Isoprene and its major degradation product chemistry, (4) halocarbon chemistry, (5) Henry's law solubility data, and (6) uptake coefficients. In addition, a listing of complete references with the evaluation notes has been implemented. Users of the data evaluation are encouraged to suggest potential improvements and ways that the evaluation can better serve the atmospheric chemistry community.

  13. Data analysis and management for the Uranium Resource Evaluation Project

    Energy Technology Data Exchange (ETDEWEB)

    Kane, V.E.

    1980-01-01

    The Department of Energy has funded a large data collection effort with the purpose of determining the US uranium resources. This Uranium Resource Evaluation (URE) Project required a large data management effort which involved collection, retrieval, processing, display, and analysis of large volumes of data. Many of the characteristics of this data processing system are relevant to other applications, particularly where routine processing involves analyses for input into numerous technical reports. The URE Project computing system has a modular program structure which has enabled a straightforward interface with both special and general graphics and analysis packages such as SAS, BMDP, and SURFACE II. Other topics include cost-effective computing, data quality, report quality computer output, and test versus production program development.

  14. Modern Nuclear Data Evaluation with the TALYS Code System

    Energy Technology Data Exchange (ETDEWEB)

    Koning, A.J., E-mail: koning@nrg.eu [Nuclear Research and Consultancy Group NRG, P.O. Box, 1755 ZG Petten (Netherlands); Rochman, D. [Nuclear Research and Consultancy Group NRG, P.O. Box, 1755 ZG Petten (Netherlands)

    2012-12-15

    This paper presents a general overview of nuclear data evaluation and its applications as developed at NRG, Petten. Based on concepts such as robustness, reproducibility and automation, modern calculation tools are exploited to produce original nuclear data libraries that meet the current demands on quality and completeness. This requires a system which comprises differential measurements, theory development, nuclear model codes, resonance analysis, evaluation, ENDF formatting, data processing and integral validation in one integrated approach. Software, built around the TALYS code, will be presented in which all these essential nuclear data components are seamlessly integrated. Besides the quality of the basic data and its extensive format testing, a second goal lies in the diversity of processing for different type of users. The implications of this scheme are unprecedented. The most important are: 1. Complete ENDF-6 nuclear data files, in the form of the TENDL library, including covariance matrices, for many isotopes, particles, energies, reaction channels and derived quantities. All isotopic data files are mutually consistent and are supposed to rival those of the major world libraries. 2. More exact uncertainty propagation from basic nuclear physics to applied (reactor) calculations based on a Monte Carlo approach: 'Total' Monte Carlo (TMC), using random nuclear data libraries. 3. Automatic optimization in the form of systematic feedback from integral measurements back to the basic data. This method of work also opens a new way of approaching the analysis of nuclear applications, with consequences in both applied nuclear physics and safety of nuclear installations, and several examples are given here. This applied experience and feedback is integrated in a final step to improve the quality of the nuclear data, to change the users vision and finally to orchestrate their integration into simulation codes.

  15. Charting and Evaluation of Environmental Microbial Monitoring Data.

    Science.gov (United States)

    Bar, Raphael

    2015-01-01

    Statistical tools are required to organize and present microbial environmental monitoring data for the purpose of evaluating it against regulatory action limits and of determining if the microbial monitoring process is in a state of control. This paper applies a known methodology of a simple and straightforward construction of control XmR (X data and moving range) charts of individual microbial counts as they are or of contamination rates derived from them, irrespective of the type of the parent data distribution and without the need to transform the data into a normal distribution. Plotting of monthly and cumulative sample contamination rates, as newly suggested by USP , is also shown. Both types of the control charts and plots allow an evaluation of the behavior of the microbial monitoring process. After addressing the magnitude of microbial counts expected in environmental monitoring samples, this paper presents the rationale behind the use of XmR charts. Employing data taken from environmental monitoring programs of pharmaceuticals manufacturing facilities, this paper analyzes examples of (1) microbial counts from passive or active air sampling in area Grade D or B or Class 100,000 in XmR charts, (2) contamination recovery rates as suggested by USP from active air samples in area Grade B and contact plates in area Grade C, and (3) instantaneous contamination rates with calculations illustrated on microbial counts of contact plates in area Grade D. Pharmaceutical companies conduct environmental monitoring programs, and samples of air (active and passive sampling) and of surfaces (contact plates) are routinely tested for microbiological quality. Thus, hundreds of microbial counts of tested environmental monitoring samples are routinely generated and recorded. Statistical tools are required to organize and present this abundant data for the purpose of evaluating it against regulatory action limits and determining if the microbial monitoring process is a state of

  16. Threshold evaluation data revision and computer program enhancement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-02-27

    The Threshold Evaluation System was developed to assist the Division of Buildings and Community Systems of the Department of Energy in performing preliminary evaluation of projects being considered for funding. In addition, the evaluation has been applied to on-going projects, because information obtained through RD and D may alter the expected benefits and costs of a project, making it necessary to reevaluate project funding. The system evaluates each project according to its expected energy savings and costs. A number of public and private sector criteria are calculated, upon which comparisons between projects may be based. A summary of the methodology is given in Appendix B. The purpose of this task is to upgrade both the quality of the data used for input to the system and the usefulness and efficiency of the computer program used to perform the analysis. The modifications required to produce a better, more consistent set of data are described in Section 2. Program changes that have had a significant impact on the methodology are discussed in Section 3, while those that affected only the computer code are presented as a system flow diagram and program listing in Appendix C. These improvements in the project evaluation methodology and data will provide BCS with a more efficient and comprehensive management tool. The direction of future work will be toward integrating this system with a large scale (at ORNL) so that information used by both systems may be stored in a common data base. A discussion of this, and other unresolved problems is given in Section 4.

  17. Evaluation of mask data preparation with OASIS and P10

    Science.gov (United States)

    Kuriyama, Koki; Machiya, Yuji; Yamasaki, Kiyoshi; Narukawa, Shogo; Hayashi, Naoya

    2005-06-01

    OASIS (Open Artwork System Interchange Standard) is the new stream format to replace conventional GDSII and has become a SEMI standard 2003. Also, some EDA software tools already support OASIS. OASIS can apply not only layout design field but also photomask industory. OASIS is effective to reduce data volume even if it is a fractured data, therefore it is expected to solve file size explosion problem. From mask manufacturer's perspective, it is also necessary to consider mask layout information. In present, there are various kinds of layout information and jobdeck formats. These circumstances require complicated data handling and preparation process at the mask manufacturers. Computerized automatic process needs to be more utilized to eradicate mistakes and miscommunications at the planning department. SEMI standard P10 (Specification of Data Structures for Photomask Orders) is one of the solutions. P10 is basically intended to communicate about mask order data which include layout information. This paper reports the result of evaluation of mask data preparation unified with two SEMI standards: P39 (OASIS) and P10. We have developed a reticle pattern viewer (HOTSCOPE) which can view photomask data with combined OASIS with P10. Figure 1 shows connection between mask data formats, which include OASIS and P10 format with our reticle pattern viewer. HOTSCOPE provides reviewing mask data as a photomask image. It will interface between device manufacturers and mask manufacturers.

  18. Normalizing Google Scholar data for use in research evaluation.

    Science.gov (United States)

    Mingers, John; Meyer, Martin

    2017-01-01

    Using bibliometric data for the evaluation of the research of institutions and individuals is becoming increasingly common. Bibliometric evaluations across disciplines require that the data be normalized to the field because the fields are very different in their citation processes. Generally, the major bibliographic databases such as Web of Science (WoS) and Scopus are used for this but they have the disadvantage of limited coverage in the social science and humanities. Coverage in Google Scholar (GS) is much better but GS has less reliable data and fewer bibliometric tools. This paper tests a method for GS normalization developed by Bornmann et al. (J Assoc Inf Sci Technol 67:2778-2789, 2016) on an alternative set of data involving journal papers, book chapters and conference papers. The results show that GS normalization is possible although at the moment it requires extensive manual involvement in generating and validating the data. A comparison of the normalized results for journal papers with WoS data shows a high degree of convergent validity.

  19. Chemical kinetic and photochemical data for use in stratospheric modeling evaluation number 4: NASA panel for data evaluation

    Science.gov (United States)

    1981-01-01

    Evaluated sets of rate constants and photochemical cross sections compiled by the Panel are presented. The primary application of the data is in the modelling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  20. Evaluation of thematic mapper data for natural resource assessment

    Science.gov (United States)

    Haas, R.H.; Waltz, F.A.

    1983-01-01

    The U.S. Geological Survey EROS Data Center evaluated the utility of Landsat Thematic Mapper (TM) date for natural resource assessment, emphasizing manual interpretation and digital classification of the data for U.S. Department of the Interior applications. Substantially more information was derived from TM data than from Landsat Multispectral Scanner (MSS) data. Greater resolution of TM data aided in locating roads, small stock ponds, and many other land features that could be used as landmarks. The improved spatial resolution of TM data also permitted more efficient visual interpretations of land use, better identification of resource types, and improved assessment of ecological status of natural vegetation. TM data also provided a new source of spectral information that was useful for natural resource assessment. New mid-infrared spectral bands, TM band 5 and band 7, aided in distinguishing water resources, wetland vegetation resources, and other important terrain features. The added information was useful for both manual interpretation and digital data classification of vegetation resources and land features. Results from the analyses of both TM and TM simulator (TMS) spectral data suggest that the coefficient of variation for major land cover types is generally less for TM data than for MSS data taken from the same area. This reduction in variance should contribute to an improved multispectral analysis, contributing new information about vegetation in natural ecosystems. Although the amount of new information in TM bands 5 and 7 is mall, it is unique in that the same information cannot be derived from four-band Landsat MSS spectral data.

  1. Item Selection, Evaluation, and Simple Structure in Personality Data.

    Science.gov (United States)

    Pettersson, Erik; Turkheimer, Eric

    2010-08-01

    We report an investigation of the genesis and interpretation of simple structure in personality data using two very different self-reported data sets. The first consists of a set of relatively unselected lexical descriptors, whereas the second is based on responses to a carefully constructed instrument. In both data sets, we explore the degree of simple structure by comparing factor solutions to solutions from simulated data constructed to have either strong or weak simple structure. The analysis demonstrates that there is little evidence of simple structure in the unselected items, and a moderate degree among the selected items. In both instruments, however, much of the simple structure that could be observed originated in a strong dimension of positive vs. negative evaluation.

  2. General practice ethnicity data: evaluation of a tool

    Directory of Open Access Journals (Sweden)

    Neuwelt P

    2014-03-01

    Full Text Available INTRODUCTION: There is evidence that the collection of ethnicity data in New Zealand primary care is variable and that data recording in practices does not always align with the procedures outlined in the Ethnicity Data Protocols for the Health and Disability Sector. In 2010, The Ministry of Health funded the development of a tool to audit the collection of ethnicity data in primary care. The aim of this study was to pilot the Ethnicity Data Audit Tool (EAT in general practice. The goal was to evaluate the tool and identify recommendations for its improvement. METHODS: Eight general practices in the Waitemata District Health Board region participated in the EAT pilot. Feedback about the pilot process was gathered by questionnaires and interviews, to gain an understanding of practices’ experiences in using the tool. Questionnaire and interview data were analysed using a simple analytical framework and a general inductive method. FINDINGS: General practice receptionists, practice managers and general practitioners participated in the pilot. Participants found the pilot process challenging but enlightening. The majority felt that the EAT was a useful quality improvement tool for handling patient ethnicity data. Larger practices were the most positive about the tool. CONCLUSION: The findings suggest that, with minor improvements to the toolkit, the EAT has the potential to lead to significant improvements in the quality of ethnicity data collection and recording in New Zealand general practices. Other system-level factors also need to be addressed.

  3. A conceptual framework for evaluating data suitability for observational studies.

    Science.gov (United States)

    Shang, Ning; Weng, Chunhua; Hripcsak, George

    2017-09-08

    To contribute a conceptual framework for evaluating data suitability to satisfy the research needs of observational studies. Suitability considerations were derived from a systematic literature review on researchers' common data needs in observational studies and a scoping review on frequent clinical database design considerations, and were harmonized to construct a suitability conceptual framework using a bottom-up approach. The relationships among the suitability categories are explored from the perspective of 4 facets of data: intrinsic, contextual, representational, and accessible. A web-based national survey of domain experts was conducted to validate the framework. Data suitability for observational studies hinges on the following key categories: Explicitness of Policy and Data Governance, Relevance, Availability of Descriptive Metadata and Provenance Documentation, Usability, and Quality. We describe 16 measures and 33 sub-measures. The survey uncovered the relevance of all categories, with a 5-point Likert importance score of 3.9 ± 1.0 for Explicitness of Policy and Data Governance, 4.1 ± 1.0 for Relevance, 3.9 ± 0.9 for Availability of Descriptive Metadata and Provenance Documentation, 4.2 ± 1.0 for Usability, and 4.0 ± 0.9 for Quality. The suitability framework evaluates a clinical data source's fitness for research use. Its construction reflects both researchers' points of view and data custodians' design features. The feedback from domain experts rated Usability, Relevance, and Quality categories as the most important considerations.

  4. Data Collection Guidelines for Consistent Evaluation of Data from Verification and Monitoring Safeguard Systems

    Energy Technology Data Exchange (ETDEWEB)

    Castleberry, K.; Lenarduzzi, R.; Whitaker, M.

    1999-09-20

    One of the several activities the International Atomic Energy Agency (IAEA) inspectors perform in the verification process of Safeguard operations is the review and correlation of data from different sources. This process is often complex due to the different forms in which the data is presented. This paper describes some of the elements that are necessary to create a ''standardized'' structure for the verification of data. When properly collected and formatted, data can be analyzed with off-the shelf software applications using customized macros to automate the commands for the desired analysis. The standardized-data collection methodology is based on instrumentation guidelines as well as data structure elements, such as verifiable timing of data entry, automated data logging, identification codes, and others. The identification codes are used to associate data items with their sources and to correlate them with items from other data logging activities. The addition of predefined parameter ranges allows automated evaluation with the capability to provide a data summary, a cross-index of all data related to a specific event. Instances of actual databases are used as examples. The data collection guidelines described in this paper facilitate the use of data from a variety of instrumentation platforms and also allow the instrumentation itself to be more easily applied in subsequent monitoring applications.

  5. Evaluation of data quality in Japanese National Forest Inventory.

    Science.gov (United States)

    Kitahara, Fumiaki; Mizoue, Nobuya; Yoshida, Shigejiro

    2009-12-01

    We evaluated the quality of data being collected for the Japanese National Forest Inventory (NFI). The inventory program commenced in 1999 but has not incorporated a quality assurance (QA) program; we sought to determine what effect this was having on the quality of data being collected. Forty-eight plots in four prefectures were measured by operational field teams and then remeasured by a control team that made careful and unhurried measurements. The paired data were evaluated, including diameter, total height, tree count, species richness, and topographic condition. Compared to the control team, all field teams of each prefecture tended to significantly underestimate all of the continuous variables. Most variables had larger variability in the inventory data than has been reported in the published literature. The findings of consistent bias and large variation in the field team measurements call for urgent implementation of a quality assurance program (extensive field training and regular remeasurement) in the Japanese NFI to improve data quality, and this conclusion could be applied to the inventory system of any country that does not include a QA program.

  6. Evaluating de Bruijn graph assemblers on 454 transcriptomic data.

    Directory of Open Access Journals (Sweden)

    Xianwen Ren

    Full Text Available Next generation sequencing (NGS technologies have greatly changed the landscape of transcriptomic studies of non-model organisms. Since there is no reference genome available, de novo assembly methods play key roles in the analysis of these data sets. Because of the huge amount of data generated by NGS technologies for each run, many assemblers, e.g., ABySS, Velvet and Trinity, are developed based on a de Bruijn graph due to its time- and space-efficiency. However, most of these assemblers were developed initially for the Illumina/Solexa platform. The performance of these assemblers on 454 transcriptomic data is unknown. In this study, we evaluated and compared the relative performance of these de Bruijn graph based assemblers on both simulated and real 454 transcriptomic data. The results suggest that Trinity, the Illumina/Solexa-specialized transcriptomic assembler, performs the best among the multiple de Bruijn graph assemblers, comparable to or even outperforming the standard 454 assembler Newbler which is based on the overlap-layout-consensus algorithm. Our evaluation is expected to provide helpful guidance for researchers to choose assemblers when analyzing 454 transcriptomic data.

  7. Data Evaluation for 56Co epsilon + beta+ Decay

    Energy Technology Data Exchange (ETDEWEB)

    Baglin, Coral M.; MacMahon, T. Desmond

    2005-02-28

    Recommended values for nuclear and atomic data pertaining to the {var_epsilon} + {beta}{sup +} decay of {sup 56}Co are provided here, followed by comments on evaluation procedures and a summary of all available experimental data. {sup 56}Co is a radionuclide which is potentially very useful for Ge detector efficiency calibration because it is readily produced via the {sup 56}Fe(p,n) reaction, its half-life of 77.24 days is conveniently long, and it provides a number of relatively strong {gamma} rays with energies up to {approx}3500 keV. The transition intensities recommended here for the strongest lines will be included in the forthcoming International Atomic Energy Agency Coordinated Research Programme document ''Update of X- and Gamma-ray Decay Data Standards for Detector Calibration and Other Applications'', and the analysis for all transitions along with relevant atomic data have been provided to the Decay Data Evaluation Project.

  8. Automated Quality Evaluation for a More Effective Data Peer Review

    Directory of Open Access Journals (Sweden)

    A Düsterhus

    2014-06-01

    Full Text Available A peer review scheme comparable to that used in traditional scientific journals is a major element missing in bringing publications of raw data up to standards equivalent to those of traditional publications. This paper introduces a quality evaluation process designed to analyse the technical quality as well as the content of a dataset. This process is based on quality tests, the results of which are evaluated with the help of the knowledge of an expert. As a result, the quality is estimated by a single value only. Further, the paper includes an application and a critical discussion on the potential for success, the possible introduction of the process into data centres, and practical implications of the scheme.

  9. ENDF-102 DATA FORMATS AND PROCEDURES FOR THE EVALUATION NUCLEAR DATA FILE ENDF-6.

    Energy Technology Data Exchange (ETDEWEB)

    MCLANE,V.

    2001-05-15

    The Evaluated Nuclear Data File (ENDF) formats and libraries are decided by the Cross Section Evaluation Working Group (CSEWG), a cooperative effort of national laboratories, industry, and universities in the U.S. and Canada, and are maintained by the National Nuclear Data Center (NNDC). Earlier versions of the ENDF format provided representations for neutron cross sections and distributions, photon production from neutron reactions, a limited amount of charged-particle production from neutron reactions, photo-atomic interaction data, thermal neutron scattering data, and radionuclide production and decay data (including fission products). Version 6 (ENDF-6) allows higher incident energies, adds more complete descriptions of the distributions of emitted particles, and provides for incident charged particles and photonuclear data by partitioning the ENDF library into sub-libraries. Decay data, fission product yield data, thermal scattering data, and photo-atomic data have also been formally placed in sub-libraries. In addition, this rewrite represents an extensive update to the Version V manual.

  10. Analysis of the evaluated data discrepancies for minor actinides and development of improved evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ignatyuk, A. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)

    1997-03-01

    The work is directed on a compilation of experimental and evaluated data available for neutron induced reaction cross sections on {sup 237}Np, {sup 241}Am, {sup 242m}Am and {sup 243}Am isotopes, on the analysis of the old data and renormalizations connected with changes of standards and on the comparison of experimental data with theoretical calculation. Main results of the analysis performed by now are presented in this report. (J.P.N.)

  11. Possibilistic NDT data fusion for evaluating concrete structures

    OpenAIRE

    PLOIX, M.A.; Garnier, V.; Breysse, D.; Moysan, J.

    2009-01-01

    International audience; A new application of data fusion is presented within the context of national research project named SENSO. The aim is to improve evaluation of indicators or pathologies of in situ concrete structures by combining measurements from different NDE techniques (radar, electrical resistivity and capacity, infrared thermography, impact echo and ultrasounds). Every non-destructive measurement is likely to provide an estimation of the unknown indicators with a certain confidenc...

  12. Evaluated nuclear structure data file. a manual for preparation of data sets

    CERN Document Server

    Tuli, J K

    2001-01-01

    This manual describes the organization and structure of the Evaluated Nuclear Structure Data File (ENSDF). This computer-based file is maintained by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory for the international Nuclear Structure and Decay Data Network. For every mass number (presently, A = 44, this information is published in the Nuclear Data Sheets; for A < 44, ENSDF is based on compilations published in the journal Nuclear Physics. The information in ENSDF is updated by mass chain or by nuclide with a varying cycle time dependent on the availability of new information.

  13. The Decay Data Evaluation Project (DDEP and the JEFF-3.3 radioactive decay data library: Combining international collaborative efforts on evaluated decay data

    Directory of Open Access Journals (Sweden)

    Kellett Mark A.

    2017-01-01

    Full Text Available The Decay Data Evaluation Project (DDEP, is an international collaboration of decay data evaluators formed with groups from France, Germany, USA, China, Romania, Russia, Spain and the UK, mainly from the metrology community. DDEP members have evaluated over 220 radionuclides, following an agreed upon methodology, including a peer review. Evaluations include all relevant parameters relating to the nuclear decay and the associated atomic processes. An important output of these evaluations are recommendations for new measurements, which can serve as a basis for future measurement programmes. Recently evaluated radionuclides include: 18F, 59Fe, 82Rb, 82Sr, 88Y, 90Y, 89Zr, 94mTc, 109Cd, 133Ba, 140Ba, 140La, 151Sm and 169Er. The DDEP recommended data have recently been incorporated into the JEFF-3.3 Radioactive Decay Data Library. Other sources of nuclear data include 900 or so radionuclides converted from the Evaluated Nuclear Structure Data File (ENSDF, 500 from two UK libraries (UKPADD6.12 and UKHEDD2.6, the IAEA Actinide Decay Data Library, with the remainder converted from the NUBASE evaluation of nuclear properties. Mean decay energies for a number of radionuclides determined from total absorption gamma-ray spectroscopy (TAGS have also been included, as well as more recent European results from TAGS measurements performed at the University of Jyväskylä by groups from the University of Valencia, Spain and SUBATECH, the University of Nantes, France. The current status of the DDEP collaboration and the JEFF Radioactive Decay Data Library will be presented.

  14. The Decay Data Evaluation Project (DDEP) and the JEFF-3.3 radioactive decay data library: Combining international collaborative efforts on evaluated decay data

    Science.gov (United States)

    Kellett, Mark A.; Bersillon, Olivier

    2017-09-01

    The Decay Data Evaluation Project (DDEP), is an international collaboration of decay data evaluators formed with groups from France, Germany, USA, China, Romania, Russia, Spain and the UK, mainly from the metrology community. DDEP members have evaluated over 220 radionuclides, following an agreed upon methodology, including a peer review. Evaluations include all relevant parameters relating to the nuclear decay and the associated atomic processes. An important output of these evaluations are recommendations for new measurements, which can serve as a basis for future measurement programmes. Recently evaluated radionuclides include: 18F, 59Fe, 82Rb, 82Sr, 88Y, 90Y, 89Zr, 94mTc, 109Cd, 133Ba, 140Ba, 140La, 151Sm and 169Er. The DDEP recommended data have recently been incorporated into the JEFF-3.3 Radioactive Decay Data Library. Other sources of nuclear data include 900 or so radionuclides converted from the Evaluated Nuclear Structure Data File (ENSDF), 500 from two UK libraries (UKPADD6.12 and UKHEDD2.6), the IAEA Actinide Decay Data Library, with the remainder converted from the NUBASE evaluation of nuclear properties. Mean decay energies for a number of radionuclides determined from total absorption gamma-ray spectroscopy (TAGS) have also been included, as well as more recent European results from TAGS measurements performed at the University of Jyväskylä by groups from the University of Valencia, Spain and SUBATECH, the University of Nantes, France. The current status of the DDEP collaboration and the JEFF Radioactive Decay Data Library will be presented. Note to the reader: the pdf file has been changed on September 22, 2017.

  15. Evaluation of the Global Land Data Assimilation System (GLDAS) air temperature data products

    Science.gov (United States)

    Ji, Lei; Senay, Gabriel B.; Verdin, James P.

    2015-01-01

    There is a high demand for agrohydrologic models to use gridded near-surface air temperature data as the model input for estimating regional and global water budgets and cycles. The Global Land Data Assimilation System (GLDAS) developed by combining simulation models with observations provides a long-term gridded meteorological dataset at the global scale. However, the GLDAS air temperature products have not been comprehensively evaluated, although the accuracy of the products was assessed in limited areas. In this study, the daily 0.25° resolution GLDAS air temperature data are compared with two reference datasets: 1) 1-km-resolution gridded Daymet data (2002 and 2010) for the conterminous United States and 2) global meteorological observations (2000–11) archived from the Global Historical Climatology Network (GHCN). The comparison of the GLDAS datasets with the GHCN datasets, including 13 511 weather stations, indicates a fairly high accuracy of the GLDAS data for daily temperature. The quality of the GLDAS air temperature data, however, is not always consistent in different regions of the world; for example, some areas in Africa and South America show relatively low accuracy. Spatial and temporal analyses reveal a high agreement between GLDAS and Daymet daily air temperature datasets, although spatial details in high mountainous areas are not sufficiently estimated by the GLDAS data. The evaluation of the GLDAS data demonstrates that the air temperature estimates are generally accurate, but caution should be taken when the data are used in mountainous areas or places with sparse weather stations.

  16. A program evaluation of classroom data collection with bar codes.

    Science.gov (United States)

    Saunders, M D; Saunders, J L; Saunders, R R

    1993-01-01

    A technology incorporating bar code symbols and hand-held optical scanners was evaluated for its utility for routine data collection in a special education classroom. A different bar code symbol was created for each Individualized Educational Plan objective, each type of response occurrence, and each student in the first author's classroom. These symbols were organized by activity and printed as data sheets. The teacher and paraprofessionals scanned relevant codes with scanners when the students emitted targeted behaviors. The codes, dates, and approximate times of the scans were retained in the scanner's electronic memory until they could be transferred by communication software to a computer file. The data from the computer file were organized weekly into a printed report of student performance using a program written with commercially available database software. Advantages, disadvantages, and costs of using the system are discussed.

  17. Evaluating strategies to normalise biological replicates of Western blot data.

    Science.gov (United States)

    Degasperi, Andrea; Birtwistle, Marc R; Volinsky, Natalia; Rauch, Jens; Kolch, Walter; Kholodenko, Boris N

    2014-01-01

    Western blot data are widely used in quantitative applications such as statistical testing and mathematical modelling. To ensure accurate quantitation and comparability between experiments, Western blot replicates must be normalised, but it is unclear how the available methods affect statistical properties of the data. Here we evaluate three commonly used normalisation strategies: (i) by fixed normalisation point or control; (ii) by sum of all data points in a replicate; and (iii) by optimal alignment of the replicates. We consider how these different strategies affect the coefficient of variation (CV) and the results of hypothesis testing with the normalised data. Normalisation by fixed point tends to increase the mean CV of normalised data in a manner that naturally depends on the choice of the normalisation point. Thus, in the context of hypothesis testing, normalisation by fixed point reduces false positives and increases false negatives. Analysis of published experimental data shows that choosing normalisation points with low quantified intensities results in a high normalised data CV and should thus be avoided. Normalisation by sum or by optimal alignment redistributes the raw data uncertainty in a mean-dependent manner, reducing the CV of high intensity points and increasing the CV of low intensity points. This causes the effect of normalisations by sum or optimal alignment on hypothesis testing to depend on the mean of the data tested; for high intensity points, false positives are increased and false negatives are decreased, while for low intensity points, false positives are decreased and false negatives are increased. These results will aid users of Western blotting to choose a suitable normalisation strategy and also understand the implications of this normalisation for subsequent hypothesis testing.

  18. Privacy of fingermarks data in forensic science: forensic evaluation and individual data protection

    NARCIS (Netherlands)

    Baartmans, Chloë; Meuwly, Didier; Kosta, Eline

    2014-01-01

    Expert-based methods are used from the beginning of the 20th century for forensic evaluation of fingermarks (trace specimens) and fingerprints (reference specimens). Currently semi-automatic systems using biometric data, biometric technology and statistical models are developed to support the

  19. A Category Based Threat Evaluation Model Using Platform Kinematics Data

    Directory of Open Access Journals (Sweden)

    Mustafa Çöçelli

    2017-08-01

    Full Text Available Command and control (C2 systems direct operators to make accurate decisions in the stressful atmosphere of the battlefield at the earliest. There are powerful tools that fuse various instant piece of information and brings summary of those in front of operators. Threat evaluation is one of the important fusion method that provides these assistance to military people. However, C2 systems could be deprived of valuable data source due to the absence of capable equipment. This situation has a bad unfavorable influence on the quality of tactical picture in front of C2 operators. In this paper, we study on the threat evaluation model that take into account these deficiencies. Our method extracts threat level of various targets mostly from their kinematics in two dimensional space. In the meantime, classification of entities around battlefield is unavailable. Only, category of targets are determined as a result of sensors process, which is the information of whether entities belong to air or surface environment. Hereby, threat evaluation model is consist of three fundamental steps that runs on entities belongs to different environment separately: the extraction of threat assessment cues, threat selection based on Bayesian Inference and the calculation of threat assessment rating. We have evaluated performance of proposed model by simulating a set of synthetic scenarios.

  20. Evaluation of aileron actuator reliability with censored data

    Directory of Open Access Journals (Sweden)

    Li Huaiyuan

    2015-08-01

    Full Text Available For the purpose of enhancing reliability of aileron of Airbus new-generation A350XWB, an evaluation of aileron reliability on the basis of maintenance data is presented in this paper. Practical maintenance data contains large number of censoring samples, information uncertainty of which makes it hard to evaluate reliability of aileron actuator. Considering that true lifetime of censoring sample has identical distribution with complete sample, if censoring sample is transformed into complete sample, conversion frequency of censoring sample can be estimated according to frequency of complete sample. On the one hand, standard life table estimation and product limit method are improved on the basis of such conversion frequency, enabling accurate estimation of various censoring samples. On the other hand, by taking such frequency as one of the weight factors and integrating variance of order statistics under standard distribution, weighted least square estimation is formed for accurately estimating various censoring samples. Large amounts of experiments and simulations show that reliabilities of improved life table and improved product limit method are closer to the true value and more conservative; moreover, weighted least square estimate (WLSE, with conversion frequency of censoring sample and variances of order statistics as the weights, can still estimate accurately with high proportion of censored data in samples. Algorithm in this paper has good effect and can accurately estimate the reliability of aileron actuator even with small sample and high censoring rate. This research has certain significance in theory and engineering practice.

  1. Lifetime-Aware Cloud Data Centers: Models and Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Luca Chiaraviglio

    2016-06-01

    Full Text Available We present a model to evaluate the server lifetime in cloud data centers (DCs. In particular, when the server power level is decreased, the failure rate tends to be reduced as a consequence of the limited number of components powered on. However, the variation between the different power states triggers a failure rate increase. We therefore consider these two effects in a server lifetime model, subject to an energy-aware management policy. We then evaluate our model in a realistic case study. Our results show that the impact on the server lifetime is far from negligible. As a consequence, we argue that a lifetime-aware approach should be pursued to decide how and when to apply a power state change to a server.

  2. Evaluation and Monitoring of Jpss Land Surface Temperature Data

    Science.gov (United States)

    Yu, Y.; Yu, P.; Liu, Y.; Csiszar, I. A.

    2016-12-01

    Land Surface Temperature (LST) is one of environmental data records (EDRs) produced operationally through the U.S. Joint Polar Satellite System (JPSS) mission. LST is an important parameter for understanding climate change, modeling the hydrological and biogeochemical cycles, and is a prime candidate for Numerical Weather Prediction (NWP) assimilation models. Recently, the international LST and Emissivity Working Ggroup (ILSTE-WG) is promoting to the inclusion of the LST as essential climate variable (ECV) in the Global Climate Observation System (GCOS) of the Word Meteorological Organization (WMO). At the Center for Satellite Applications and Research (STAR) of National Atmospheric and Oceanic Administration (NOAA), we, are as a science team, are responsible to for the science of JPSS LST production. In this work, we present our activities and accomplishments on the JPSS LST evaluation and monitoring since the launch of the first JPSS satellite, i.e. S-NPP, satellite. Beta version, provisional version, and validated stage 1 version of the S-NPP LST products which were announced in May 2013, July 2014, and March 2015, respectively. Evaluation of the LST products have been performed versus ground measurements and other polar-orbiting satellite LST data (e,g. MODIS LSTs); some results will be illustrated. A daily monitoring system of the JPSS LST production has been developed, which presents daily, weekly and monthly global LST maps and inter-comparison results on the STAR JPSS program website. Further, evaluation of the enterprise LST algorithm for JPSS mission which is in development at STAR currently are presented in this work. Finally, evaluation and monitoring plan of the LST production for the JPSS-1 satellite are also presented.

  3. Building test data from real outbreaks for evaluating detection algorithms.

    Directory of Open Access Journals (Sweden)

    Gaetan Texier

    Full Text Available Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method-ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler. We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1 resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak

  4. Noise Hazard Evaluation Sound Level Data on Noise Sources

    Science.gov (United States)

    1975-01-01

    Doo 300, Snowmobile 90 80. Ski -Doo Alpine 399 ER, Snovnnobile 80 81. Ski -Doo Elan, Sno.mobil-e 80 82. Ski -Doo Nordic 375, Snowmobile 86 83. Ski -Doo...ACCESION NO. 3. RECIPIENT’S CATALOG NUMBER 4. TITLE (and Subtitae) 5. TYPE OF REPORT & PERIOD COVERED NOISE HAZARD EVALUATION, SOUND LEVEL DATA OF...are not practically suitable to the threshold criterion presented in paragraph III B. 3. Winen the daily exposure is composed of two or more periods of

  5. Evaluating Frequency Quality of Nordic System using PMU data

    DEFF Research Database (Denmark)

    Xu, Zhao; Østergaard, Jacob; Togeby, Mikael

    2008-01-01

    This paper focuses on analysing frequency quality of Nordic power system using measurements from Phasor Measurement Units (PMU). The PMU data of one year long period is used which has very high time resolution (20 ms per sample) and is able to provide detailed information in evaluating frequency...... quality and its correlation with time. The results show that the frequency quality of the Nordic power system is not satisfactory according to the suggested requirements. The electricity market operation is found to be one of the major reasons behind. Based on the results, discussion of frequency control...

  6. Evaluation of Electronic Medical Record Administrative data Linked Database (EMRALD).

    Science.gov (United States)

    Tu, Karen; Mitiku, Tezeta F; Ivers, Noah M; Guo, Helen; Lu, Hong; Jaakkimainen, Liisa; Kavanagh, Doug G; Lee, Douglas S; Tu, Jack V

    2014-01-01

    Primary care electronic medical records (EMRs) represent a potentially rich source of information for research and evaluation. To assess the completeness of primary care EMR data compared with administrative data. Retrospective comparison of provincial health-related administrative databases and patient records for more than 50,000 patients of 54 physicians in 15 geographically distinct clinics in Ontario, Canada, contained in the Electronic Medical Record Administrative data Linked Database (EMRALD). Physician billings, laboratory tests, medications, specialist consultation letters, and hospital discharges captured in EMRALD were compared with health-related administrative data in a universal access healthcare system. The mean (standard deviation [SD]) percentage of clinic primary care outpatient visits captured in EMRALD compared with administrative data was 94.4% (4.88%). Consultation letters from specialists for first consultations and for hospital discharges were captured at a mean (SD) rate of 72.7% (7.98%) and 58.5% (15.24%), respectively, within 30 days of the occurrence. The mean (SD) capture within EMRALD of the most common laboratory tests billed and the most common drugs dispensed was 67.3% (21.46%) and 68.2% (8.32%), respectively, for all clinics. We found reasonable capture of information within the EMR compared with administrative data, with the advantage in the EMR of having actual laboratory results, prescriptions for patients of all ages, and detailed clinical information. However, the combination of complete EMR records and administrative data is needed to provide a full comprehensive picture of patient health histories and processes, and outcomes of care.

  7. Sensitivity analysis of critical experiments with evaluated nuclear data libraries

    Energy Technology Data Exchange (ETDEWEB)

    Fujiwara, D.; Kosaka, S. [Tepco Systems Corporation, Nuclear Engineering Dept., Tokyo (Japan)

    2008-07-01

    Criticality benchmark testing was performed with evaluated nuclear data libraries for thermal, low-enriched uranium fuel rod applications. C/E values for k{sub eff} were calculated with the continuous-energy Monte Carlo code MVP2 and its libraries generated from Endf/B-VI.8, Endf/B-VII.0, JENDL-3.3 and JEFF-3.1. Subsequently, the observed k{sub eff} discrepancies between libraries were decomposed to specify the source of difference in the nuclear data libraries using sensitivity analysis technique. The obtained sensitivity profiles are also utilized to estimate the adequacy of cold critical experiments to the boiling water reactor under hot operating condition. (authors)

  8. Evaluation of Crops Moisture Provision by Space Remote Sensing Data

    Science.gov (United States)

    Ilienko, Tetiana

    2016-08-01

    The article is focused on theoretical and experimental rationale for the use of space data to determine the moisture provision of agricultural landscapes and agricultural plants. The improvement of space remote sensing methods to evaluate plant moisture availability is the aim of this research.It was proved the possibility of replacement of satellite imagery of high spatial resolution on medium spatial resolution which are freely available to determine crop moisture content at the local level. The mathematical models to determine the moisture content of winter wheat plants by spectral indices were developed based on the results of experimental field research and satellite (Landsat, MODIS/Terra, RapidEye, SICH-2) data. The maps of the moisture content in winter wheat plants in test sites by obtained models were constructed using modern GIS technology.

  9. Bayesian Monte Carlo method for nuclear data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Koning, A.J. [Nuclear Research and Consultancy Group NRG, P.O. Box 25, ZG Petten (Netherlands)

    2015-12-15

    A Bayesian Monte Carlo method is outlined which allows a systematic evaluation of nuclear reactions using the nuclear model code TALYS and the experimental nuclear reaction database EXFOR. The method is applied to all nuclides at the same time. First, the global predictive power of TALYS is numerically assessed, which enables to set the prior space of nuclear model solutions. Next, the method gradually zooms in on particular experimental data per nuclide, until for each specific target nuclide its existing experimental data can be used for weighted Monte Carlo sampling. To connect to the various different schools of uncertainty propagation in applied nuclear science, the result will be either an EXFOR-weighted covariance matrix or a collection of random files, each accompanied by the EXFOR-based weight. (orig.)

  10. Bayesian Monte Carlo method for nuclear data evaluation

    Science.gov (United States)

    Koning, A. J.

    2015-12-01

    A Bayesian Monte Carlo method is outlined which allows a systematic evaluation of nuclear reactions using the nuclear model code TALYS and the experimental nuclear reaction database EXFOR. The method is applied to all nuclides at the same time. First, the global predictive power of TALYS is numerically assessed, which enables to set the prior space of nuclear model solutions. Next, the method gradually zooms in on particular experimental data per nuclide, until for each specific target nuclide its existing experimental data can be used for weighted Monte Carlo sampling. To connect to the various different schools of uncertainty propagation in applied nuclear science, the result will be either an EXFOR-weighted covariance matrix or a collection of random files, each accompanied by the EXFOR-based weight.

  11. Evaluation of Empirical Data and Modeling Studies to Support ...

    Science.gov (United States)

    This study is an evaluation of empirical data and select modeling studies of the behavior of petroleum hydrocarbon (PHC) vapors in subsurface soils and how they can affect subsurface-to-indoor air vapor intrusion (VI), henceforth referred to as petroleum vapor intrusion or “PVI” for short. The purpose of this study is to support the development of a soil vapor screening methodology for PHC compounds for the U.S. Environmental Protection Agency’s Office of Underground Storage Tanks (U.S. EPA OUST); consequently, the focus is primarily on characterizing PVI at Subtitle I underground storage tank (UST) sites with petroleum fuel releases. However, PVI data from other types of sites (fuel terminals, petroleum refineries) are also presented and discussed

  12. Upgrading TDPAC Data Acquisition and Evaluation software Interlude

    CERN Document Server

    Laulainen, Joonatan Eemi

    2017-01-01

    In Time-Differential Perturbed Angular Spectroscopy, local hyperfine fields are probed through the observation of perturbed decay time spectra of an intermediate excited state in a -ray cascade. At ISOLDE Materials and Life Science groups, the data acquisition and evaluation is done using the program Interlude. However, Interlude had numerous outstanding bugs and issues, which have now been removed. During the construction of the observable function with the raw data, a systematic bias was identified in the fitting algorithm, and significantly reduced through the addition of a more complicated fitting function. Steps were taken to generalising the program for a variety of end users and input file types, with aim to share the program among users and publish a paper on it in early 2018.

  13. New Methods for Air Quality Model Evaluation with Satellite Data

    Science.gov (United States)

    Holloway, T.; Harkey, M.

    2015-12-01

    Despite major advances in the ability of satellites to detect gases and aerosols in the atmosphere, there remains significant, untapped potential to apply space-based data to air quality regulatory applications. Here, we showcase research findings geared toward increasing the relevance of satellite data to support operational air quality management, focused on model evaluation. Particular emphasis is given to nitrogen dioxide (NO2) and formaldehyde (HCHO) from the Ozone Monitoring Instrument aboard the NASA Aura satellite, and evaluation of simulations from the EPA Community Multiscale Air Quality (CMAQ) model. This work is part of the NASA Air Quality Applied Sciences Team (AQAST), and is motivated by ongoing dialog with state and federal air quality management agencies. We present the response of satellite-derived NO2 to meteorological conditions, satellite-derived HCHO:NO2 ratios as an indicator of ozone production regime, and the ability of models to capture these sensitivities over the continental U.S. In the case of NO2-weather sensitivities, we find boundary layer height, wind speed, temperature, and relative humidity to be the most important variables in determining near-surface NO2 variability. CMAQ agreed with relationships observed in satellite data, as well as in ground-based data, over most regions. However, we find that the southwest U.S. is a problem area for CMAQ, where modeled NO2 responses to insolation, boundary layer height, and other variables are at odds with the observations. Our analyses utilize a software developed by our team, the Wisconsin Horizontal Interpolation Program for Satellites (WHIPS): a free, open-source program designed to make satellite-derived air quality data more usable. WHIPS interpolates level 2 satellite retrievals onto a user-defined fixed grid, in effect creating custom-gridded level 3 satellite product. Currently, WHIPS can process the following data products: OMI NO2 (NASA retrieval); OMI NO2 (KNMI retrieval); OMI

  14. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-10-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  15. Evaluating Temporal Analysis Methods Using Residential Burglary Data

    Directory of Open Access Journals (Sweden)

    Martin Boldt

    2016-08-01

    Full Text Available Law enforcement agencies, as well as researchers rely on temporal analysis methods in many crime analyses, e.g., spatio-temporal analyses. A number of temporal analysis methods are being used, but a structured comparison in different configurations is yet to be done. This study aims to fill this research gap by comparing the accuracy of five existing, and one novel, temporal analysis methods in approximating offense times for residential burglaries that often lack precise time information. The temporal analysis methods are evaluated in eight different configurations with varying temporal resolution, as well as the amount of data (number of crimes available during analysis. A dataset of all Swedish residential burglaries reported between 2010 and 2014 is used (N = 103,029. From that dataset, a subset of burglaries with known precise offense times is used for evaluation. The accuracy of the temporal analysis methods in approximating the distribution of burglaries with known precise offense times is investigated. The aoristic and the novel aoristic e x t method perform significantly better than three of the traditional methods. Experiments show that the novel aoristic e x t method was most suitable for estimating crime frequencies in the day-of-the-year temporal resolution when reduced numbers of crimes were available during analysis. In the other configurations investigated, the aoristic method showed the best results. The results also show the potential from temporal analysis methods in approximating the temporal distributions of residential burglaries in situations when limited data are available.

  16. Evaluation of the 125Sb nuclear decay data

    Science.gov (United States)

    Lile, Liu; Xiaolong, Huang; Mengxiao, Kang; Guochang, Chen; Jimin, Wang; Liyang, Jiang

    2016-02-01

    Evaluation of the complete decay scheme and data for 125Sb/125mTe decay to nuclear levels in 125Te is presented in this report; literature data available up to June 2015 are included. The Limitation of Relative Statistical Weight Method (LRSW) was applied to average numbers throughout the evaluation. The uncertainty assigned to the average value was always greater than or equal to the smallest uncertainty of the values used to calculate the average. The half-life has been determined to be 1007.4±0.3 days. All known measured γ-ray absolute or relative intensities have been examined; the γ-ray emission probability of the reference γ-ray line of 427.874 keV is recommended to be 29.54±0.10%. The theoretical internal conversion coefficients and their uncertainties were used to obtain the complete decay scheme intensity balance. The values of other decay characteristics were calculated using the ENSDF analysis program. Finally the updated decay scheme for 125Sb is presented.

  17. Evaluation of the {sup 125}Sb nuclear decay data

    Energy Technology Data Exchange (ETDEWEB)

    Lile, Liu; Xiaolong, Huang, E-mail: huang@ciae.ac.cn; Mengxiao, Kang; Guochang, Chen; Jimin, Wang; Liyang, Jiang

    2016-02-01

    Evaluation of the complete decay scheme and data for {sup 125}Sb/{sup 125m}Te decay to nuclear levels in {sup 125}Te is presented in this report; literature data available up to June 2015 are included. The Limitation of Relative Statistical Weight Method (LRSW) was applied to average numbers throughout the evaluation. The uncertainty assigned to the average value was always greater than or equal to the smallest uncertainty of the values used to calculate the average. The half-life has been determined to be 1007.4±0.3 days. All known measured γ-ray absolute or relative intensities have been examined; the γ-ray emission probability of the reference γ-ray line of 427.874 keV is recommended to be 29.54±0.10%. The theoretical internal conversion coefficients and their uncertainties were used to obtain the complete decay scheme intensity balance. The values of other decay characteristics were calculated using the ENSDF analysis program. Finally the updated decay scheme for {sup 125}Sb is presented.

  18. Methods of experimental settlement of contradicting data in evaluated nuclear data libraries

    Directory of Open Access Journals (Sweden)

    V. A. Libman

    2016-12-01

    Full Text Available The latest versions of the evaluated nuclear data libraries (ENDLs have contradictions concerning data about neutron cross sections. To resolve this contradiction we propose the method of experimental verification. This method is based on using of the filtered neutron beams and following measurement of appropriate samples. The basic idea of the method is to modify the suited filtered neutron beam so that the differences between the neutron cross sections in accordance with different ENDLs become measurable. Demonstration of the method is given by the example of cerium, which according to the latest versions of four ENDLs has significantly different total neutron cross section.

  19. EVALUATION DIGITAL ELEVATION MODEL GENERATED BY SYNTHETIC APERTURE RADAR DATA

    Directory of Open Access Journals (Sweden)

    H. B. Makineci

    2016-06-01

    Full Text Available Digital elevation model, showing the physical and topographical situation of the earth, is defined a tree-dimensional digital model obtained from the elevation of the surface by using of selected an appropriate interpolation method. DEMs are used in many areas such as management of natural resources, engineering and infrastructure projects, disaster and risk analysis, archaeology, security, aviation, forestry, energy, topographic mapping, landslide and flood analysis, Geographic Information Systems (GIS. Digital elevation models, which are the fundamental components of cartography, is calculated by many methods. Digital elevation models can be obtained terrestrial methods or data obtained by digitization of maps by processing the digital platform in general. Today, Digital elevation model data is generated by the processing of stereo optical satellite images, radar images (radargrammetry, interferometry and lidar data using remote sensing and photogrammetric techniques with the help of improving technology. One of the fundamental components of remote sensing radar technology is very advanced nowadays. In response to this progress it began to be used more frequently in various fields. Determining the shape of topography and creating digital elevation model comes the beginning topics of these areas. It is aimed in this work , the differences of evaluation of quality between Sentinel-1A SAR image ,which is sent by European Space Agency ESA and Interferometry Wide Swath imaging mode and C band type , and DTED-2 (Digital Terrain Elevation Data and application between them. The application includes RMS static method for detecting precision of data. Results show us to variance of points make a high decrease from mountain area to plane area.

  20. Evaluation Digital Elevation Model Generated by Synthetic Aperture Radar Data

    Science.gov (United States)

    Makineci, H. B.; Karabörk, H.

    2016-06-01

    Digital elevation model, showing the physical and topographical situation of the earth, is defined a tree-dimensional digital model obtained from the elevation of the surface by using of selected an appropriate interpolation method. DEMs are used in many areas such as management of natural resources, engineering and infrastructure projects, disaster and risk analysis, archaeology, security, aviation, forestry, energy, topographic mapping, landslide and flood analysis, Geographic Information Systems (GIS). Digital elevation models, which are the fundamental components of cartography, is calculated by many methods. Digital elevation models can be obtained terrestrial methods or data obtained by digitization of maps by processing the digital platform in general. Today, Digital elevation model data is generated by the processing of stereo optical satellite images, radar images (radargrammetry, interferometry) and lidar data using remote sensing and photogrammetric techniques with the help of improving technology. One of the fundamental components of remote sensing radar technology is very advanced nowadays. In response to this progress it began to be used more frequently in various fields. Determining the shape of topography and creating digital elevation model comes the beginning topics of these areas. It is aimed in this work , the differences of evaluation of quality between Sentinel-1A SAR image ,which is sent by European Space Agency ESA and Interferometry Wide Swath imaging mode and C band type , and DTED-2 (Digital Terrain Elevation Data) and application between them. The application includes RMS static method for detecting precision of data. Results show us to variance of points make a high decrease from mountain area to plane area.

  1. Evaluation of Gaussian approximations for data assimilation in reservoir models

    KAUST Repository

    Iglesias, Marco A.

    2013-07-14

    The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our

  2. Evaluation of ERA-Interim precipitation data in complex terrain

    Science.gov (United States)

    Gao, Lu; Bernhardt, Matthias; Schulz, Karsten

    2013-04-01

    Precipitation controls a large variety of environmental processes, which is an essential input parameter for land surface models e.g. in hydrology, ecology and climatology. However, rain gauge networks provides the necessary information, are commonly sparse in complex terrains, especially in high mountainous regions. Reanalysis products (e.g. ERA-40 and NCEP-NCAR) as surrogate data are increasing applied in the past years. Although they are improving forward, previous studies showed that these products should be objectively evaluated due to their various uncertainties. In this study, we evaluated the precipitation data from ERA-Interim, which is a latest reanalysis product developed by ECMWF. ERA-Interim daily total precipitation are compared with high resolution gridded observation dataset (E-OBS) at 0.25°×0.25° grids for the period 1979-2010 over central Alps (45.5-48°N, 6.25-11.5°E). Wet or dry day is defined using different threshold values (0.5mm, 1mm, 5mm, 10mm and 20mm). The correspondence ratio (CR) is applied for frequency comparison, which is the ratio of days when precipitation occurs in both ERA-Interim and E-OBS dataset. The result shows that ERA-Interim captures precipitation occurrence very well with a range of CR from 0.80 to 0.97 for 0.5mm to 20mm thresholds. However, the bias of intensity increases with rising thresholds. Mean absolute error (MAE) varies between 4.5 mm day-1 and 9.5 mm day-1 in wet days for whole area. In term of mean annual cycle, ERA-Interim almost has the same standard deviation of the interannual variability of daily precipitation with E-OBS, 1.0 mm day-1. Significant wet biases happened in ERA-Interim throughout warm season (May to August) and dry biases in cold season (November to February). The spatial distribution of mean annual daily precipitation shows that ERA-Interim significant underestimates precipitation intensity in high mountains and northern flank of Alpine chain from November to March while pronounced

  3. Evaluation of bispectral LIDAR data for urban vegetation mapping

    Science.gov (United States)

    Nabucet, Jean; Hubert-Moy, Laurence; Corpetti, Thomas; Launeau, Patrick; Lague, Dimitri; Michon, Cyril; Quenol, Herve

    2016-10-01

    Because of the large increase of urban population in the last decades, the question of sustainable development in urban areas is crucial. In this context, vegetation plays a significant role in urban planning, environmental protecting, and sustainable development policy making, heating and cooling requirements of buildings, displacement of animals dispersion, concentration of pollutants, and well-being. In numerous cities, vegetation is limited to public areas using GPS surveys or aerial remote sensing data. Recently, very high-resolution sensors as Light Detection and Ranging (LiDAR) data have permitted significant improvements in vegetation mapping in urban areas. This paper presents an evaluation of a new generation of airborne LIDAR bi-spectral discrete point (Optech titan) for mapping and characterizing urban vegetation. The methodology is based on a four-step approach: 1) the analysis of the quality of data in order to estimate noise between the green and near-infrared LIDAR point clouds; 2) this enables to remove the topographic effects and 3) a first classification, devoted to the elimination of the non-vegetation class, is performed based on the intensity value of the two channels; finally, in 4), the tree coverage is classified into seven categories of strata combination. To this end specific descriptors related to the organization of the point clouds are used. These first results show that compared to monospectral LiDAR data, bi-spectral LiDAR enables to improve significantly both the extraction and the characterization of urban objects. This reveals new perspectives for mapping and characterizing urban patterns and other complex structures.

  4. Evaluating Micrometeoroid and Orbital Debris Risk Assessments Using Anomaly Data

    Science.gov (United States)

    Squire, Michael

    2017-01-01

    The accuracy of micrometeoroid and orbital debris (MMOD) risk assessments can be difficult to evaluate. A team from the National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) has completed a study that compared MMOD-related failures on operational satellites to predictions of how many of those failures should occur using NASA's TM"s MMOD risk assessment methodology and tools. The study team used the Poisson probability to quantify the degree of inconsistency between the predicted and reported numbers of failures. Many elements go into a risk assessment, and each of those elements represent a possible source of uncertainty or bias that will influence the end result. There are also challenges in obtaining accurate and useful data on MMOD-related failures.

  5. Decay data evaluation project (DDEP): Updated evaluations of the {sup 233}Th and {sup 241}Am decay characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Chechev, Valery P., E-mail: chechev@khlopin.r [V.G. Khlopin Radium Institute, 28 Second Murinsky Ave., St. Petersburg 194021 (Russian Federation); Kuzmenko, Nikolay K. [V.G. Khlopin Radium Institute, 28 Second Murinsky Ave., St. Petersburg 194021 (Russian Federation)

    2010-07-15

    The results of new decay data evaluations are presented for {sup 233}Th ({beta}{sup -}) decay to nuclear levels in {sup 233}Pa and {sup 241}Am ({alpha}) decay to nuclear levels in {sup 237}Np. These evaluated data have been obtained within the Decay Data Evaluation Project using information published up to 2009.

  6. Evaluation of Algebraic Reynolds Stress Model Assumptions Using Experimental Data

    Science.gov (United States)

    Jyoti, B.; Ewing, D.; Matovic, D.

    1996-11-01

    The accuracy of Rodi's ASM assumption is examined by evaluating the terms in Reynolds stress transport equation and their modelled counterparts. The basic model assumption: Dτ_ij/Dt + partial T_ijl/partial xl = (τ_ij/k )(Dk/Dt + partial Tl /partial xl ) (Rodi( Rodi W., ZAMM.), 56, pp. 219-221, 1976.), can also be broken into two stronger assumptions: Da_ij/Dt = 0 and (2) partial T_ijl/partial xl = (τ_ij/k )(partial Tl /partial xl ) (e.g. Taulbee( Taulbee D. B., Phys. of Fluids), 4(11), pp. 2555-2561, 1992.). Fu et al( Fu S., Huang P.G., Launder B.E. & Leschziner M.A., J. Fluid Eng.), 110(2), pp. 216-221., 1988 examined the accuracy of Rodi's assumption using the results of RSM calculation of axisymmetric jets. Since the RSM results did not accurately predict the experimental results either, it may be useful to examine the basic ASM model assumptions using experimental data. The database of Hussein, Capp and George( Hussein H., Capp S. & George W., J.F.M.), 258, pp.31-75., 1994. is sufficiently detailed to evaluate the terms of Reynolds stress transport equations individually, thus allowing both Rodi's and the stronger assumptions to be tested. For this flow assumption (1) is well satisfied for all the components (including \\overlineuv); however, assumption (2) does not seem as well satisfied.

  7. Evaluation of the decay data of 109Cd

    Science.gov (United States)

    Xiaolong, Huang; Shenggui, Yin; Chunsheng, Deng

    2010-09-01

    109Cd decays to the excited states of 109Ag through the electron capture decay mode. The evaluation of the complete decay scheme and data of 109Cd including the recent new measurements is presented in this report. The limitation of relative statistical weight method (LRSW) was applied to average numbers throughout the evaluation. The uncertainty assigned to the average value was always greater than or equal to the smallest uncertainty of the values used to calculate the average. The half-life is determined to be 462.0±0.3 days. All known measured absolute emission probabilities for gamma-ray and KX-ray have been examined. And the gamma-ray emission probability of the reference γ line of 88.0336 keV is recommended to be 3.65±0.3%. The internal conversion coefficients and their uncertainties were used to obtain the complete decay intensity balance. The other decay characteristics are calculated using the ENSDF analysis program. Finally the 109Cd decay scheme was given.

  8. Evaluation of robust functions for data reconciliation in thermal systems

    Directory of Open Access Journals (Sweden)

    Regina Luana Santos de França

    2016-04-01

    Full Text Available Process variables regularly control and evaluate industrial processes. Information with gross errors may in some cases not be attenuated by function reconciliation and change the calculation of process balance, leading optimization results towards non-feasible regions or to optimal sites. A promising alternative for reconciling functions is the use of robust functions. Current paper considers the above scenario and evaluates the fitness of some robust functions in solving in steady state chemical processes data reconciliation problems represented by linear and nonlinear systems in the presence of gross errors. Traditional Cauchy, Fair, Contaminated Normal and Logistic robust functions are used in the reconciliation problem where their estimates are compared to those obtained with the use of the latest features, such as New Target and Alarm. Rates for gross errors in tests were limited between 4 and 10σ of the measured current and elaborated a region of outliers. Results showed that New Target and Alarm functions are different from the others as the magnitude of the gross error increases, tending towards true rates specified by set point.

  9. Linear model applied to the evaluation of pharmaceutical stability data

    Directory of Open Access Journals (Sweden)

    Renato Cesar Souza

    2013-09-01

    Full Text Available The expiry date on the packaging of a product gives the consumer the confidence that the product will retain its identity, content, quality and purity throughout the period of validity of the drug. The definition of this term in the pharmaceutical industry is based on stability data obtained during the product registration. By the above, this work aims to apply the linear regression according to the guideline ICH Q1E, 2003, to evaluate some aspects of a product undergoing in a registration phase in Brazil. With this propose, the evaluation was realized with the development center of a multinational company in Brazil, with samples of three different batches composed by two active principal ingredients in two different packages. Based on the preliminary results obtained, it was possible to observe the difference of degradation tendency of the product in two different packages and the relationship between the variables studied, added knowledge so new models of linear equations can be applied and developed for other products.

  10. Evaluation of the value of radar QPE data and rain gauge data for hydrological modeling

    DEFF Research Database (Denmark)

    He, Xin; Sonnenborg, Torben Obel; Refsgaard, Jens Christian

    2013-01-01

    QPE data is in fact more obvious to groundwater than to surface water at daily scale. Moreover, substantial negative impact on the simulated hydrological responses is observed due to the cut down in operational rain gauge network between 2006 and 2010. The radar QPE based model demonstrates the added......Weather radar-based quantitative precipitation estimation (QPE) is in principle superior to the areal precipitation estimated by using rain gauge data only, and therefore has become increasingly popular in applications such as hydrological modeling. The present study investigates the potential...... of using multiannual radar QPE data in coupled surface water—groundwater modeling with emphasis given to the groundwater component. Since the radar QPE is partly dependent on the rain gauge observations, it is necessary to evaluate the impact of rain gauge network density on the quality of the estimated...

  11. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T W; Sutton, M

    2011-09-19

    Thermodynamic data are essential for understanding and evaluating geochemical processes, as by speciation-solubility calculations, reaction-path modeling, or reactive transport simulation. These data are required to evaluate both equilibrium states and the kinetic approach to such states (via the affinity term or its equivalent in commonly used rate laws). These types of calculations and the data needed to carry them out are a central feature of geochemistry in many applications, including water-rock interactions in natural systems at low and high temperatures. Such calculations are also made in engineering studies, for example studies of interactions involving man-made materials such as metal alloys and concrete. They are used in a fairly broad spectrum of repository studies where interactions take place among water, rock, and man-made materials (e.g., usage on YMP and WIPP). Waste form degradation, engineered barrier system performance, and near-field and far-field transport typically incorporate some level of thermodynamic modeling, requiring the relevant supporting data. Typical applications of thermodynamic modeling involve calculations of aqueous speciation (which is of great importance in the case of most radionuclides), solubilities of minerals and related solids, solubilities of gases, and stability relations among the various possible phases that might be present in a chemical system at a given temperature and pressure. If a phase can have a variable chemical composition, then a common calculational task is to determine that composition. Thermodynamic modeling also encompasses ion exchange and surface complexation processes. Any and all of these processes may be important in a geochemical process or reactive transport calculation. Such calculations are generally carried out using computer codes. For geochemical modeling calculations, codes such as EQ3/6 and PHREEQC, are commonly used. These codes typically provide 'full service' geochemistry

  12. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  13. Evaluating and Extending the Ocean Wind Climate Data Record

    Science.gov (United States)

    Ricciardulli, Lucrezia; Rodriguez, Ernesto; Stiles, Bryan W.; Bourassa, Mark A.; Long, David G.; Hoffman, Ross N.; Stoffelen, Ad; Verhoef, Anton; O'Neill, Larry W.; Farrar, J. Tomas; Vandemark, Douglas; Fore, Alexander G.; Hristova-Veleva, Svetla M.; Turk, F. Joseph; Gaston, Robert; Tyler, Douglas

    2017-01-01

    Satellite microwave sensors, both active scatterometers and passive radiometers, have been systematically measuring near-surface ocean winds for nearly 40 years, establishing an important legacy in studying and monitoring weather and climate variability. As an aid to such activities, the various wind datasets are being intercalibrated and merged into consistent climate data records (CDRs). The ocean wind CDRs (OW-CDRs) are evaluated by comparisons with ocean buoys and intercomparisons among the different satellite sensors and among the different data providers. Extending the OW-CDR into the future requires exploiting all available datasets, such as OSCAT-2 scheduled to launch in July 2016. Three planned methods of calibrating the OSCAT-2 σo measurements include 1) direct Ku-band σo intercalibration to QuikSCAT and RapidScat; 2) multisensor wind speed intercalibration; and 3) calibration to stable rainforest targets. Unfortunately, RapidScat failed in August 2016 and cannot be used to directly calibrate OSCAT-2. A particular future continuity concern is the absence of scheduled new or continuation radiometer missions capable of measuring wind speed. Specialized model assimilations provide 30-year long high temporal/spatial resolution wind vector grids that composite the satellite wind information from OW-CDRs of multiple satellites viewing the Earth at different local times. PMID:28824741

  14. Evaluating and Extending the Ocean Wind Climate Data Record.

    Science.gov (United States)

    Wentz, Frank J; Ricciardulli, Lucrezia; Rodriguez, Ernesto; Stiles, Bryan W; Bourassa, Mark A; Long, David G; Hoffman, Ross N; Stoffelen, Ad; Verhoef, Anton; O'Neill, Larry W; Farrar, J Tomas; Vandemark, Douglas; Fore, Alexander G; Hristova-Veleva, Svetla M; Turk, F Joseph; Gaston, Robert; Tyler, Douglas

    2017-05-01

    Satellite microwave sensors, both active scatterometers and passive radiometers, have been systematically measuring near-surface ocean winds for nearly 40 years, establishing an important legacy in studying and monitoring weather and climate variability. As an aid to such activities, the various wind datasets are being intercalibrated and merged into consistent climate data records (CDRs). The ocean wind CDRs (OW-CDRs) are evaluated by comparisons with ocean buoys and intercomparisons among the different satellite sensors and among the different data providers. Extending the OW-CDR into the future requires exploiting all available datasets, such as OSCAT-2 scheduled to launch in July 2016. Three planned methods of calibrating the OSCAT-2 σ o measurements include 1) direct Ku-band σ o intercalibration to QuikSCAT and RapidScat; 2) multisensor wind speed intercalibration; and 3) calibration to stable rainforest targets. Unfortunately, RapidScat failed in August 2016 and cannot be used to directly calibrate OSCAT-2. A particular future continuity concern is the absence of scheduled new or continuation radiometer missions capable of measuring wind speed. Specialized model assimilations provide 30-year long high temporal/spatial resolution wind vector grids that composite the satellite wind information from OW-CDRs of multiple satellites viewing the Earth at different local times.

  15. Global TEC maps based on GNNS data: 2. Model evaluation

    Science.gov (United States)

    Mukhtarov, P.; Pancheva, D.; Andonov, B.; Pashova, L.

    2013-07-01

    present paper presents a detailed statistical evaluation of the global empirical background TEC model built by using the CODE TEC data for full 13 years, 1999-2011, and described in Part 1. It has been found that the empirical probability density distribution resembles more the Laplace than the Gaussian distribution. A further insight into the nature and sources of the model's error variable led up to building of a new error model. It has been constructed by using a similar approach to that of the background TEC model. The spatial-temporal variability of the RMSE (root mean squares error) is presented as a multiplication of three separable functions which describe solar cycle, seasonal and LT dependences. The error model contains 486 constants that have been determined by least squares fitting techniques. The overall standard deviation of the predicted RMSE with respect to the empirical one is 0.7 TECU. The error model could offer a prediction approach on the basis of which the RMSE depending on the solar activity, season and LT is predicted.

  16. Introducing A Hybrid Data Mining Model to Evaluate Customer Loyalty

    Directory of Open Access Journals (Sweden)

    H. Alizadeh

    2016-12-01

    Full Text Available The main aim of this study was introducing a comprehensive model of bank customers᾽ loyalty evaluation based on the assessment and comparison of different clustering methods᾽ performance. This study also pursues the following specific objectives: a using different clustering methods and comparing them for customer classification, b finding the effective variables in determining the customer loyalty, and c using different collective classification methods to increase the modeling accuracy and comparing the results with the basic methods. Since loyal customers generate more profit, this study aims at introducing a two-step model for classification of customers and their loyalty. For this purpose, various methods of clustering such as K-medoids, X-means and K-means were used, the last of which outperformed the other two through comparing with Davis-Bouldin index. Customers were clustered by using K-means and members of these four clusters were analyzed and labeled. Then, a predictive model was run based on demographic variables of customers using various classification methods such as DT (Decision Tree, ANN (Artificial Neural Networks, NB (Naive Bayes, KNN (K-Nearest Neighbors and SVM (Support Vector Machine, as well as their bagging and boosting to predict the class of loyal customers. The results showed that the bagging-ANN was the most accurate method in predicting loyal customers. This two-stage model can be used in banks and financial institutions with similar data to identify the type of future customers.

  17. Analysis and evaluation of operational data. Annual report, 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-01

    The United States Nuclear Regulatory Commission`s Office for Analysis and Evaluation of Operational Data (AEOD) has published reports of its activities since 1984. The first report covered January through June of 1984, and the second report covered July through December of 1984. After those first two semiannual reports, AEOD published annual reports of its activities from 1985 through 1993. Beginning with report for 1986, AEOD Annual Reports have been published as NUREG-1272. Beginning with the report for 1987, NUREG-1272 has been published in two parts, No. 1 covering power reactors and No. 2 covering nonreactors (changed to `nuclear materials` with the 1993 report). AEOD changed its annual report from a calendar year (CY) to a fiscal year report, and added part No. 3 covering technical training, beginning with the combined Annual Report for CY 1994 and fiscal year 1995, NUREG-1272, Vol. 9, Nos. 1-3. This report, NUREG-1272, Vol. 10, No. 2, covers nuclear materials and presents a review of the events and concerns associated with the use of licensed material in applications other than power reactores. NUREG-1272, Vol. 10, No. 1, covers power reactors and presents an overview of the fiscal year 1996 operating experience of the nuclear power industry from the NRC perspective. NUREG-1272, Vol. 10, No. 3, covers technical training and presents the activities of the Technical Training Center in support of the NRC`s mission. Throughout these reports, whenever information is presented for a calendar year, it is so designated. Fiscal year information is designated by the four digits of the fiscal year.

  18. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    Science.gov (United States)

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  19. Coded Statutory Data Sets for Evaluation of Public Health Law

    Science.gov (United States)

    Costich, Julia Field

    2012-01-01

    Background and objectives: The evaluation of public health law requires reliable accounts of underlying statutes and regulations. States often enact public health-related statutes with nonuniform provisions, and variation in the structure of state legal codes can foster inaccuracy in evaluating the impact of specific categories of law. The optimal…

  20. Evaluation of O2PLS in Omics data integration

    NARCIS (Netherlands)

    el Bouhaddani, S.; Houwing-Duistermaat, Jeanine; Salo, Perttu; Perola, Markus; Jongbloed, G.; Uh, Hae-Won

    2016-01-01

    Background
    Rapid computational and technological developments made large amounts of omics data available in different biological levels. It is becoming clear that simultaneous data analysis methods are needed for better interpretation and understanding of the underlying systems biology.

  1. Achieving Fairness in Data Fusion Performance Evaluation Development

    Science.gov (United States)

    2006-04-30

    data link MADL, SATCOM intelligence broadcast links, voice and data communications, messaging, GPS, integrated navigation, IRS, TACAN and other landing...avionics status and environment data > Intelligence Preparation of the Battlespace (IPB): threat laydown, weather, EW data base, etc. Key sensor...support is gratefully acknowledged herein, especially the programmatic support of Dr. Neal Glassman of AFOSR, and the technical guidance of Mr. Pete Burke

  2. Using Concrete and Realistic Data in Evaluating Initial Visualization Designs

    DEFF Research Database (Denmark)

    Knudsen, Søren; Pedersen, Jeppe Gerner; Herdal, Thor

    2016-01-01

    to span a range of factors, such as the role of the person doing the data collection and the type of instrumentation used. The three cases relate to visualizing sports, construction, and cooking domain data, and use primarily time-domain data and visualizations. For each case, we briefly describe...

  3. An Evaluation of PET Using Extant Achievement Data.

    Science.gov (United States)

    Mandeville, Garrett K.

    Data gathered under various statewide testing programs in South Carolina were used to assess the effects of PET (data base) training for teachers on student achievement. Reading and mathematics achievement data for students in grades 1 through 4 in South Carolina were compared for teachers who had received PET training and those who did not. A…

  4. Data collection and evaluation for experimental computer science research

    Science.gov (United States)

    Zelkowitz, Marvin V.

    1983-01-01

    The Software Engineering Laboratory was monitoring software development at NASA Goddard Space Flight Center since 1976. The data collection activities of the Laboratory and some of the difficulties of obtaining reliable data are described. In addition, the application of this data collection process to a current prototyping experiment is reviewed.

  5. Evaluation of Cirrus Cloud Simulations using ARM Data-Development of Case Study Data Set

    Science.gov (United States)

    Starr, David OC.; Demoz, Belay; Wang, Yansen; Lin, Ruei-Fong; Lare, Andrew; Mace, Jay; Poellot, Michael; Sassen, Kenneth; Brown, Philip

    2002-01-01

    Cloud-resolving models (CRMs) are being increasingly used to develop parametric treatments of clouds and related processes for use in global climate models (GCMs). CRMs represent the integrated knowledge of the physical processes acting to determine cloud system lifecycle and are well matched to typical observational data in terms of physical parameters/measurables and scale-resolved physical processes. Thus, they are suitable for direct comparison to field observations for model validation and improvement. The goal of this project is to improve state-of-the-art CRMs used for studies of cirrus clouds and to establish a relative calibration with GCMs through comparisons among CRMs, single column model (SCM) versions of the GCMs, and observations. The objective is to compare and evaluate a variety of CRMs and SCMs, under the auspices of the GEWEX Cloud Systems Study (GCSS) Working Group on Cirrus Cloud Systems (WG2), using ARM data acquired at the Southern Great Plains (SGP) site. This poster will report on progress in developing a suitable WG2 case study data set based on the September 26, 1996 ARM IOP case - the Hurricane Nora outflow case. Progress is assessing cloud and other environmental conditions will be described. Results of preliminary simulations using a regional cloud system model (MM5) and a CRM will be discussed. Focal science questions for the model comparison are strongly based on results of the idealized GCSS WG2 cirrus cloud model comparison projects (Idealized Cirrus Cloud Model Comparison Project and Cirrus Parcel Model Comparison Project), which will also be briefly summarized.

  6. Oil Points - Life Cycle Evaluations without the Data Problem

    DEFF Research Database (Denmark)

    Bey, Niki; Lenau, Torben Anker; Larsen, Michael Holm

    1999-01-01

    Environmental aspects of products in their whole life cycle are of increasing importance in industry [1]. Therefore, several methods and tools for environmental life cycle evaluation have been developed during the last years. Formal Life Cycle Assessment (LCA), the state-of-the-art in environmental...... indicator-based evaluation is dependent on the existence and availability of such indicators.In order to avoid this, the Oil Point Method (OPM) has been developed. Its application only requires limited resources while still providing a valuable evaluation. The OPM is based on primary energy considerations...

  7. 76 FR 78661 - Correction for Draft Vieques Report: An Evaluation of Environmental, Biological, and Health Data...

    Science.gov (United States)

    2011-12-19

    ... Evaluation of Environmental, Biological, and Health Data From the Island of Vieques, PR AGENCY: Agency for... Draft Vieques Report: An Evaluation of Environmental, Biological, and Health Data From the Island of...

  8. Key data for outbreak evaluation: building on the Ebola experience.

    Science.gov (United States)

    Cori, Anne; Donnelly, Christl A; Dorigatti, Ilaria; Ferguson, Neil M; Fraser, Christophe; Garske, Tini; Jombart, Thibaut; Nedjati-Gilani, Gemma; Nouvellet, Pierre; Riley, Steven; Van Kerkhove, Maria D; Mills, Harriet L; Blake, Isobel M

    2017-05-26

    Following the detection of an infectious disease outbreak, rapid epidemiological assessment is critical for guiding an effective public health response. To understand the transmission dynamics and potential impact of an outbreak, several types of data are necessary. Here we build on experience gained in the West African Ebola epidemic and prior emerging infectious disease outbreaks to set out a checklist of data needed to: (1) quantify severity and transmissibility; (2) characterize heterogeneities in transmission and their determinants; and (3) assess the effectiveness of different interventions. We differentiate data needs into individual-level data (e.g. a detailed list of reported cases), exposure data (e.g. identifying where/how cases may have been infected) and population-level data (e.g. size/demographics of the population(s) affected and when/where interventions were implemented). A remarkable amount of individual-level and exposure data was collected during the West African Ebola epidemic, which allowed the assessment of (1) and (2). However, gaps in population-level data (particularly around which interventions were applied when and where) posed challenges to the assessment of (3). Here we highlight recurrent data issues, give practical suggestions for addressing these issues and discuss priorities for improvements in data collection in future outbreaks.This article is part of the themed issue 'The 2013-2016 West African Ebola epidemic: data, decision-making and disease control'. © 2017 The Authors.

  9. Wind tunnel evaluation of Hi-Vol TSP effectiveness data

    Data.gov (United States)

    U.S. Environmental Protection Agency — Wind tunnel evaluation of EPA's Hi-Vol TSP sampler for sampling effectiveness with regards to aerodynamic particle diameter (5 to 35 microns), wind speed (2, 8, 24...

  10. Report on the activities of the decay data evaluation project; Compte rendu d'activite du groupe ''decay data evaluation project''

    Energy Technology Data Exchange (ETDEWEB)

    Browne, E.; Be, M.M.; Desmond Mac Mahon, T.; Helmer, R.G

    2001-12-01

    This report summarizes the work of the DDEP collaboration since its establishment in 1994. The aim of this group is to evaluate and propose decay data for approximately 250 radionuclides of interest in various applications. It also presents a projected schedule of nuclear data evaluations for 2002-2003, and the minutes of the DDEP Meeting held at Braunschweig, Germany, May 15, 2001. A sample of a DDEP evaluation is presented in this report. (authors)

  11. Evaluation of Remote Collaborative Manipulation for Scientific Data Analysis

    OpenAIRE

    Fleury, Cédric; Duval, Thierry; Gouranton, Valérie; Steed, Anthony

    2012-01-01

    International audience; In the context of scientific data analysis, we propose to compare a remote collaborative manipulation technique with a single user manipulation technique. The manipulation task consists in positioning a clipping plane in order to perform cross-sections of scientific data which show several points of interest located inside this data. For the remote collaborative manipulation, we have chosen to use the 3-hand manipulation technique proposed by Aguerreche et al. which is...

  12. [Efficient multicentric data acquisition via internet - a method evaluation].

    Science.gov (United States)

    Neymeyer, J; Tunn, R

    2001-08-01

    Data input in web-htm-forms with a browser is much more efficient in comparison with write in paper mask documents. The specification-compliant forms for data entry as htm-forms are downloaded from the homepage of the project manager by the participants of the study. The transformation of content of different formatted htm-forms is made by the program "security-based form converter (SFC)", which is located on a web server respectively a proxy server. From there the transformed data are returned the project manager via e-mail. Those incoming data are imported into databases by means of ODBC-data source. The data are pivoted from sequential format to table format. The data entry principle presented here is a client-server-structured, web based intranet program. For this purpose only standard software and relational databases with ODBC-interfaces are used; no further registered commercial programs are needed. General data protection is ensured by encrypted data transmission.

  13. Cognition-inspired route evaluation using mobile phone data

    NARCIS (Netherlands)

    Wang, H.; Huang, J.; Zhou, E.; Huang, Z.; Zhong, N.

    2015-01-01

    With the increasing popularity of mobile phones, large amounts of real and reliable mobile phone data are being generated every day. These mobile phone data represent the practical travel routes of users and imply the intelligence of them in selecting a suitable route. Usually, an experienced user

  14. An Evaluation of PET Based on Longitudinal Data.

    Science.gov (United States)

    Mandeville, Garrett K.

    Although teacher inservice programs based on Madeline Hunter's Program for Effective Teaching (PET) have become very popular in U.S. schools, there is little evidence that the Hunter model ultimately results in increased student achievement. This longitudinal study attempts to evaluate the effects of Hunter-based staff development programs on…

  15. Estimation of Total Tree Height from Renewable Resources Evaluation Data

    Science.gov (United States)

    Charles E. Thomas

    1981-01-01

    Many ecological, biological, and genetic studies use the measurement of total tree height. Until recently, the Southern Forest Experiment Station's inventory procedures through Renewable Resources Evaluation (RRE) have not included total height measurements. This note provides equations to estimate total height based on other RRE measurements.

  16. Bayesian Monte Carlo Method for Nuclear Data Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Koning, A.J., E-mail: koning@nrg.eu

    2015-01-15

    A Bayesian Monte Carlo method is outlined which allows a systematic evaluation of nuclear reactions using TALYS. The result will be either an EXFOR-weighted covariance matrix or a collection of random files, each accompanied by an experiment based weight.

  17. Building Assessment Survey and Evaluation Study Summarized Data - HVAC Characteristics

    Science.gov (United States)

    In the Building Assessment Survey and Evaluation (BASE) Study Information on the characteristics of the heating, ventilation, and air conditioning (HVAC) system(s) in the entire BASE building including types of ventilation, equipment configurations, and operation and maintenance issues was acquired by examining the building plans, conducting a building walk-through, and speaking with the building owner, manager, and/or operator.

  18. Evaluating company growth potential using AI and web media data

    DEFF Research Database (Denmark)

    Droll, Andrew; Khan, Shahzad; Tanev, Stoyan

    2017-01-01

    The article focuses on adapting and validating the use of an existing web search and analytics engine to evaluate the growth and competitive potential of new technology start-ups and existing firms in the newly emerging precision medicine sector. The results are based on two different search onto...

  19. Dynamic data evaluation for solid-liquid equilibria

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Kang, Jeong Won

    The accuracy and reliability of the measured data sets to be used in regression of model parameters is an important issue related to modeling of phase equilibria. It is clear that good parameters for any model cannot be obtained from low quality data. A thermodynamic consistency test for solid...... of the developed tests were based in the quality tests proposed for VLE data by Kang et al. [3] and a methodology that combines solute activity coefficients in the liquid phase at infinite dilution and a theoretically based term to account for the non-ideality in dilute solutions are discussed. In this work, case...

  20. Evaluating parallel relational databases for medical data analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; Wilson, Andrew T.

    2012-03-01

    Hospitals have always generated and consumed large amounts of data concerning patients, treatment and outcomes. As computers and networks have permeated the hospital environment it has become feasible to collect and organize all of this data. This raises naturally the question of how to deal with the resulting mountain of information. In this report we detail a proof-of-concept test using two commercially available parallel database systems to analyze a set of real, de-identified medical records. We examine database scalability as data sizes increase as well as responsiveness under load from multiple users.

  1. Uncertainty representation, quantification and evaluation for data and information fusion

    CSIR Research Space (South Africa)

    De Villiers, Johan P

    2015-07-01

    Full Text Available to be modelled), datum uncertainty (where uncertainty is introduced by representing real world information by a mathematical quantity), data generation abstraction (where uncertainty is introduced through a mathematical representation of the mapping between a...

  2. Pre-Licensing Evaluation of Legacy SFR Metallic Fuel Data

    Energy Technology Data Exchange (ETDEWEB)

    Yacout, A. M. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Billone, M. C. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division

    2016-09-16

    The US sodium cooled fast reactor (SFR) metallic fuel performance data that are of interest to advanced fast reactors applications, can be attributed mostly to the Integral Fast Reactor (IFR) program between 1984 and 1994. Metallic fuel data collected prior to the IFR program were associated with types of fuel that are not of interest to future advanced reactors deployment (e.g., previous U-Fissium alloy fuel). The IFR fuels data were collected from irradiation of U-Zr based fuel alloy, with and without Pu additions, and clad in different types of steels, including HT9, D9, and 316 stainless-steel. Different types of data were generated during the program, and were based on the requirements associated with the DOE Advanced Liquid Metal Cooled Reactor (ALMR) program.

  3. Airborne camera and spectrometer experiments and data evaluation

    Science.gov (United States)

    Lehmann, F. F.; Bucher, T.; Pless, S.; Wohlfeil, J.; Hirschmüller, H.

    2009-09-01

    New stereo push broom camera systems have been developed at German Aerospace Centre (DLR). The new small multispectral systems (Multi Functional Camerahead - MFC, Advanced Multispectral Scanner - AMS) are light weight, compact and display three or five RGB stereo lines of 8000, 10 000 or 14 000 pixels, which are used for stereo processing and the generation of Digital Surface Models (DSM) and near True Orthoimage Mosaics (TOM). Simultaneous acquisition of different types of MFC-cameras for infrared and RGB data has been successfully tested. All spectral channels record the image data in full resolution, pan-sharpening is not necessary. Analogue to the line scanner data an automatic processing chain for UltraCamD and UltraCamX exists. The different systems have been flown for different types of applications; main fields of interest among others are environmental applications (flooding simulations, monitoring tasks, classification) and 3D-modelling (e.g. city mapping). From the DSM and TOM data Digital Terrain Models (DTM) and 3D city models are derived. Textures for the facades are taken from oblique orthoimages, which are created from the same input data as the TOM and the DOM. The resulting models are characterised by high geometric accuracy and the perfect fit of image data and DSM. The DLR is permanently developing and testing a wide range of sensor types and imaging platforms for terrestrial and space applications. The MFC-sensors have been flown in combination with laser systems and imaging spectrometers and special data fusion products have been developed. These products include hyperspectral orthoimages and 3D hyperspectral data.

  4. Evaluation of Channel Infill Processes in Relation to Forcing Data

    Science.gov (United States)

    2016-05-01

    provides the water that drives flow in surface waters through surface runoff and groundwater recharge for spring-fed channels and influences sediment...data collectively as a tool to estimate future dredging volumes as well as likely location and sediment composition of these dredging events...beneficial use projects and, when paired with historical dredge history data, can improve estimation of expected future dredging needs. Channel

  5. Evaluation of Healthcare Interventions and Big Data: Review of Associated Data Issues.

    Science.gov (United States)

    Asche, Carl V; Seal, Brian; Kahler, Kristijan H; Oehrlein, Elisabeth M; Baumgartner, Meredith Greer

    2017-08-01

    Although the analysis of 'big data' holds tremendous potential to improve patient care, there remain significant challenges before it can be realized. Accuracy and completeness of data, linkage of disparate data sources, and access to data are areas that require particular focus. This article discusses these areas and shares strategies to promote progress. Improvement in clinical coding, innovative matching methodologies, and investment in data standardization are potential solutions to data validation and linkage problems. Challenges to data access still require significant attention with data ownership, security needs, and costs representing significant barriers to access.

  6. Data envelopment analysis in service quality evaluation: An empirical study

    OpenAIRE

    Najafi, Seyesvahid; Saati, Saber; Tavana, Madjid

    2015-01-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A...

  7. Community detection algorithm evaluation with ground-truth data

    Science.gov (United States)

    Jebabli, Malek; Cherifi, Hocine; Cherifi, Chantal; Hamouda, Atef

    2018-02-01

    Community structure is of paramount importance for the understanding of complex networks. Consequently, there is a tremendous effort in order to develop efficient community detection algorithms. Unfortunately, the issue of a fair assessment of these algorithms is a thriving open question. If the ground-truth community structure is available, various clustering-based metrics are used in order to compare it versus the one discovered by these algorithms. However, these metrics defined at the node level are fairly insensitive to the variation of the overall community structure. To overcome these limitations, we propose to exploit the topological features of the 'community graphs' (where the nodes are the communities and the links represent their interactions) in order to evaluate the algorithms. To illustrate our methodology, we conduct a comprehensive analysis of overlapping community detection algorithms using a set of real-world networks with known a priori community structure. Results provide a better perception of their relative performance as compared to classical metrics. Moreover, they show that more emphasis should be put on the topology of the community structure. We also investigate the relationship between the topological properties of the community structure and the alternative evaluation measures (quality metrics and clustering metrics). It appears clearly that they present different views of the community structure and that they must be combined in order to evaluate the effectiveness of community detection algorithms.

  8. Radiometric Quality Evaluation of INSAT-3D Imager Data

    Science.gov (United States)

    Prakash, S.; Jindal, D.; Badal, N.; Kartikeyan, B.; Gopala Krishna, B.

    2014-11-01

    INSAT-3D is an advanced meteorological satellite of ISRO which acquires imagery in optical and infra-red (IR) channels for study of weather dynamics in Indian sub-continent region. In this paper, methodology of radiometric quality evaluation for Level-1 products of Imager, one of the payloads onboard INSAT-3D, is described. Firstly, overall visual quality of scene in terms of dynamic range, edge sharpness or modulation transfer function (MTF), presence of striping and other image artefacts is computed. Uniform targets in Desert and Sea region are identified for which detailed radiometric performance evaluation for IR channels is carried out. Mean brightness temperature (BT) of targets is computed and validated with independently generated radiometric references. Further, diurnal/seasonal trends in target BT values and radiometric uncertainty or sensor noise are studied. Results of radiometric quality evaluation over duration of eight months (January to August 2014) and comparison of radiometric consistency pre/post yaw flip of satellite are presented. Radiometric Analysis indicates that INSAT-3D images have high contrast (MTF > 0.2) and low striping effects. A bias of specifications.

  9. Evaluation of Decision Trees for Cloud Detection from AVHRR Data

    Science.gov (United States)

    Shiffman, Smadar; Nemani, Ramakrishna

    2005-01-01

    Automated cloud detection and tracking is an important step in assessing changes in radiation budgets associated with global climate change via remote sensing. Data products based on satellite imagery are available to the scientific community for studying trends in the Earth's atmosphere. The data products include pixel-based cloud masks that assign cloud-cover classifications to pixels. Many cloud-mask algorithms have the form of decision trees. The decision trees employ sequential tests that scientists designed based on empirical astrophysics studies and simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In a previous study we compared automatically learned decision trees to cloud masks included in Advanced Very High Resolution Radiometer (AVHRR) data products from the year 2000. In this paper we report the replication of the study for five-year data, and for a gold standard based on surface observations performed by scientists at weather stations in the British Islands. For our sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks p < 0.001.

  10. Evaluation of data quality at the Gambia national cancer registry.

    Science.gov (United States)

    Shimakawa, Yusuke; Bah, Ebrima; Wild, Christopher P; Hall, Andrew J

    2013-02-01

    The Gambia National Cancer Registry (GNCR) is one of the few nationwide population-based cancer registries in sub-Saharan Africa. Most registries in sub-Saharan Africa are limited to cities; therefore, the GNCR is important in providing estimates of cancer incidence in rural Africa. Our study assesses the quality of its data. The methods proposed by Bray and Parkin, and Parkin and Bray (Eur J Cancer 2009;45:747-64) were applied to the registry data from 1990 to 2009 to assess comparability, validity and completeness. The system used for classification and coding of neoplasms followed international standards. The percentage of cases morphologically verified was 18.1% for men and 33.1% for women, and that of death certificate only cases was 6.6 and 3.6%, respectively. Incidence rates in rural regions were lower than in the urban part of the country, except amongst young male adults. Comparison with other West African registries showed that the incidences of liver and uterine cervical cancer were comparable, but those of prostate and breast in The Gambia were relatively low. The overall completeness was estimated at 50.3% using the capture-recapture method. The GNCR applies international standard practices to data collection and handling, providing valuable data on cancer incidence in sub-Saharan Africa. However, the data are incomplete in the rural and elderly populations probably because of health care access and use. Copyright © 2012 UICC.

  11. SOIL FERTILITY EVALUATION FOR FERTILISER RECOMMENDATION USING HYPERION DATA

    Directory of Open Access Journals (Sweden)

    Ranendu Ghosh

    2015-12-01

    Full Text Available Soil fertility characterised by nitrogen, phosphorus, potassium, calcium, magnesium and sulphur is traditionally measured from soil samples collected from the field. The process is very cumbersome and time intensive. Hyperspectral data available from Hyperion payload of EO 1 was used for facilitating preparation of soil fertility map of Udaipur district of Rajasthan state, India. Hyperion data was pre-processed for band and area sub setting, atmospheric correction and reflectance data preparation. Spectral analysis in the form of SFF and PPI were carried out for selecting the ground truth sites for soil sample collection. Soil samples collected from forty one sites were analysed for analysis of nutrient composition. Generation of correlogram followed by multiple regressions was done for identifying the most important bands and spectral parameters that can be used for nutrient map generation.

  12. Evaluating the radiance transformation for normalizing Landsat data

    Science.gov (United States)

    Middleton, E. M.; Lu, Y. C.

    1982-01-01

    A technique is examined for improving the comparability of Landsat multisppectral scanner (MSS) data acquired on different dates. The technique involves conversion of digital brightness counts to relative radiance values measured in energy units (milliwatts per square centimeter-steradian). The statistical data of signature from 23 land cover (or biomass) classifications derived from all three Landsats were compared before and after the radiance normalization. Significant convergence occurred among the data sets for mean spectral values and the variances associated with each of seven major land cover types for MSS bands 4, 5, and 7. Overall, the variance attributed to the sensor component was reduced from 5.39 to 2.69 percent, with the largest decrease occurring in band 4 (14.4 percent to 3.7 percent).

  13. Reanalysis Data Evaluation to Study Temperature Extremes in Siberia

    Science.gov (United States)

    Shulgina, T. M.; Gordov, E. P.

    2014-12-01

    Ongoing global climate changes are strongly pronounced in Siberia by significant warming in the 2nd half of 20th century and recent extreme events such as 2010 heat wave and 2013 flood in Russia's Far East. To improve our understanding of observed climate extremes and to provide to regional decision makers the reliable scientifically based information with high special and temporal resolution on climate state, we need to operate with accurate meteorological data in our study. However, from available 231 stations across Siberia only 130 of them present the homogeneous daily temperature time series. Sparse, station network, especially in high latitudes, force us to use simulated reanalysis data. However those might differ from observations. To obtain reliable information on temperature extreme "hot spots" in Siberia we have compared daily temperatures form ERA-40, ERA Interim, JRA-25, JRA-55, NCEP/DOE, MERRA Reanalysis, HadEX2 and GHCNDEX gridded datasets with observations from RIHMI-WDC/CDIAC dataset for overlap period 1981-2000. Data agreement was estimated at station coordinates to which reanalysis data were interpolated using modified Shepard method. Comparison of averaged over 20 year annual mean temperatures shows general agreement for Siberia excepting Baikal region, where reanalyses significantly underestimate observed temperature behavior. The annual temperatures closest to observed one were obtained from ERA-40 and ERA Interim. Furthermore, t-test results show homogeneity of these datasets, which allows one to combine them for long term time series analysis. In particular, we compared the combined data with observations for percentile-based extreme indices. In Western Siberia reanalysis and gridded data accurately reproduce observed daily max/min temperatures. For East Siberia, Lake Baikal area, ERA Interim data slightly underestimates TN90p and TX90p values. Results obtained allows regional decision-makers to get required high spatial resolution (0,25°×0

  14. Evaluation, or Just Data Collection? An Exploration of the Evaluation Practice of Selected UK Environmental Educators

    Science.gov (United States)

    West, Sarah Elizabeth

    2015-01-01

    Little is known about the evaluation practices of environmental educators. Questionnaires and discussion groups with a convenience sample of UK-based practitioners were used to uncover their evaluation methods. Although many report that they are evaluating regularly, this is mainly monitoring numbers of participants or an assessment of enjoyment.…

  15. GIS in Evaluation: Utilizing the Power of Geographic Information Systems to Represent Evaluation Data

    Science.gov (United States)

    Azzam, Tarek; Robinson, David

    2013-01-01

    This article provides an introduction to geographic information systems (GIS) and how the technology can be used to enhance evaluation practice. As a tool, GIS enables evaluators to incorporate contextual features (such as accessibility of program sites or community health needs) into evaluation designs and highlights the interactions between…

  16. The ENSDF_toolbox program package: tool for the evaluator of nuclear data

    OpenAIRE

    Shulyak, G. I.; A. A. Rodionov

    2010-01-01

    The program package for the work with the Evaluated Nuclear Structure Data File is discussed. The program shell designed for the unification of the process of the evaluation of the nuclear data is proposed. This program shell may be used in the regular work of the nuclear data evaluator and for common use by scientists and engineers who need the actual data about nuclear states and transitions from the ENSDF database.

  17. Performance evaluation of multi-sensor data fusion technique for ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Abstract. We have adopted the state-vector fusion technique for fusing multiple sensors track data to provide complete and precise trajectory information about the flight vehicle under test, for the purpose of flight safety monitoring and decision- making at Test Range. The present paper brings out the performance of the algo-.

  18. Performance evaluation of multi-sensor data fusion technique for ...

    Indian Academy of Sciences (India)

    ... state-vector fusion technique for fusing multiple sensors track data to provide complete and precise trajectory information about the flight vehicle under test, for the purpose of flight safety monitoring and decisionmaking at Test Range. The present paper brings out the performance of the algorithm for different process noise ...

  19. Study on the AMO data production and evaluation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Yong Joo; Yoo, B. D.; Choe, A. S.; Han, J. M.; Jung, E. C.; Rho, S. P.; Yi, J. H.; Jeong, D. Y.; Lee, K. S.; Park, H. M.; Kim, S. K.; Song, K. S.; Lee, J. M

    1998-01-01

    AMODS (Atomic, Molecular, and Optical Database System) which can be accessed with the URL http://amods.kaeri.re.kr consists of a computer system which is an Alpha workstation 600 with UNIX O/S and the APACHE 1.2 WWW server installed on an independently mounted file system of 4.3 GB. Currently the data in AMODS is mostly atom-related and consists of atomic spectral lines, atomic transition probabilities, atomic energy levels, atomic transition lines, and CODATA 86 as well as several reference data. Meanwhile spectroscopic parameter of Sm which is one of the rare earth elements, has been measured, resulting in production of 36 isotope shift data of the high-lying even parity states, followed by the measurement of autoionization states. New 31 autoionization states are found and energy levels of them are measured. The Fano`s q parameters are determined through the theoretical analysis of the experimental data. (author). 11 refs., 3 tabs., 15 figs

  20. Simulation of PVT and swelling experimental data: a systematic evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Paulo S.M.V. [PETROBRAS S.A., Salvador, BA (Brazil). Unidade de Negocios da Bahia]. E-mail: psrocha@petrobras.com.br; Alves, Danilo C.R.; Sacramento, Vinicio S.; Costa, Gloria M.N. [Universidade Salvador (UNIFACS), Salvador, BA (Brazil). Centro de Estudos em Petroleo e Gas Natural (CEPGN)]. E-mail: gloria.costa@unifacs.br

    2004-07-01

    Accurate data of the phase behavior of oil and gas mixtures are needed, for example, for the design of process plants and reservoir simulation studies. Often, experimental PVT data are available, but in practice a few PVT measurements are carried out for a given mixture. Therefore, it is necessary to use a thermodynamic model when planning production strategies for a given petroleum reservoir. This raises the question of what accuracy can be obtained using a cubic equation of state for phase equilibrium calculations, for example at conditions in which oil and gas are being produced. The only way to improve the agreement between measured and calculated results is to adjust the equation of state parameters. Currently, there is not a clear methodology to make these modifications. The objective of this study is to investigate the best tuning to describe the PVT experimental data: differential liberation, constant composition expansion and swelling test. The following programs were used: SPECS and MI-PVT (Technical University of Denmark) and WinProp (windows version of CMGPROP). The Soave-Redlich-Kwong equation of sate was also used. Experimental data for 06 oil samples from Reconcavo Basin (Bahia- Brazil) were obtained in the CEPGN (Study Center on Oil and Natural Gas at UNIFACS) and used in the tuning (author)

  1. Application of an Aesthetic Evaluation Model to Data Entry Screens.

    Science.gov (United States)

    Ngo, D. C. L.; Byrne, J. G.

    2001-01-01

    Describes a new model for quantitatively assessing screen formats. Results of applying the model to data entry screens support the use of the model. Also described is a critiquing mechanism embedded in a user interface design environment as a demonstration of this approach. (Author/AEF)

  2. Evaluation of Data on Simple Turbulent Reacting Flows

    Science.gov (United States)

    1985-09-01

    normally elementary assessment to any of the data, due to the absence of appropriate information. 152 Ia LITERATURE SEARCH m The discussion of available... aroma -ly. AIAA J. 16, 279. 186 UI Razdan, M.K. and Scevel ,. (1985) CO/air turbulent diffusion flame: measurements aad mQdeling. Comb. Flame 59, 289

  3. Evaluating rehabilitation efforts following the Milford Flat Fire—Data

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The focus of the study, associated with these data, is a 540-km2 area at the low-elevation northern end of the 1460-km2 Milford Flat Fire in west-central Utah, and...

  4. Development and evaluation of an integrated electronic data ...

    African Journals Online (AJOL)

    Discussion: The database system has been incorporated into the daily flow of clinical work, thus reducing duplication of note keeping and avoiding the need for data capturers. After improving the design and user interface, better compliance was noted. This provided useful insight into critical care database development.

  5. 76 FR 77234 - Availability of Draft Vieques Report: An Evaluation of Environmental, Biological, and Health Data...

    Science.gov (United States)

    2011-12-12

    ... Evaluation of Environmental, Biological, and Health Data From the Island of Vieques, Puerto Rico AGENCY... availability of the Draft Vieques Report: An Evaluation of Environmental, Biological, and Health Data from the... implications posed by the site under consideration. This analysis generally involves an evaluation of relevant...

  6. Evaluation of life cycle inventory data for recycling systems

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Damgaard, Anders; Jensen, Morten Bang

    2014-01-01

    This paper reviews databases on material recycling (primary as well as secondary production) used in life cycle assessments (LCA) of waste management systems. A total of 366 datasets, from 1980 to 2010 and covering 14 materials, were collected from databases and reports. Totals for CO2-equivalent...... emissions were compared to illustrate variations in the data. It was hypothesised that emissions from material production and the recycling industry had decreased over time due to increasing regulation, energy costs and process optimisation, but the reported datasets did not reveal such a general trend...... the primary production of newsprint, HDPE and glass were 238%, 443% and 452%, respectively. For steel and aluminium the differences were 1761% and 235%, respectively. There is a severe lack of data for some recycled materials; for example, only one dataset existed for secondary cardboard. The study shows...

  7. Cockpit data link displays - An evaluation of textual formats

    Science.gov (United States)

    Mcgann, Alison; Lozito, Sandra; Corker, Kevin

    1992-01-01

    Data link technologies are being investigated for air/ground information transfer for commercial aircraft operation. This study was designed to measure which of four alpha-numeric display formats for display of data link information would lead to the quickest and most accurate memory retrieval in a part-task simulation environment. Pilots viewed a clearance for 10-15 seconds and were subsequently queried about the content of that clearance. Speed and accuracy of responses were measured across three retention tasks. The three retention tasks included free recall of a particular clearance. recognition of a previous clearance, and the comparison of element values between a previously displayed and current clearance. Each format was tested with and without a distraction task. Subjective ratings of each format were also collected. The analyses revealed no significant differences for reaction time or accuracy among the four formats. Explanations for these results as well as alternative methodologies are discussed.

  8. Data center network performance evaluation in ns3

    DEFF Research Database (Denmark)

    Andrus, Bogdan-Mihai; Vegas Olmos, Juan José

    2015-01-01

    In the following paper we present the analysis of highly interconnected topologies like hypercube and torus and how they can be implemented in data centers in order to cope with the rapid increase and demands for performance of the internal traffic. By replicating the topologies in NS3 and subjec......In the following paper we present the analysis of highly interconnected topologies like hypercube and torus and how they can be implemented in data centers in order to cope with the rapid increase and demands for performance of the internal traffic. By replicating the topologies in NS3...... we scale the network from 16 to 512 switches. The performance measurements are supported by abstract metrics that that also give a cost and complexity indication in choosing the right topology for the required application....

  9. New algorithm for biological objects' shape evaluation and data reduction

    Directory of Open Access Journals (Sweden)

    Stanislav Bartoň

    2010-01-01

    Full Text Available The paper presents the software procedure (using MAPLE 11 intended for considerable reduction of digital image data set to more easily treatable extent. The example with image of peach stone is presented. Peach stone, displayed on the digital photo, was represented as a polygon described by the coordinates of the pixels creating its perimeter. The photos taken in high resolution (and corresponding data sets contain coordinates of thousands of pixels - polygon's vertexes. Presented approach substitutes this polygon by the new one, where smaller number of vertexes is used. The task is solved by use of adapted least squares method. The presented algorithm enables reduction of number of vertexes to 10 % of its original extent with acceptable accuracy +/− one pixel (distance between initial and final polygon. The procedure can be used for processing of similar types of 2D images and acceleration of following computations.

  10. Evaluation of contraceptive history data in the Republic of Korea.

    Science.gov (United States)

    Pebley, A R; Goldman, N; Choe, M K

    1986-01-01

    The consistency of retrospective and current status data on contraceptive use from a series of national fertility surveys carried out during the 1970s in Korea is investigated. Aggregate consistency is examined among random samples from the same cohort or cohorts of women interviewed in each survey. The results indicate that estimates of trends in contraceptive use from a retrospective history in one survey, or from cross-sectional estimates in a series of surveys, can each yield misleading findings. Data from the 1974 Korean National Fertility Survey (KNFS) appear to be more reliable than those from other surveys, possibly because an interval-by-interval contraceptive history was used, explicit definitions of contraceptive methods were given prior to taking the contraceptive history, and the KNFS involved longer interviewer training and, perhaps, less time pressure during interviews.

  11. Performance evaluation of two highly interconnected Data Center networks

    DEFF Research Database (Denmark)

    Andrus, Bogdan-Mihai; Mihai Poncea, Ovidiu; Vegas Olmos, Juan José

    2015-01-01

    In this paper we present the analysis of highly interconnected topologies like hypercube and torus and how they can be implemented in data centers in order to cope with the rapid increase and demands for performance of the internal traffic. By replicating the topologies and subjecting them...... of the network was increased by a factor of 32. The performance measurements are supported by abstract metrics that also give a cost and complexity indication in choosing the right topology for the required application....

  12. Uranium internal exposure evaluation based on urine assay data

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.N.P.

    1984-09-01

    The difficulties in assessing internal exposures to uranium from urine assay data are described. A simplified application of the ICRP-30 and ICRP Lung Model concepts to the estimation of uranium intake is presented. A discussion follows on the development of a computer code utilizing the ICRP-30-based uranium elimination model with the existing urine assay information. The calculated uranium exposures from 1949 through 1983 are discussed. 13 references, 1 table.

  13. Liquidity in JGB Markets: An Evaluation from Transaction Data

    OpenAIRE

    Tetsuo Kurosaki; Yusuke Kumano; Kota Okabe; Teppei Nagano

    2015-01-01

    There is no single, widely-accepted definition of "market liquidity" even though the expression "market liquidity is high/low" is frequently used, and measuring market liquidity is not easy. Recognizing these challenges, this paper formulates a set of new liquidity indicators using transaction data of the markets related to Japanese government bonds (JGBs), including futures, cash, and special collateral (SC) repo, thereby examining market liquidity from various angles. Traditional liquidity ...

  14. Neuromorphic Computing for Very Large Test and Evaluation Data Analysis

    Science.gov (United States)

    2014-05-01

    different types of sensors including their control and data flow. In the case of small unmanned systems, microcontrollers are certainly capable of handling...area, similar to how computer microprocessors are fabricated today. In Sections 3.2 and 4.2, a physical description is presented by which three...chips containing memristive Resistive Random Access Memory (ReRAM) devices in several different size crossbar architectures were acquired for use in

  15. National Needs for Critically Evaluated Physical and Chemical Data.

    Science.gov (United States)

    1978-01-01

    methods, fluidized beds , boiler efficiency, stack gas technology, liquefaction, low and high Btu gas, open-cycle gas turbines, alkali metal vapor turbine... Pharmaceutical research Industrial pollution Agricultural pollution 32 TABLE 7.3 Some Data Needs in Current Federal R&D Programs and Their Current Degree of...products, coatings, films/fabrics, glass 17% 4. Heavy equipment, steel products, rubber, transport 13% 5. Food, drugs, pharmaceuticals , diagnostics 8

  16. A Signal Detection Theory Approach to Evaluating Oculometer Data Quality

    Science.gov (United States)

    Latorella, Kara; Lynn, William, III; Barry, John S.; Kelly, Lon; Shih, Ming-Yun

    2013-01-01

    Currently, data quality is described in terms of spatial and temporal accuracy and precision [Holmqvist et al. in press]. While this approach provides precise errors in pixels, or visual angle, often experiments are more concerned with whether subjects'points of gaze can be said to be reliable with respect to experimentally-relevant areas of interest. This paper proposes a method to characterize oculometer data quality using Signal Detection Theory (SDT) [Marcum 1947]. SDT classification results in four cases: Hit (correct report of a signal), Miss (failure to report a ), False Alarm (a signal falsely reported), Correct Reject (absence of a signal correctly reported). A technique is proposed where subjects' are directed to look at points in and outside of an AOI, and the resulting Points of Gaze (POG) are classified as Hits (points known to be internal to an AOI are classified as such), Misses (AOI points are not indicated as such), False Alarms (points external to AOIs are indicated as in the AOI), or Correct Rejects (points external to the AOI are indicated as such). SDT metrics describe performance in terms of discriminability, sensitivity, and specificity. This paper presentation will provide the procedure for conducting this assessment and an example of data collected for AOIs in a simulated flightdeck environment.

  17. Evaluation of commercial available fusion algorithms for Geoeye data

    Science.gov (United States)

    Vaiopoulos, Aristides D.; Nikolakopoulos, Konstantinos G.

    2013-10-01

    In this study ten commercial available fusion techniques and more especially the Ehlers, Gram-Schmidt, High Pass Filter, Local Mean Matching (LMM), Local Mean and Variance Matching (LMVM), Modified IHS (ModIHS), Pansharp, PCA, HCS (Hyperspherical Color Space) and Wavelet were used for the fusion of Geoeye panchromatic and multispectral data. The panchromatic data have a spatial resolution of 0.5m while the multispectral data have a spatial resolution of 2.0m. The optical result, the statistical parameters and different quality indexes such as ERGAS, Q, entropy were examined and the results are presented. The broader area of Pendeli mountain near to the city of Athens Greece and more especially two sub areas with different characteristics were chosen for the comparison. The first sub area is located at the edge of the urban fabric and combines at the same time the characteristics of an urban and a rural area. The second sub area comprises a large open quarry and it is suitable to examine which fused product is more suitable for mine monitoring.

  18. Safety evaluation of a Medical Device Data System.

    Science.gov (United States)

    Liddle, Stephanie; Grover, Lata; Zhang, Rachel; Khitrov, Maxim; Brown, Joan C; Cobb, J Perren; Goldman, Julian; Chou, Joseph; Yagoda, Daniel; Westover, Brandon; Reisner, Andrew T

    2012-01-01

    Our hospital became interested in the extraction of electronic data from our bedside monitor network to enrich clinical care, and enable various quality improvement projects, research projects, and future applications involving advanced decision-support. We conducted a range of tests to confirm the safety of deploying BedMaster (Excel Medical Electronics, Jupiter FL, USA), which is third-party software sold expressly to provide electronic data extraction and storage from networked General Electric Healthcare bedside patient monitors. We conducted a series of tests examining the changes in network performance when the BedMaster system was on our isolated patient monitor network. We found that use of BedMaster led to measurable, but trivial increases in network traffic and latency. We did not identify any failure scenarios in our analysis and testing. The major value of this report is to highlight potential challenges inherent in data and electronic device integration within the healthcare setting. In describing our strategy for testing the BedMaster system, it is our intention to present one testing protocol and to generate thought and discussion in the broader community about what types of problems can arise with inter-operability, and what types of testing are necessary to mitigate against these risks. Standards for inter-operability would surely reduce the inherent risks.

  19. Evaluation of Meteorology Data for MOPITT Operational Processing

    Science.gov (United States)

    Ziskin, D.; Deeter, M. N.; Worden, H. M.; Mao, D.; Dean, V.

    2015-12-01

    Measurements Of Pollution In The Troposphere[1] (MOPITT) is an instrument flying aboard NASA's Terra satellite[2]. It measures CO using correlated spectroscopy[3]. As part of its processing it uses surface temperature, an atmospheric temperature profile and a water vapor profile from analysis. Since there are many analysis products on the market (e.g. GMAO, NCEP, ECMWF etc.) that meet MOPITT's operational requirements, the question arises as to which product is most apt? There is a collection of "validation data" that MOPITT compares its CO retrievals against[4]. The validation dataset has been acquired by in situ air samples taken by aircraft at a series of altitudes. We can run our processing system in "validation mode" which processes the satellite data for only the days that validation data exists and for a spatial subset that corresponds to the region where the validation data has been collected. We will run the MOPITT retrievals in validation mode separately using each variety of analysis data. We will create a cost function that will provide a scalar estimate of the retrieved CO profile error relative to the validation dataset which is assumed to be "the truth". The retrieval errors of each of the input datasets will be compared to each other to provide insight into the best choice for use in operational MOPITT processing. [1] Drummond, J.R., "Measurements of Pollution in the Troposphere (MOPITT)," in The Use of EOS for Studies of Atmospheric Physics, J. C. Gille, G. Visconti, eds. (North Holland, Amsterdam), pp. 77-101, 1992. [2] 1999 EOS Reference Handbook: A Guide to NASA's Earth Science Enterprise and the Earth Observing System; Eds. Michael D. King and Reynold Greenstone; NASA, Greenbelt, MD, 1999. [3] Drummond, J.R., G. P. Brasseur, G. R. Davis, J. C. Gille, J. C. McConnell, G. D. Pesket, H. G. Reichle, N. Roulet, MOPITT Mission Description Document (Department of Physics, University of Toronto, Toronto, Ontario, Canada M5S 1A7), 1993. [4] Deeter, M. N

  20. A framework for evaluating forest landscape model predictions using empirical data and knowledge

    Science.gov (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson; William D. Dijak; Qia. Wang

    2014-01-01

    Evaluation of forest landscape model (FLM) predictions is indispensable to establish the credibility of predictions. We present a framework that evaluates short- and long-term FLM predictions at site and landscape scales. Site-scale evaluation is conducted through comparing raster cell-level predictions with inventory plot data whereas landscape-scale evaluation is...

  1. Evaluation of soil radioactivity data from the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Since 1951, 933 nuclear tests have been conducted at the Nevada Test Site (NTS) and test areas on the adjacent Tonopah Test Range (TTR) and Nellis Air Force Range (NAFR). Until the early 1960s. the majority of tests were atmospheric, involving detonation of nuclear explosive devices on the ground or on a tower, suspended from a balloon or dropped from an airplane. Since the signing of the Limited Test Ban Treaty in 1963, most tests have been conducted underground, although several shallow subsurface tests took place between 1962 and 1968. As a result of the aboveground and near-surface nuclear explosions, as well as ventings of underground tests, destruction of nuclear devices with conventional explosives, and nuclear-rocket engine tests, the surface soil on portions of the NTS has been contaminated with radionuclides. Relatively little consideration was given to the environmental effects of nuclear testing during the first two decades of operations at the NTS. Since the early 1970s, however, increasingly strict environmental regulations have forced greater attention to be given to contamination problems at the site and how to remediate them. One key element in the current environmental restoration program at the NTS is determining the amount and extent of radioactivity in the surface soil. The general distribution of soil radioactivity on the NTS is already well known as a result of several programs carried out in the 1970s and 1980s. However, questions have been raised as to whether the data from those earlier studies are suitable for use in the current environmental assessments and risk analyses. The primary purpose of this preliminary data review is to determine to what extent the historical data collected at the NTS can be used in the characterization/remediation process.

  2. Evaluation of Target Picking Methods for Magnetic Data

    Science.gov (United States)

    2008-03-01

    The area is relatively flat, open grassland with elevation increasing from 5100 feet above sea level on the west, to 5400 feet above sea level on...the targets in this area. This was a lengthy endeavor and during the systematic process of zooming and scrolling through the data this small...Sky Research Stephen Billings Sky Research Tel: 604-822-1819 Co-PI 445 Dead Indian Memorial Road Fax: 604-822-6088 Ashland, OR 97520 stephen.billings@skyresearch.com

  3. Towards consistent nuclear models and comprehensive nuclear data evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, O [Los Alamos National Laboratory; Hale, G M [Los Alamos National Laboratory; Lynn, J E [Los Alamos National Laboratory; Talou, P [Los Alamos National Laboratory; Bernard, D [FRANCE; Litaize, O [FRANCE; Noguere, G [FRANCE; De Saint Jean, C [FRANCE; Serot, O [FRANCE

    2010-01-01

    The essence of this paper is to enlighten the consistency achieved nowadays in nuclear data and uncertainties assessments in terms of compound nucleus reaction theory from neutron separation energy to continuum. Making the continuity of theories used in resolved (R-matrix theory), unresolved resonance (average R-matrix theory) and continuum (optical model) rangcs by the generalization of the so-called SPRT method, consistent average parameters are extracted from observed measurements and associated covariances are therefore calculated over the whole energy range. This paper recalls, in particular, recent advances on fission cross section calculations and is willing to suggest some hints for future developments.

  4. Performance Evaluation of Affinity Propagation Approaches on Data Clustering

    OpenAIRE

    Refianti, R.; Mutiara, A. B.; A.A. Syamsudduha

    2016-01-01

    Classical techniques for clustering, such as k-means clustering, are very sensitive to the initial set of data centers, so it need to be rerun many times in order to obtain an optimal result. A relatively new clustering approach named Affinity Propagation (AP) has been devised to resolve these problems. Although AP seems to be very powerful it still has several issues that need to be improved. In this paper several improvement or development are discussed in , i.e. other four approaches: Adap...

  5. Evaluating the operations capability of Freedom's Data Management System

    Science.gov (United States)

    Sowizral, Henry A.

    1990-01-01

    Three areas of Data Management System (DMS) performance are examined: raw processor speed, the subjective speed of the Lynx OS X-Window system, and the operational capacity of the Runtime Object Database (RODB). It is concluded that the proposed processor will operate at its specified rate of speed and that the X-Window system operates within users' subjective needs. It is also concluded that the RODB cannot provide the required level of service, even with a two-order of magnitude (100 fold) improvement in speed.

  6. 75 FR 34452 - Center for Drug Evaluation and Research Data Standards Plan; Availability for Comment

    Science.gov (United States)

    2010-06-17

    ... HUMAN SERVICES Food and Drug Administration Center for Drug Evaluation and Research Data Standards Plan... development of a comprehensive data standards program in the Center for Drug Evaluation and Research (CDER... Administration (FDA) is announcing the availability for public comment of the draft document entitled ``CDER Data...

  7. LLNL Data Disk Evaluation Report and Information Gathering Document #449.R1.3

    Energy Technology Data Exchange (ETDEWEB)

    BeLue, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-29

    This is a report on Data Storage Disk Evaluation and characterization. The purpose of this memo is to capture new recipients due to some recent characterization issues with the Hammer Mill process. The Data Storage Disk Evaluation report was generated utilizing data acquired during 2009 and 2010 from submitted storage media.

  8. Evaluation of the surface reflectance retrieval on the satellite data

    Science.gov (United States)

    Xu, Chunyan; Feng, Xuezhi; Xiao, Pengfeng; Wang, Peifa

    2007-06-01

    Electromagnetic radiance acquired by sensors is distorted mainly by atmospheric absorbing and scattering. Atmospheric correction is required for quantitatively analysis of remote sensing information. Radiation transfer model based atmospheric correction usually needs some atmospheric parameters to be chosen and estimated reasonably in advance when atmospheric observation data is lacked. In our work, a radiometric calibration was applied on the satellite data using revised coefficients at first. Then several parameters were determined for the correction process, taking into account the earth's surface and atmospheric properties of the study area. Moreover, the atmospheric correction was implemented using 6S code and the surface reflectance was retrieved. Lastly, the influence of atmospheric correction on spectral response characteristics of different land covers was discussed in respects of the spectral response curve, NDVI and the classification process, respectively. The results showed that the reflectance of all land covers decreases evidently in three visible bands, but increases in the near-infrared and shortwave infrared bands after atmospheric correction. NDVI of land covers also increases obviously after atmospheric influence was removed, and NDVI derived from the surface reflectance is the highest comparing to that from the original digital number and the top of atmosphere reflectance. The accuracy of the supervised classification is improved greatly, which is up to 87.23%, after the atmospheric effect is corrected. Methods of the parameter determination can be used for reference in similar studies.

  9. Using Abdominal CT Data for Visceral Fat Evaluation

    Directory of Open Access Journals (Sweden)

    M Pop

    2013-10-01

    Full Text Available Background: Quantitative assessment of body fat is important for the diagnosis and treatment of diseases related to obesity, Computed tomography (CT becoming the standard procedure for measuring the abdominal fat distribution. Material and method: The retrospective study included 111 inpatients, who underwent routine abdominal CT exams in the Radiology Laboratory of SCJU Tg.Mures (2013. MPR MDCT (SOMATOM AS 64 data was processed using a custom written MATLAB R2009b software, ImageJ being used for tracing of the visceral fat area (VFA. Patient data (including blood glucose, cholesterol and triglycerides were analyzed using MO Excel and GraphPad Inprism5. Results: Visceral Fat percentage varied in population from 14.59-68.69 (SD = 11.83 with significant difference between sexes (male vs. female, 46.98 vs. 31.62, p 220 mg% and triglycerides >150 mg% are significantly associated with the VF percent (p <0.05. Overall there is a weak correlation between the lab variables and the measured fat, the strongest one being between triglycerides and the VFA (r = +0.23 and between age and VFA percentage (certain samples. Conclusions: The technique used should decreases the human error in marking of the fat areas providing a better estimation of the VF/VF percentage. CT measured VF relates with certain lab tests. Further analysis, is required for a better use of CT in obesity related pathology diagnosis and treatment

  10. A new approach for evaluating measured wake data

    Energy Technology Data Exchange (ETDEWEB)

    Magnusson, Mikael [Uppsala Univ. (Sweden). Dept. of Meteorology

    1996-12-01

    Wind turbine wakes have been studied by analysing a large set of atmospheric data, from a wind farm with four turbines sited on a flat coastal area. The results obtained have ben generalized to allow tests against data from other full scale wind turbines as well as wind tunnel simulations. These comparisons are found to give very satisfactory results. The thrust coefficient is found to be a better parameter for description than wind speed, of wake characteristics because it implicitly includes the effect of regulation. It is also found that down-wind travel time is more convenient to use than down-wind distance in this context. The travel time to the end of the near wake region, i.e. to the point where a single velocity deficit peak first appears, is found to be inversely proportional to the rotational frequency of the turbine and to the turbulence intensity of the ambient air flow and proportional to the ratio of the wake radius and the hub height. For larger travel times, i.e. for the far wake region, it is found that the centre line relative velocity deficit decreases with the logarithm of the time traveled and is parametrically dependent on the time constant and the thrust coefficient. 3 refs, 5 figs

  11. Aerodynamic flight evaluation analysis and data base update

    Science.gov (United States)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  12. Use of Biomonitoring Data to Evaluate Methyl Eugenol Exposure

    Science.gov (United States)

    Robison, Steven H.; Barr, Dana B.

    2006-01-01

    Methyl eugenol is a naturally occurring material found in a variety of food sources, including spices, oils, and nutritionally important foods such as bananas and oranges. Given its natural occurrence, a broad cross-section of the population is likely exposed. The availability of biomonitoring and toxicology data offers an opportunity to examine how biomonitoring data can be integrated into risk assessment. Methyl eugenol has been used as a biomarker of exposure. An analytical method to detect methyl eugenol in human blood samples is well characterized but not readily available. Human studies indicate that methyl eugenol is short-lived in the body, and despite the high potential for exposure through the diet and environment, human blood levels are relatively low. The toxicology studies in animals demonstrate that relatively high-bolus doses administered orally result in hepatic neoplasms. However, an understanding is lacking regarding how this effect relates to the exposures that result when food containing methyl eugenol is consumed. Overall, the level of methyl eugenol detected in biomonitoring studies indicates that human exposure is several orders of magnitude lower than the lowest dose used in the bioassay. Furthermore, there are no known health effects in humans that result from typical dietary exposure to methyl eugenol. PMID:17107870

  13. Collection and evaluation of salt mixing data with the real time data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Glazer, S.; Chiu, C.; Todreas, N.E.

    1977-09-01

    A minicomputer based real time data acquisition system was designed and built to facilitate data collection during salt mixing tests in mock ups of LMFBR rod bundles. The system represents an expansion of data collection capabilities over previous equipment. It performs steady state and transient monitoring and recording of up to 512 individual electrical resistance probes. Extensive real time software was written to govern all phases of the data collection procedure, including probe definition, probe calibration, salt mixing test data acquisition and storage, and data editing. Offline software was also written to permit data examination and reduction to dimensionless salt concentration maps. Finally, the computer program SUPERENERGY was modified to permit rapid extraction of parameters from dimensionless salt concentration maps. The document describes the computer system, and includes circuit diagrams of all custom built components. It also includes descriptions and listings of all software written, as well as extensive user instructions.

  14. Evaluation method of nuclear data: half-lives, gamma-ray intensities etc

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Yasukazu; Miyatake, Osamu; Toyama, Masao

    1998-03-01

    The evaluation method has been studied. The basic problem is how to estimate and treat the systematic error. Nuclear decay data were evaluated. Eight practical examples of half-lives are shown in this report. (author)

  15. MoDOT pavement preservation research program volume IV, pavement evaluation tools-data collection methods.

    Science.gov (United States)

    2015-10-01

    The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...

  16. RealWorld evaluation: working under budget, time, data, and political constraints

    National Research Council Canada - National Science Library

    Bamberger, Michael; Rugh, Jim; Mabry, Linda

    2012-01-01

    This book addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints and where critical data may be missing...

  17. Industrial process heat data analysis and evaluation. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Lewandowski, A; Gee, R; May, K

    1984-07-01

    The Solar Energy Research Institute (SERI) has modeled seven of the Department of Energy (DOE) sponsored solar Industrial Process Heat (IPH) field experiments and has generated thermal performance predictions for each project. Additionally, these performance predictions have been compared with actual performance measurements taken at the projects. Predictions were generated using SOLIPH, an hour-by-hour computer code with the capability for modeling many types of solar IPH components and system configurations. Comparisons of reported and predicted performance resulted in good agreement when the field test reliability and availability was high. Volume I contains the main body of the work; objective model description, site configurations, model results, data comparisons, and summary. Volume II contains complete performance prediction results (tabular and graphic output) and computer program listings.

  18. Photonuclear data evaluation of U{sup 235}

    Energy Technology Data Exchange (ETDEWEB)

    Raskinyte, I.; Dupont, E.; Ridikas, D

    2005-11-15

    This report presents cross-sections calculation up to 130 MeV for U{sup 235}. The photoabsorption process is described by the giant dipole resonance and quasi deuteron mechanisms. Neutron emission is treated within the preequilibrium and statistical models while fission is also calculated within this statistical approach using a double-humped fission barrier. The U{sup 235}({gamma},n), ({gamma},2n) and ({gamma},f) cross-sections are calculated with the TALYS-0.64 code using a coupled channels optical potential. Charges in level densities did not affect results significantly. Fission barrier heights and widths were modified to reproduce experimental data. Neither fission level densities nor other default fission parameters were changed. More calculations are needed in order to check cross-sections sensitivity to other level densities and fission models. (A.C.)

  19. Monitoring and evaluation of rowing performance using mobile mapping data

    Science.gov (United States)

    Mpimis, A.; Gikas, V.

    2011-12-01

    Traditionally, the term mobile mapping refers to a means of collecting geospatial data using mapping sensors that are mounted on a mobile platform. Historically, this process was mainly driven by the need for highway infrastructure mapping and transportation corridor inventories. However, the recent advances in mapping sensor and telecommunication technologies create the opportunity that, completely new, emergent application areas of mobile mapping to evolve rapidly. This article examines the potential of mobile mapping technology (MMT) in sports science and in particular in competitive rowing. Notably, in this study the concept definition of mobile mapping somehow differs from the traditional one in a way that, the end result is not relevant to the geospatial information acquired as the moving platform travels in space. In contrast, the interest is placed on the moving platform (rowing boat) itself and on the various subsystems which are also in continuous motion.

  20. First evaluation of MyOcean altimetric data in the Arctic Ocean

    DEFF Research Database (Denmark)

    Cheng, Yongcun; Andersen, Ole Baltazar; Knudsen, Per

    2012-01-01

    The MyOcean V2 preliminary (V2p) data set of weekly gridded sea level anomaly (SLA) maps from 1993 to 2009 over the Arctic region is evaluated against existing altimetric data sets and tide gauge data. Compared with DUACS V3.0.0 (Data Unification and Altimeter Combination System) data set, My...

  1. EVALUATION AND MAPPING OF RANGELANDS DEGRADATION USING REMOTELY SENSED DATA

    Directory of Open Access Journals (Sweden)

    Majid Ajorlo

    2005-05-01

    Full Text Available The empirical and scientifically documents prove that misuse of natural resource causes degradation in it. So natural resources conservation is important in approaching sustainable development aims. In current study, Landsat Thematic Mapper images and grazing gradient method have been used to map the extent and degree of rangeland degradation. In during ground-based data measuring, factors such as vegetation cover, litter, plant diversity, bare soil, and stone & gravels were estimated as biophysical indicators of degradation. The next stage, after geometric correction and doing some necessary pre-processing practices on the study area’s images; the best and suitable vegetation index has been selected to map rangeland degradation among the Normalized Difference Vegetation Index (NDVI, Soil Adjusted Vegetation Index (SAVI, and Perpendicular Vegetation Index (PVI. Then using suitable vegetation index and distance parameter was produced the rangelands degradation map. The results of ground-based data analysis reveal that there is a significant relation between increasing distance from critical points and plant diversity and also percentage of litter. Also there is significant relation between vegetation cover percent and distance from village, i.e. the vegetation cover percent increases by increasing distance from villages, while it wasn’t the same around the stock watering points. The result of analysis about bare soil and distance from critical point was the same to vegetation cover changes manner. Also there wasn’t significant relation between stones & gravels index and distance from critical points. The results of image processing show that, NDVI appears to be sensitive to vegetation changes along the grazing gradient and it can be suitable vegetation index to map rangeland degradation. The degradation map shows that there is high degradation around the critical points. These areas need urgent attention for soil conservation. Generally, it

  2. Decay Data Evaluation Project (DDEP): Updated evaluation of the 133Ba, 140Ba, 140La and 141Ce decay characteristics

    Science.gov (United States)

    Chechev, Valerii P.; Kuzmenko, Nikolai K.

    2017-09-01

    Within the Decay Data Evaluation Project (DDEP) an updated comprehensive assessment has been made of the decay characteristics of 133Ba, 140Ba, 140La, and 141Ce. Experimental data published up to 2016 along with other information (new compilations, analyses and corrections) were taken into account. Newly evaluated values of the half-lives and a number of other key decay characteristics are presented in this paper for all four radionuclides.

  3. TUNL Nuclear Structure Data Evaluation on A = 2-20 Nuclides

    Science.gov (United States)

    Truong, Thinh; Kelley, John; Sheu, Grace

    2016-09-01

    Nuclear data represents measured or evaluated probabilities of various physical interactions involving the nuclei of atoms. The nuclear data group at Triangle Universities Nuclear Laboratory (TUNL) compiles, evaluates and disseminates nuclear structure data relevant to light nuclei in the mass region of A = 2 - 20. Our activities primarily involve surveying literature articles and producing recommended values for inclusion into various United States Nuclear Data Program databases, such as Experimental Unevaluated Nuclear Data List (XUNDL) and Evaluated Nuclear Structure Data File (ENSDF). We have projects related to analyzing beta-decay lifetimes, compiling structure data from recently published articles, and producing full nuclear structure data evaluations of nuclides based on all existing literature. The nuclear data disseminated is used for theoretical model development of nuclear physics and for applications involving radiation and nuclear power technologies. This work is supported by the U.S. National Science Foundation Grant No. NSF-PHY-1461204 and Duke/TUNL.

  4. Decay Data Evaluation Project (DDEP): Updated decay data evaluations for (24)Na, (46)Sc, (51)Cr, (54)Mn, (57)Co, (59)Fe, (88)Y, (198)Au.

    Science.gov (United States)

    Chechev, Valery P; Kuzmenko, Nikolay K

    2016-03-01

    Updated DDEP evaluations have been presented for the decay characteristics of the radionuclides (24)Na, (46)Sc, (51)Cr, (54)Mn, (57)Co, (59)Fe, (88)Y and (198)Au. Previous DDEP evaluations for these radionuclides were published in the BIPM-5 monographie in 2004. The experimental data published during the intervening period of 2004-2014 were taken into account in the current evaluations as well as other information: new compilations, analyses, and corrections. The updated evaluations are compared to previous results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Evaluating post-disaster ecosystem resilience using MODIS GPP data

    Science.gov (United States)

    Frazier, Amy E.; Renschler, Chris S.; Miles, Scott B.

    2013-04-01

    An integrated community resilience index (CRI) quantifies the status, exposure, and recovery of the physical, economic, and socio-cultural capital for a specific target community. However, most CRIs do not account for the recovery of ecosystem functioning after extreme events, even though many aspects of a community depend on the services provided by the natural environment. The primary goal of this study was to monitor the recovery of ecosystem functionality (ecological capital) using remote sensing-derived gross primary production (GPP) as an indicator of 'ecosystem-wellness' and assess the effect of resilience of ecological capital on the recovery of a community via an integrated CRI. We developed a measure of ecosystem resilience using remotely sensed GPP data and applied the modeling prototype ResilUS in a pilot study for a four-parish coastal community in southwestern Louisiana, USA that was impacted by Hurricane Rita in 2005. The results illustrate that after such an extreme event, the recovery of ecological capital varies according to land use type and may take many months to return to full functionality. This variable recovery can potentially impact the recovery of certain businesses that rely heavily on ecosystem services such as agriculture, forestry, fisheries, and tourism.

  6. Evaluation of ERTS data for certain hydrological uses

    Science.gov (United States)

    Wiesnet, D. R.; Mcginnis, D. F. (Principal Investigator); Matson, M.

    1973-01-01

    The author has identified the following significant results. Mapping of snow cover using ERTS-1 data proved to be six times faster than that done from U-2 photography. However, NOAA-2 VHRR snow cover mapping was almost as fast as ERTS-1, and it is available more frequently. Ice conditions in the Great Lakes can be readily determined by ERTS-1. Ice features characteristic of thawing conditions such as rotten ice, lack of pressure ridges, brash belts, and compacted ice edges can be identified. A great decrease in apparent reflectivity in band 7 as compared to band 4 also indicated melting conditions. Using sidelap from two successive ERTS-1 images of Lake Erie (February 17 and 18, 1973) a measure of ice movement was made, agreeing closely with the estimate from conventional methods. The same imagery permitted tentative identification of the following features: shuga, light and dark nilas, fast ice, icefoot, ice breccia, brash ice, fracturing, ridging, rafting, sastrugi, thaw holes, rotten ice, ice islands, dried ice puddles, hummocked ice, and leads.

  7. Police Enforcement Policy and Programmes on European Roads (PEPPER). Deliverable 4a: Good practice in data, data collection and data use for monitoring and evaluating Traffic Law Enforcement.

    NARCIS (Netherlands)

    Schagen, I.N.L.G. Bernhoft, I.M. Erke, A. Ewert, U. Kallberg, V.-P. & Skladana, P.

    2008-01-01

    This report is the Deliverable of task 4.3a of the PEPPER project. It describes the good practice requirements regarding data, data collection and data use for monitoring and evaluating Traffic Law Enforcement (TLE). The aim is that, eventually, individual police forces/countries put the identified

  8. Photoneutron reaction cross sections from various experiments - analysis and evaluation using physical criteria of data reliability

    Science.gov (United States)

    Varlamov, Vladimir; Ishkhanov, Boris; Orlin, Vadim; Peskov, Nikolai; Stepanov, Mikhail

    2017-09-01

    The majority of photonuclear reaction cross sections important for many fields of science and technology and various data files (EXFOR, RIPL, ENDF, etc.) supported by the IAEA were obtained in experiments using quasimonoenergetic annihilation photons. There are well-known systematic discrepancies between the partial photoneutron reactions (γ, 1n), (γ, 2n), (γ, 3n). For analysis of the data reliability the objective physical criteria were proposed. It was found out that the experimental data for many nuclei are not reliable because of large systematic uncertainties of the neutron multiplicity sorting method used. The experimentally-theoretical method was proposed for evaluating the reaction cross sections data satisfying the reliability criteria. The partial and total reaction cross sections were evaluated for many nuclei. In many cases evaluated data differ noticeably from both the experimental data and the data evaluated before for the IAEA Photonuclear Data Library. Therefore it became evident that the IAEA Library needs to be revised and updated.

  9. Identifying chronic conditions in Medicare claims data: evaluating the Chronic Condition Data Warehouse algorithm.

    Science.gov (United States)

    Gorina, Yelena; Kramarow, Ellen A

    2011-10-01

    To examine the strengths and limitations of the Center for Medicare and Medicaid Services' Chronic Condition Data Warehouse (CCW) algorithm for identifying chronic conditions in older persons from Medicare beneficiary data. Records from participants of the NHANES I Epidemiologic Follow-up Study (NHEFS 1971-1992) linked to Medicare claims data from 1991 to 2000. We estimated the percent of preexisting cases of chronic conditions correctly identified by the CCW algorithm during its reference period and the number of years of claims data necessary to find a preexisting condition. The CCW algorithm identified 69 percent of preexisting diabetes cases but only 17 percent of preexisting arthritis cases. Cases identified by the CCW are a mix of preexisting and newly diagnosed conditions. The prevalence of conditions needing less frequent health care utilization (e.g., arthritis) may be underestimated by the CCW algorithm. The CCW reference periods may not be sufficient for all analytic purposes. © Health Research and Educational Trust.

  10. Developing a Survey Instrument for Evaluating the Effectiveness of Data Management Training Materials

    Science.gov (United States)

    Hou, C. Y.; Soyka, H.; Hutchison, V.; Budden, A. E.

    2016-12-01

    Education and training resources that focus on best practices and guidelines for working with data such as: data management, data sharing, quality metadata creation, and maintenance for reuse, have vital importance not only to the users of Data Observation Network for Earth (DataONE), but also to the broader scientific, research, and academic communities. However, creating and maintaining relevant training/educational materials that remain sensitive and responsive to community needs is dependent upon careful evaluations of the current landscape in order to promote and support thoughtful development of new resources. Using DataONE's existing training/educational resources as the basis for this project, the authors have worked to develop an evaluation instrument that can be used to evaluate the effectiveness of data management training/education resources. The evaluation instrument is in the form of a digital questionnaire/survey. The evaluation instrument also includes the structure and content as recommended by the best practices/guidelines of questionnaire/survey design, based on a review of the literature. Additionally, the evaluation instrument can be customized to evaluate various training/education modalities and be implemented using a web-based questionnaire/survey platform. Finally, the evaluation instrument can be used for site-wide evaluation of DataONE teaching materials and resources, and once made publicly available and openly accessible, other organizations may also utilize the instrument. One key outcome of developing the evaluation instrument is to help in increasing the effectiveness of data management training/education resources across the Earth/Geoscience community. Through this presentation, the authors will provide the full background and motivations for creating an instrument for evaluating the effectiveness of data management training/education resources. The presentation will also discuss in detail the process and results of the current

  11. Tracking Foodborne Pathogens from Farm to Table: Data Needs to Evaluate Control Options

    OpenAIRE

    Anonymous; Jensen, Helen H.; Unnevehr, Laurian J.

    1995-01-01

    Food safety policymakers and scientists came together at a conference in January 1995 to evaluate data available for analyzing control of foodborne microbial pathogens. This proceedings starts with data regarding human illnesses associated with foodborne pathogens and moves backwards in the food chain to examine pathogen data in the processing sector and at the farm level. Of special concern is the inability to link pathogen data throughout the food chain. Analytical tools to evaluate the imp...

  12. A data set for evaluating the performance of multi-class multi-object video tracking

    OpenAIRE

    Chakraborty, Avishek; Stamatescu, Victor; Wong, Sebastien C.; Wigley, Grant; Kearney, David

    2017-01-01

    One of the challenges in evaluating multi-object video detection, tracking and classification systems is having publically available data sets with which to compare different systems. However, the measures of performance for tracking and classification are different. Data sets that are suitable for evaluating tracking systems may not be appropriate for classification. Tracking video data sets typically only have ground truth track IDs, while classification video data sets only have ground tru...

  13. THE AIMS AND ACTIVITIES OF THE INTERNATIONAL NETWORK OF NUCLEAR STRUCTURE AND DECAY DATA EVALUATORS.

    Energy Technology Data Exchange (ETDEWEB)

    NICHOLS,A.L.; TULI, J.K.

    2007-04-22

    International Network of Nuclear Structure and Decay Data (NSDD) Evaluators consists of a number of evaluation groups and data service centers in several countries that appreciate the merits of working together to maintain and ensure the quality and comprehensive content of the ENSDF database (Evaluated Nuclear Structure Data File). Biennial meetings of the network are held under the auspices of the International Atomic Energy Agency (IAEA) to assign evaluation responsibilities, monitor progress, discuss improvements and emerging difficulties, and agree on actions to be undertaken by individual members. The evaluated data and bibliographic details are made available to users via various media, such as the journals ''Nuclear Physics A'' and ''Nuclear Data Sheets'', the World Wide Web, on CD-ROM, wall charts of the nuclides and ''Nuclear Wallet Cards''. While the ENSDF master database is maintained by the US National Nuclear Data Center at the Brookhaven National Laboratory, these data are also available from other nuclear data centers including the IAEA Nuclear Data Section. The Abdus Salam International Centre for Theoretical Physics (ICTP), Trieste, Italy, in cooperation with the IAEA, organizes workshops on NSDD at regular intervals. The primary aims of these particular workshops are to provide hands-on training in the data evaluation processes, and to encourage new evaluators to participate in NSDD activities. The technical contents of these NSDD workshops are described, along with the rationale for the inclusion of various topics.

  14. Evaluation of nuclear data for R and D projects; development of database for medical nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Tae Suk [Catholic University, Seoul (Korea); Shin, D. O. [Kyung Hee University, Seoul (Korea); Joh, C. W.; Chang, J. S. [Ajou University, Suwon (Korea); Choi, Y. [Sungkyunkwan University, Seoul (Korea); Kim, S. H. [Hanyang University, Seoul (Korea); Park, S. Y. [National Cancer Center, Seoul (Korea); Shin, D. H.; Lee, S [Kyonggi University, Seoul (Korea)

    2002-04-01

    Medical nuclear data used in the country is not provided by academic associations and organizations concerned and even by government organizations concerned. This is aimed to investigate the diagnostic and therapeutic equipments in the clinical use and the domestic present status of nuclear data and physical properties of sealed or unsealed radioactive isotopes and to establish the nuclear database. About 120 domestic centers take nuclear medicine tests and 52 medical centers do radiotherapy. The 30-odd different kinds of radionuclides are usually used in nuclear medicine in the country. The 30-odd kinds of unsealed sources are used for diagnosis and therapy and 10-odd kinds of sealed sources for brachytherapy in the country. The special radiotherapy includes Gamma-knife, linac-based stereotactic radiosurgery, conformal radiotherapy and Intensity modulated radiotherapy. The nuclear data base has been completed on the basis of these data collected and the web site made is available with ease to anyone who want to get nuclear data. 39 refs., 20 figs., 8 tabs. (Author)

  15. Development and evaluation of a data-adaptive alerting algorithm for univariate temporal biosurveillance data.

    Science.gov (United States)

    Elbert, Yevgeniy; Burkom, Howard S

    2009-11-20

    This paper discusses further advances in making robust predictions with the Holt-Winters forecasts for a variety of syndromic time series behaviors and introduces a control-chart detection approach based on these forecasts. Using three collections of time series data, we compare biosurveillance alerting methods with quantified measures of forecast agreement, signal sensitivity, and time-to-detect. The study presents practical rules for initialization and parameterization of biosurveillance time series. Several outbreak scenarios are used for detection comparison. We derive an alerting algorithm from forecasts using Holt-Winters-generalized smoothing for prospective application to daily syndromic time series. The derived algorithm is compared with simple control-chart adaptations and to more computationally intensive regression modeling methods. The comparisons are conducted on background data from both authentic and simulated data streams. Both types of background data include time series that vary widely by both mean value and cyclic or seasonal behavior. Plausible, simulated signals are added to the background data for detection performance testing at signal strengths calculated to be neither too easy nor too hard to separate the compared methods. Results show that both the sensitivity and the timeliness of the Holt-Winters-based algorithm proved to be comparable or superior to that of the more traditional prediction methods used for syndromic surveillance.

  16. Online Traffic Condition Evaluation Method for Connected Vehicles Based on Multisource Data Fusion

    Directory of Open Access Journals (Sweden)

    Pang-wei Wang

    2017-01-01

    Full Text Available With the development of connected vehicle (CV and Vehicle to X (V2X communication, more traffic data is being collected from the road network. In order to predict future traffic condition from connected vehicles’ data in real-time, we present an online traffic condition evaluation model utilizing V2X communication. This model employs the Analytic Hierarchy Process (AHP and the multilevel fuzzy set theory to fuse multiple sources of information for prediction. First, the contemporary vehicle data from the On Board Diagnostic (OBD is fused with the static road data in the Road Side Unit (RSU. Then, the real-time traffic evaluation scores are calculated using the variable membership model. The real data collected by OBU in field test demonstrates the feasibility of the evaluation model. Compared with traditional evaluation systems, the proposed model can handle more types of data but demands less data transfer.

  17. Use of genomic data in risk assessment case study: II. Evaluation of the dibutyl phthalate toxicogenomic data set

    Energy Technology Data Exchange (ETDEWEB)

    Euling, Susan Y., E-mail: euling.susan@epa.gov [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington, DC (United States); White, Lori D. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Kim, Andrea S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington, DC (United States); Sen, Banalata [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Wilson, Vickie S. [National Health and Environmental Effects Research Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Keshava, Channa; Keshava, Nagalakshmi [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Washington, DC (United States); Hester, Susan [National Health and Environmental Effects Research Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Ovacik, Meric A.; Ierapetritou, Marianthi G.; Androulakis, Ioannis P. [National Center for Environmental Research Science to Achieve Results (STAR) Bioinformatics Center, Environmental Bioinformatics and Computational Toxicology Center (ebCTC), Rutgers University and University of Medicine and Dentistry of New Jersey, Piscataway, NJ (United States); Gaido, Kevin W. [Center for Veterinary Medicine, U.S. Food and Drug Administration, Rockville, MD 20855 (United States)

    2013-09-15

    An evaluation of the toxicogenomic data set for dibutyl phthalate (DBP) and male reproductive developmental effects was performed as part of a larger case study to test an approach for incorporating genomic data in risk assessment. The DBP toxicogenomic data set is composed of nine in vivo studies from the published literature that exposed rats to DBP during gestation and evaluated gene expression changes in testes or Wolffian ducts of male fetuses. The exercise focused on qualitative evaluation, based on a lack of available dose–response data, of the DBP toxicogenomic data set to postulate modes and mechanisms of action for the male reproductive developmental outcomes, which occur in the lower dose range. A weight-of-evidence evaluation was performed on the eight DBP toxicogenomic studies of the rat testis at the gene and pathway levels. The results showed relatively strong evidence of DBP-induced downregulation of genes in the steroidogenesis pathway and lipid/sterol/cholesterol transport pathway as well as effects on immediate early gene/growth/differentiation, transcription, peroxisome proliferator-activated receptor signaling and apoptosis pathways in the testis. Since two established modes of action (MOAs), reduced fetal testicular testosterone production and Insl3 gene expression, explain some but not all of the testis effects observed in rats after in utero DBP exposure, other MOAs are likely to be operative. A reanalysis of one DBP microarray study identified additional pathways within cell signaling, metabolism, hormone, disease, and cell adhesion biological processes. These putative new pathways may be associated with DBP effects on the testes that are currently unexplained. This case study on DBP identified data gaps and research needs for the use of toxicogenomic data in risk assessment. Furthermore, this study demonstrated an approach for evaluating toxicogenomic data in human health risk assessment that could be applied to future chemicals

  18. General Guidelines on Criteria for Adoption or Rejection of Evaluated Libraries and Data by the Nuclear Data Team

    Energy Technology Data Exchange (ETDEWEB)

    Neudecker, Denise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gray, Mark Girard [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, Donald Kent [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); White, Morgan Curtis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-12

    This memo contains general guidelines on what documentation and tools need to be in place as well as format and data testing requirements such that evaluated nuclear data sets or entire libraries can be adopted by the nuclear data team. Additional requirements beyond this memo might apply for specific nuclear data observables. These guidelines were established based on discussions between J.L. Conlin, M.G. Gray, A.P. McCartney, D. Neudecker, D.K. Parsons and M.C. White.

  19. Chemical Kinetics and Photochemical Data for Use in Stratospheric Modeling. Evaluation No. 12

    Science.gov (United States)

    DeMore, W. B.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.; Kolb, C. E.; Molina, M. J.

    1997-01-01

    This is the twelfth in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The primary application of the data is in the modeling of stratospheric processes, with special emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  20. Chemical kinetics and photochemical data for use in stratospheric modeling: Evaluation number 11

    Science.gov (United States)

    Demore, W. B.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.; Kolb, C. E.; Molina, M. J.

    1994-01-01

    This is the eleventh in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The primary application of the data is in the modeling of stratospheric processes, with special emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  1. Status of experimental data of proton-induced reactions for intermediate-energy nuclear data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, Yukinobu; Kawano, Toshihiko [Kyushu Univ., Fukuoka (Japan); Yamano, Naoki; Fukahori, Tokio

    1998-11-01

    The present status of experimental data of proton-induced reactions is reviewed, with particular attention to total reaction cross section, elastic and inelastic scattering cross section, double-differential particle production cross section, isotope production cross section, and activation cross section. (author)

  2. 10 CFR 1045.16 - Criteria for evaluation of restricted data and formerly restricted data information.

    Science.gov (United States)

    2010-01-01

    ... by significantly assisting potential adversaries to develop or improve a nuclear weapon capability, produce nuclear weapons materials, or make other military use of nuclear energy; (4) Whether publication... restricted data information. 1045.16 Section 1045.16 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) NUCLEAR...

  3. Syndromic surveillance using veterinary laboratory data: data pre-processing and algorithm performance evaluation.

    Science.gov (United States)

    Dórea, Fernanda C; McEwen, Beverly J; McNab, W Bruce; Revie, Crawford W; Sanchez, Javier

    2013-06-06

    Diagnostic test orders to an animal laboratory were explored as a data source for monitoring trends in the incidence of clinical syndromes in cattle. Four years of real data and over 200 simulated outbreak signals were used to compare pre-processing methods that could remove temporal effects in the data, as well as temporal aberration detection algorithms that provided high sensitivity and specificity. Weekly differencing demonstrated solid performance in removing day-of-week effects, even in series with low daily counts. For aberration detection, the results indicated that no single algorithm showed performance superior to all others across the range of outbreak scenarios simulated. Exponentially weighted moving average charts and Holt-Winters exponential smoothing demonstrated complementary performance, with the latter offering an automated method to adjust to changes in the time series that will likely occur in the future. Shewhart charts provided lower sensitivity but earlier detection in some scenarios. Cumulative sum charts did not appear to add value to the system; however, the poor performance of this algorithm was attributed to characteristics of the data monitored. These findings indicate that automated monitoring aimed at early detection of temporal aberrations will likely be most effective when a range of algorithms are implemented in parallel.

  4. Summary documentation of LASL nuclear data evaluations for ENDF/B-V

    Energy Technology Data Exchange (ETDEWEB)

    Young, P.G. (comp.)

    1979-01-01

    Summaries are presented of nuclear data evaluations performed at Los Alamos Scientific Laboratory (LASL) that will comprise part of Version V of the Evaluated Nuclear Data File, ENDF/B. A total of 18 general purpose evaluations of neutron-induced data are summarized, together with 6 summaries directed specifically at covariance data evaluations. The general purpose evaluation summaries cover the following isotopes: /sup 1 -3/H, /sup 3/ /sup 4/He, /sup 6/ /sup 7/Li, /sup 10/B, /sup 14/ /sup 15/N, /sup 16/O, /sup 27/Al, /sup 182/ /sup 183/ /sup 184/ /sup 186/W, /sup 233/U, and /sup 242/Pu. The covariance data summaries are given for /sup 1/H, /sup 6/Li, /sup 10/B, /sup 14/N, /sup 16/O, and /sup 27/Al. 28 figures.

  5. Chemical Kinetics and Photochemical Data for Use in Atmospheric Studies: Evaluation Number 18

    Science.gov (United States)

    Burkholder, J. B.; Sander, S. P.; Abbatt, J. P. D.; Barker, J. R.; Huie, R. E.; Kolb, C. E.; Kurylo, M. J.; Orkin, V. L.; Wilmouth, D. M.; Wine, P. H.

    2015-01-01

    This is the eighteenth in a series of evaluated sets of rate constants, photochemical cross sections, heterogeneous parameters, and thermochemical parameters compiled by the NASA Panel for Data Evaluation. The data are used primarily to model stratospheric and upper tropospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena. The evaluation is available in electronic form from the following Internet URL: http://jpldataeval.jpl.nasa.gov/

  6. Chemical kinetics and photochemical data for use in stratospheric modeling evaluation Number 8

    Science.gov (United States)

    Demore, W. B.; Molina, M. J.; Sander, S. P.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.

    1987-01-01

    This is the eighth in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The primary application of the data is in the modeling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena. Copies of this evaluation are available from the Jet Propulsion Laboratory, Documentation Section, 111-116B, California Institute of Technology, Pasadena, California, 91109.

  7. Chemical Kinetics and Photochemical Data for Use in Atmospheric Studies Evaluation Number 15

    Science.gov (United States)

    Sander, S. P.; Friedl, R. R.; Golden, D. M.; Kurylo, M. J.; Moortgat, G. K.; Wine, P. H.; Ravishankara, A. R.; Kolb, C. E.; Molina, M. J.; Finlayson-Pitts, B. J.; hide

    2006-01-01

    This is the fifteenth in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The data are used primarily to model stratospheric and upper tropospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena. Copies of this evaluation are available in electronic form and may be printed from the following Internet URL: http://jpldataeval.jpl.nasa.gov/.

  8. Evaluating functional diversity: Missing trait data and the importance of species abundance structure and data transformation

    Czech Academy of Sciences Publication Activity Database

    Májeková, M.; Paal, T.; Plowman, Nichola S.; Bryndová, Michala; Kasari, L.; Norberg, A.; Weiss, Matthias; Bishop, T. R.; Luke, S. H.; Sam, Kateřina; Le Bagousse-Pinguet, Y.; Lepš, Jan; Götzenberger, Lars; de Bello, Francesco

    2016-01-01

    Roč. 11, č. 2 (2016), č. článku e0149270. E-ISSN 1932-6203 R&D Projects: GA ČR GB14-36098G; GA ČR(CZ) GP14-32024P; GA ČR GAP505/12/1296 Grant - others:GA JU(CZ) 156/2013/P Institutional support: RVO:60077344 ; RVO:67985939 Keywords : data incompleteness * functional diversity * species abundance Subject RIV: EH - Ecology, Behaviour; EH - Ecology, Behaviour (BU-J) Impact factor: 2.806, year: 2016 http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0149270

  9. Computer program system for evaluation of FP nuclear data for JENDL. Smooth part

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Watanabe, Takashi; Iijima, Shungo

    1997-12-01

    This report describes computer programs used to evaluate nuclear data of fission product (FP) nuclides stored in an evaluated nuclear data library JENDL, especially in the smooth part above the resonance region. Many programs were used for determination of nuclear model parameters, calculation of nuclear data, handling of experimental and/or calculated data, and so on. Among them, reported here are programs for determination of level density parameters (ENSDFRET, LVLPLOT, LEVDES), for making sets of JCL and input data for the theoretical calculation program CASTHY (JOBSETTER, INDES/CASTHY), and for conversion of format of CASTHY output files to the ENDF format (CTOB2). (author). 51 refs.

  10. Update on the U.S. Nuclear Structure and Decay Data Evaluation Program*

    Science.gov (United States)

    Baglin, Coral M.

    2000-04-01

    The U.S. Nuclear Data Program (USNDP) maintains and provides easy access to several large databases to assist scientists to sift through and assess the vast quantity of published nuclear structure and decay data. The Evaluated Nuclear Structure Data File (ENSDF) presents evaluated experimental nuclear data; the Nuclear Structure Reference (NSR) database provides bibliographic information; an Experimental Unevaluated Data Listing (XUNDL)has also recently been established to provide rapid access to data from the most recent publications (primarily in high-spin physics). In addition to the ongoing revision of data in ENSDF, information from the latest evaluations for A=21-44 by Endt[1] and evaluated data for recently-discovered heavy elements (up to Z=118) are currently being incorporated into ENSDF. An overview of nuclear structure and decay data available through the USNDP will be presented, with emphasis on recent and forthcoming additions to the material available. Feedback concerning the extent to which users' nuclear structure and decay data needs are being met by this data program will be welcomed. 1. P.M. Endt, Nucl. Phys. A521, 1 (1998). *On behalf of the U.S. Nuclear Data Program

  11. Systematic Evaluation of Methods for Integration of Transcriptomic Data into Constraint-Based Models of Metabolism

    DEFF Research Database (Denmark)

    Machado, Daniel; Herrgard, Markus

    2014-01-01

    is then systematically evaluated using published data from three different case studies in E. coli and S. cerevisiae. The flux predictions made by different methods using transcriptomic data are compared against experimentally determined extracellular and intracellular fluxes (from 13C-labeling data). The sensitivity...... biological principles of metabolic regulation....

  12. Evaluation of two precipitation data sets for the Rhine River using streamflow simulations

    NARCIS (Netherlands)

    Photiadou, C.|info:eu-repo/dai/nl/325842914; Weerts, A.H.; van den Hurk, B.J.J.M.

    2011-01-01

    This paper presents an extended version of a widely used precipitation data set and evaluates it along with a recently released precipitation data set, using streamflow simulations. First, the existing precipitation data set issued by the Commission for the Hydrology of the Rhine basin (CHR),

  13. Summary Report of a Specialized Workshop on Nuclear Structure and Decay Data (NSDD) Evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, Alan L. [Univ. of Surrey, Guildford (United Kingdom); Dimitrious, P. [IAEA Nuclear Data Section, Vienna (Austria); Kondev, F. G. [Argonne National Lab. (ANL), Argonne, IL (United States); Ricard-McCutchan, E. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-04-27

    A three-day specialised workshop on Nuclear Structure and Decay Data Evaluations was organised and held at the headquarters of the International Atomic Energy Agency in Vienna, Austria, from 27 to 29 April 2015. This workshop covered a wide range of important topics and issues addressed when evaluating and maintaining the Evaluated Nuclear Structure Data File (ENSDF). The primary aim was to improve evaluators’ abilities to identify and understand the most appropriate evaluation processes to adopt in the formulation of individual ENSDF data sets. Participants assessed and reviewed existing policies, procedures and codes, and round-table discussions included the debate and resolution of specific difficulties experienced by ENSDF evaluators (i.e., all workshop participants). The contents of this report constitute a record of this workshop, based on the presentations and subsequent discussions.

  14. Release of the ENDF/B-VII.1 Evaluated Nuclear Data File

    Energy Technology Data Exchange (ETDEWEB)

    Brown, David

    2012-06-30

    The Cross Section Evaluation Working Group (CSEWG) released the ENDF/B-VII.1 library on December 22, 2011. The ENDF/B-VII.1 library is CSEWG's latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0, including: many new evaluation in the neutron sublibrary (423 in all and over 190 of these contain covariances), new fission product yields and a greatly improved decay data sublibrary. This summary barely touches on the five years worth of advances present in the ENDF/B-VII.1 library. We expect that these changes will lead to improved integral performance in reactors and other applications. Furthermore, the expansion of covariance data in this release will allow for better uncertainty quantification, reducing design margins and costs. The ENDF library is an ongoing and evolving effort. Currently, the ENDF data community embarking on several parallel efforts to improve library management: (1) The adoption of a continuous integration system to provide evaluators 'instant' feedback on the quality of their evaluations and to provide data users with working 'beta' quality libraries in between major releases. (2) The transition to new hierarchical data format - the Generalized Nuclear Data (GND) format. We expect GND to enable new kinds of evaluated data which cannot be accommodated in the legacy ENDF format. (3) The development of data assimilation and uncertainty propagation techniques to enable the consistent use of integral experimental data in the evaluation process.

  15. Chemical kinetics and photochemical data for use in stratospheric modeling. Evaluation number 6

    Science.gov (United States)

    Demore, W. B.; Molina, M. J.; Watson, R. T.; Golden, D. M.; Hampson, R. F.; Kurylo, M. J.; Howard, C. J.; Ravishankara, A. R.

    1983-01-01

    Evaluated sets of rate constants and photochemical cross sections are presented. The primary application of the data is in the modeling of stratospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena.

  16. Chemical kinetics and photochemical data for use in stratospheric modeling: Evaluation number 5

    Science.gov (United States)

    Demore, W. B.

    1982-01-01

    Sets of rate constants and photochemical cross sections compiled which were evaluated. The primary application of the data is in the modeling of stratospheric processes on the ozone layer and its possible perturbation by anthropogenic and natural phenomena are emphasized.

  17. Evaluated kinetic and photochemical data for atmospheric chemistry: Volume VI – heterogeneous reactions with liquid substrates

    Directory of Open Access Journals (Sweden)

    M. Ammann

    2013-08-01

    Full Text Available This article, the sixth in the ACP journal series, presents data evaluated by the IUPAC Task Group on Atmospheric Chemical Kinetic Data Evaluation. It covers the heterogeneous processes involving liquid particles present in the atmosphere with an emphasis on those relevant for the upper troposphere/lower stratosphere and the marine boundary layer, for which uptake coefficients and adsorption parameters have been presented on the IUPAC website since 2009. The article consists of an introduction and guide to the evaluation, giving a unifying framework for parameterisation of atmospheric heterogeneous processes. We provide summary sheets containing the recommended uptake parameters for the evaluated processes. The experimental data on which the recommendations are based are provided in data sheets in separate appendices for the four surfaces considered: liquid water, deliquesced halide salts, other aqueous electrolytes and sulfuric acid.

  18. Co-ordination of the International Network of Nuclear Structure and Decay Data Evaluators

    Energy Technology Data Exchange (ETDEWEB)

    Ricard-McCutchan, E. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dimitriou, P. [Intl Atomic Energy Agency (IAEA), Vienna (Austria); Nichols, A. L. [Univ. of Surrey, Guildford (United Kingdom)

    2015-08-01

    The 21st meeting of the International Network of Nuclear Structure and Decay Data Evaluators was convened at the IAEA Headquarters, Vienna, from 20 to 24 April 2015 under the auspices of the IAEA Nuclear Data Section. This meeting was attended by 36 scientists from 15 Member States, plus IAEA staff, concerned with the compilation, evaluation and dissemination of nuclear structure and decay data. A summary of the meeting, data centre reports, various proposals considered, and actions agreed by the participants, as well as recommendations/conclusions are presented within this document.

  19. Analysis of data user's needs for performance evaluation of solar heating and cooling systems

    Science.gov (United States)

    Christensen, D. L.

    1978-01-01

    In a successful data acquisition program, the information needs must be evaluated, the design and cost factors of the program must be determined, and a data management loop must be organized and operated in order to collect, process, and disseminate the needed information in useable formats. This paper describes each of these program elements in detail as an aid for the solar heating and cooling data manager and user to implement effective data acquisition and monitoring systems. Consideration is given to the development of evaluation techniques which will aid in the determination of solar energy systems performances.

  20. Benchmarking Data Sets for the Evaluation of Virtual Ligand Screening Methods: Review and Perspectives.

    Science.gov (United States)

    Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu

    2015-07-27

    Virtual screening methods are commonly used nowadays in drug discovery processes. However, to ensure their reliability, they have to be carefully evaluated. The evaluation of these methods is often realized in a retrospective way, notably by studying the enrichment of benchmarking data sets. To this purpose, numerous benchmarking data sets were developed over the years, and the resulting improvements led to the availability of high quality benchmarking data sets. However, some points still have to be considered in the selection of the active compounds, decoys, and protein structures to obtain optimal benchmarking data sets.

  1. Mathematics of Sensing, Exploitation, and Execution (MSEE) Hierarchical Representations for the Evaluation of Sensed Data

    Science.gov (United States)

    2016-06-01

    AFRL-RY-WP-TR-2016-0123 MATHEMATICS OF SENSING, EXPLOITATION, AND EXECUTION (MSEE) Hierarchical Representations for the Evaluation of Sensed...December 2015 4. TITLE AND SUBTITLE MATHEMATICS OF SENSING, EXPLOITATION, AND EXECUTION (MSEE) Hierarchical Representations for the Evaluation of...8-98) Prescribed by ANSI Std. Z39-18 Hierarchical Representations for the Evaluation of Sensed Data Final Report Mathematics of Sensing

  2. Evaluation of Multiple Imputation in Missing Data Analysis: An Application on Repeated Measurement Data in Animal Science

    Directory of Open Access Journals (Sweden)

    Gazel Ser

    2015-12-01

    Full Text Available The purpose of this study was to evaluate the performance of multiple imputation method in case that missing observation structure is at random and completely at random from the approach of general linear mixed model. The application data of study was consisted of a total 77 heads of Norduz ram lambs at 7 months of age. After slaughtering, pH values measured at five different time points were determined as dependent variable. In addition, hot carcass weight, muscle glycogen level and fasting durations were included as independent variables in the model. In the dependent variable without missing observation, two missing observation structures including Missing Completely at Random (MCAR and Missing at Random (MAR were created by deleting the observations at certain rations (10% and 25%. After that, in data sets that have missing observation structure, complete data sets were obtained using MI (multiple imputation. The results obtained by applying general linear mixed model to the data sets that were completed using MI method were compared to the results regarding complete data. In the mixed model which was applied to the complete data and MI data sets, results whose covariance structures were the same and parameter estimations and standard estimations were rather close to the complete data are obtained. As a result, in this study, it was ensured that reliable information was obtained in mixed model in case of choosing MI as imputation method in missing observation structure and rates of both cases.

  3. Kortlægning og evaluering af toksikologiske data for organiske fibre, der anvendes som isoleringsmaterialer

    DEFF Research Database (Denmark)

    Hansen, Ernst Jan de Place

    2001-01-01

    Resume af rapport om kortlægning og evaluering af toksikologiske data for organiske fibre, der anvendes som isoleringsmaterialer, udarbejdet af Dansk Toksikologi Center under Energistyrelsens udviklingsprogram "Miljø- og arbejdsmiljøvenlig isolering"......Resume af rapport om kortlægning og evaluering af toksikologiske data for organiske fibre, der anvendes som isoleringsmaterialer, udarbejdet af Dansk Toksikologi Center under Energistyrelsens udviklingsprogram "Miljø- og arbejdsmiljøvenlig isolering"...

  4. Using Statistical Control Charts to Analyze Data from Student Evaluations of Teaching

    Science.gov (United States)

    Marks, Neil B.; O'Connell, Richard T.

    2003-01-01

    In this paper, a method for analyzing data from student evaluations of teaching is presented. The first step of the process requires development of a regression model for teacher's summary rating as a function of student's expected grade. Then, two-sigma control charts for individual evaluation scores (section averages) and residuals from the…

  5. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    Science.gov (United States)

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  6. Review of recent benchmark experiments on integral test for high energy nuclear data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi; Tanaka, Susumu; Konno, Chikara; Fukahori, Tokio; Hayashi, Katsumi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-11-01

    A survey work of recent benchmark experiments on an integral test for high energy nuclear data evaluation was carried out as one of the work of the Task Force on JENDL High Energy File Integral Evaluation (JHEFIE). In this paper the results are compiled and the status of recent benchmark experiments is described. (author)

  7. Improvement of evaluated neutron nuclear data for {sup 237}Np and {sup 241}Am

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo; Iwamoto, Osamu; Hasegawa, Akira [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-01-01

    The nuclear data of {sup 237}Np and {sup 241}Am that are particularly important among the minor actinides were investigated by comparing JENDL-3.2 with the recent evaluated data and available experimental data. As a result of the study, several defects of JENDL-3.2 data were revealed. They were improved on the basis of experimental data or recent evaluated data. For the both nuclides, main quantities revised in the present work were the resonance parameters, cross sections, angular and energy distributions of secondary neutrons, number of neutrons per fission. The data were given in the neutron energy range from 10{sup -5} eV to 20 MeV, and compiled in the ENDF-6 format. (author)

  8. How should the completeness and quality of curated nanomaterial data be evaluated?†

    Science.gov (United States)

    Marchese Robinson, Richard L.; Lynch, Iseult; Peijnenburg, Willie; Rumble, John; Klaessig, Fred; Marquardt, Clarissa; Rauscher, Hubert; Puzyn, Tomasz; Purian, Ronit; Åberg, Christoffer; Karcher, Sandra; Vriens, Hanne; Hoet, Peter; Hoover, Mark D.; Hendren, Christine Ogilvie; Harper, Stacey L.

    2016-01-01

    Nanotechnology is of increasing significance. Curation of nanomaterial data into electronic databases offers opportunities to better understand and predict nanomaterials’ behaviour. This supports innovation in, and regulation of, nanotechnology. It is commonly understood that curated data need to be sufficiently complete and of sufficient quality to serve their intended purpose. However, assessing data completeness and quality is non-trivial in general and is arguably especially difficult in the nanoscience area, given its highly multidisciplinary nature. The current article, part of the Nanomaterial Data Curation Initiative series, addresses how to assess the completeness and quality of (curated) nanomaterial data. In order to address this key challenge, a variety of related issues are discussed: the meaning and importance of data completeness and quality, existing approaches to their assessment and the key challenges associated with evaluating the completeness and quality of curated nanomaterial data. Considerations which are specific to the nanoscience area and lessons which can be learned from other relevant scientific disciplines are considered. Hence, the scope of this discussion ranges from physicochemical characterisation requirements for nanomaterials and interference of nanomaterials with nanotoxicology assays to broader issues such as minimum information checklists, toxicology data quality schemes and computational approaches that facilitate evaluation of the completeness and quality of (curated) data. This discussion is informed by a literature review and a survey of key nanomaterial data curation stakeholders. Finally, drawing upon this discussion, recommendations are presented concerning the central question: how should the completeness and quality of curated nanomaterial data be evaluated? PMID:27143028

  9. Blinded Anonymization: a method for evaluating cancer prevention programs under restrictive data protection regulations.

    Science.gov (United States)

    Bartholomäus, Sebastian; Hense, Hans Werner; Heidinger, Oliver

    2015-01-01

    Evaluating cancer prevention programs requires collecting and linking data on a case specific level from multiple sources of the healthcare system. Therefore, one has to comply with data protection regulations which are restrictive in Germany and will likely become stricter in Europe in general. To facilitate the mortality evaluation of the German mammography screening program, with more than 10 Million eligible women, we developed a method that does not require written individual consent and is compliant to existing privacy regulations. Our setup is composed of different data owners, a data collection center (DCC) and an evaluation center (EC). Each data owner uses a dedicated software that preprocesses plain-text personal identifiers (IDAT) and plaintext evaluation data (EDAT) in such a way that only irreversibly encrypted record assignment numbers (RAN) and pre-aggregated, reversibly encrypted EDAT are transmitted to the DCC. The DCC uses the RANs to perform a probabilistic record linkage which is based on an established and evaluated algorithm. For potentially identifying attributes within the EDAT ('quasi-identifiers'), we developed a novel process, named 'blinded anonymization'. It allows selecting a specific generalization from the pre-processed and encrypted attribute aggregations, to create a new data set with assured k-anonymity, without using any plain-text information. The anonymized data is transferred to the EC where the EDAT is decrypted and used for evaluation. Our concept was approved by German data protection authorities. We implemented a prototype and tested it with more than 1.5 Million simulated records, containing realistically distributed IDAT. The core processes worked well with regard to performance parameters. We created different generalizations and calculated the respective suppression rates. We discuss modalities, implications and limitations for large data sets in the cancer registry domain, as well as approaches for further

  10. Addressing data center efficiency. Lessons learned from process evaluations of utility energy efficiency programs

    Energy Technology Data Exchange (ETDEWEB)

    Howard, A.J.; Holmes, J. [Energy Market Innovations, Inc, 83 Columbia St., Suite 303, Seattle, WA 98104 (United States)

    2012-01-15

    This paper summarizes the unique challenges related to addressing energy efficiency in the data center industry and lessons learned from original research and two process evaluations of energy efficiency programs with components that specifically target data centers. The lessons learned include: creating program opportunities specifically focused on data centers; clearly identifying target data centers able to implement energy efficiency programs; understanding decision making in these facilities; and effectively communicating the program opportunities to the target market. The growing energy use of data centers has drawn international attention from policy makers, regulators, industry consortiums, and electric utilities. Any program effective at improving the energy performance of data centers must include specific strategies and processes aimed at confronting a number of challenges specific to this industry, including: the concentrated and rapidly growing energy use of these facilities; the rapid pace of innovation; the extremely high reliability requirements; and the significant split incentives due to the typical data center management structure. The process evaluations covered in this paper are the Pacific Gas and Electric (PG and E) High-Tech program and the Silicon Valley Power (SVP) Public Benefits Program. While the PG and E evaluation was a more complete process evaluation, the SVP evaluation focused specifically on participation from co-location facilities. These process evaluations together included interviews with program participants, nonparticipants and utility staff and also included outreach to a large variety of industry stakeholders. In addition, the PG and E evaluation included detailed process-mapping used to identify the necessity and importance of all program processes. The insights gathered from these evaluations are not only applicable to US electrical utilities but can also be applied to any international organization looking to create

  11. Data glove embedded with 9-axis IMU and force sensing sensors for evaluation of hand function.

    Science.gov (United States)

    Pei-Chi Hsiao; Shu-Yu Yang; Bor-Shing Lin; I-Jung Lee; Chou, Willy

    2015-08-01

    A hand injury can greatly affect a person's daily life. Physicians must evaluate the state of recovery of a patient's injured hand. However, current manual evaluations of hand functions are imprecise and inconvenient. In this paper, a data glove embedded with 9-axis inertial sensors and force sensitive resistors is proposed. The proposed data glove system enables hand movement to be tracked in real-time. In addition, the system can be used to obtain useful parameters for physicians, is an efficient tool for evaluating the hand function of patients, and can improve the quality of hand rehabilitation.

  12. Existing test data for the Act on Registration & Evaluation, etc. of Chemical Substances.

    Science.gov (United States)

    Choi, Bong-In; Ryu, Byung-Taek; Na, Suk-Hyun; Chung, Seon-Yong

    2015-01-01

    In this study, the possibility of using existing test data provided in Korea and elsewhere for the registration of chemical substances was examined. Data on 510 chemical substances that are among the first subject to registration under the "Act on the Registration and Evaluation, etc. of Chemical Substances (K-REACH)" were analyzed. The possibility of using existing data from 16 reference databases was examined for 510 chemical substances notified in July 2015 as being subject to registration. Test data with the reliability required for the registration of chemical substances under the K-REACH constituted 48.4% of the required physicochemical characteristics, 6.5% of the required health hazards, and 9.4% of the required environmental hazards. Some existing test data were not within the scope of this research, including data used for registration in the European Union (EU). Thus, considering that 350 of these 510 species are registered in EU Registration, Evaluation, Authorisation & Restriction of Chemicals, more test data may exist that can be utilized in addition to the data identified in this study. Furthermore, the K-REACH states that non-testing data (test results predicted through Read Across, Quantitative Structure- Activity Relationships) and the weight of evidence (test results predicted based on test data with low reliability) can also be utilized for registration data. Therefore, if methods for using such data were actively reviewed, it would be possible to reduce the cost of securing test data required for the registration of chemical substances.

  13. Preliminary data for the 20 May 1974, simultaneous evaluation of remote sensors experiment. [water pollution monitoring

    Science.gov (United States)

    Johnson, R. W.; Batten, C. E.; Bowker, D. E.; Bressette, W. E.; Grew, G. W.

    1975-01-01

    Several remote sensors were simultaneously used to collect data over the tidal James River from Hopewell to Norfolk, Virginia. Sensors evaluated included the Multichannel-Ocean Color Sensor, multispectral scanners, and multispectral photography. Ground truth measurements and remotely sensed data are given. Preliminary analysis indicates that suspended sediment and concentrated industrial effluent are observable from all sensors.

  14. A data collection and processing procedure for evaluating a research program

    Science.gov (United States)

    Giuseppe Rensi; H. Dean Claxton

    1972-01-01

    A set of computer programs compiled for the information processing requirements of a model for evaluating research proposals are described. The programs serve to assemble and store information, periodically update it, and convert it to a form usable for decision-making. Guides for collecting and coding data are explained. The data-processing options available and...

  15. Evaluating satellite imagery-based land use data for describing forestland development in western Washington

    Science.gov (United States)

    Jeffrey D. Kline; Alissa Moses; David L. Azuma; Andrew. Gray

    2009-01-01

    Forestry professionals are concerned about how forestlands are affected by residential and other development. To address those concerns, researchers must find appropriate data with which to describe and evaluate rates and patterns of forestland development and the impact of development on the management of remaining forestlands. We examine land use data gathered from...

  16. Learning Disability Identification Consistency: The Impact of Methodology and Student Evaluation Data

    Science.gov (United States)

    Maki, Kathrin E.; Burns, Matthew K.; Sullivan, Amanda

    2017-01-01

    Learning disability (LD) identification has long been controversial and has undergone substantive reform. This study examined the consistency of school psychologists' LD identification decisions across three identification methods and across student evaluation data conclusiveness levels. Data were collected from 376 practicing school psychologists…

  17. The Use of Consumer Injury Registry Data to Evaluate Physical Abuse.

    Science.gov (United States)

    Wissow, Lawrence S.; Wilson, Modena H.

    1988-01-01

    Descriptive case information evaluated by 68 medical personnel included a fall from a highchair as the explanation of an injury, with or without injury pattern data obtained for such falls from the U.S. Consumer Product Safety Commission (CPSC). Respondents given the CPSC data appropriately had less confidence in the explanation. (Author/JW)

  18. Creating the data basis for environmental evaluations with the Oil Point Method

    DEFF Research Database (Denmark)

    Bey, Niki; Lenau, Torben Anker

    1999-01-01

    with rules-of-thumb. The central idea is that missing indicators can be calculated or estimated by the designers themselves.After discussing energy-related environmental evaluation and arguing for its application in evaluation of concepts, the paper focuses on the basic problem of missing data and describes...... the way in which the problem may be solved by making Oil Point evaluations. Sources of energy data are mentioned. Typical deficits to be aware of - such as the negligence of efficiency factors - are revealed and discussed. Comparative case studies which have shown encouraging results are mentioned as well.......A simple, indicator-based method for environmental evaluations, the Oil Point Method, has been developed. Oil Points are derived from energy data and refer to kilograms of oil, therefore the name. In the Oil Point Method, a certain degree of inaccuracy is explicitly accepted like it is the case...

  19. Evaluated kinetic and photochemical data for atmospheric chemistry: Volume V – heterogeneous reactions on solid substrates

    Directory of Open Access Journals (Sweden)

    J. N. Crowley

    2010-09-01

    Full Text Available This article, the fifth in the ACP journal series, presents data evaluated by the IUPAC Subcommittee on Gas Kinetic Data Evaluation for Atmospheric Chemistry. It covers the heterogeneous processes on surfaces of solid particles present in the atmosphere, for which uptake coefficients and adsorption parameters have been presented on the IUPAC website in 2010. The article consists of an introduction and guide to the evaluation, giving a unifying framework for parameterisation of atmospheric heterogeneous processes. We provide summary sheets containing the recommended uptake parameters for the evaluated processes. Four substantial appendices contain detailed data sheets for each process considered for ice, mineral dust, sulfuric acid hydrate and nitric acid hydrate surfaces, which provide information upon which the recommendations are made.

  20. Evaluation study for a multi-user oriented medical data visualization method.

    Science.gov (United States)

    Kopanitsa, Georgy

    2014-01-01

    The chosen evaluation concept is based the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI). The stages of the study were identified. Each stage got a detailed description. We also identified the participants and their required qualifications and responsibilities. The developed evaluation concept was used for the evaluation study of the developed medical data visualization method. The study was performed in Tomsk, Russia. This helped to involve more doctors and patients to the study. This also facilitated the involvement of patients, because they have already the experience of using patients' portal.

  1. Integrate Data into Scientific Workflows for Terrestrial Biosphere Model Evaluation through Brokers

    Science.gov (United States)

    Wei, Y.; Cook, R. B.; Du, F.; Dasgupta, A.; Poco, J.; Huntzinger, D. N.; Schwalm, C. R.; Boldrini, E.; Santoro, M.; Pearlman, J.; Pearlman, F.; Nativi, S.; Khalsa, S.

    2013-12-01

    Terrestrial biosphere models (TBMs) have become integral tools for extrapolating local observations and process-level understanding of land-atmosphere carbon exchange to larger regions. Model-model and model-observation intercomparisons are critical to understand the uncertainties within model outputs, to improve model skill, and to improve our understanding of land-atmosphere carbon exchange. The DataONE Exploration, Visualization, and Analysis (EVA) working group is evaluating TBMs using scientific workflows in UV-CDAT/VisTrails. This workflow-based approach promotes collaboration and improved tracking of evaluation provenance. But challenges still remain. The multi-scale and multi-discipline nature of TBMs makes it necessary to include diverse and distributed data resources in model evaluation. These include, among others, remote sensing data from NASA, flux tower observations from various organizations including DOE, and inventory data from US Forest Service. A key challenge is to make heterogeneous data from different organizations and disciplines discoverable and readily integrated for use in scientific workflows. This presentation introduces the brokering approach taken by the DataONE EVA to fill the gap between TBMs' evaluation scientific workflows and cross-organization and cross-discipline data resources. The DataONE EVA started the development of an Integrated Model Intercomparison Framework (IMIF) that leverages standards-based discovery and access brokers to dynamically discover, access, and transform (e.g. subset and resampling) diverse data products from DataONE, Earth System Grid (ESG), and other data repositories into a format that can be readily used by scientific workflows in UV-CDAT/VisTrails. The discovery and access brokers serve as an independent middleware that bridge existing data repositories and TBMs evaluation scientific workflows but introduce little overhead to either component. In the initial work, an OpenSearch-based discovery broker

  2. The WA Hospital Morbidity Data System: an evaluation of its performance and the impact of electronic data transfer.

    Science.gov (United States)

    Unwin, E; Codde, J; Gill, L; Stevens, S; Nelson, T

    The paper evaluates the performance of the Hospital Morbidity Data System, maintained by the Health Statistics Branch (HSB) of the Health Department of Western Australia (WA). The time taken to process discharge summaries was compared in the first and second halves of 1995, using the number of weeks taken to process 90% of all discharges and the percentage of records processed within four weeks as indicators of throughput. Both the hospitals and the HSB showed improvements in timeliness during the second half of the year. The paper also examines the impact of a recently introduced electronic data transfer system for WA country public hospitals on the timeliness of morbidity data. The processing time of country hospital records by the HSB was reduced to a similar time as for metropolitan hospitals, but the processing time in the hospitals increased, resulting in little improvement in total processing time.

  3. Feasibility of Automatic Extraction of Electronic Health Data to Evaluate a Status Epilepticus Clinical Protocol.

    Science.gov (United States)

    Hafeez, Baria; Paolicchi, Juliann; Pon, Steven; Howell, Joy D; Grinspan, Zachary M

    2016-05-01

    Status epilepticus is a common neurologic emergency in children. Pediatric medical centers often develop protocols to standardize care. Widespread adoption of electronic health records by hospitals affords the opportunity for clinicians to rapidly, and electronically evaluate protocol adherence. We reviewed the clinical data of a small sample of 7 children with status epilepticus, in order to (1) qualitatively determine the feasibility of automated data extraction and (2) demonstrate a timeline-style visualization of each patient's first 24 hours of care. Qualitatively, our observations indicate that most clinical data are well labeled in structured fields within the electronic health record, though some important information, particularly electroencephalography (EEG) data, may require manual abstraction. We conclude that a visualization that clarifies a patient's clinical course can be automatically created using the patient's electronic clinical data, supplemented with some manually abstracted data. Future work could use this timeline to evaluate adherence to status epilepticus clinical protocols. © The Author(s) 2015.

  4. Research on Holographic Evaluation of Service Quality in Power Data Network

    Science.gov (United States)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    With the rapid development of power data network, the continuous development of the Power data application service system, more and more service systems are being put into operation. Following this, the higher requirements for network quality and service quality are raised, in the actual process for the network operation and maintenance. This paper describes the electricity network and data network services status. A holographic assessment model was presented to achieve a comprehensive intelligence assessment on the power data network and quality of service in the operation and maintenance on the power data network. This evaluation method avoids the problems caused by traditional means which performs a single assessment of network performance quality. This intelligent Evaluation method can improve the efficiency of network operation and maintenance guarantee the quality of real-time service in the power data network..

  5. Evaluation of the Radiometric Integrity of LANDSAT-4 Thematic Mapper Band 6 Data

    Science.gov (United States)

    Schott, J. R.

    1984-01-01

    An approach for experimentally evaluating the radiometric calibration of the LANDSAT-4 band 6 data is described which draws on a method used to radiometrically calibrate the HCMR data which involved underflying the satellite with an infrared line scanner. By extending this technology to higher altitudes experimental radiance data suitable for radiometric calibration of the TM band 6 sensor can be generated. Repetition of this experiment can permit evaluation of long term drift in the sensor and provide a data base for evaluating atmospheric propagation models for radiation transfer. To date, efforts were concentrated on modifying the infrared line scanner to match the spectral response of the TM band 6 sensor. In addition, the LOWTRAN code corresponding to a satellite overpass of September 1982 was run to yield a plot of transmission and path radiance as a function of altitude.

  6. An Integrated Health Monitoring Method for Structural Fatigue Life Evaluation Using Limited Sensor Data.

    Science.gov (United States)

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Wang, Yanrong; Zhang, Weifang

    2016-11-04

    A general framework for structural fatigue life evaluation under fatigue cyclic loading using limited sensor data is proposed in this paper. First, limited sensor data are measured from various sensors which are preset on the complex structure. Then the strain data at remote spots are used to obtain the strain responses at critical spots by the strain/stress reconstruction method based on empirical mode decomposition (REMD method). All the computations in this paper are directly performed in the time domain. After the local stress responses at critical spots are determined, fatigue life evaluation can be performed for structural health management and risk assessment. Fatigue life evaluation using the reconstructed stresses from remote strain gauge measurement data is also demonstrated with detailed error analysis. Following this, the proposed methodology is demonstrated using a three-dimensional frame structure and a simplified airfoil structure. Finally, several conclusions and future work are drawn based on the proposed study.

  7. Current status of the verification and processing system GALILÉE-1 for evaluated data

    Science.gov (United States)

    Coste-Delclaux, Mireille; Jouanne, Cédric; Moreau, Frédéric; Mounier, Claude; Visonneau, Thierry; Dolci, Florence

    2017-09-01

    This paper describes the current status of GALILÉE-1 that is the new verification and processing system for evaluated data, developed at CEA. It consists of various components respectively dedicated to read/write the evaluated data whatever the format is, to diagnose inconsistencies in the evaluated data and to provide continuous-energy and multigroup data as well as probability tables for transport and depletion codes. All these components are written in C+ + language and share the same objects. Cross-comparisons with other processing systems (NJOY, CALENDF or PREPRO) are systematically carried out at each step in order to fully master possible discrepancies. Some results of such comparisons are provided.

  8. An Integrated Health Monitoring Method for Structural Fatigue Life Evaluation Using Limited Sensor Data

    Directory of Open Access Journals (Sweden)

    Jingjing He

    2016-11-01

    Full Text Available A general framework for structural fatigue life evaluation under fatigue cyclic loading using limited sensor data is proposed in this paper. First, limited sensor data are measured from various sensors which are preset on the complex structure. Then the strain data at remote spots are used to obtain the strain responses at critical spots by the strain/stress reconstruction method based on empirical mode decomposition (REMD method. All the computations in this paper are directly performed in the time domain. After the local stress responses at critical spots are determined, fatigue life evaluation can be performed for structural health management and risk assessment. Fatigue life evaluation using the reconstructed stresses from remote strain gauge measurement data is also demonstrated with detailed error analysis. Following this, the proposed methodology is demonstrated using a three-dimensional frame structure and a simplified airfoil structure. Finally, several conclusions and future work are drawn based on the proposed study.

  9. San Jose, California: Evaluating Local Solar Energy Generation Potential (City Energy: From Data to Decisions)

    Energy Technology Data Exchange (ETDEWEB)

    Office of Strategic Programs, Strategic Priorities and Impact Analysis Team

    2017-09-29

    This fact sheet "San Jose, California: Evaluating Local Solar Energy Generation Potential" explains how the City of San Jose used data from the U.S. Department of Energy's Cities Leading through Energy Analysis and Planning (Cities-LEAP) and the State and Local Energy Data (SLED) programs to inform its city energy planning. It is one of ten fact sheets in the "City Energy: From Data to Decisions" series.

  10. Data Availability in Appliance Standards and Labeling Program Development and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Romankiewicz, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Khanna, Nina [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Vine, Edward [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Zhou, Nan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-05-01

    In this report, we describe the necessary data inputs for both standards development and program evaluation and perform an initial assessment of the availability and uncertainty of those data inputs in China. For standards development, we find that China and its standards and labeling program administrators currently has access to the basic market and technical data needed for conducting market and technology assessment and technological and economic analyses. Some data, such as shipments data, is readily available from the China Energy Label product registration database while the availability of other data, including average unit energy consumption, prices and design options, needs improvement. Unlike some other countries such as the United States, most of the necessary data for conducting standards development analyses are not publicly available or compiled in a consolidated data source. In addition, improved data on design and efficiency options as well as cost data (e.g., manufacturing costs, mark-ups, production and product use-phase costs) – key inputs to several technoeconomic analyses – are particularly in need given China’s unconsolidated manufacturing industry. For program evaluation, we find that while China can conduct simple savings evaluations on its incentive programs with the data it currently has available from the Ministry of Finance – the program administrator, the savings estimates produced by such an evaluation will carry high uncertainty. As such, China could benefit from an increase in surveying and metering in the next one to three years to decrease the uncertainty surrounding key data points such as unit energy savings and free ridership.

  11. Complete Evaluation of Available Laboratory-scale Data for the Independence Model

    Energy Technology Data Exchange (ETDEWEB)

    Holland, Troy Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kress, Joel David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-30

    Year 1 Objectives (August 2016 – December 2016) – The original Independence model is a sequentially regressed set of parameters from numerous data sets in the Aspen Plus modeling framework. The immediate goal with the basic data model is to collect and evaluate those data sets relevant to the thermodynamic submodels (pure substance heat capacity, solvent mixture heat capacity, loaded solvent heat capacities, and volatility data). These data are informative for the thermodynamic parameters involved in both vapor-liquid equilibrium, and in the chemical equilibrium of the liquid phase.

  12. Evaluation of Big Data Containers for Popular Storage, Retrieval, and Computation Primitives in Earth Science Analysis

    Science.gov (United States)

    Das, K.; Clune, T.; Kuo, K. S.; Mattmann, C. A.; Huang, T.; Duffy, D.; Yang, C. P.; Habermann, T.

    2015-12-01

    Data containers are infrastructures that facilitate storage, retrieval, and analysis of data sets. Big data applications in Earth Science require a mix of processing techniques, data sources and storage formats that are supported by different data containers. Some of the most popular data containers used in Earth Science studies are Hadoop, Spark, SciDB, AsterixDB, and RasDaMan. These containers optimize different aspects of the data processing pipeline and are, therefore, suitable for different types of applications. These containers are expected to undergo rapid evolution and the ability to re-test, as they evolve, is very important to ensure the containers are up to date and ready to be deployed to handle large volumes of observational data and model output. Our goal is to develop an evaluation plan for these containers to assess their suitability for Earth Science data processing needs. We have identified a selection of test cases that are relevant to most data processing exercises in Earth Science applications and we aim to evaluate these systems for optimal performance against each of these test cases. The use cases identified as part of this study are (i) data fetching, (ii) data preparation for multivariate analysis, (iii) data normalization, (iv) distance (kernel) computation, and (v) optimization. In this study we develop a set of metrics for performance evaluation, define the specifics of governance, and test the plan on current versions of the data containers. The test plan and the design mechanism are expandable to allow repeated testing with both new containers and upgraded versions of the ones mentioned above, so that we can gauge their utility as they evolve.

  13. REPLACING OUTLYING WOOD ANATOMY IN THE EVALUATION OF PROCESSING ROUGHNESS DATA AT SANDING

    Directory of Open Access Journals (Sweden)

    Lidia GURĂU

    2015-09-01

    Full Text Available Anatomical irregularities should be removed from the evaluation if a reliable processing roughness is to be evaluated. In order to get only measures of processing, wood anatomy can be removed with a method based on the Abbot-curve, which separates the core data from outliers by means of an upper and a lower threshold. Although researchers agreed on the need of separating the processing roughness from anatomical roughness, there was no study on the most appropriate method of replacing the missing data in the roughness profiles. This paper examined three methods of replacing the oulying data and their effect on roughness parameters calculated on sampling lengths, as instructed by ISO 4287 (1998 and on evaluation length. The Zero method replaced outliers with zero values disregarded in further calculations, the Predicted method replaced the missing data by cubic spline interpolation and Total removal method removed completely wood anatomical features up to the profile mean line.The results showed that Zero method is the best choice when roughness parameters are calculated on the evaluation length. Compared to Predicted method it has the advantage of using real data giving similar results.Total removal method dramatically reduces the number of profile valleys in the evaluation biasing the roughness parameters.

  14. Evaluation of Assimilated SMOS Soil Moisture Data for US Cropland Soil Moisture Monitoring

    Science.gov (United States)

    Yang, Zhengwei; Sherstha, Ranjay; Crow, Wade; Bolten, John; Mladenova, Iva; Yu, Genong; Di, Liping

    2016-01-01

    Remotely sensed soil moisture data can provide timely, objective and quantitative crop soil moisture information with broad geospatial coverage and sufficiently high resolution observations collected throughout the growing season. This paper evaluates the feasibility of using the assimilated ESA Soil Moisture Ocean Salinity (SMOS)Mission L-band passive microwave data for operational US cropland soil surface moisture monitoring. The assimilated SMOS soil moisture data are first categorized to match with the United States Department of Agriculture (USDA)National Agricultural Statistics Service (NASS) survey based weekly soil moisture observation data, which are ordinal. The categorized assimilated SMOS soil moisture data are compared with NASSs survey-based weekly soil moisture data for consistency and robustness using visual assessment and rank correlation. Preliminary results indicate that the assimilated SMOS soil moisture data highly co-vary with NASS field observations across a large geographic area. Therefore, SMOS data have great potential for US operational cropland soil moisture monitoring.

  15. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method.

    Science.gov (United States)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-02-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5-12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim,) [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  16. Privacy-preserving data cube for electronic medical records: An experimental evaluation.

    Science.gov (United States)

    Kim, Soohyung; Lee, Hyukki; Chung, Yon Dohn

    2017-01-01

    The aim of this study is to evaluate the effectiveness and efficiency of privacy-preserving data cubes of electronic medical records (EMRs). An EMR data cube is a complex of EMR statistics that are summarized or aggregated by all possible combinations of attributes. Data cubes are widely utilized for efficient big data analysis and also have great potential for EMR analysis. For safe data analysis without privacy breaches, we must consider the privacy preservation characteristics of the EMR data cube. In this paper, we introduce a design for a privacy-preserving EMR data cube and the anonymization methods needed to achieve data privacy. We further focus on changes in efficiency and effectiveness that are caused by the anonymization process for privacy preservation. Thus, we experimentally evaluate various types of privacy-preserving EMR data cubes using several practical metrics and discuss the applicability of each anonymization method with consideration for the EMR analysis environment. We construct privacy-preserving EMR data cubes from anonymized EMR datasets. A real EMR dataset and demographic dataset are used for the evaluation. There are a large number of anonymization methods to preserve EMR privacy, and the methods are classified into three categories (i.e., global generalization, local generalization, and bucketization) by anonymization rules. According to this classification, three types of privacy-preserving EMR data cubes were constructed for the evaluation. We perform a comparative analysis by measuring the data size, cell overlap, and information loss of the EMR data cubes. Global generalization considerably reduced the size of the EMR data cube and did not cause the data cube cells to overlap, but incurred a large amount of information loss. Local generalization maintained the data size and generated only moderate information loss, but there were cell overlaps that could decrease the search performance. Bucketization did not cause cells to overlap

  17. Bias corrections of GOSAT SWIR XCO2 and XCH4 with TCCON data and their evaluation using aircraft measurement data

    Science.gov (United States)

    Inoue, Makoto; Morino, Isamu; Uchino, Osamu; Nakatsuru, Takahiro; Yoshida, Yukio; Yokota, Tatsuya; Wunch, Debra; Wennberg, Paul O.; Roehl, Coleen M.; Griffith, David W. T.; Velazco, Voltaire A.; Deutscher, Nicholas M.; Warneke, Thorsten; Notholt, Justus; Robinson, John; Sherlock, Vanessa; Hase, Frank; Blumenstock, Thomas; Rettinger, Markus; Sussmann, Ralf; Kyrö, Esko; Kivi, Rigel; Shiomi, Kei; Kawakami, Shuji; De Mazière, Martine; Arnold, Sabrina G.; Feist, Dietrich G.; Barrow, Erica A.; Barney, James; Dubey, Manvendra; Schneider, Matthias; Iraci, Laura T.; Podolske, James R.; Hillyard, Patrick W.; Machida, Toshinobu; Sawa, Yousuke; Tsuboi, Kazuhiro; Matsueda, Hidekazu; Sweeney, Colm; Tans, Pieter P.; Andrews, Arlyn E.; Biraud, Sebastien C.; Fukuyama, Yukio; Pittman, Jasna V.; Kort, Eric A.; Tanaka, Tomoaki

    2016-08-01

    We describe a method for removing systematic biases of column-averaged dry air mole fractions of CO2 (XCO2) and CH4 (XCH4) derived from short-wavelength infrared (SWIR) spectra of the Greenhouse gases Observing SATellite (GOSAT). We conduct correlation analyses between the GOSAT biases and simultaneously retrieved auxiliary parameters. We use these correlations to bias correct the GOSAT data, removing these spurious correlations. Data from the Total Carbon Column Observing Network (TCCON) were used as reference values for this regression analysis. To evaluate the effectiveness of this correction method, the uncorrected/corrected GOSAT data were compared to independent XCO2 and XCH4 data derived from aircraft measurements taken for the Comprehensive Observation Network for TRace gases by AIrLiner (CONTRAIL) project, the National Oceanic and Atmospheric Administration (NOAA), the US Department of Energy (DOE), the National Institute for Environmental Studies (NIES), the Japan Meteorological Agency (JMA), the HIAPER Pole-to-Pole observations (HIPPO) program, and the GOSAT validation aircraft observation campaign over Japan. These comparisons demonstrate that the empirically derived bias correction improves the agreement between GOSAT XCO2/XCH4 and the aircraft data. Finally, we present spatial distributions and temporal variations of the derived GOSAT biases.

  18. Evaluating New York City's abortion reporting system: insights for public health data collection systems.

    Science.gov (United States)

    Toprani, Amita; Madsen, Ann; Das, Tara; Gambatese, Melissa; Greene, Carolyn; Begier, Elizabeth

    2014-01-01

    New York City (NYC) mandates reporting of all abortion procedures. These reports enable tracking of abortion incidence and underpin programs, policy, and research. Since January 2011, the majority of abortion facilities must report electronically. We conducted an evaluation of NYC's abortion reporting system and its transition to electronic reporting. We summarize the evaluation methodology and results and draw lessons relevant to other vital statistics and public health reporting systems. The evaluation followed Centers for Disease Control and Prevention guidelines for evaluating public health surveillance systems. We interviewed key stakeholders and conducted a data provider survey. In addition, we compared the system's abortion counts with external estimates and calculated the proportion of missing and invalid values for each variable on the report form. Finally, we assessed the process for changing the report form and estimated system costs. NYC Health Department's Bureau of Vital Statistics. Usefulness, simplicity, flexibility, data quality, acceptability, sensitivity, timeliness, and stability of the abortion reporting system. Ninety-five percent of abortion data providers considered abortion reporting important; 52% requested training regarding the report form. Thirty percent reported problems with electronic biometric fingerprint certification, and 18% reported problems with the electronic system's stability. Estimated system sensitivity was 88%. Of 17 variables, education and ancestry had more than 5% missing values in 2010. Changing the electronic reporting module was costly and time-consuming. System operating costs were estimated at $80 136 to $89 057 annually. The NYC abortion reporting system is sensitive and provides high-quality data, but opportunities for improvement include facilitating biometric certification, increasing electronic platform stability, and conducting ongoing outreach and training for data providers. This evaluation will help data

  19. Development of a Visual Inspection Data Collection Tool for Evaluation of Fielded PV Module Condition

    Energy Technology Data Exchange (ETDEWEB)

    Packard, C. E.; Wohlgemuth, J. H.; Kurtz, S. R.

    2012-08-01

    A visual inspection data collection tool for the evaluation of fielded photovoltaic (PV) modules has been developed to facilitate describing the condition of PV modules with regard to field performance. The proposed data collection tool consists of 14 sections, each documenting the appearance or properties of a part of the module. This report instructs on how to use the collection tool and defines each attribute to ensure reliable and valid data collection. This tool has been evaluated through the inspection of over 60 PV modules produced by more than 20 manufacturers and fielded at two different sites for varying periods of time. Aggregated data from such a single data collection tool has the potential to enable longitudinal studies of module condition over time, technology evolution, and field location for the enhancement of module reliability models.

  20. Comprehensive analysis and evaluation of big data for main transformer equipment based on PCA and Apriority

    Science.gov (United States)

    Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun

    2018-01-01

    With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.

  1. The Design, Development, and Evaluation of a Qualitative Data Collection Application for Pregnant Women.

    Science.gov (United States)

    Keedle, Hazel; Schmied, Virginia; Burns, Elaine; Dahlen, Hannah

    2018-01-01

    This article explores the development and evaluation of a smartphone mobile software application (app) to collect qualitative data. The app was specifically designed to capture real-time qualitative data from women planning a vaginal birth after caesarean delivery. This article outlines the design and development of the app to include funding, ethics, and the recruitment of an app developer, as well as the evaluation of using the app by seven participants. Data collection methods used in qualitative research include interviews and focus groups (either online, face-to-face, or by phone), participant diaries, or observations of interactions. This article identifies an alternative data collection methodology using a smartphone app to collect real-time data. The app provides real-time data and instant access to data alongside the ability to access participants from a variety of locations. This allows the researcher to gain insight into the experiences of participants through audio or video recordings in longitudinal studies without the need for constant interactions or interviews with participants. Using smartphone applications can allow researchers to access participants who are traditionally hard to reach and access their data in real time. Evaluating these apps before use in research is invaluable. © 2017 Sigma Theta Tau International.

  2. Evaluation of thyroid radioactivity measurement data from Hanford workers, 1944--1946

    Energy Technology Data Exchange (ETDEWEB)

    Ikenberry, T.A.

    1991-05-01

    This report describes the preliminary results of an evaluation conducted in support of the Hanford Environmental Dose Reconstruction (HEDR) Project. The primary objective of the HEDR Project is to estimate the radiation doses that populations could have received from nuclear operations at the Hanford Site since 1944. A secondary objective is to make information that HEDR staff members used in estimate radiation doses available to the public. The objectives of this report to make available thyroid measurement data from Hanford workers for the year 1944 through 1946, and to investigate the suitability of those data for use in the HEDR dose estimation process. An important part of this investigation was to provide a description of the uncertainty associated with the data. Lack of documentation on thyroid measurements from this period required that assumptions be made to perform data evaluations. These assumptions introduce uncertainty into the evaluations that could be significant. It is important to recognize the nature of these assumptions, the inherent uncertainty, and the propagation of this uncertainty, and the propagation of this uncertainty through data evaluations to any conclusions that can be made by using the data. 15 refs., 1 fig., 5 tabs.

  3. Development, implementation and evaluation of an information model for archetype based user responsive medical data visualization.

    Science.gov (United States)

    Kopanitsa, Georgy; Veseli, Hasan; Yampolsky, Vladimir

    2015-06-01

    When medical data have been successfully recorded or exchanged between systems there appear a need to present the data consistently to ensure that it is clearly understood and interpreted. A standard based user interface can provide interoperability on the visual level. The goal of this research was to develop, implement and evaluate an information model for building user interfaces for archetype based medical data. The following types of knowledge were identified as important elements and were included in the information model: medical content related attributes, data type related attributes, user-related attributes, device-related attributes. In order to support flexible and efficient user interfaces an approach that represents different types of knowledge with different models separating the medical concept from a visual concept and interface realization was chosen. We evaluated the developed approach using Guideline for Good Evaluation Practice in Health Informatics (GEP-HI). We developed a higher level information model to complement the ISO 13606 archetype model. This enabled the specification of the presentation properties at the moment of the archetypes' definition. The model allows realizing different users' perspectives on the data. The approach was implemented and evaluated within a functioning EHR system. The evaluation involved 30 patients of different age and IT experience and 5 doctors. One month of testing showed that the time required reading electronic health records decreased for both doctors (from average 310 to 220s) and patients (from average 95 to 39s). Users reported a high level of satisfaction and motivation to use the presented data visualization approach especially in comparison with their previous experience. The introduced information model allows separating medical knowledge and presentation knowledge. The additional presentation layer will enrich the graphical user interface's flexibility and will allow an optimal presentation of

  4. Evaluating Public Policies with High Frequency Data: Evidence for Driving Restrictions in Mexico City Revisited

    OpenAIRE

    Christian Salas

    2010-01-01

    The evaluation of public policies is on the heart of the e cient management of public resources. As complex as it generally is, any reform should be assessed on its ability to achieve its preconceived goals. This research paper attempts to show the importance of the design of a public policy's empirical evaluation, considering the susceptibility that its conclusions might have to changes in the approach to the data. The work of Davis (2008), which nds that a driving restrictions program had n...

  5. Annual report of the project CIS-03-95, `evaluation of actinide nuclear data`

    Energy Technology Data Exchange (ETDEWEB)

    Maslov, V.M. [Radiation Physics and Chemistry Problems Inst., Minsk-Sosny (Belarus)

    1997-03-01

    The evaluation of neutron data for {sup 243}Cm, {sup 245}Cm and {sup 246}Cm is made in the energy region from 10-5 eV up to 20 MeV. The results of the evaluation are compiled in the ENDF/B-VI format. This work is performed under the Project Agreement CIS-03-95 with the International Science and Technology Center (Moscow). This is the annual report of the project CIS-03-95. (author)

  6. Evaluation of PM-3 Chemistry Data and Possible Interpretations of 3H Observations, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Robert [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States); Marutzky, Sam J. [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States)

    2015-02-01

    This report summarizes the analyses of the groundwater results from sampling of PM-3-1 (deep) and PM-3-2 (shallow), with a particular focus of evaluating the groundwater geochemistry data in comparison to the geochemistry observed in other wells in the Thirsty Canyon area as well as to evaluate the potential source of 3H observed in these piezometers from previous sampling activities, which employed depth-discrete bailers or a Bennett submersible piston pump.

  7. Multisensor Data Fusion for Water Quality Evaluation Using Dempster-Shafer Evidence Theory

    OpenAIRE

    Jian Zhou; Linfeng Liu; Jian Guo; Lijuan Sun

    2013-01-01

    A multisensor data fusion approach for water quality evaluation using Dempster-Shafer evidence theory is presented. To evaluate water quality, each sensor measurement is considered as a piece of evidence. Based on the water quality parameters measured by sensor node, the mass function of water quality class is calculated. Evidence from each sensor is given a reliability discounting and then combined with the others by D-S rule. According to the decision rule which uses the fusion mass functio...

  8. Challenges in Evaluating Relationships Between Quantitative Data (Carbon Dioxide) and Qualitative Data (Self-Reported Visual Changes)

    Science.gov (United States)

    Mendez, C. M.; Foy, M.; Mason, S.; Wear, M. L.; Meyers, V.; Law, J.; Alexander, D.; Van Baalen, M.

    2014-01-01

    Understanding the nuances in clinical data is critical in developing a successful data analysis plan. Carbon dioxide (CO2) data are collected on board the International Space Station (ISS) in a continuous stream. Clinical data on ISS are primarily collected via conversations between individual crewmembers and NASA Flight Surgeons during weekly Private Medical Conferences (PMC). Law, et.al, 20141 demonstrated a statistically significant association between weekly average CO2 levels on ISS and self-reported headaches over the reporting period from March 14, 2001 to May 31, 2012. The purpose of this analysis is to describe the evaluation of a possible association between visual changes and CO2 levels on ISS and to discuss challenges in developing an appropriate analysis plan. METHODS & PRELIMINARY RESULTS: A first analysis was conducted following the same study design as the published work on CO2 and self-reported headaches1; substituting self-reported changes in visual acuity in place of self-reported headaches. The analysis demonstrated no statistically significant association between visual impairment characterized by vision symptoms self-reported during PMCs and ISS average CO2 levels over ISS missions. Closer review of the PMC records showed that vision outcomes are not well-documented in terms of clinical severity, timing of onset, or timing of resolution, perhaps due to the incipient nature of vision changes. Vision has been monitored in ISS crewmembers, pre- and post-flight, using standard optometry evaluations. In-flight visual assessments were limited early in the ISS program, primarily consisting of self-perceived changes reported by crewmembers. Recently, on-orbit capabilities have greatly improved. Vision data ranges from self-reported post-flight changes in visual acuity, pre- to postflight changes identified during fundoscopic examination, and in-flight progression measured by advanced on-orbit clinical imaging capabilities at predetermined testing

  9. Evaluation in medical education: A topical review of target parameters, data collection tools and confounding factors

    Directory of Open Access Journals (Sweden)

    Schiekirka, Sarah

    2015-09-01

    Full Text Available Background and objective: Evaluation is an integral part of education in German medical schools. According to the quality standards set by the German Society for Evaluation, evaluation tools must provide an accurate and fair appraisal of teaching quality. Thus, data collection tools must be highly reliable and valid. This review summarises the current literature on evaluation of medical education with regard to the possible dimensions of teaching quality, the psychometric properties of survey instruments and potential confounding factors.Methods: We searched Pubmed, PsycINFO and PSYNDEX for literature on evaluation in medical education and included studies published up until June 30, 2011 as well as articles identified in the “grey literature”. Results are presented as a narrative review.Results: We identified four dimensions of teaching quality: structure, process, teacher characteristics, and outcome. Student ratings are predominantly used to address the first three dimensions, and a number of reliable tools are available for this purpose. However, potential confounders of student ratings pose a threat to the validity of these instruments. Outcome is usually operationalised in terms of student performance on examinations, but methodological problems may limit the usability of these data for evaluation purposes. In addition, not all examinations at German medical schools meet current quality standards.Conclusion: The choice of tools for evaluating medical education should be guided by the dimension that is targeted by the evaluation. Likewise, evaluation results can only be interpreted within the context of the construct addressed by the data collection tool that was used as well as its specific confounding factors.

  10. Systems biology: model based evaluation and comparison of potential explanations for given biological data.

    Science.gov (United States)

    Cedersund, Gunnar; Roll, Jacob

    2009-02-01

    Systems biology and its usage of mathematical modeling to analyse biological data is rapidly becoming an established approach to biology. A crucial advantage of this approach is that more information can be extracted from observations of intricate dynamics, which allows nontrivial complex explanations to be evaluated and compared. In this minireview we explain this process, and review some of the most central available analysis tools. The focus is on the evaluation and comparison of given explanations for a given set of experimental data and prior knowledge. Three types of methods are discussed: (a) for evaluation of whether a given model is sufficiently able to describe the given data to be nonrejectable; (b) for evaluation of whether a slightly superior model is significantly better; and (c) for a general evaluation and comparison of the biologically interesting features in a model. The most central methods are reviewed, both in terms of underlying assumptions, including references to more advanced literature for the theoretically oriented reader, and in terms of practical guidelines and examples, for the practically oriented reader. Many of the methods are based upon analysis tools from statistics and engineering, and we emphasize that the systems biology focus on acceptable explanations puts these methods in a nonstandard setting. We highlight some associated future improvements that will be essential for future developments of model based data analysis in biology.

  11. Fuzzy norm method for evaluating random vibration of airborne platform from limited PSD data

    Directory of Open Access Journals (Sweden)

    Wang Zhongyu

    2014-12-01

    Full Text Available For random vibration of airborne platform, the accurate evaluation is a key indicator to ensure normal operation of airborne equipment in flight. However, only limited power spectral density (PSD data can be obtained at the stage of flight test. Thus, those conventional evaluation methods cannot be employed when the distribution characteristics and priori information are unknown. In this paper, the fuzzy norm method (FNM is proposed which combines the advantages of fuzzy theory and norm theory. The proposed method can deeply dig system information from limited data, which probability distribution is not taken into account. Firstly, the FNM is employed to evaluate variable interval and expanded uncertainty from limited PSD data, and the performance of FNM is demonstrated by confidence level, reliability and computing accuracy of expanded uncertainty. In addition, the optimal fuzzy parameters are discussed to meet the requirements of aviation standards and metrological practice. Finally, computer simulation is used to prove the adaptability of FNM. Compared with statistical methods, FNM has superiority for evaluating expanded uncertainty from limited data. The results show that the reliability of calculation and evaluation is superior to 95%.

  12. A data set for evaluating the performance of multi-class multi-object video tracking

    Science.gov (United States)

    Chakraborty, Avishek; Stamatescu, Victor; Wong, Sebastien C.; Wigley, Grant; Kearney, David

    2017-05-01

    One of the challenges in evaluating multi-object video detection, tracking and classification systems is having publically available data sets with which to compare different systems. However, the measures of performance for tracking and classification are different. Data sets that are suitable for evaluating tracking systems may not be appropriate for classification. Tracking video data sets typically only have ground truth track IDs, while classification video data sets only have ground truth class-label IDs. The former identifies the same object over multiple frames, while the latter identifies the type of object in individual frames. This paper describes an advancement of the ground truth meta-data for the DARPA Neovision2 Tower data set to allow both the evaluation of tracking and classification. The ground truth data sets presented in this paper contain unique object IDs across 5 different classes of object (Car, Bus, Truck, Person, Cyclist) for 24 videos of 871 image frames each. In addition to the object IDs and class labels, the ground truth data also contains the original bounding box coordinates together with new bounding boxes in instances where un-annotated objects were present. The unique IDs are maintained during occlusions between multiple objects or when objects re-enter the field of view. This will provide: a solid foundation for evaluating the performance of multi-object tracking of different types of objects, a straightforward comparison of tracking system performance using the standard Multi Object Tracking (MOT) framework, and classification performance using the Neovision2 metrics. These data have been hosted publically.

  13. SoFAR: software for fully automatic evaluation of real-time PCR data.

    Science.gov (United States)

    Wilhelm, Jochen; Pingoud, Alfred; Hahn, Meinhard

    2003-02-01

    Quantitative real-time PCR has proven to be an extremely useful technique in life sciences for many applications. Although a lot of attention has been paid to the optimization of the assay conditions, the analysis of the data acquired is often done with software tools that do not make optimum use of the information provided by the data. Particularly, this is the case for high-throughput analysis, which requires a careful characterization and interpretation of the complete data by suitable software. Here we present a software solution for the robust, reliable, accurate, and fast evaluation of real-time PCR data, called SoFAR. The software automatically evaluates the data acquired with the LightCycler system. It applies new algorithms for an adaptive background correction of signal trends, the calculation of the effective signal noise, the automated identification of the exponential phases, the adaptive smoothing of the raw data, and the correction of melting curve data. Finally, it provides information regarding the validity of the results obtained. The SoFAR software minimizes the time required for evaluation and increases the accuracy and reliability of the results. The software is available upon request.

  14. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    Science.gov (United States)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  15. New approaches to provide feedback from nuclear and covariance data adjustment for effective improvement of evaluated nuclear data files

    Energy Technology Data Exchange (ETDEWEB)

    Palmiotti, Giuseppe; Salvatores, Massimo; Hursin, Mathieu; Kodeli, Ivo; Gabrielli, Fabrizio; Hummel, Andrew

    2016-11-01

    A critical examination of the role of uncertainty assessment, target accuracies, role of integral experiment for validation and, consequently, of data adjustments methods is underway since several years at OECD-NEA, the objective being to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and experimentalists in order to improve without ambiguities the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications and to meet new requirements and constraints for innovative reactor and fuel cycle system design. An approach will be described that expands as much as possible the use in the adjustment procedure of selected integral experiments that provide information on “elementary” phenomena, on separated individual physics effects related to specific isotopes or on specific energy ranges. An application to a large experimental data base has been performed and the results are discussed in the perspective of new evaluation projects like the CIELO initiative.

  16. New approaches to provide feedback from nuclear and covariance data adjustment for effective improvement of evaluated nuclear data files

    Science.gov (United States)

    Palmiotti, Giuseppe; Salvatores, Massimo; Hursin, Mathieu; Kodeli, Ivo; Gabrielli, Fabrizio; Hummel, Andrew

    2017-09-01

    A critical examination of the role of uncertainty assessment, target accuracies, role of integral experiment for validation and, consequently, of data adjustments methods is underway since several years at OECD-NEA, the objective being to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and experimentalists in order to improve without ambiguities the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications and to meet new requirements and constraints for innovative reactor and fuel cycle system design. An approach will be described that expands as much as possible the use in the adjustment procedure of selected integral experiments that provide information on "elementary" phenomena, on separated individual physics effects related to specific isotopes or on specific energy ranges. An application to a large experimental data base has been performed and the results are discussed in the perspective of new evaluation projects like the CIELO initiative.

  17. The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions

    Science.gov (United States)

    Iwamoto, O.; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.

    2016-01-01

    A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.

  18. Evaluation of Land Surface Temperature Operationally Retrieved from Korean Geostationary Satellite (COMS Data

    Directory of Open Access Journals (Sweden)

    A-Ra Cho

    2013-08-01

    Full Text Available We evaluated the precision of land surface temperature (LST operationally retrieved from the Korean multipurpose geostationary satellite, Communication, Ocean and Meteorological Satellite (COMS. The split-window (SW-type retrieval algorithm was developed through radiative transfer model simulations under various atmospheric profiles, satellite zenith angles, surface emissivity values and surface lapse rate conditions using Moderate Resolution Atmospheric Transmission version 4 (MODTRAN4. The estimation capabilities of the COMS SW (CSW LST algorithm were evaluated for various impacting factors, and the retrieval accuracy of COMS LST data was evaluated with collocated Moderate Resolution Imaging Spectroradiometer (MODIS LST data. The surface emissivity values for two SW channels were generated using a vegetation cover method. The CSW algorithm estimated the LST distribution reasonably well (averaged bias = 0.00 K, Root Mean Square Error (RMSE = 1.41 K, correlation coefficient = 0.99; however, the estimation capabilities of the CSW algorithm were significantly impacted by large brightness temperature differences and surface lapse rates. The CSW algorithm reproduced spatiotemporal variations of LST comparing well to MODIS LST data, irrespective of what month or time of day the data were collected from. The one-year evaluation results with MODIS LST data showed that the annual mean bias, RMSE and correlation coefficient for the CSW algorithm were −1.009 K, 2.613 K and 0.988, respectively.

  19. A guide to evaluating linkage quality for the analysis of linked data.

    Science.gov (United States)

    Harron, Katie L; Doidge, James C; Knight, Hannah E; Gilbert, Ruth E; Goldstein, Harvey; Cromwell, David A; van der Meulen, Jan H

    2017-10-01

    Linked datasets are an important resource for epidemiological and clinical studies, but linkage error can lead to biased results. For data security reasons, linkage of personal identifiers is often performed by a third party, making it difficult for researchers to assess the quality of the linked dataset in the context of specific research questions. This is compounded by a lack of guidance on how to determine the potential impact of linkage error. We describe how linkage quality can be evaluated and provide widely applicable guidance for both data providers and researchers. Using an illustrative example of a linked dataset of maternal and baby hospital records, we demonstrate three approaches for evaluating linkage quality: applying the linkage algorithm to a subset of gold standard data to quantify linkage error; comparing characteristics of linked and unlinked data to identify potential sources of bias; and evaluating the sensitivity of results to changes in the linkage procedure. These approaches can inform our understanding of the potential impact of linkage error and provide an opportunity to select the most appropriate linkage procedure for a specific analysis. Evaluating linkage quality in this way will improve the quality and transparency of epidemiological and clinical research using linked data. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Existing test data for the Act on Registration & Evaluation, etc. of Chemical Substances

    Directory of Open Access Journals (Sweden)

    Bong-In Choi

    2015-12-01

    Full Text Available Objectives In this study, the possibility of using existing test data provided in Korea and elsewhere for the registration of chemical substances was examined. Data on 510 chemical substances that are among the first subject to registration under the “Act on the Registration and Evaluation, etc. of Chemical Substances (K-REACH” were analyzed. Methods The possibility of using existing data from 16 reference databases was examined for 510 chemical substances notified in July 2015 as being subject to registration. Results Test data with the reliability required for the registration of chemical substances under the K-REACH constituted 48.4% of the required physicochemical characteristics, 6.5% of the required health hazards, and 9.4% of the required environmental hazards. Conclusions Some existing test data were not within the scope of this research, including data used for registration in the European Union (EU. Thus, considering that 350 of these 510 species are registered in EU Registration, Evaluation, Authorisation & Restriction of Chemicals, more test data may exist that can be utilized in addition to the data identified in this study. Furthermore, the K-REACH states that non-testing data (test results predicted through Read Across, Quantitative Structure- Activity Relationships and the weight of evidence (test results predicted based on test data with low reliability can also be utilized for registration data. Therefore, if methods for using such data were actively reviewed, it would be possible to reduce the cost of securing test data required for the registration of chemical substances.

  1. BioBenchmark Toyama 2012: an evaluation of the performance of triple stores on biological data.

    Science.gov (United States)

    Wu, Hongyan; Fujiwara, Toyofumi; Yamamoto, Yasunori; Bolleman, Jerven; Yamaguchi, Atsuko

    2014-01-01

    Biological databases vary enormously in size and data complexity, from small databases that contain a few million Resource Description Framework (RDF) triples to large databases that contain billions of triples. In this paper, we evaluate whether RDF native stores can be used to meet the needs of a biological database provider. Prior evaluations have used synthetic data with a limited database size. For example, the largest BSBM benchmark uses 1 billion synthetic e-commerce knowledge RDF triples on a single node. However, real world biological data differs from the simple synthetic data much. It is difficult to determine whether the synthetic e-commerce data is efficient enough to represent biological databases. Therefore, for this evaluation, we used five real data sets from biological databases. We evaluated five triple stores, 4store, Bigdata, Mulgara, Virtuoso, and OWLIM-SE, with five biological data sets, Cell Cycle Ontology, Allie, PDBj, UniProt, and DDBJ, ranging in size from approximately 10 million to 8 billion triples. For each database, we loaded all the data into our single node and prepared the database for use in a classical data warehouse scenario. Then, we ran a series of SPARQL queries against each endpoint and recorded the execution time and the accuracy of the query response. Our paper shows that with appropriate configuration Virtuoso and OWLIM-SE can satisfy the basic requirements to load and query biological data less than 8 billion or so on a single node, for the simultaneous access of 64 clients. OWLIM-SE performs best for databases with approximately 11 million triples; For data sets that contain 94 million and 590 million triples, OWLIM-SE and Virtuoso perform best. They do not show overwhelming advantage over each other; For data over 4 billion Virtuoso works best. 4store performs well on small data sets with limited features when the number of triples is less than 100 million, and our test shows its scalability is poor; Bigdata

  2. Internal evaluation of a physically-based distributed model using data from a Mediterranean mountain catchment

    Directory of Open Access Journals (Sweden)

    S. P. Anderton

    2002-01-01

    Full Text Available An evaluation of the performance of a physically-based distributed model of a small Mediterranean mountain catchment is presented. This was carried out using hydrological response data, including measurements of runoff, soil moisture, phreatic surface level and actual evapotranspiration. A-priori model parameterisation was based as far as possible on property data measured in the catchment. Limited model calibration was required to identify an appropriate value for terms controlling water loss to a deeper regional aquifer. The model provided good results for an initial calibration period, when judged in terms of catchment discharge. However, model performance for runoff declined substantially when evaluated against a consecutive, rather drier, period of data. Evaluation against other catchment responses allowed identification of the problems responsible for the observed lack of model robustness in flow simulation. In particular, it was shown that an incorrect parameterisation of the soil water model was preventing adequate representation of drainage from soils during hydrograph recessions. This excess moisture was then being removed via an overestimation of evapotranspiration. It also appeared that the model underestimated canopy interception. The results presented here suggest that model evaluation against catchment scale variables summarising its water balance can be of great use in identifying problems with model parameterisation, even for distributed models. Evaluation using spatially distributed data yielded less useful information on model performance, owing to the relative sparseness of data points, and problems of mismatch of scale between the measurement and the model grid. Keywords: physically-based distributed model, SHETRAN, parameterisation, Mediterranean mountain catchment, internal evaluation, multi-response

  3. Evaluation of relevant information for optimal reflector modeling through data assimilation procedures

    OpenAIRE

    Argaud Jean-Philippe; Bouriquet Bertrand; Clerc Thomas; Lucet-Sanchez Flora; Ponçot Angélique

    2015-01-01

    The goal of this study is to look after the amount of information that is mandatory to get a relevant parameters optimisation by data assimilation for physical models in neutronic diffusion calculations, and to determine what is the best information to reach the optimum of accuracy at the cheapest cost. To evaluate the quality of the optimisation, we study the covariance matrix that represents the accuracy of the optimised parameter. This matrix is a classical output of the data assimilation ...

  4. Internal evaluation of a physically-based distributed model using data from a Mediterranean mountain catchment

    OpenAIRE

    Anderton, S. P.; J. Latron; White, S M; P. Llorens; Gallart, F.; C. Salvany; P. E. O’Connell

    2002-01-01

    International audience; An evaluation of the performance of a physically-based distributed model of a small Mediterranean mountain catchment is presented. This was carried out using hydrological response data, including measurements of runoff, soil moisture, phreatic surface level and actual evapotranspiration. A-priori model parameterisation was based as far as possible on property data measured in the catchment. Limited model calibration was required to identify an appropriate value for ter...

  5. Evaluation of Swiss slaughterhouse data for integration in a syndromic surveillance system

    OpenAIRE

    Vial, Flavie; Reist, Martin

    2014-01-01

    BACKGROUND: We evaluated Swiss slaughterhouse data for integration in a national syndromic surveillance system for the early detection of emerging diseases in production animals. We analysed meat inspection data for cattle, pigs and small ruminants slaughtered between 2007 and 2012 (including emergency slaughters of sick/injured animals); investigating patterns in the number of animals slaughtered and condemned; the reasons invoked for whole carcass condemnations; reporting biases and re...

  6. Advancing a framework to enable characterization and evaluation of data streams useful for biosurveillance.

    Directory of Open Access Journals (Sweden)

    Kristen J Margevicius

    Full Text Available In recent years, biosurveillance has become the buzzword under which a diverse set of ideas and activities regarding detecting and mitigating biological threats are incorporated depending on context and perspective. Increasingly, biosurveillance practice has become global and interdisciplinary, requiring information and resources across public health, One Health, and biothreat domains. Even within the scope of infectious disease surveillance, multiple systems, data sources, and tools are used with varying and often unknown effectiveness. Evaluating the impact and utility of state-of-the-art biosurveillance is, in part, confounded by the complexity of the systems and the information derived from them. We present a novel approach conceptualizing biosurveillance from the perspective of the fundamental data streams that have been or could be used for biosurveillance and to systematically structure a framework that can be universally applicable for use in evaluating and understanding a wide range of biosurveillance activities. Moreover, the Biosurveillance Data Stream Framework and associated definitions are proposed as a starting point to facilitate the development of a standardized lexicon for biosurveillance and characterization of currently used and newly emerging data streams. Criteria for building the data stream framework were developed from an examination of the literature, analysis of information on operational infectious disease biosurveillance systems, and consultation with experts in the area of biosurveillance. To demonstrate utility, the framework and definitions were used as the basis for a schema of a relational database for biosurveillance resources and in the development and use of a decision support tool for data stream evaluation.

  7. Study on Evaluation of Project Management Data for Decommissioning of Uranium Refining and Conversion Plant - 12234

    Energy Technology Data Exchange (ETDEWEB)

    Usui, Hideo; Izumo, Sari; Tachibana, Mitsuo [Japan Atomic Energy Agency, Tokai-mura, Naka-gun, Ibaraki, 319-1195 (Japan); Shibahara, Yuji [Japan Atomic Energy Agency, Tokai-mura, Naka-gun, Ibaraki, 319-1195 (Japan); University of Fukui, Fukui-shi, Fukui, 910-8507 (Japan); Morimoto, Yasuyuki; Tokuyasu, Takashi; Takahashi, Nobuo; Tanaka, Yoshio; Sugitsue, Noritake [Japan Atomic Energy Agency, Kagamino-cho, Tomata-gun, Okayama, 708-0698 (Japan)

    2012-07-01

    Some of nuclear facilities that would no longer be required have been decommissioned in JAEA (Japan Atomic Energy Agency). A lot of nuclear facilities have to be decommissioned in JAEA in near future. To implement decommissioning of nuclear facilities, it was important to make a rational decommissioning plan. Therefore, project management data evaluation system for dismantling activities (PRODIA code) has been developed, and will be useful for making a detailed decommissioning plan for an object facility. Dismantling of dry conversion facility in the uranium refining and conversion plant (URCP) at Ningyo-toge began in 2008. During dismantling activities, project management data such as manpower and amount of waste generation have been collected. Such collected project management data has been evaluated and used to establish a calculation formula to calculate manpower for dismantling equipment of chemical process and calculate manpower for using a green house (GH) which was a temporary structure for preventing the spread of contaminants during dismantling. In the calculation formula to calculate project management data related to dismantling of equipment, the relation of dismantling manpower to each piece of equipment was evaluated. Furthermore, the relation of dismantling manpower to each chemical process was evaluated. The results showed promise for evaluating dismantling manpower with respect to each chemical process. In the calculation formula to calculate project management data related to use of the GH, relations of GH installation manpower and removal manpower to GH footprint were evaluated. Furthermore, the calculation formula for secondary waste generation was established. In this study, project management data related to dismantling of equipment and use of the GH were evaluated and analyzed. The project management data, manpower for dismantling of equipment, manpower for installation and removal of GH, and secondary waste generation from GH were considered

  8. Ultrasound machine comparison: an evaluation of ergonomic design, data management, ease of use, and image quality.

    Science.gov (United States)

    Wynd, Kimberly P; Smith, Hugh M; Jacob, Adam K; Torsher, Laurence C; Kopp, Sandra L; Hebl, James R

    2009-01-01

    The use of ultrasound technology for vascular access and regional anesthesia is gaining widespread acceptance among anesthesia providers. As a result, many group practices and medical institutions are considering purchasing ultrasound equipment. Currently, comparative information regarding the ergonomic design, physical and adjustable features, data management, ease of use, cost, and image quality of various ultrasound machines is not available. The primary goal of this investigation was to develop an objective process of evaluating ultrasound equipment before purchase. The process of evaluation used in the current investigation may be used when comparing a variety of medical technologies. A randomized, side-by-side comparative evaluation of 7 different ultrasound machine models was performed. Sixteen resident physicians without prior ultrasound experience (inexperienced providers) completed a formal evaluation of each machine model after performing a standardized machine configuration and performance checklist. Inexperienced providers and 10 faculty members experienced in ultrasound-guided regional anesthesia evaluated the image quality of 2 standardized images acquired from each machine model. Overall, evaluators rated questions on the machine evaluation form as "very good" or "outstanding" 70% or more of the time for all machine models. The largest, most complex ultrasound machine was rated as having the best image quality by both inexperienced and experienced providers. Ultrasound machine models with the simplest ergonomic design and user interface were rated highest by inexperienced study participants. Anesthesia providers considering an ultrasound equipment purchase should objectively evaluate machine models that have features most important to their own clinical practice. Ergonomic design, physical and adjustable features, data management, ease of use, image quality, and cost are important features to consider when evaluating an ultrasound machine.

  9. Evaluation of data quality of interRAI assessments in home and community care.

    Science.gov (United States)

    Hogeveen, Sophie E; Chen, Jonathan; Hirdes, John P

    2017-10-30

    The aim of this project is to describe the quality of assessment data regularly collected in home and community, with techniques adapted from an evaluation of the quality of long-term care data in Canada. Data collected using the Resident Assessment Instrument - Home Care (RAI-HC) in Ontario and British Columbia (BC) as well as the interRAI Community Health Assessment (CHA) in Ontario were analyzed using descriptive statistics, Pearson's r correlation, and Cronbach's alpha in order to assess trends in population characteristics, convergent validity, and scale reliability. Results indicate that RAI-HC data from Ontario and BC behave in a consistent manner, with stable trends in internal consistency providing evidence of good reliability (alpha values range from 0.72-0.94, depending on the scale and province). The associations between various scales, such as those reflecting functional status and cognition, were found to be as expected and stable over time within each setting (r values range from 0.42-0.45 in Ontario and 0.41-0.43 in BC). These trends in convergent validity demonstrate that constructs in the data behave as they should, providing evidence of good data quality. In most cases, CHA data quality matches that of RAI-HC data quality and shows evidence of good validity and reliability. The findings are comparable to the findings observed in the evaluation of data from the long-term care sector. Despite an increasingly complex client population in the home and community care sectors, the results from this work indicate that data collected using the RAI-HC and the CHA are of an overall quality that may be trusted when used to inform decision-making at the organizational- or policy-level. High quality data and information are vital when used to inform steps taken to improve quality of care and enhance quality of life. This work also provides evidence that a method used to evaluate the quality of data obtained in the long-term care setting may be used to

  10. Quality evaluation of value sets from cancer study common data elements using the UMLS semantic groups.

    Science.gov (United States)

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2012-06-01

    The objective of this study is to develop an approach to evaluate the quality of terminological annotations on the value set (ie, enumerated value domain) components of the common data elements (CDEs) in the context of clinical research using both unified medical language system (UMLS) semantic types and groups. The CDEs of the National Cancer Institute (NCI) Cancer Data Standards Repository, the NCI Thesaurus (NCIt) concepts and the UMLS semantic network were integrated using a semantic web-based framework for a SPARQL-enabled evaluation. First, the set of CDE-permissible values with corresponding meanings in external controlled terminologies were isolated. The corresponding value meanings were then evaluated against their NCI- or UMLS-generated semantic network mapping to determine whether all of the meanings fell within the same semantic group. Of the enumerated CDEs in the Cancer Data Standards Repository, 3093 (26.2%) had elements drawn from more than one UMLS semantic group. A random sample (n=100) of this set of elements indicated that 17% of them were likely to have been misclassified. The use of existing semantic web tools can support a high-throughput mechanism for evaluating the quality of large CDE collections. This study demonstrates that the involvement of multiple semantic groups in an enumerated value domain of a CDE is an effective anchor to trigger an auditing point for quality evaluation activities. This approach produces a useful quality assurance mechanism for a clinical study CDE repository.

  11. An Intervention to Decrease the Occurrence of Invalid Data on Neuropsychological Evaluation.

    Science.gov (United States)

    Horner, Michael David; Turner, Travis H; VanKirk, Kathryn K; Denning, John H

    2017-03-01

    This study tested whether patients who were given a handout based on deterrence theory, immediately prior to evaluation, would provide invalid data less frequently than patients who were simply given an informational handout. All outpatients seen for clinical evaluation in a VA Neuropsychology Clinic were randomly given one of the two handouts immediately prior to evaluation. The "Intervention" handout emphasized the importance of trying one's hardest, explicitly listed consequences of valid and invalid responding and asked patients to sign and initial it. The "Control" handout provided general information about neuropsychological evaluation. Examiners were blinded to condition. Patients were excluded from analyses if they were diagnosed with major neurocognitive disorder or could not read the handout. Medical Symptom Validity Test (MSVT) was used to determine performance validity. Groups did not differ on age, education, or litigation status. For the entire sample (N = 251), there was no effect of handout on passing versus failing MSVT. However, among patients who were seeking disability benefits at the time of evaluation (n = 70), the Intervention handout was associated with lower frequency of failing MSVT than the Control handout. This brief, theory-based, cost-free intervention was associated with lower frequency of invalid data among patients seeking disability benefits at the time of clinical evaluation. We suggest methodological modifications that might produce a more potent intervention that could be effective with additional subsets of patients.

  12. Project DATA-TECH. 1990-91 Final Evaluation Profile. OREA Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.

    An evaluation was done of New York City Public Schools' Project DATA-TECH, which served limited English proficient high school students interested in computer-aided drafting (CAD) and cosmetology programs. The program served 190 students at Sara J. Hale High School in Brooklyn, of whom 89.5 percent were eligible for the Free Lunch Program and most…

  13. Use of Kinematic Viscosity Data for the Evaluation of the Molecular Weight of Petroleum Oils

    Science.gov (United States)

    Maroto, J. A.; Quesada-Perez, M.; Ortiz-Hernandez, A. J.

    2010-01-01

    A new laboratory procedure for the evaluation of the mean molecular weight (mean relative molecular mass) of petroleum oils with high accuracy is described. The density and dynamic viscosity of three commercial petroleum oils are measured at different temperatures. These experimental data are used to calculate the kinematic viscosity as a function…

  14. A Distributed Data Base System Concept for Defense Test and Evaluation.

    Science.gov (United States)

    1983-03-01

    paper is defined as the combined use of comunications facilities and data processing equipment. Dominant among the many elements supporting this...objectives, and evaluation criteria related to the satisfaction of mission need shall be established before tests begin. o Successful planning for and

  15. Architecture and evaluation of software-defined optical switching matrix for hybrid data centers

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    2016-01-01

    A software architecture is proposed for hybrid packet/optical data centers employing programmable NETCONF-enabled optical switching matrix, and a performance evaluation is presented comparing hybrid and electrical-only architectures for elephant flows under different traffic patterns. Network thr...

  16. Teaching Effectiveness, Impression Management, and Dysfunctional Behavior: Student Evaluation of Teaching Control Data

    Science.gov (United States)

    Crumbley, D. Larry; Reichelt, Kenneth J.

    2009-01-01

    Purpose: Student evaluation of teaching (SET) questionnaires are used in many countries, although much current research questions the validity of these surveys. US research indicates that more than 90 percent of academic accounting departments use this performance measurement. This paper aims to focus on the validity of SET data.…

  17. Evaluation of the similarity of gene expression data estimated with SAGE and Affymetrix GeneChips

    NARCIS (Netherlands)

    van Ruissen, Fred; Ruijter, Jan M.; Schaaf, Gerben J.; Asgharnegad, Lida; Zwijnenburg, Danny A.; Kool, Marcel; Baas, Frank

    2005-01-01

    Background: Serial Analysis of Gene Expression ( SAGE) and microarrays have found awidespread application, but much ambiguity exists regarding the evaluation of these technologies. Cross-platform utilization of gene expression data from the SAGE and microarray technology could reduce the need for

  18. Recent developments in geostatistical resource evaluation - Learning from production data for optimized extraction of mineral resources

    NARCIS (Netherlands)

    Benndorf, J.

    2015-01-01

    The resource model is the foundation for mineral project evaluation, mine planning and operations production control. The representativeness of such a model depends on both, the quality of data available and the modelling technique applied. This contribution reviews recent developments in

  19. Proceedings of the 1st conference on nuclear structure data evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Firestone, R.B.; Shirley, V.S.; Dairiki, J.M. (eds.)

    1982-04-01

    The 1st Conference on Nuclear Structure Data Evaluation was organized by the Isotopes Project of the Lawrence Berkeley Laboratory in order to encourage the open discussion of the scientific aspects of ENSDF production and usage. Summaries of the roundtable discussion sessions, abstracts of the presented papers, and additional contributed papers are contained in these Proceedings.

  20. Partners in a Common Cause: External Evaluators Team with Practitioners to Build Data Use Practices

    Science.gov (United States)

    Wilkerson, Stephanie B.; Johnson, Margie

    2017-01-01

    For many districts, evaluation is an afterthought to implementing a new initiative. Educators participate in professional learning experiences, apply what they learn to their practice, and then, at some point, school and district staff begin wondering if the initiative is making a difference. Then they scramble for nuggets of data that provide any…

  1. Creating the Data Basis for Environmental Evaluations with the Oil Point Method

    DEFF Research Database (Denmark)

    Bey, Niki; Lenau, Torben Anker

    1999-01-01

    it is the case with rules-of-thumb. The central idea is that missing indicators can be calculated or estimated by the designers themselves.After discussing energy-related environmental evaluation and arguing for its application in evaluation of concepts, the paper focuses on the basic problem of missing data...... and describes the way in which the problem may be solved by making Oil Point evaluations. Sources of energy data are mentioned. Typical deficits to be aware of - such as the negligence of efficiency factors - are revealed and discussed. Comparative case studies which have shown encouraging results are mentioned......In order to support designers in decision-making, some methods have been developed which are based on environmental indicators. These methods, however, can only be used, if indicators for the specific product concept exist and are readily available.Based on this situation, the authors developed...

  2. Use of experience data for seismic evaluations at Department of Energy facilities

    Energy Technology Data Exchange (ETDEWEB)

    Murray, R.C. [Lawrence Livermore National Lab., CA (United States); Kimball, J.K.; Guzy, D.J.; Hill, J.R. [USDOE, Washington, DC (United States)

    1994-12-07

    Seismic evaluations of essential systems and components at Department of Energy (DOE) facilities will be conducted over the next several years. For many of these systems and components, few, if any, seismic requirements applied to the original design, procurement, installation, and maintenance process. Thus the verification of the seismic adequacy of existing systems and components presents a difficult challenge. DOE has undertaken development of the criteria and procedures for these seismic evaluations that will maximize safety benefits in a timely and cost effective manner. As demonstrated in previous applications at DOE facilities and by the experience from the commercial nuclear power industry, use of experience data for these evaluations is the only viable option for most existing systems and components. This paper describes seismic experience data, the needs at DOE facilities, the precedent of application of nuclear power plants and DOE facilities, and the program underway for the seismic verification task ahead for DOE.

  3. Evaluation of active mortality surveillance system data for monitoring hurricane-related deaths-Texas, 2008.

    Science.gov (United States)

    Choudhary, Ekta; Zane, David F; Beasley, Crystal; Jones, Russell; Rey, Araceli; Noe, Rebecca S; Martin, Colleen; Wolkin, Amy F; Bayleyegn, Tesfaye M

    2012-08-01

    The Texas Department of State Health Services (DSHS) implemented an active mortality surveillance system to enumerate and characterize hurricane-related deaths during Hurricane Ike in 2008. This surveillance system used established guidelines and case definitions to categorize deaths as directly, indirectly, and possibly related to Hurricane Ike. The objective of this study was to evaluate Texas DSHS' active mortality surveillance system using US Centers for Disease Control and Prevention's (CDC) surveillance system evaluation guidelines. Using CDC's Updated Guidelines for Surveillance System Evaluation, the active mortality surveillance system of the Texas DSHS was evaluated. Data from the active mortality surveillance system were compared with Texas vital statistics data for the same time period to estimate the completeness of reported disaster-related deaths. From September 8 through October 13, 2008, medical examiners (MEs) and Justices of the Peace (JPs) in 44 affected counties reported deaths daily by using a one-page, standardized mortality form. The active mortality surveillance system identified 74 hurricane-related deaths, whereas a review of vital statistics data revealed only four deaths that were hurricane-related. The average time of reporting a death by active mortality surveillance and vital statistics was 14 days and 16 days, respectively. Texas's active mortality surveillance system successfully identified hurricane-related deaths. Evaluation of the active mortality surveillance system suggested that it is necessary to collect detailed and representative mortality data during a hurricane because vital statistics do not capture sufficient information to identify whether deaths are hurricane-related. The results from this evaluation will help improve active mortality surveillance during hurricanes which, in turn, will enhance preparedness and response plans and identify public health interventions to reduce future hurricane-related mortality rates.

  4. Evaluation of Active Mortality Surveillance System Data for Monitoring Hurricane-Related Deaths—Texas, 2008

    Science.gov (United States)

    Choudhary, Ekta; Zane, David F.; Beasley, Crystal; Jones, Russell; Rey, Araceli; Noe, Rebecca S.; Martin, Colleen; Wolkin, Amy F.; Bayleyegn, Tesfaye M.

    2015-01-01

    Introduction The Texas Department of State Health Services (DSHS) implemented an active mortality surveillance system to enumerate and characterize hurricane-related deaths during Hurricane Ike in 2008. This surveillance system used established guidelines and case definitions to categorize deaths as directly, indirectly, and possibly related to Hurricane Ike. Objective The objective of this study was to evaluate Texas DSHS’ active mortality surveillance system using US Centers for Disease Control and Prevention’s (CDC) surveillance system evaluation guidelines. Methods Using CDC’s Updated Guidelines for Surveillance System Evaluation, the active mortality surveillance system of the Texas DSHS was evaluated. Data from the active mortality surveillance system were compared with Texas vital statistics data for the same time period to estimate the completeness of reported disaster-related deaths. Results From September 8 through October 13, 2008, medical examiners (MEs) and Justices of the Peace (JPs) in 44 affected counties reported deaths daily by using a one-page, standardized mortality form. The active mortality surveillance system identified 74 hurricane-related deaths, whereas a review of vital statistics data revealed only four deaths that were hurricane-related. The average time of reporting a death by active mortality surveillance and vital statistics was 14 days and 16 days, respectively. Conclusions Texas’s active mortality surveillance system successfully identified hurricane-related deaths. Evaluation of the active mortality surveillance system suggested that it is necessary to collect detailed and representative mortality data during a hurricane because vital statistics do not capture sufficient information to identify whether deaths are hurricane-related. The results from this evaluation will help improve active mortality surveillance during hurricanes which, in turn, will enhance preparedness and response plans and identify public health

  5. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    Energy Technology Data Exchange (ETDEWEB)

    Shreekanth Mandayam; Robi Polikar; John C. Chen

    2004-04-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University designed, developed and exercised multisensor data fusion algorithms for identifying defect related information present in magnetic flux leakage, ultrasonic testing and thermal imaging nondestructive evaluation signatures of a test-specimen suite representative of benign and anomalous indications in gas transmission pipelines.

  6. PyCorrFit-generic data evaluation for fluorescence correlation spectroscopy.

    Science.gov (United States)

    Müller, Paul; Schwille, Petra; Weidemann, Thomas

    2014-09-01

    We present a graphical user interface (PyCorrFit) for the fitting of theoretical model functions to experimental data obtained by fluorescence correlation spectroscopy (FCS). The program supports many data file formats and features a set of tools specialized in FCS data evaluation. The Python source code is freely available for download from the PyCorrFit web page at http://pycorrfit.craban.de. We offer binaries for Ubuntu Linux, Mac OS X and Microsoft Windows. © The Author 2014. Published by Oxford University Press.

  7. Perspective, Method, and Data Triangulation in the Evaluation of a Local Educational Landscape

    Directory of Open Access Journals (Sweden)

    Kristina Ackel-Eisnach

    2012-07-01

    Full Text Available Communal engagement and co-operation in the field of education has become increasing important in the last few years. The goal is to expand the local educational landscapes by improving the quality of the educational programs. Therefore responsibilities, networks and new-designed educational programs have to be enhanced. To control the efficiency of these arrangements and the budget, a lot of councils delegate experts to evaluate these new implementations. The following article describes the evaluation for the council Ulm. The program of the local educational landscape is known as "Bildungsoffensive Ulm." The evaluation presents methods such as triangulation of perspectives, mixed-method design and data triangulation. Following the local situation is presented and the methods applied in the evaluation are justified methodologically. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120354

  8. Ohio's Abandoned Mine Lands Reclamation Program: a Study of Data Collection and Evaluation Techniques

    Science.gov (United States)

    Sperry, S. L.

    1982-01-01

    The planning process for a statewide reclamation plan of Ohio abandoned minelands in response to the Federal Surface Mining Control and Reclamation Act of 1977 included: (1) the development of a screening and ranking methodology; (2) the establishment of a statewide review of major watersheds affected by mining; (3) the development of an immediate action process; and (4) a prototypical study of a priority watershed demonstrating the data collection, analysis, display and evaluation to be used for the remaining state watersheds. Historical methods for satisfying map information analysis and evaluation, as well as current methodologies being used were discussed. Various computer mapping and analysis programs were examined for their usability in evaluating the priority reclamation sites. Hand methods were chosen over automated procedures; intuitive evaluation was the primary reason.

  9. A proposed integrated data collection, analysis and sharing platform for impact evaluation

    Directory of Open Access Journals (Sweden)

    Andreas Kipf

    2016-06-01

    Full Text Available Global poverty reduction efforts value monitoring and evaluation, but often struggle to translate lessons learned from one intervention into practical application in another intervention. Commonly, data is not easily or often shared between interventions and summary data collected as part of an impact evaluation is often not available until after the intervention is complete. Equally limiting, the workflows that lead to research results are rarely published in a reproducible, reusable, and easy-to-understand fashion for others. Information and communication technologies widely used in commercial and government programs are growing in relevance for international global development professionals and offer a potential towards better data and workflow sharing. However, the technical and custom nature of many data management systems limits their accessibility to non-ICT professionals. The authors propose an end-to-end data collection, management, and dissemination platform designed for use by global development program managers and researchers. The system leverages smartphones, cellular based sensors, and cloud storage and computing to lower the entry barrier to impact evaluation.

  10. [Evaluation of population data quality and coverage of registration of deaths for the Brazilian regions].

    Science.gov (United States)

    Paes, N A; Albuquerque, M E

    1999-02-01

    The evaluation of the quality of population data and coverage of death statistics for all Federal Brazilian Units by sex in 1990. The population data came from censuses and the recorded death data from "Fundação Instituto Brasileiro de Geografia e Estatística" and the Health Ministry. The population data were evaluated by applying classical demographic methods. Three techniques were chosen to evaluate the extent of death registration coverage. The degree of precision of the age statement for the majority of the Brazilian regions improved the status from "low precision" or "moderate" to "precise" during the 80's. The coverage of deaths in 1990 was classified as "good" or "satisfactory" for all Federal Units in the South, Southeast and Centre-West and for the Northeastern States below Rio Grande do Norte. All the remaining states were classified as "regular" or "unsatisfactory". There was a significant improvement in the quality of the census population data and an increase in the coverage of death. It is possible to obtain get reliable mortality indicators for many Brazilian States.

  11. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    Science.gov (United States)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  12. Evaluation of Tropospheric and Ionospheric Effects on the Geographic Localization of Data Collection Platforms

    Directory of Open Access Journals (Sweden)

    C. C. Celestino

    2007-01-01

    Full Text Available The Brazilian National Institute for Space Research (INPE is operating the Brazilian Environmental Data Collection System that currently amounts to a user community of around 100 organizations and more than 700 data collection platforms installed in Brazil. This system uses the SCD-1, SCD-2, and CBERS-2 low Earth orbit satellites to accomplish the data collection services. The main system applications are hydrology, meteorology, oceanography, water quality, and others. One of the functionalities offered by this system is the geographic localization of the data collection platforms by using Doppler shifts and a batch estimator based on least-squares technique. There is a growing demand to improve the quality of the geographical location of data collection platforms for animal tracking. This work presents an evaluation of the ionospheric and tropospheric effects on the Brazilian Environmental Data Collection System transmitter geographic location. Some models of the ionosphere and troposphere are presented to simulate their impacts and to evaluate performance of the platform location algorithm. The results of the Doppler shift measurements, using the SCD-2 satellite and the data collection platform (DCP located in Cuiabá town, are presented and discussed.

  13. A new technique for evaluating mesospheric momentum balance utilizing radars and satellite data

    Directory of Open Access Journals (Sweden)

    D. J. Frame

    2000-04-01

    Full Text Available A new method for evaluating momentum balance in the mesosphere using radar and satellite data is presented. This method is applied to radar wind data from two medium frequency installations (near Adelaide, Australia and Christchurch, New Zealand and satellite temperature data from the Improved Stratospheric and Mesospheric Sounder (ISAMS. Because of limitations in data availability and vertical extent, the technique can only be applied to evaluate the momentum balance at 80 km above the radar sites for May 1992. The technique allows the calculation of the residual terms in the momentum balance which are usually attributed to the effects of breaking gravity waves. Although the results are inconclusive above Adelaide, this method produces values of zonal and meridional residual accelerations above Christchurch which are consistent with expectation. In both locations it is apparent that geostrophic balance is a poor approximation of reality. (This result is not dependent on a mismatch between the radar and satellite derived winds, but rather is inherent in the satellite data alone. Despite significant caveats about data quality the technique appears robust and could be of use with data from future instruments.Key words: Meteorology and atmospheric dynamics (middle atmosphere dynamics; waves and tides; instruments and techniques

  14. Integrating computational fluid dynamics (CFD) models with GIS: an evaluation on data conversion formats

    Science.gov (United States)

    Wong, David W.; Camelli, Fernando; Sonwalkar, Mukul

    2007-06-01

    Computational fluid dynamics (CFD) models are powerful computational tools to simulate urban-landscape scale atmospheric dispersion events. They are proven to be very useful for security management and emergency response. Essential inputs to CFD models include landscape characteristics, which are often captured by various GIS data layers. While it is logical to couple GIS and CFD models to take advantage of available GIS data and the visualization and cartographic rendering capabilities of GIS, the integration of the two tools have been minimal. In this paper, we took the first step to evaluate the use of GIS data in CFD modeling. Specifically, we explore how efficient is to use GIS data in CFD models and how sensitive the CFD results are to different GIS data formats. Using campus topography and building data, and the FEFLO-URBAN CFD model, we performed atmospheric release simulations using topographic data in contour and raster formats. We found that using raster format was quite efficient and contour data required significant effort. Though the simulation outputs from the two data formats were not identical, their overall outcomes were similar and did not post alarming discrepancies. We concluded that using GIS data have tremendous potential for CFD modeling.

  15. Update and evaluation of decay data for spent nuclear fuel analyses

    Science.gov (United States)

    Simeonov, Teodosi; Wemple, Charles

    2017-09-01

    Studsvik's approach to spent nuclear fuel analyses combines isotopic concentrations and multi-group cross-sections, calculated by the CASMO5 or HELIOS2 lattice transport codes, with core irradiation history data from the SIMULATE5 reactor core simulator and tabulated isotopic decay data. These data sources are used and processed by the code SNF to predict spent nuclear fuel characteristics. Recent advances in the generation procedure for the SNF decay data are presented. The SNF decay data includes basic data, such as decay constants, atomic masses and nuclide transmutation chains; radiation emission spectra for photons from radioactive decay, alpha-n reactions, bremsstrahlung, and spontaneous fission, electrons and alpha particles from radioactive decay, and neutrons from radioactive decay, spontaneous fission, and alpha-n reactions; decay heat production; and electro-atomic interaction data for bremsstrahlung production. These data are compiled from fundamental (ENDF, ENSDF, TENDL) and processed (ESTAR) sources for nearly 3700 nuclides. A rigorous evaluation procedure of internal consistency checks and comparisons to measurements and benchmarks, and code-to-code verifications is performed at the individual isotope level and using integral characteristics on a fuel assembly level (e.g., decay heat, radioactivity, neutron and gamma sources). Significant challenges are presented by the scope and complexity of the data processing, a dearth of relevant detailed measurements, and reliance on theoretical models for some data.

  16. Update and evaluation of decay data for spent nuclear fuel analyses

    Directory of Open Access Journals (Sweden)

    Simeonov Teodosi

    2017-01-01

    Full Text Available Studsvik’s approach to spent nuclear fuel analyses combines isotopic concentrations and multi-group cross-sections, calculated by the CASMO5 or HELIOS2 lattice transport codes, with core irradiation history data from the SIMULATE5 reactor core simulator and tabulated isotopic decay data. These data sources are used and processed by the code SNF to predict spent nuclear fuel characteristics. Recent advances in the generation procedure for the SNF decay data are presented. The SNF decay data includes basic data, such as decay constants, atomic masses and nuclide transmutation chains; radiation emission spectra for photons from radioactive decay, alpha-n reactions, bremsstrahlung, and spontaneous fission, electrons and alpha particles from radioactive decay, and neutrons from radioactive decay, spontaneous fission, and alpha-n reactions; decay heat production; and electro-atomic interaction data for bremsstrahlung production. These data are compiled from fundamental (ENDF, ENSDF, TENDL and processed (ESTAR sources for nearly 3700 nuclides. A rigorous evaluation procedure of internal consistency checks and comparisons to measurements and benchmarks, and code-to-code verifications is performed at the individual isotope level and using integral characteristics on a fuel assembly level (e.g., decay heat, radioactivity, neutron and gamma sources. Significant challenges are presented by the scope and complexity of the data processing, a dearth of relevant detailed measurements, and reliance on theoretical models for some data.

  17. A Framework for Evaluation of Marine Spatial Data Geoportals Using Case Studies

    Directory of Open Access Journals (Sweden)

    Tavra Marina

    2014-12-01

    Full Text Available Need for a Marine Spatial Data Infrastructure (MSDI as a component of a National Spatial Data Infrastructure (NSDI is widely recognized. An MSDI is relevant not only for hydrographers and government planners, but also for many other sectors which takes interest in marine spatial data, whether they are data users, data providers, or data managers [9]. An MSDI encompasses marine and coastal geographic and business information. For efficient use of Marine Spatial Data, it is necessary to ensure its valid and accessible distribution. A geoportal is a specialized web portal for sharing spatial information at different levels over the Internet. This paper re-examines the implementation of an MSDI and what it means for data custodians and end users. Several geoportals are reviewed (German and Australian to determine their web services functionality, capabilities and the scope to which they support the sharing and reuse of Marine Spatial Data to assist the development of the Croatian MSDI Geoportal. This framework provides a context for better understanding the information bases on spatial data standards and a tool for evaluation of MSDI dissemination - Geoportal.

  18. Evaluating Precipitation from Orbital Data Products of TRMM and GPM over the Indian Subcontinent

    Science.gov (United States)

    Jayaluxmi, I.; Kumar, D. N.

    2015-12-01

    The rapidly growing records of microwave based precipitation data made available from various earth observation satellites have instigated a pressing need towards evaluating the associated uncertainty which arise from different sources such as retrieval error, spatial/temporal sampling error and sensor dependent error. Pertaining to microwave remote sensing, most of the studies in literature focus on gridded data products, fewer studies exist on evaluating the uncertainty inherent in orbital data products. Evaluation of the latter are essential as they potentially cause large uncertainties during real time flood forecasting studies especially at the watershed scale. The present study evaluates the uncertainty of precipitation data derived from the orbital data products of the Tropical Rainfall Measuring Mission (TRMM) satellite namely the 2A12, 2A25 and 2B31 products. Case study results over the flood prone basin of Mahanadi, India, are analyzed for precipitation uncertainty through these three facets viz., a) Uncertainty quantification using the volumetric metrics from the contingency table [Aghakouchak and Mehran 2014] b) Error characterization using additive and multiplicative error models c) Error decomposition to identify systematic and random errors d) Comparative assessment with the orbital data from GPM mission. The homoscedastic random errors from multiplicative error models justify a better representation of precipitation estimates by the 2A12 algorithm. It can be concluded that although the radiometer derived 2A12 precipitation data is known to suffer from many sources of uncertainties, spatial analysis over the case study region of India testifies that they are in excellent agreement with the reference estimates for the data period considered [Indu and Kumar 2015]. References A. AghaKouchak and A. Mehran (2014), Extended contingency table: Performance metrics for satellite observations and climate model simulations, Water Resources Research, vol. 49

  19. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    Science.gov (United States)

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i

  20. The Evaluation on Data Mining Methods of Horizontal Bar Training Based on BP Neural Network

    Directory of Open Access Journals (Sweden)

    Zhang Yanhui

    2015-01-01

    Full Text Available With the rapid development of science and technology, data analysis has become an indispensable part of people’s work and life. Horizontal bar training has multiple categories. It is an emphasis for the re-search of related workers that categories of the training and match should be reduced. The application of data mining methods is discussed based on the problem of reducing categories of horizontal bar training. The BP neural network is applied to the cluster analysis and the principal component analysis, which are used to evaluate horizontal bar training. Two kinds of data mining methods are analyzed from two aspects, namely the operational convenience of data mining and the rationality of results. It turns out that the principal component analysis is more suitable for data processing of horizontal bar training.

  1. Preventing Data Ambiguity in Infectious Diseases with Four-Dimensional and Personalized Evaluations.

    Directory of Open Access Journals (Sweden)

    Michelle J Iandiorio

    Full Text Available Diagnostic errors can occur, in infectious diseases, when anti-microbial immune responses involve several temporal scales. When responses span from nanosecond to week and larger temporal scales, any pre-selected temporal scale is likely to miss some (faster or slower responses. Hoping to prevent diagnostic errors, a pilot study was conducted to evaluate a four-dimensional (4D method that captures the complexity and dynamics of infectious diseases.Leukocyte-microbial-temporal data were explored in canine and human (bacterial and/or viral infections, with: (i a non-structured approach, which measures leukocytes or microbes in isolation; and (ii a structured method that assesses numerous combinations of interacting variables. Four alternatives of the structured method were tested: (i a noise-reduction oriented version, which generates a single (one data point-wide line of observations; (ii a version that measures complex, three-dimensional (3D data interactions; (iii a non-numerical version that displays temporal data directionality (arrows that connect pairs of consecutive observations; and (iv a full 4D (single line-, complexity-, directionality-based version.In all studies, the non-structured approach revealed non-interpretable (ambiguous data: observations numerically similar expressed different biological conditions, such as recovery and lack of recovery from infections. Ambiguity was also found when the data were structured as single lines. In contrast, two or more data subsets were distinguished and ambiguity was avoided when the data were structured as complex, 3D, single lines and, in addition, temporal data directionality was determined. The 4D method detected, even within one day, changes in immune profiles that occurred after antibiotics were prescribed.Infectious disease data may be ambiguous. Four-dimensional methods may prevent ambiguity, providing earlier, in vivo, dynamic, complex, and personalized information that facilitates both

  2. Research and primary evaluation of an automatic fusion method for multisource tooth crown data.

    Science.gov (United States)

    Dai, Ning; Li, Dawei; Yang, Xu; Cheng, Cheng; Sun, Yuchun

    2017-11-01

    With the development of 3-dimensional (3D) scanning technologies in dentistry, high accuracy optical scanning data from the crown and cone beam computed tomography data from the root can be acquired easily. In many dental fields, especially in digital orthodontics, it is useful to fuse the data from the crown and the root. However, the manual fusion method is complex and difficult. A novel automatic fusion method for 2-source data from the crown and the root was researched, and its accuracy was evaluated in this study. An occlusal splint with several alumina ceramic spheres was fabricated using heat-curing resin. A multipoint (center of each sphere) alignment method was performed to achieve rapid registration of the crown data from optical scanning and the root data from cone beam computed tomography. The segmentation algorithm based on heuristic search was adopted to perform extraction and segmentation of the crown from whole optical scanning data. The level set algorithm and the marching cubes (MC) algorithm were used to reconstruct digital imaging and communications in medicine data into a 3D model. A novel multisource data fusion algorithm, which is based on iterative Laplacian deformation (ILD), was researched and applied to achieve automatic fusion. Finally, the 3D errors of the method were evaluated. The 3 groups of typical tooth data were automatically fused within 2 seconds. The mean standard deviation was less than 0.02 mm. The novel method can aid the construction of a high-quality 3D model of complete teeth to enable orthodontists to safely, reliably, and visually plan tooth alignment programs. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Decay Data Evaluation Project (DDEP): evaluation of the {sup 237}U,{sup 236}Np, {sup 236m}Np and {sup 241}Pu decay characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Checheva, V.P.; Kuzmenko, N.K. [V.G. Khlopin Radium Institute, Saint Petersburg (Russian Federation)

    2008-07-01

    The results of decay data evaluations are presented for {sup 237}U, {sup 236}Np, {sup 236m}Np and {sup 241}Pu. These evaluated data have been obtained within the Decay Data Evaluation Project and the IAEA CRP 'Updated Decay Data Library for Actinides' using information published up to 2007. The following decay characteristics have been evaluated: half-life, decay energy, energies and probabilities of alpha, beta and electron-capture transitions, energies and transition probabilities of gamma transitions, internal conversion coefficients, and energies and absolute emission probabilities of gamma rays, X-rays and electron emissions.

  4. The evaluation of experimental data in fast range for n + 56Fe(n,inl

    Directory of Open Access Journals (Sweden)

    Qian Jing

    2017-01-01

    Full Text Available Iron is one of the five materials selected for evaluation within the pilot international evaluation project CIELO. Analysis of experimental data for n+56Fe reaction is the basis for constraining theoretical calculations and eventual creation of the evaluated file. The detail analysis was performed for inelastic cross sections of neutron induced reactions with 56Fe in the fast range up to 20 MeV where there are significant differences among the main evaluated libraries, mainly caused by the different inelastic scattering cross section measurements. Gamma-ray production cross sections provide a way to gain experimental information about the inelastic cross section. Large discrepancies between experimental data for the 847-keV gamma ray produced in the 56Fe(n,n1'γ reaction were analyzed. In addition, experimental data for elastic scattering cross section between 9.41∼11 MeV were used to deduce the inelastic cross section from the unitarity constrain.

  5. Data Processing and Quality Evaluation of a Boat-Based Mobile Laser Scanning System

    Science.gov (United States)

    Vaaja, Matti; Kukko, Antero; Kaartinen, Harri; Kurkela, Matti; Kasvi, Elina; Flener, Claude; Hyyppä, Hannu; Hyyppä, Juha; Järvelä, Juha; Alho, Petteri

    2013-01-01

    Mobile mapping systems (MMSs) are used for mapping topographic and urban features which are difficult and time consuming to measure with other instruments. The benefits of MMSs include efficient data collection and versatile usability. This paper investigates the data processing steps and quality of a boat-based mobile mapping system (BoMMS) data for generating terrain and vegetation points in a river environment. Our aim in data processing was to filter noise points, detect shorelines as well as points below water surface and conduct ground point classification. Previous studies of BoMMS have investigated elevation accuracies and usability in detection of fluvial erosion and deposition areas. The new findings concerning BoMMS data are that the improved data processing approach allows for identification of multipath reflections and shoreline delineation. We demonstrate the possibility to measure bathymetry data in shallow (0–1 m) and clear water. Furthermore, we evaluate for the first time the accuracy of the BoMMS ground points classification compared to manually classified data. We also demonstrate the spatial variations of the ground point density and assess elevation and vertical accuracies of the BoMMS data. PMID:24048340

  6. Evaluating uncertainty to strengthen epidemiologic data for use in human health risk assessments.

    Science.gov (United States)

    Burns, Carol J; Wright, J Michael; Pierson, Jennifer B; Bateson, Thomas F; Burstyn, Igor; Goldstein, Daniel A; Klaunig, James E; Luben, Thomas J; Mihlan, Gary; Ritter, Leonard; Schnatter, A Robert; Symons, J Morel; Yi, Kun Don

    2014-11-01

    There is a recognized need to improve the application of epidemiologic data in human health risk assessment especially for understanding and characterizing risks from environmental and occupational exposures. Although there is uncertainty associated with the results of most epidemiologic studies, techniques exist to characterize uncertainty that can be applied to improve weight-of-evidence evaluations and risk characterization efforts. This report derives from a Health and Environmental Sciences Institute (HESI) workshop held in Research Triangle Park, North Carolina, to discuss the utility of using epidemiologic data in risk assessments, including the use of advanced analytic methods to address sources of uncertainty. Epidemiologists, toxicologists, and risk assessors from academia, government, and industry convened to discuss uncertainty, exposure assessment, and application of analytic methods to address these challenges. Several recommendations emerged to help improve the utility of epidemiologic data in risk assessment. For example, improved characterization of uncertainty is needed to allow risk assessors to quantitatively assess potential sources of bias. Data are needed to facilitate this quantitative analysis, and interdisciplinary approaches will help ensure that sufficient information is collected for a thorough uncertainty evaluation. Advanced analytic methods and tools such as directed acyclic graphs (DAGs) and Bayesian statistical techniques can provide important insights and support interpretation of epidemiologic data. The discussions and recommendations from this workshop demonstrate that there are practical steps that the scientific community can adopt to strengthen epidemiologic data for decision making.

  7. Evaluation and Use of Registry Data in a GIS Analysis of Diabetes

    Directory of Open Access Journals (Sweden)

    Mungrue Kameel

    2015-07-01

    Full Text Available Objectives: to evaluate registry data routinely collected by the Chronic Disease Electronic Management System (CDEMS in the monitoring of type 2 diabetes mellitus (T2DM in the Eastern half of the island and use the data to describe the spatial epidemiological patterns of T2DM. Design and Method: The starting point was access and retrival of all exsisting data on the diabetes registry. This data was subsequently validated using handwritten medical records. Several clinical indicators were selected to evaluate the registry. The address of each patient was extracted and georeferenced using ArcGIS 10.0 and several maps were created. Results: The registry had data for thirteen (13 out of the sixteen (16 health facilities. We found that less than 15 percent of all patients actually had diabetic indicator tests done according to World Health Organization (WHO standards. The overall prevalence of T2DM was 20.8 per 1000 population. The highest prevalence of diabetes occurred at the northeastern tip of the island. In addition 57.58% of patients with T2DM resided inland and 40.75% of patients residing on the coastal areas. Conclusions: In conclusion, we provide evidence that the data collected by the diabetes registry although lacking in many areas was adequate for spatial epidemiological analysis.

  8. Examination of various roles for covariance matrices in the development, evaluation, and application of nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-01-01

    The last decade has been a period of rapid development in the implementation of covariance-matrix methodology in nuclear data research. This paper offers some perspective on the progress which has been made, on some of the unresolved problems, and on the potential yet to be realized. These discussions address a variety of issues related to the development of nuclear data. Topics examined are: the importance of designing and conducting experiments so that error information is conveniently generated; the procedures for identifying error sources and quantifying their magnitudes and correlations; the combination of errors; the importance of consistent and well-characterized measurement standards; the role of covariances in data parameterization (fitting); the estimation of covariances for values calculated from mathematical models; the identification of abnormalities in covariance matrices and the analysis of their consequences; the problems encountered in representing covariance information in evaluated files; the role of covariances in the weighting of diverse data sets; the comparison of various evaluations; the influence of primary-data covariance in the analysis of covariances for derived quantities (sensitivity); and the role of covariances in the merging of the diverse nuclear data information. 226 refs., 2 tabs.

  9. MR 201104: Evaluation of Discrimination Technologies and Classification Results and MR 201157: Demonstration of MetalMapper Static Data Acquisition and Data Analysis

    Science.gov (United States)

    2016-09-23

    Discrimination Technologies and Classification Results) and ESTCP MR-201157 (Demonstration of MetalMapper Static Data Acquisition and Data Analysis ... Discrimination Technologies and Classification Results) and ESTCP MR-201157 (Demonstration of MetalMapper Static Data Acquisition and Data Analysis ). All...MR-201104 and MR-201157) MR-201104: Evaluation of Discrimination Technologies and Classification Results MR-201157: Demonstration of MetalMapper

  10. Evaluation of the Radiometric Integrity of LANDSAT 4 Thematic Mapper Band 6 Data

    Science.gov (United States)

    Schott, J. R.

    1985-01-01

    Probably the most generally accepted method for processing radiometric data from space is to correct the observed radiance or apparent temperature to a surface radiance or temperature value using atmospheric propagation models. As part of NASA's Heat Capacity Mapping Mission (HCMM) experiment the atmospheric propagation models were used in reverse in an attempt to evaluate the post launch radiometric response of the radiometer. Techniques successfully used to radiometrically calibrate the HCMM sensor were extended. The HCMM experiment is described and used as a base for the evaluation of the TM band 6 (infrared) sensor.

  11. Evaluation of nuclear data of {sup 244}Pu and {sup 237}Pu

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo; Konshin, V.A. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1995-10-01

    The evaluation of nuclear data for {sup 244}Pu and {sup 237}Pu was made in the neutron energy region from 10{sup -5} eV to 20 MeV. For the both nuclides, the total, elastic and inelastic scattering, fission, capture, (n,2n) and (n,3n) reaction cross sections were evaluated on the basis of theoretical calculation. The resonance parameters were given for {sup 244}Pu. The angular and energy distributions of secondary neutrons were also estimated for the both nuclides. The results were compiled in the ENDF-5 format and will be adopted in JENDL Actinoid File. (author).

  12. Evaluation research of small and medium-sized enterprise informatization on big data

    Science.gov (United States)

    Yang, Na

    2017-09-01

    Under the background of big data, key construction of small and medium-sized enterprise informationization level was needed, but information construction cost was large, while information cost of inputs can bring benefit to small and medium-sized enterprises. This paper established small and medium-sized enterprise informatization evaluation system from hardware and software security level, information organization level, information technology application and the profit level, and information ability level. The rough set theory was used to brief indexes, and then carry out evaluation by support vector machine (SVM) model. At last, examples were used to verify the theory in order to prove the effectiveness of the method.

  13. An evaluation of strain and temperature instrumentation technology used for SRM nozzle static test data acquisition

    Science.gov (United States)

    Lanius, S. J.; Brasfield, R. G.

    1986-08-01

    A program to investigate the status of strain gauge, thermocouple, and attachment technology as used in solid rocket motor static tests is being conducted. The objective for the first part of this program is to critically evaluate strain and temperature measuring instruments for material, configuration, and application deficiencies. Results of the component and application analysis and recommendations for the development of alternatives are presented. The analysis includes evaluation of strain and temperature transducers, lead wires, attachment and signal conditioning/data reduction technologies and procedures.

  14. 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Beck, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Descalles, M. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Escher, J. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hoffman, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mattoon, C. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Navratil, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Nobre, G. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ormand, W. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Summers, N. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Thompson, I. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vogt, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Barnowski, R. [Univ. of California, Berkeley, CA (United States)

    2015-05-12

    LLNL’s Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to produce the last of three major releases of LLNL’s evaluated nuclear database, ENDL2011. ENDL2011 is designed to support LLNL’s current and future nuclear data needs by providing the best nuclear data available to our programmatic customers. This library contains many new evaluations for radiochemical diagnostics, structural materials, and thermonuclear reactions. We have made an effort to eliminate all holes in reaction networks, allowing in-line isotopic creation and depletion calculations. We have striven to keep ENDL2011 at the leading edge of nuclear data library development by reviewing and incorporating new evaluations as they are made available to the nuclear data community. Finally, this release is our most highly tested release as we have strengthened our already rigorous testing regime by adding tests against IPPE Activation Ratio Measurements, many more new critical assemblies and a more complete set of classified testing (to be detailed separately).

  15. Stem analysis program (GOAP for evaluating of increment and growth data at individual tree

    Directory of Open Access Journals (Sweden)

    Gafura Aylak Özdemir

    2016-07-01

    Full Text Available Stem analysis is a method evaluating in a detailed way data of increment and growth of individual tree at the past periods and widely used in various forestry disciplines. Untreated data of stem analysis consist of annual ring count and measurement procedures performed on cross sections taken from individual tree by section method. The evaluation of obtained this untreated data takes quite some time. Thus, a computer software was developed in this study to quickly and efficiently perform stem analysis. This computer software developed to evaluate untreated data of stem analysis as numerical and graphical was programmed as macro by utilizing Visual Basic for Application feature of MS Excel 2013 program currently the most widely used. In developed this computer software, growth height model is formed from two different approaches, individual tree volume depending on section method, cross-sectional area, increments of diameter, height and volume, volume increment percent and stem form factor at breast height are calculated depending on desired period lengths. This calculated values are given as table. Development of diameter, height, volume, increments of these variables, volume increment percent and stem form factor at breast height according to periodic age are given as chart. Stem model showing development of diameter, height and shape of individual tree in the past periods also can be taken from computer software as chart.

  16. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Directory of Open Access Journals (Sweden)

    Parmanto Bambang

    2008-06-01

    Full Text Available Abstract Background Data analysis in community health assessment (CHA involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS". Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01 from SPSS-GIS for satisfaction and time (p Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the

  17. Performance Evaluation of Virtualization Techniques for Control and Access of Storage Systems in Data Center Applications

    Science.gov (United States)

    Ahmadi, Mohammad Reza

    2013-09-01

    Virtualization is a new technology that creates virtual environments based on the existing physical resources. This article evaluates effect of virtualization techniques on control servers and access method in storage systems [1, 2]. In control server virtualization, we have presented a tile based evaluation based on heterogeneous workloads to compare several key parameters and demonstrate effectiveness of virtualization techniques. Moreover, we have evaluated the virtualized model using VMotion techniques and maximum consolidation. In access method, we have prepared three different scenarios using direct, semi-virtual, and virtual attachment models. We have evaluated the proposed models with several workloads including OLTP database, data streaming, file server, web server, etc. Results of evaluation for different criteria confirm that server virtualization technique has high throughput and CPU usage as well as good performance with noticeable agility. Also virtual technique is a successful alternative for accessing to the storage systems especially in large capacity systems. This technique can therefore be an effective solution for expansion of storage area and reduction of access time. Results of different evaluation and measurements demonstrate that the virtualization in control server and full virtual access provide better performance and more agility as well as more utilization in the systems and improve business continuity plan.

  18. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    Energy Technology Data Exchange (ETDEWEB)

    Shreekanth Mandayam; Robi Polikar; John C. Chen

    2004-04-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University fabricated test specimens with simulated defects for nondestructive evaluation (NDE); designed and developed two versions of a test platform for performing multi-sensor interrogation of test specimens under loaded conditions simulating pressurized gas pipelines; and performed magnetic flux leakage (MFL), ultrasonic testing (UT), thermal imaging and acoustic emission (AE) NDE on the test specimens. The data resulting from this work will be employed for designing multi-sensor data fusion algorithms.

  19. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Shreekanth Mandayam; Dr. Robi Polikar; Dr. John C. Chen

    2003-06-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University fabricated test specimens with simulated defects for nondestructive evaluation (NDE); designed and developed two versions of a test platform for performing multi-sensor interrogation of test specimens under loaded conditions simulating pressurized gas pipelines; and performed acoustic emission (AE) NDE on the test specimens. The data resulting from this work will be employed for designing multi-sensor data fusion algorithms during the next reporting period.

  20. A comprehensive evaluation of machine learning techniques for cancer class prediction based on microarray data.

    Science.gov (United States)

    Raza, Khalid; Hasan, Atif N

    2015-01-01

    Prostate cancer is among the most common cancer in males and its heterogeneity is well known. The genomic level changes can be detected in gene expression data and those changes may serve as standard model for any random cancer data for class prediction. Various techniques were implied on prostate cancer data set in order to accurately predict cancer class including machine learning techniques. Large number of attributes but few numbers of samples in microarray data leads to poor training; therefore, the most challenging part is attribute reduction or non-significant gene reduction. In this work, a combination of interquartile range and t-test is used for attribute reduction. Further, a comprehensive evaluation of ten state-of-the-art machine learning techniques for their accuracy in class prediction of prostate cancer is done. Out of these techniques, Bayes Network outperformed with an accuracy of 94.11% followed by Naïve Bayes with an accuracy of 91.17%.

  1. Kiwi: An Evaluated Library of Uncertainties in Nuclear Data and Package for Nuclear Sensitivity Studies

    Energy Technology Data Exchange (ETDEWEB)

    Pruet, J

    2007-06-23

    This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directory structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.

  2. Implementation and evaluation of a clinical data management programme in a primary care centre.

    LENUS (Irish Health Repository)

    Sweeney, J

    2014-11-01

    Electronic health records (EHR) support clinical management, administration, quality assurance, research, and service planning. The aim of this study was to evaluate a clinical data management programme to improve consistency, completeness and accuracy of EHR information in a large primary care centre with 10 General Practitioners (GPs). A Clinical Data Manager was appointed to implement a Data Management Strategy which involved coding consultations using ICPC-2 coding, tailored support and ongoing individualised feedback to clinicians. Over an eighteen month period there were improvements in engagement with and level of coding. Prior to implementation (August 2011) 4 of the 10 GPs engaged in regular coding and 69% of their consultation notes were coded. After 12 months, all 10 GPs and 6 nurses were ICPC-2 coding their consultations and monthly coding levels had increased to 98%. This structured Data Management Strategy provides a feasible sustainable way to improve information management in primary care.

  3. Evaluating the Efficacy of Wavelet Configurations on Turbulent-Flow Data

    Energy Technology Data Exchange (ETDEWEB)

    Li, Shaomeng; Gruchalla, Kenny; Potter, Kristin; Clyne, John; Childs, Hank

    2015-10-25

    I/O is increasingly becoming a significant constraint for simulation codes and visualization tools on modern supercomputers. Data compression is an attractive workaround, and, in particular, wavelets provide a promising solution. However, wavelets can be applied in multiple configurations, and the variations in configuration impact accuracy, storage cost, and execution time. While the variation in these factors over wavelet configurations have been explored in image processing, they are not well understood for visualization and analysis of scientific data. To illuminate this issue, we evaluate multiple wavelet configurations on turbulent-flow data. Our approach is to repeat established analysis routines on uncompressed and lossy-compressed versions of a data set, and then quantitatively compare their outcomes. Our findings show that accuracy varies greatly based on wavelet configuration, while storage cost and execution time vary less. Overall, our study provides new insights for simulation analysts and visualization experts, who need to make tradeoffs between accuracy, storage cost, and execution time.

  4. [A new model of comprehensive data linkage--evaluation of its application in femoral neck fracture].

    Science.gov (United States)

    Ohmann, Christian; Smektala, Rüdiger; Pientka, Ludger; Paech, Stefan; Neuhaus, Elke; Rieger, Martin; Schwabe, Wolfgang; Debold, Peter; Jonas, Michael; Hupe, Klaus; Bücker-Nott, Hans-Joachim; Guido, Giani; Szucs, Thomas D

    2005-01-01

    Aim of the project was a comprehensive assessment of short- and middle-term outcome of femoral neck fracture by linkage and analysis of available data from routine care. For this purpose, a generic model of data linkage was developed, agreed with the data security officer, and practically applied. Included were all patients of the AOK Westphalia-Lippe, who were treated in a general or trauma surgery hospital in 1995/1999 for a femoral neck fracture (ICD-9: 820). For these patients, the linkage was based on the following sources: the data regarding the initial hospital stay were provided by the office of quality assurance of the chamber of physicians of Westphalia-Lippe; the administrative data were provided by the AOKWestphalia-Lippe; and the data evaluating the nursing needs were obtained from the Medical Services of the Health Insurance of Westphalia-Lippe (MDK). This paper presents the model of data linkage and describes its practical implementation; it also presents medical data demonstrating that femoral neck fractures are associated with high mortality and increase of nursing needs in the course of disease. The benefit of the new model is manifold and can be easily extended to other clinical questions.

  5. Evaluation of Annual Modis Ptc Data for Deforestation and Forest Degradation Analysis

    Science.gov (United States)

    Gao, Y.; Ghilardi, A.; Mas, J. F.; Paneque-Galvez, J.; Skutsch, M.

    2016-06-01

    Anthropogenic land-cover change, e.g. deforestation and forest degradation cause carbon emissions. To estimate deforestation and forest degradation, it is important to have reliable data on forest cover. In this analysis, we evaluated annual MODIS Percent Tree Cover (PTC) data for the detection of forest change including deforestation, forest degradation, reforestation and revegetation. The annual MODIS PTC data (2000 - 2010) were pre-processed by applying quality layer. Based on the PTC values of the annual MODIS data, forest change maps were produced and assessed by comparing with the data from visual interpretation of SPOT-5 images. The assessment was applied to two case-studies: Ayuquila Basin and Monarch Reserve. Results show that the detected deforestation patches by visual interpretation are roughly 4 times in quantity more than those by MODIS PTC data, which can be partially due to the much higher spatial resolution of SPOT-5, being able to pick up small deforestation patches. This analysis found poor spatial overlapping for both case-studies. Possible reasons for the discrepancy in quantity and spatial coincidence were provided. It is necessary to refine the methodology for forest change detection by PTC images; also to refine the validation data in terms of data periods and forest change categories to ensure a better assessment.

  6. EVALUATION OF ANNUAL MODIS PTC DATA FOR DEFORESTATION AND FOREST DEGRADATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    Y. Gao

    2016-06-01

    Full Text Available Anthropogenic land-cover change, e.g. deforestation and forest degradation cause carbon emissions. To estimate deforestation and forest degradation, it is important to have reliable data on forest cover. In this analysis, we evaluated annual MODIS Percent Tree Cover (PTC data for the detection of forest change including deforestation, forest degradation, reforestation and revegetation. The annual MODIS PTC data (2000 – 2010 were pre-processed by applying quality layer. Based on the PTC values of the annual MODIS data, forest change maps were produced and assessed by comparing with the data from visual interpretation of SPOT-5 images. The assessment was applied to two case-studies: Ayuquila Basin and Monarch Reserve. Results show that the detected deforestation patches by visual interpretation are roughly 4 times in quantity more than those by MODIS PTC data, which can be partially due to the much higher spatial resolution of SPOT-5, being able to pick up small deforestation patches. This analysis found poor spatial overlapping for both case-studies. Possible reasons for the discrepancy in quantity and spatial coincidence were provided. It is necessary to refine the methodology for forest change detection by PTC images; also to refine the validation data in terms of data periods and forest change categories to ensure a better assessment.

  7. Subjective evaluation of the comfort of popular denim: elaboration and validation of the the data

    Science.gov (United States)

    Braga, I.; Abreu, M. J.; Oliveira, M.

    2017-10-01

    The main objective of this study is to describe the process of validation of the inquiry of subjective evaluation of the comfort of the popular jeans, through the accomplishment of a pre-test. Through this research, we intend to define the language corresponding to the understanding of the public participating in the research and to use the scale of responses in accordance with the interpreters’ ability to infer the analysis of the parts in question based on the different comfort parameters. The group of evaluators consists of 10 women consumers in the popular markets of Fortaleza, aged between 18 and 40 years. With this research it was possible to elaborate questions and answers focussed to the public understanding in order to choose the attributes of evaluation in analysis, to define the scale of answers and to validate the inquiry as instrument of data collection.

  8. Evaluation of hydroconverted residues. Rationalization of analytical data through hydrogen transfer balance

    Energy Technology Data Exchange (ETDEWEB)

    Bacaud, Robert; Rouleau, Loiec [Institut de Recherches sur la Catalyse, CNRS, 2 Avenue Albert Einstein, 69626 Villeurbanne (France); Cebolla, Vicente L.; Membrado, Luis; Vela, Jesus [Departamento de Procesos Quimicos, Instituto de Carboquimica, CSIC, Calle Poeta Luciano Gracia 5, 50015 Zaragoza (Spain)

    1998-08-27

    Analytical evaluation of petroleum based materials and processed feeds is a complex task relying on a compromise between tedious in-depth characterizations and fast responding tools for process control. In the present paper, a large number of hydroprocessed vacuum residues, obtained either under catalytic or thermal conditions, have been submitted to the following analytical techniques: Simulated distillation, coupled Simdist/MS, UV spectroscopy, {sup 13}C NMR, quantitative thin-layer chromatography/FID, vapor phase osmometry. A confrontation of analytical data in the light of correlations with hydrogen transfer evaluation is proposed, which accounts for observed variations in aromatic content. Conradson carbon residue largely influences the results obtained with some of the examined techniques. Apparent discrepancies are rationalized and a strategy for a comprehensive analytical evaluation of hydroprocessed feeds is proposed

  9. Evaluating Technical Efficiency of Nursing Care Using Data Envelopment Analysis and Multilevel Modeling.

    Science.gov (United States)

    Min, Ari; Park, Chang Gi; Scott, Linda D

    2016-05-23

    Data envelopment analysis (DEA) is an advantageous non-parametric technique for evaluating relative efficiency of performance. This article describes use of DEA to estimate technical efficiency of nursing care and demonstrates the benefits of using multilevel modeling to identify characteristics of efficient facilities in the second stage of analysis. Data were drawn from LTCFocUS.org, a secondary database including nursing home data from the Online Survey Certification and Reporting System and Minimum Data Set. In this example, 2,267 non-hospital-based nursing homes were evaluated. Use of DEA with nurse staffing levels as inputs and quality of care as outputs allowed estimation of the relative technical efficiency of nursing care in these facilities. In the second stage, multilevel modeling was applied to identify organizational factors contributing to technical efficiency. Use of multilevel modeling avoided biased estimation of findings for nested data and provided comprehensive information on differences in technical efficiency among counties and states. © The Author(s) 2016.

  10. Evaluation of empirical attributes for credit risk forecasting from numerical data

    Directory of Open Access Journals (Sweden)

    Augustinos Dimitras

    2017-03-01

    Full Text Available In this research, the authors proposed a new method to evaluate borrowers’ credit risk and quality of financial statements information provided. They use qualitative and quantitative criteria to measure the quality and the reliability of its credit customers. Under this statement, the authors evaluate 35 features that are empirically utilized for forecasting the borrowers’ credit behavior of a Greek Bank. These features are initially selected according to universally accepted criteria. A set of historical data was collected and an extensive data analysis is performed by using non parametric models. Our analysis revealed that building simplified model by using only three out of the thirty five initially selected features one can achieve the same or slightly better forecasting accuracy when compared to the one achieved by the model uses all the initial features. Also, experimentally verified claim that universally accepted criteria can’t be globally used to achieve optimal results is discussed.

  11. Japanese evaluated nuclear data library version 3 revision-3: JENDL-3.3

    CERN Document Server

    Shibata, K; Kawano, T

    2002-01-01

    Evaluation for JENDL-3.3 has been performed by considering the accumulated feedback information and various benchmark tests of the previous library JENDL-3.2. The major problems of the JENDL-3.2 data were solved by the new library: overestimation of criticality values for thermal fission reactors was improved by the modifications of fission cross sections and fission neutron spectra for sup 2 sup 3 sup 5 U; incorrect energy distributions of secondary neutrons from important heavy nuclides were replaced with statistical model calculations; the inconsistency between elemental and isotopic evaluations was removed for medium-heavy nuclides. Moreover, covariance data were provided for 20 nuclides. The reliability of JENDL-3.3 was investigated by the benchmark analyses on reactor and shielding performances. The results of the analyses indicate that JENDL-3.3 predicts various reactor and shielding characteristics better than JENDL-3.2. (author)

  12. Comprehensive evaluation of attitude and orbit estimation using real earth magnetic field data

    Science.gov (United States)

    Deutschmann, Julie; Bar-Itzhack, Itzhack

    1997-01-01

    A single, augmented extended Kalman filter (EKF) which simultaneously and autonomously estimates spacecraft attitude and orbit was developed and tested with simulated and real magnetometer and rate data. Since the earth's magnetic field is a function of time and position, and since time is accurately known, the differences between the computed and measured magnetic field components, as measured by the magnetometers throughout the entire spacecraft's orbit, are a function of orbit and attitude errors. These differences can be used to estimate the orbit and attitude. The test results of the EKF with magnetometer and gyro data from three NASA satellites are presented and evaluated.

  13. Computer-assisted evaluation of the thermochemical data of the compounds of thorium

    Energy Technology Data Exchange (ETDEWEB)

    Wagman, D. D.; Schumm, R. H.; Parker, V. B.

    1977-08-01

    Selected values are given for the thermochemical properties of the compounds of thorium. They are obtained from a computer-assisted least sums-least squares approach to the evaluation of thermodynamic data networks. The properties given, where data are available, are enthalpy of formation, Gibbs energy of formation, and entropy at 298.15 K (..delta.. Hf (298), ..delta.. Gf (298), and S (298)). The values are consistent with the CODATA Key Values for Thermodynamics. The reaction catalog from which this self consistent set of values is generated is given with a statistical analysis. Some thermal functions are also given, as well as detailed comments when necessary.

  14. On evaluated nuclear data for beta-delayed gamma rays following of special nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Mencarini, Leonardo de H.; Caldeira, Alexandre D., E-mail: mencarini@ieav.cta.b, E-mail: alexdc@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2011-07-01

    In this paper, a new type of information available in ENDF is discussed. During a consistency check of the evaluated nuclear data library ENDF/B-VII.0 performed at the Nuclear Data Subdivision of the Institute for Advanced Studies, the size of the files for some materials drew the attention of one of the authors. Almost 94 % of all available information for these special nuclear materials is used to represent the beta-delayed gamma rays following fission. This is the first time this information is included in an ENDF version. (author)

  15. History and evaluation of national-scale geochemical data sets for the United States

    OpenAIRE

    Smith, David B.; Smith, Steven M.; Horton, John D.

    2013-01-01

    Six national-scale, or near national-scale, geochemical data sets for soils or stream sediments exist for the United States. The earliest of these, here termed the ‘Shacklette’ data set, was generated by a U.S. Geological Survey (USGS) project conducted from 1961 to 1975. This project used soil collected from a depth of about 20 cm as the sampling medium at 1323 sites throughout the conterminous U.S. The National Uranium Resource Evaluation Hydrogeochemical and Stream Sediment Reconnaissance ...

  16. Evaluation of methodologies for interpolation of data for hydrological modeling in glacierized basins with limited information

    Science.gov (United States)

    Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier

    2017-04-01

    The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount

  17. An automated pressure data acquisition system for evaluation of pressure sensitive paint chemistries

    Science.gov (United States)

    Sealey, Bradley S.; Mitchell, Michael; Burkett, Cecil G.; Oglesby, Donald M.

    1993-01-01

    An automated pressure data acquisition system for testing of pressure sensitive phosphorescent paints was designed, assembled, and tested. The purpose of the calibration system is the evaluation and selection of pressure sensitive paint chemistries that could be used to obtain global aerodynamic pressure distribution measurements. The test apparatus and setup used for pressure sensitive paint characterizations is described. The pressure calibrations, thermal sensitivity effects, and photodegradation properties are discussed.

  18. Data mining approach to the evaluation of diagnostic tests in Wilson disease

    Science.gov (United States)

    Plutecki, Michal M.; Dądalski, Maciej; Socha, Piotr; Mulawka, Jan J.

    2009-06-01

    The purpose of this paper is to figure out a new, better than so-far-known, evaluation method of diagnostic tests in Wilson disease. In order to find the most interesting classification models various data mining techniques were applied to real, suffering from Wilson disease, set of patients. It occurred that a combination of two classification algorithms with its implementations in Weka environment may significantly increase classification ability.

  19. Evaluation of the Frequency for Gas Sampling for the High Burnup Confirmatory Data Project

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, Christine T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Alsaed, Halim A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marschman, Steven C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Scaglione, John M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-05-01

    This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed based on operational considerations.

  20. Outlier Detection for Sensor Systems (ODSS): A MATLAB Macro for Evaluating Microphone Sensor Data Quality.

    Science.gov (United States)

    Vasta, Robert; Crandell, Ian; Millican, Anthony; House, Leanna; Smith, Eric

    2017-10-13

    Microphone sensor systems provide information that may be used for a variety of applications. Such systems generate large amounts of data. One concern is with microphone failure and unusual values that may be generated as part of the information collection process. This paper describes methods and a MATLAB graphical interface that provides rapid evaluation of microphone performance and identifies irregularities. The approach and interface are described. An application to a microphone array used in a wind tunnel is used to illustrate the methodology.

  1. Power-law behaviour evaluation from foreign exchange market data using a wavelet transform method

    Science.gov (United States)

    Wei, H. L.; Billings, S. A.

    2009-09-01

    Numerous studies in the literature have shown that the dynamics of many time series including observations in foreign exchange markets exhibit scaling behaviours. A simple new statistical approach, derived from the concept of the continuous wavelet transform correlation function (WTCF), is proposed for the evaluation of power-law properties from observed data. The new method reveals that foreign exchange rates obey power-laws and thus belong to the class of self-similarity processes.

  2. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards

    Directory of Open Access Journals (Sweden)

    Charles S. Mayo, PhD

    2017-07-01

    Conclusions: Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  3. Extracting Sentiment from Healthcare Survey Data: An Evaluation of Sentiment Analysis Tools

    OpenAIRE

    Georgiou, D.; MacFarlane, A.; Russell-Rose, T.

    2015-01-01

    Sentiment analysis is an emerging discipline with many analytical tools available. This project aimed to examine a number of tools regarding their suitability for healthcare data. A comparison between commercial and non-commercial tools was made using responses from an online survey which evaluated design changes made to a clinical information service. The commercial tools were Semantria and TheySay and the non-commercial tools were WEKA and Google Prediction API. Different approaches were fo...

  4. ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chadwick, M B; Oblozinsky, P; Herman, M; Greene, N M; McKnight, R D; Smith, D L; Young, P G; MacFarlane, R E; Hale, G M; Haight, R C; Frankle, S; Kahler, A C; Kawano, T; Little, R C; Madland, D G; Moller, P; Mosteller, R; Page, P; Talou, P; Trellue, H; White, M; Wilson, W B; Arcilla, R; Dunford, C L; Mughabghab, S F; Pritychenko, B; Rochman, D; Sonzogni, A A; Lubitz, C; Trumbull, T H; Weinman, J; Brown, D; Cullen, D E; Heinrichs, D; McNabb, D; Derrien, H; Dunn, M; Larson, N M; Leal, L C; Carlson, A D; Block, R C; Briggs, B; Cheng, E; Huria, H; Kozier, K; Courcelle, A; Pronyaev, V; der Marck, S

    2006-10-02

    We describe the next generation general purpose Evaluated Nuclear Data File, ENDF/B-VII.0, of recommended nuclear data for advanced nuclear science and technology applications. The library, released by the U.S. Cross Section Evaluation Working Group (CSEWG) in December 2006, contains data primarily for reactions with incident neutrons, protons, and photons on almost 400 isotopes. The new evaluations are based on both experimental data and nuclear reaction theory predictions. The principal advances over the previous ENDF/B-VI library are the following: (1) New cross sections for U, Pu, Th, Np and Am actinide isotopes, with improved performance in integral validation criticality and neutron transmission benchmark tests; (2) More precise standard cross sections for neutron reactions on H, {sup 6}Li, {sup 10}B, Au and for {sup 235,238}U fission, developed by a collaboration with the IAEA and the OECD/NEA Working Party on Evaluation Cooperation (WPEC); (3) Improved thermal neutron scattering; (4) An extensive set of neutron cross sections on fission products developed through a WPEC collaboration; (5) A large suite of photonuclear reactions; (6) Extension of many neutron- and proton-induced reactions up to an energy of 150 MeV; (7) Many new light nucleus neutron and proton reactions; (8) Post-fission beta-delayed photon decay spectra; (9) New radioactive decay data; and (10) New methods developed to provide uncertainties and covariances, together with covariance evaluations for some sample cases. The paper provides an overview of this library, consisting of 14 sublibraries in the same, ENDF-6 format, as the earlier ENDF/B-VI library. We describe each of the 14 sublibraries, focusing on neutron reactions. Extensive validation, using radiation transport codes to simulate measured critical assemblies, show major improvements: (a) The long-standing underprediction of low enriched U thermal assemblies is removed; (b) The {sup 238}U, {sup 208}Pb, and {sup 9}Be reflector

  5. Research on Full Data Planning Stimulation State of Smart Distribution Automation and Stimulation Evaluating System

    Directory of Open Access Journals (Sweden)

    Qiang Chang

    2014-05-01

    Full Text Available Smart distribution automation is an important part of smart grids. In our country, power distribution network is open looped and radial. To optimize the power distribution network, this paper aims to conduct full data planning in smart distribution automation from the perspective of annual total cost to explore efficient and practical planning algorithm, and provide fundamental basis for development and completion of smart power distribution system. In this paper, firstly, we analyze the proposal background and features of smart distribution automation to prepare for limits and targets in the process of distribution network planning. Then, according to regional characteristics of smart distribution automation, we provide with appliance strategies of multi ant colony algorithm in planning of smart distribution automation and figure out the stimulation branch circuit data of full data planning in certain power supply region after 500 iterations. It turns out that the convergence of algorithm and the result of optimization are good. At last, after evaluating the result of distribution network planning, we make out an index evaluating system and come up with evaluating process of the planning results as well as algorithm principle of AHP, and finally provide fundamental basis for self-cure of smart distribution automation.

  6. EXTENSION OF THE NUCLEAR REACTION MODEL CODE EMPIRE TO ACTINIDES NUCLEAR DATA EVALUATION.

    Energy Technology Data Exchange (ETDEWEB)

    CAPOTE,R.; SIN, M.; TRKOV, A.; HERMAN, M.; CARLSON, B.V.; OBLOZINSKY, P.

    2007-04-22

    Recent extensions and improvements of the EMPIRE code system are outlined. They add new capabilities to the code, such as prompt fission neutron spectra calculations using Hauser-Feshbach plus pre-equilibrium pre-fission spectra, cross section covariance matrix calculations by Monte Carlo method, fitting of optical model parameters, extended set of optical model potentials including new dispersive coupled channel potentials, parity-dependent level densities and transmission through numerically defined fission barriers. These features, along with improved and validated ENDF formatting, exclusive/inclusive spectra, and recoils make the current EMPIRE release a complete and well validated tool for evaluation of nuclear data at incident energies above the resonance region. The current EMPIRE release has been used in evaluations of neutron induced reaction files for {sup 232}Th and {sup 231,233}Pa nuclei in the fast neutron region at IAEA. Triple-humped fission barriers and exclusive pre-fission neutron spectra were considered for the fission data evaluation. Total, fission, capture and neutron emission cross section, average resonance parameters and angular distributions of neutron scattering are in excellent agreement with the available experimental data.

  7. Using Rainfall and Temperature Data in the Evaluation of National Malaria Control Programs in Africa.

    Science.gov (United States)

    Thomson, Madeleine C; Ukawuba, Israel; Hershey, Christine L; Bennett, Adam; Ceccato, Pietro; Lyon, Bradfield; Dinku, Tufa

    2017-09-01

    Since 2010, the Roll Back Malaria (RBM) Partnership, including National Malaria Control Programs, donor agencies (e.g., President's Malaria Initiative and Global Fund), and other stakeholders have been evaluating the impact of scaling up malaria control interventions on all-cause under-five mortality in several countries in sub-Saharan Africa. The evaluation framework assesses whether the deployed interventions have had an impact on malaria morbidity and mortality and requires consideration of potential nonintervention influencers of transmission, such as drought/floods or higher temperatures. Herein, we assess the likely effect of climate on the assessment of the impact malaria interventions in 10 priority countries/regions in eastern, western, and southern Africa for the President's Malaria Initiative. We used newly available quality controlled Enhanced National Climate Services rainfall and temperature products as well as global climate products to investigate likely impacts of climate on malaria evaluations and test the assumption that changing the baseline period can significantly impact on the influence of climate in the assessment of interventions. Based on current baseline periods used in national malaria impact assessments, we identify three countries/regions where current evaluations may overestimate the impact of interventions (Tanzania, Zanzibar, Uganda) and three countries where current malaria evaluations may underestimate the impact of interventions (Mali, Senegal and Ethiopia). In four countries (Rwanda, Malawi, Mozambique, and Angola) there was no strong difference in climate suitability for malaria in the pre- and post-intervention period. In part, this may be due to data quality and analysis issues.

  8. Evaluating the credibility of histopathology data in environmental endocrine toxicity studies.

    Science.gov (United States)

    Wolf, Jeffrey C; Maack, Gerd

    2017-03-01

    Agencies responsible for environmental protection are tasked with developing regulatory guidance that is based on the best available scientific evidence. Histopathology is a common endpoint in toxicologic bioassays; however, because of the subjective nature of this endpoint, and the advanced level of specialized training required for its effective utilization, the reliability of histopathology data can be inconsistent. Consequently, mechanisms for evaluating such data on a case-by-case basis are needed. The purposes of the present review are to describe a methodology that can be used to evaluate the credibility of histopathology findings and to discuss the results of such assessments as applied to real-world data collected from the scientific literature. A key outcome of these efforts was the finding that only 54% of the studies examined contained histopathology data that were considered to be either highly credible or credible, whereas data in 46% of those studies were of equivocal, dubious, or no credibility. In addition, the results indicated that the quality of the data examined tended to decline during the past 15 yr. The ultimate goals of the present review are to draw attention to reliability issues that can affect histopathology results, provide recommendations to improve the quality of this endpoint, and suggest an approach for the expeditious and judicious use of histopathology data in the weight-of-evidence determinations required for hazard and/or risk assessment. This exercise was conducted initially as part of a SETAC Pellston Workshop™ entitled "Environmental Hazard and Risk Assessment Approaches for Endocrine-Active Chemicals (EHRA): Developing Technical Guidance Based on Case Studies to Support Decision Making" that was held in Pensacola, Florida (USA) from 31 January to 5 February 2016. Environ Toxicol Chem 2017;36:601-611. © 2016 SETAC. © 2016 SETAC.

  9. ACCURACY EVALUATION OF TWO GLOBAL LAND COVER DATA SETS OVER WETLANDS OF CHINA

    Directory of Open Access Journals (Sweden)

    Z. G. Niu

    2012-07-01

    Full Text Available Although wetlands are well known as one of the most important ecosystems in the world, there are still few global wetland mapping efforts at present. To evaluate the wetland-related types of data accurately for both the Global Land Cover 2000 (GLC2000 data set and MODIS land cover data set (MOD12Q1, we used the China wetland map of 2000, which was interpreted manually based on Landsat TM images, to examine the precision of these global land cover data sets from two aspects (class area accuracy, and spatial agreement across China. The results show that the area consistency coefficients of wetland-related types between the two global data sets and the reference data are 77.27% and 56.85%, respectively. However, the overall accuracy of relevant wetland types from GLC2000 is only 19.81% based on results of confusion matrix of spatial consistency, and similarly, MOD12Q1 is merely 18.91%. Furthermore, the accuracy of the peatlands is much lower than that of the water bodies according to the results of per-pixel comparison. The categories where errors occurred frequently mainly include grasslands, croplands, bare lands and part of woodland (deciduous coniferous forest, deciduous broadleaf forest and open shrubland. The possible reasons for the low precision of wetland-related land cover types include (1the different aims of various products and therefore the inconsistent wetland definitions in their systems; (2 the coarse spatial resolution of satellite images used in global data; (3 Discrepancies in dates when images were acquired between the global data set and the reference data. Overall, the unsatisfactory results highlight that more attention should be paid to the application of these two global data products, especially in wetland-relevant types across China.

  10. Evaluating clinical stop-smoking services globally: towards a minimum data set.

    Science.gov (United States)

    Skinner, Andrew L; West, Robert; Raw, Martin; Anderson, Emma; Munafò, Marcus R

    2017-11-26

    Behavioural and pharmacological support for smoking cessation improves the chances of success and represents a highly cost-effective way of preventing chronic disease and premature death. There is a large number of clinical stop-smoking services throughout the world. These could be connected into a global network to provide data to assess what treatment components are most effective, for what populations and in what settings. To enable this, a minimum data set (MDS) is required to standardize the data captured from smoking cessation services globally. We describe some of the key steps involved in developing a global MDS for smoking cessation services and methodologies to be considered for their implementation, including approaches for reaching consensus on data items to include in a MDS and for its robust validation. We use informal approximations of these methods to produce an example global MDS for smoking cessation. Our aim with this is to stimulate further discussion around the development of a global MDS for smoking cessation services. Our example MDS comprises three sections. The first is a set of data items characterizing treatments offered by a service. The second is a small core set of data items describing clients' characteristics, engagement with the service and outcomes. The third is an extended set of client data items to be captured in addition to the core data items wherever resources permit. There would be benefit in establishing a minimum data set (MDS) to standardize data captured for smoking cessation services globally. Once implemented, a formal MDS could provide a basis for meaningful evaluations of different smoking cessation treatments in different populations in a variety of settings across many countries. © 2017 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  11. Multi-site evaluation of the JULES land surface model using global and local data

    Directory of Open Access Journals (Sweden)

    D. Slevin

    2015-02-01

    Full Text Available This study evaluates the ability of the JULES land surface model (LSM to simulate photosynthesis using local and global data sets at 12 FLUXNET sites. Model parameters include site-specific (local values for each flux tower site and the default parameters used in the Hadley Centre Global Environmental Model (HadGEM climate model. Firstly, gross primary productivity (GPP estimates from driving JULES with data derived from local site measurements were compared to observations from the FLUXNET network. When using local data, the model is biased with total annual GPP underestimated by 16% across all sites compared to observations. Secondly, GPP estimates from driving JULES with data derived from global parameter and atmospheric reanalysis (on scales of 100 km or so were compared to FLUXNET observations. It was found that model performance decreases further, with total annual GPP underestimated by 30% across all sites compared to observations. When JULES was driven using local parameters and global meteorological data, it was shown that global data could be used in place of FLUXNET data with a 7% reduction in total annual simulated GPP. Thirdly, the global meteorological data sets, WFDEI and PRINCETON, were compared to local data to find that the WFDEI data set more closely matches the local meteorological measurements (FLUXNET. Finally, the JULES phenology model was tested by comparing results from simulations using the default phenology model to those forced with the remote sensing product MODIS leaf area index (LAI. Forcing the model with daily satellite LAI results in only small improvements in predicted GPP at a small number of sites, compared to using the default phenology model.

  12. Chemical Kinetics and Photochemical Data for Use in Atmospheric Studies Evaluation Number 16. Supplement to Evaluation 15: Update of Key Reactions

    Science.gov (United States)

    Sander, S. P.; Friedl, R. R.; Barker, J. R.; Golden, D. M.; Kurylo, M. J.; Wine, P. H.; Abbatt, J.; Burkholder, J. B.; Kolb, C. E.; Moortgat, G. K.; hide

    2009-01-01

    This is the supplement to the fifteenth in a series of evaluated sets of rate constants and photochemical cross sections compiled by the NASA Panel for Data Evaluation. The data are used primarily to model stratospheric and upper tropospheric processes, with particular emphasis on the ozone layer and its possible perturbation by anthropogenic and natural phenomena. Copies of this evaluation are available in electronic form and may be printed from the following Internet URL: http://jpldataeval.jpl.nasa.gov/.

  13. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    Science.gov (United States)

    Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.

    2009-12-01

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains

  14. RIPL-Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Capote, R.; Herman, M.; Capote,R.; Herman,M.; Oblozinsky,P.; Young,P.G.; Goriely,S.; Belgy,T.; Ignatyuk,A.V.; Koning,A.J.; Hilaire,S.; Pljko,V.A.; Avrigeanu,M.; Bersillon,O.; Chadwick,M.B.; Fukahori,T.; Ge, Zhigang; Han,Yinl,; Kailas,S.; Kopecky,J.; Maslov,V.M.; Reffo,G.; Sin,M.; Soukhovitskii,E.Sh.; Talou,P

    2009-12-01

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains

  15. Summary of results from the National Renewable Energy Laboratory`s vehicle evaluation data collection efforts

    Energy Technology Data Exchange (ETDEWEB)

    Whalen, P.; Kelly, K.; Motta, R.; Broderick, J.

    1996-05-01

    The U.S. DOE National Renewable Energy Laboratory conducted a data collection project for light-duty, alternative fuel vehicles (AFVs) for about 4 years. The project has collected data on 10 vehicle models (from the original equipment manufacturers) spanning model years 1991 through 1995. Emissions data have also been collected from a number of vehicles converted to natural gas (CNG) and liquefied petroleum gas (LPG). Most of the vehicles involved in the data collection and evaluation are part of the General Services Administration`s fleet of AFVs. This evaluation effort addressed the performance and reliability, fuel economy, and emissions of light- duty AFVs, with comparisons to similar gasoline vehicles when possible. Driver-reported complaints and unscheduled vehicle repairs were used to assess the performance and reliability of the AFVs compared to the comparable gasoline vehicles. Two sources of fuel economy were available, one from testing of vehicles on a chassis dynamometer, and the other from records of in-service fuel use. This report includes results from emissions testing completed on 169 AFVs and 161 gasoline control vehicles.

  16. [Evaluation of health promotion programs using health checkup data and medical receipts].

    Science.gov (United States)

    Mizushima, Shunsaku; Morikawa, Nozomi; Fujii, Hitoshi; Yokoyama, Tetsuji

    2012-01-01

    In a research project on the evaluation of health promotion programs using a health dataset including health checkup data and medical receipts, which was supported by a research grant from the Ministry of Health, Labour and Welfare, the potential preventive factors that influence disease severity and medical expenditure were studied. A dataset from a city with a population of 36,544 and a large elderly population (31.1%) was studied using 12 elementary school geographical areas as the basis of classification. A dataset from a company of 4,780 employees was analyzed to determine changes in health checkup data related to metabolic syndrome over 3 years. A simulation model to identify factors that are important for reducing the incidence of complications such as ischemic heart disease and stroke was developed. Geographical analyses showed a negative association between the rate of participation in health checkup and the prevalence of noncommunicable diseases. The percentage of new patients who started pharmaceutical treatment was lower in the health-advice-intervention group (5.2%) than in the nonintervention group (29.6%). A simulation model to identify factors that are important for reducing the incidence of complications was developed using data on health checkup participation, percentage of those who undertook a lifestyle-modification program, and compliance with medical treatment among others. Various health data including those on health checkup participation, percentage of those who undertook a lifestyle modification program, and compliance under medical treatment over 3 years were helpful in evaluating health promotion programs.

  17. TRAPLINE: a standardized and automated pipeline for RNA sequencing data analysis, evaluation and annotation.

    Science.gov (United States)

    Wolfien, Markus; Rimmbach, Christian; Schmitz, Ulf; Jung, Julia Jeannine; Krebs, Stefan; Steinhoff, Gustav; David, Robert; Wolkenhauer, Olaf

    2016-01-06

    Technical advances in Next Generation Sequencing (NGS) provide a means to acquire deeper insights into cellular functions. The lack of standardized and automated methodologies poses a challenge for the analysis and interpretation of RNA sequencing data. We critically compare and evaluate state-of-the-art bioinformatics approaches and present a workflow that integrates the best performing data analysis, data evaluation and annotation methods in a Transparent, Reproducible and Automated PipeLINE (TRAPLINE) for RNA sequencing data processing (suitable for Illumina, SOLiD and Solexa). Comparative transcriptomics analyses with TRAPLINE result in a set of differentially expressed genes, their corresponding protein-protein interactions, splice variants, promoter activity, predicted miRNA-target interactions and files for single nucleotide polymorphism (SNP) calling. The obtained results are combined into a single file for downstream analysis such as network construction. We demonstrate the value of the proposed pipeline by characterizing the transcriptome of our recently described stem cell derived antibiotic selected cardiac bodies ('aCaBs'). TRAPLINE supports NGS-based research by providing a workflow that requires no bioinformatics skills, decreases the processing time of the analysis and works in the cloud. The pipeline is implemented in the biomedical research platform Galaxy and is freely accessible via www.sbi.uni-rostock.de/RNAseqTRAPLINE or the specific Galaxy manual page (https://usegalaxy.org/u/mwolfien/p/trapline---manual).

  18. Preliminary Evaluation of the SMAP Radiometer Soil Moisture Product over China Using In Situ Data

    Directory of Open Access Journals (Sweden)

    Yayong Sun

    2017-03-01

    Full Text Available The Soil Moisture Active Passive (SMAP satellite makes coincident global measurements of soil moisture using an L-band radar instrument and an L-band radiometer. It is crucial to evaluate the errors in the newest L-band SMAP satellite-derived soil moisture products, before they are routinely used in scientific research and applications. This study represents the first evaluation of the SMAP radiometer soil moisture product over China. In this paper, a preliminary evaluation was performed using sparse in situ measurements from 655 China Meteorological Administration (CMA monitoring stations between 1 April 2015 and 31 August 2016. The SMAP radiometer-derived soil moisture product was evaluated against two schemes of original soil moisture and the soil moisture anomaly in different geographical zones and land cover types. Four performance metrics, i.e., bias, root mean square error (RMSE, unbiased root mean square error (ubRMSE, and the correlation coefficient (R, were used in the accuracy evaluation. The results indicated that the SMAP radiometer-derived soil moisture product agreed relatively well with the in situ measurements, with ubRMSE values of 0.058 cm3·cm−3 and 0.039 cm3·cm−3 based on original data and anomaly data, respectively. The values of the SMAP radiometer-based soil moisture product were overestimated in wet areas, especially in the Southwest China, South China, Southeast China, East China, and Central China zones. The accuracies over croplands and in Northeast China were the worst. Soil moisture, surface roughness, and vegetation are crucial factors contributing to the error in the soil moisture product. Moreover, radio frequency interference contributes to the overestimation over the northern portion of the East China zone. This study provides guidelines for the application of the SMAP-derived soil moisture product in China and acts as a reference for improving the retrieval algorithm.

  19. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    Energy Technology Data Exchange (ETDEWEB)

    Shreekanth Mandayam; Robi Polikar; John C. Chen

    2005-02-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University designed, developed and exercised multisensor data fusion algorithms for identifying defect related information present in magnetic flux leakage, ultrasonic testing, thermal imaging and acoustic emission nondestructive evaluation signatures of a test-specimen suite representative of benign and anomalous indications in gas transmission pipelines. Specifically, the algorithms presented in the earlier reports were augmented to predict information related to defect depth (severity).

  20. Use of simulated data sets to evaluate the fidelity of Metagenomicprocessing methods

    Energy Technology Data Exchange (ETDEWEB)

    Mavromatis, Konstantinos; Ivanova, Natalia; Barry, Kerri; Shapiro, Harris; Goltsman, Eugene; McHardy, Alice C.; Rigoutsos, Isidore; Salamov, Asaf; Korzeniewski, Frank; Land, Miriam; Lapidus, Alla; Grigoriev, Igor; Richardson, Paul; Hugenholtz, Philip; Kyrpides, Nikos C.

    2006-12-01

    Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity--based (blast hit distribution) and two sequence composition--based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.

  1. Data evaluation techniques used for groundwater quality assessment at the Feed Materials Production Center

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, J.E.; Longmire, P.K.

    1990-01-01

    The Feed Materials Production Center has implemented a monitoring program which includes over 300 wells and piezometers to assess the impact of its operations on the ground water. Large volumes of monitoring data are being collected in support of a Remedial Investigation and Feasibility Study, a Resource Conservation and Recovery Act ground water quality assessment program, and an underground storage tank investigation. This program aims to establish background or upgradient ground water constituent concentrations, identify the presence and amount of contamination, determine the migration rate and extent of any contamination found, develop and calibrate hydrological and solute transport models, and track the progress of cleanup activities. This paper addresses the methodologies used for evaluation of the data generated by this program. A discussion will be provided on the decision making process utilized for selecting the appropriate statistical procedures, and the progress made in analysis of the ground water data. (MHB)

  2. Evaluation of Methods in Removing Batch Effects on RNA-seq Data

    Directory of Open Access Journals (Sweden)

    Qian Liu

    2016-04-01

    Full Text Available It is common and advantageous for researchers to combine RNA-seq data from similar studies to increase statistical power in genomics analysis. However the unwanted noise and hidden artifacts such as batch effects could dramatically reduce the accuracy of statistical inference. The performance of three different methods, SVA, ComBat and PCA, for correcting batch effects in RNA-seq data is evaluated. Two simulation dataset are generated to mimic real data in a common RNA-seq experiment. The results show the SVA method has the best performance, while the ComBat method over-corrects the batch effect. Most importantly, a carefully designed experiment, which optimizes the even distribution of samples in different batches, could minimize the confounding or correlation between batches and thus lead to unbiased results.

  3. Reasons for 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Escher, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hoffman, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Luu, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ormand, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Summers, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Thompson, I. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-22

    LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to create the 2011 release of the Evaluated Nuclear Data Library (ENDL2011). ENDL2011 is designed to sup- port LLNL's current and future nuclear data needs. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles, surpassing ENDL2009.0 [1]. The ENDL2011 release [2] contains 918 transport-ready eval- uations in the neutron sub-library alone. ENDL2011 was assembled with strong support from the ASC program, leveraged with support from NNSA science campaigns and the DOE/Offce of Science US Nuclear Data Pro- gram.

  4. Evaluation of Simmental carcass EPD estimated using live and carcass data.

    Science.gov (United States)

    Crews, D H; Pollak, E J; Quaas, R L

    2004-03-01

    This study was conducted to compare carcass EPD predicted using yearling live animal data and/or progeny carcass data, and to quantify the association between the carcass phenotype of progeny and the sire EPD. The live data model (L) included scan weight, ultrasound fat thickness, longissimus muscle area, and percentage of intramuscular fat from yearling (369 d of age) Simmental bulls and heifers. The carcass data model (C) included hot carcass weight, fat thickness, longissimus muscle area, and marbling score from Simmental-sired steers and cull heifers (453 d of age). The combined data model (F) included live animal and carcass data as separate but correlated traits. All data and pedigree information on 39,566 animals were obtained from the American Simmental Association, and all EPD were predicted using animal model procedures. The genetic model included fixed effects of contemporary group and a linear covariate for age at measurement, and a random animal genetic effect. The EPD from L had smaller variance and range than those from either C or F. Further, EPD from F had highest average accuracy. Correlations indicated that evaluations from C and F were most similar, and L would significantly (P EPD using a model including contemporary group, and linear regressions for age at slaughter and the appropriate sire EPD. The regression coefficient was generally improved for sire EPD from L when genetic regression was used to scale EPD to the appropriate carcass trait basis. The EPD from C and F had similar linear associations with progeny phenotype, although EPD from F may be considered optimal because of increased accuracy. These data suggest that carcass EPD based on a combination of live and carcass data predict differences in progeny phenotype at or near theoretical expectation.

  5. An evaluation of a data linkage training workshop for research ethics committees.

    Science.gov (United States)

    Tan, Kate M; Flack, Felicity S; Bear, Natasha L; Allen, Judy A

    2015-03-04

    In Australia research projects proposing the use of linked data require approval by a Human Research Ethics Committee (HREC). A sound evaluation of the ethical issues involved requires understanding of the basic mechanics of data linkage, the associated benefits and risks, and the legal context in which it occurs. The rapidly increasing number of research projects utilising linked data in Australia has led to an urgent need for enhanced capacity of HRECs to review research applications involving this emerging research methodology. The training described in this article was designed to respond to an identified need among the data linkage units in the Australian Population Health Research Network (PHRN) and HREC members in Australia. Five one-day face to face workshops were delivered in the study period to a total of 98 participants. Participants in the workshops represented all six categories of HREC membership composition listed in the National Health and Medical Research Centres' (NHMRC) National Statement on Ethical Conduct in Human Research. Participants were assessed at three time points, prior to the training (T1), immediately after the training (T2) and 8 to 17 months after the training (T3). Ninety participants completed the pre and post questionnaires; 58 of them completed the deferred questionnaire. Participants reported significant improvements in levels of knowledge, understanding and skills in each of the eight areas evaluated. The training was beneficial for those with prior experience in the area of ethics and data linkage as well as those with no prior exposure. Our preliminary work in this area demonstrates that the provision of intensive face to face ethics training in data linkage is feasible and has a significant impact on participant's confidence in reviewing HREC applications.

  6. SU-E-I-92: Accuracy Evaluation of Depth Data in Microsoft Kinect.

    Science.gov (United States)

    Kozono, K; Aoki, M; Ono, M; Kamikawa, Y; Arimura, H; Toyofuku, F

    2012-06-01

    Microsoft Kinect has potential for use in real-time patient position monitoring in diagnostic radiology and radiotherapy. We evaluated the accuracy of depth image data and the device-to-device variation in various conditions simulating clinical applications in a hospital. Kinect sensor consists of infrared-ray depth camera and RGB camera. We developed a computer program using OpenNI and OpenCV for measuring quantitative distance data. The program displays depth image obtained from Kinect sensor on the screen, and the cartesian coordinates at an arbitrary point selected by mouse-clicking can be measured. A rectangular box without luster (300 × 198 × 50 mm 3 ) was used as a measuring object. The object was placed on the floor at various distances ranging from 0 to 400 cm in increments of 10 cm from the sensor, and depth data were measured for 10 points on the planar surface of the box. The measured distance data were calibrated by using the least square method. The device-to-device variations were evaluated using five Kinect sensors. There was almost linear relationship between true and measured values. Kinect sensor was unable to measure at a distance of less than 50 cm from the sensor. It was found that distance data calibration was necessary for each sensor. The device-to-device variation error for five Kinect sensors was within 0.46% at the distance range from 50 cm to 2 m from the sensor. The maximum deviation of the distance data after calibration was 1.1 mm at a distance from 50 to 150 cm. The overall average error of five Kinect sensors was 0.18 mm at a distance range of 50 to 150 cm. Kinect sensor has distance accuracy of about 1 mm if each device is properly calibrated. This sensor will be useable for positioning of patients in diagnostic radiology and radiotherapy. © 2012 American Association of Physicists in Medicine.

  7. Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data

    Directory of Open Access Journals (Sweden)

    Tintle Nathan L

    2012-08-01

    Full Text Available Abstract Background Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO and Kyoto Encyclopedia of Genes and Genomes (KEGG are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. Results We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Conclusions Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.

  8. Design and evaluation of a NoSQL database for storing and querying RDF data

    Directory of Open Access Journals (Sweden)

    Kanda Runapongsa Saikaew

    2014-12-01

    Full Text Available Currently the amount of web data has increased excessively. Its metadata is widely used in order to fully exploit web information resources. This causes the need for Semantic Web technology to quickly analyze such big data. Resource Description Framework (RDF is a standard for describing web resources. In this paper, we propose a method to exploit a NoSQL database, specifically MongoDB, to store and query RDF data. We choose MongoDB to represent a NoSQL database because it is one of the most popular high-performance NoSQL databases. We evaluate the proposed design and implementation by using the Berlin SPARQL Benchmark, which is one of the most widely accepted benchmarks for comparing the performance of RDF storage systems. We compare three database systems, which are Apache Jena TDB (native RDF store, MySQL (relational database, and our proposed system with MongoDB (NoSQL database. Based on the experimental results analysis, our proposed system outperforms other database systems for most queries when the data set size is small. However, for a larger data set, MongoDB performs well for queries with simple operators while MySQL offers an efficient solution for complex queries. The result of this work can provide some guideline for choosing an appropriate RDF database system and applying a NoSQL database in storing and querying RDF data.

  9. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes

    Directory of Open Access Journals (Sweden)

    Stanislav Popelka

    2016-01-01

    Full Text Available The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe.

  10. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes.

    Science.gov (United States)

    Popelka, Stanislav; Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka

    2016-01-01

    The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe.

  11. Urban Land Extraction Using VIIRS Nighttime Light Data: An Evaluation of Three Popular Methods

    Directory of Open Access Journals (Sweden)

    Yinyin Dou

    2017-02-01

    Full Text Available Timely and accurate extraction of urban land area using the Suomi National Polar-orbiting Partnership Visible Infrared Imaging Radiometer Suite (VIIRS nighttime light data is important for urban studies. However, a comprehensive assessment of the existing methods for extracting urban land using VIIRS nighttime light data remains inadequate. Therefore, we first reviewed the relevant methods and selected three popular methods for extracting urban land area using nighttime light data. These methods included local-optimized thresholding (LOT, vegetation-adjusted nighttime light urban index (VANUI, integrated nighttime lights, normalized difference vegetation index, and land surface temperature support vector machine classification (INNL-SVM. Then, we assessed the performance of these methods for extracting urban land area based on the VIIRS nighttime light data in seven evaluation areas with various natural and socioeconomic conditions in China. We found that INNL-SVM had the best performance with an average kappa of 0.80, which was 6.67% higher than the LOT and 2.56% higher than the VANUI. The superior performance of INNL-SVM was mainly attributed to the integration of information on nighttime light, vegetation cover, and land surface temperature. This integration effectively reduced the commission and omission errors arising from the overflow effect and low light brightness of the VIIRS nighttime light data. Additionally, INNL-SVM can extract urban land area more easily. Thus, we suggest that INNL-SVM has great potential for effectively extracting urban land with VIIRS nighttime light data at large scales.

  12. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Data Loading Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 6.0 and Version 7.0. In general, the data transfer procedures for version 6 and 7 are the same, but where deviations exist, the differences are noted. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  13. Evaluation of IRS-1C LISS-3 satellite data for Norway spruce defoliation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Falkenstroem, H.

    1999-02-01

    Satellite based remote sensing supported by air photo and field surveys, provide a means to area covering forest health assessment on a regional scale. Landsat TM data has been extensively used in studies of spruce and fir defoliation in Europe and North America. The temporal coverage of Landsat TM in combination with cloudiness however restrict the availability of data. In this study the LISS-3 sensor onboard the Indian Resource Satellite, IRS-1C, was evaluated for defoliation assessments in Norway spruce (Picea abies) in the central part of Sweden. The near infrared wavelength band proved to be best correlated with mean stand defoliation. After normalisation of satellite data for topographic conditions, the correlation coefficient increased from -0,19 to -0,83. Normalising satellite data for species composition did not improve the results though. The correction coefficients involved in the procedure were originally developed for Landsat TM, and proved to be inadequate for the LISS-3 data set. A thorough examination of the effects of species composition on LISS-3 data is needed to yield better results. The correlation between observed defoliation in the verification stands and predicted (based on the inverse regression function between corrected NIR values and defoliation in reference stands) was 0,70, despite a very limited range of defoliation in the verification set. IRS-1C LISS-3 is fully comparable to Landsat TM for spruce defoliation studies, although the results would probably not be significantly improved 49 refs, 7 figs, 10 tabs

  14. Results of Formal Evaluation of a Data and Modeling Driven Hydrology Learning Module

    Science.gov (United States)

    Ruddell, B. L.; Sanchez, C. A.; Schiesser, R.; Merwade, V.

    2014-12-01

    New hydrologists should not only develop a well-defined knowledgebase of basic hydrological concepts, but also synthesize this factual learning with more authentic 'real-world' knowledge gained from the interpretation and analysis of data from hydrological models (Merwade and Ruddell, 2012, Wagener et al., 2007). However, hydrological instruction is often implemented using a traditional teacher-centered approach (e.g., lectures) (Wagener, 2007). The emergence of rich and dynamic computer simulation techniques which allow students the opportunity for more authentic application of knowledge (Merwade & Ruddell, 2012). This study evaluates the efficacy of using such data-driven simulations to increase the understanding of the field of hydrology in the lower-division undergraduate geoscience classroom. In this study, 88 students at a local community college who were enrolled in an Introductory Earth Science class were evaluated on their learning performance in a unit on applying the Rational Method to estimate hydrographs and flooding for urban areas. Students were either presented with a data and visualization rich computer module (n=52), or with paper and pencil calculation activities (n=36). All conceptual material presented in lecture was consistent across these two conditions. Students were evaluated for not only changes in their knowledge and application of the concepts within the unit (e.g., effects of urbanization and impervious cover, discharge rates), but also for their broad "T-shaped" profile of professional knowledge and skills. While results showed significant (plearning areas for both groups, there is a significantly larger benefit for the data module group when it came to (1) understanding the effects of urbanization and impervious cover on flooding, (2) applying consistent vocabulary appropriately within context, and (3) explaining the roles and responsibilities of hydrologists and flood managers.

  15. Innovation in urban agriculture: Evaluation data of a participatory approach (ROIR

    Directory of Open Access Journals (Sweden)

    Felix Zoll

    2016-06-01

    Full Text Available The data in this article represent an evaluation of a participatory process called Regional Open Innovation Roadmapping (ROIR. The approach aims at the promotion of regional development. In this case, it was carried out to develop a specific innovation in the field of ‘Zero-acreage farming’ (ZFarming, which is a building-related subtype of urban agriculture. For the evaluation of the process, an online survey was sent to the 58 participants of the ROIR on March 4, 2014. The survey ended on April 8, 2014, and a response rate of 53.54% resulted in a sample size of 31 respondents. The survey was divided into seven different blocks. We analyzed the ROIR process׳s contribution to knowledge generation, the establishment of networks among the participants, the implementation of new projects related to ZFarming, and the increase of acceptance of ZFarming and the selected ZFarming innovation. Furthermore, other remarks, and personal information were collected. Hence, the objective of the survey was to assess whether ROIR is a useful tool to promote the aforementioned innovation drivers, and thereby, the selected innovation, which was developed throughout the process. The data were used in the research article “Application and evaluation of a participatory “open innovation” approach (ROIR: the case of introducing zero-acreage farming in Berlin” (Specht et al., 2016 [1].

  16. Innovation in urban agriculture: Evaluation data of a participatory approach (ROIR).

    Science.gov (United States)

    Zoll, Felix; Specht, Kathrin; Siebert, Rosemarie

    2016-06-01

    The data in this article represent an evaluation of a participatory process called Regional Open Innovation Roadmapping (ROIR). The approach aims at the promotion of regional development. In this case, it was carried out to develop a specific innovation in the field of 'Zero-acreage farming' (ZFarming), which is a building-related subtype of urban agriculture. For the evaluation of the process, an online survey was sent to the 58 participants of the ROIR on March 4, 2014. The survey ended on April 8, 2014, and a response rate of 53.54% resulted in a sample size of 31 respondents. The survey was divided into seven different blocks. We analyzed the ROIR process׳s contribution to knowledge generation, the establishment of networks among the participants, the implementation of new projects related to ZFarming, and the increase of acceptance of ZFarming and the selected ZFarming innovation. Furthermore, other remarks, and personal information were collected. Hence, the objective of the survey was to assess whether ROIR is a useful tool to promote the aforementioned innovation drivers, and thereby, the selected innovation, which was developed throughout the process. The data were used in the research article "Application and evaluation of a participatory "open innovation" approach (ROIR): the case of introducing zero-acreage farming in Berlin" (Specht et al., 2016) [1].

  17. The Operating Characteristics of the Nonparametric Levene Test for Equal Variances with Assessment and Evaluation Data

    Directory of Open Access Journals (Sweden)

    David W. Nordstokke

    2011-02-01

    Full Text Available Many assessment and evaluation studies use statistical hypothesis tests, such as the independent samples t test or analysis of variance, to test the equality of two or more means for gender, age groups, cultures or language group comparisons. In addition, some, but far fewer, studies compare variability across these same groups or research conditions. Tests of the equality of variances can therefore be used on their own for this purpose but they are most often used alongside other methods to support assumptions made about variances. This is often done so that variances can be pooled across groups to yield an estimate of variance that is used in the standard error of the statistic in question. The purposes of this paper are twofold. The first purpose is to describe a new nonparametric Levene test for equal variances that can be used with widely available statistical software such as SPSS or SAS, and the second purpose is to investigate this test's operating characteristics, Type I error and statistical power, with real assessment and evaluation data. To date, the operating characteristics of the nonparametric Levene test have been studied with mathematical distributions in computer experiments and, although that information is valuable, this study will be an important next step in documenting both the level of non-normality (skewness and kurtosis of real assessment and evaluation data, and how this new statistical test operates in these conditions.

  18. Evaluation of historical and analytical data on the TAN TSF-07 Disposal Pond

    Energy Technology Data Exchange (ETDEWEB)

    Medina, S.M.

    1993-07-01

    The Technical Support Facility (TSF)-07 Disposal Pond, located at Test Area North at the Idaho National Engineering Laboratory, has been identified as part of Operable Unit 1-06 under the Comprehensive Environmental Response, Compensation, and Liability Act. The Environmental Restoration and Waste Management Department is conducting an evaluation of existing site characterization data for the TSF-07 Disposal Pond Track 1 investigation. The results from the site characterization data will be used to determine whether the operable unit will undergo a Track 2 investigation, an interim action, a remedial investigation/feasibility study, or result in a no-action decision. This report summarizes activities relevant to wastewaters discharged to the pond and characterization efforts conducted from 1982 through 1991. Plan view and vertical distribution maps of the significant contaminants contained in the pond are included. From this evaluation it was determined that cobalt-60, cesium-137, americium-241, mercury, chromium, and thallium are significant contaminants for soils. This report also evaluates the migration tendencies of the significant contaminants into the perched water zone under the pond and the surrounding terrain to support the investigation.

  19. Statistical evaluation of photon count rate data for nanoscale particle measurement in wastewaters.

    Science.gov (United States)

    Smeraldi, Josh; Ganesh, Rajagopalan; Safarik, Jana; Rosso, Diego

    2012-01-01

    The dynamic light scattering (DLS) technique can detect the concentration and size distribution of nanoscale particles in aqueous solutions by analyzing photon interactions. This study evaluated the applicability of using photon count rate data from DLS analyses for measuring levels of biogenic and manufactured nanoscale particles in wastewater. Statistical evaluations were performed using secondary wastewater effluent and a Malvern Zetasizer. Dynamic light scattering analyses were performed equally by two analysts over a period of two days using five dilutions and twelve replicates for each dilution. Linearity evaluation using the sixty sample analysis yielded a regression coefficient R(2) = 0.959. The accuracy analysis for various dilutions indicated a recovery of 100 ± 6%. Precision analyses indicated low variance coefficients for the impact of analysts, days, and within sample error. The variation by analysts was apparent only in the most diluted sample (intermediate precision ~12%), where the photon count rate was close to the instrument detection limit. The variation for different days was apparent in the two most concentrated samples, which indicated that wastewater samples must be analyzed for nanoscale particle measurement within the same day of collection. Upon addition of 10 mg l(-1) of nanosilica to wastewater effluent samples, the measured photon count rates were within 5% of the estimated values. The results indicated that photon count rate data can effectively complement various techniques currently available to detect nanoscale particles in wastewaters.

  20. A note on prognostic accuracy evaluation of regression models applied to longitudinal autocorrelated binary data

    Directory of Open Access Journals (Sweden)

    Giulia Barbati

    2014-11-01

    Full Text Available Background: Focus of this work was on evaluating the prognostic accuracy of two approaches for modelling binary longitudinal outcomes, a Generalized Estimating Equation (GEE and a likelihood based method, Marginalized Transition Model (MTM, in which a transition model is combined with a marginal generalized linear model describing the average response as a function of measured predictors.Methods: A retrospective study on cardiovascular patients and a prospective study on sciatic pain were used to evaluate discrimination by computing the Area Under the Receiver-Operating-Characteristics curve, (AUC, the Integrated Discrimination Improvement (IDI and the Net Reclassification Improvement (NRI at different time occasions. Calibration was also evaluated. A simulation study was run in order to compare model’s performance in a context of a perfect knowledge of the data generating mechanism. Results: Similar regression coefficients estimates and comparable calibration were obtained; an higher discrimination level for MTM was observed. No significant differences in calibration and MSE (Mean Square Error emerged in the simulation study, that instead confirmed the MTM higher discrimination level. Conclusions: The choice of the regression approach should depend on the scientific question being addressed, i.e. if the overall population-average and calibration or the subject-specific patterns and discrimination are the objectives of interest, and some recently proposed discrimination indices are useful in evaluating predictive accuracy also in a context of longitudinal studies.

  1. Safety assessment of mushrooms in dietary supplements by combining analytical data with in silico toxicology evaluation.

    Science.gov (United States)

    VanderMolen, Karen M; Little, Jason G; Sica, Vincent P; El-Elimat, Tamam; Raja, Huzefa A; Oberlies, Nicholas H; Baker, Timothy R; Mahony, Catherine

    2017-05-01

    Despite growing popularity in dietary supplements, many medicinal mushrooms have not been evaluated for their safe human consumption using modern techniques. The multifaceted approach described here relies on five key principles to evaluate the safety of non-culinary fungi for human use: (1) identification by sequencing the nuclear ribosomal internal transcribed spacer (ITS) region (commonly referred to as ITS barcoding), (2) screening an extract of each fungal raw material against a database of known fungal metabolites, (3) comparison of these extracts to those prepared from grocery store-bought culinary mushrooms using UHPLCPDA-ELS-HRMS, (4) review of the toxicological and chemical literature for each fungus, and (5) evaluation of data establishing presence in-market. This weight-of-evidence approach was used to evaluate seven fungal raw materials and determine safe human use for each. Such an approach may provide an effective alternative to conventional toxicological animal studies (or more efficiently identifies when studies are necessary) for the safety assessment of fungal dietary ingredients. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Diffraction cartography: applying microbeams to macromolecular crystallography sample evaluation and data collection.

    Science.gov (United States)

    Bowler, Matthew W; Guijarro, Matias; Petitdemange, Sebastien; Baker, Isabel; Svensson, Olof; Burghammer, Manfred; Mueller-Dieckmann, Christoph; Gordon, Elspeth J; Flot, David; McSweeney, Sean M; Leonard, Gordon A

    2010-08-01

    Crystals of biological macromolecules often exhibit considerable inter-crystal and intra-crystal variation in diffraction quality. This requires the evaluation of many samples prior to data collection, a practice that is already widespread in macromolecular crystallography. As structural biologists move towards tackling ever more ambitious projects, new automated methods of sample evaluation will become crucial to the success of many projects, as will the availability of synchrotron-based facilities optimized for high-throughput evaluation of the diffraction characteristics of samples. Here, two examples of the types of advanced sample evaluation that will be required are presented: searching within a sample-containing loop for microcrystals using an X-ray beam of 5 microm diameter and selecting the most ordered regions of relatively large crystals using X-ray beams of 5-50 microm in diameter. A graphical user interface developed to assist with these screening methods is also presented. For the case in which the diffraction quality of a relatively large crystal is probed using a microbeam, the usefulness and implications of mapping diffraction-quality heterogeneity (diffraction cartography) are discussed. The implementation of these techniques in the context of planned upgrades to the ESRF's structural biology beamlines is also presented.

  3. Use of genomic data in risk assessment case study: I. Evaluation of the dibutyl phthalate male reproductive development toxicity data set.

    Science.gov (United States)

    Makris, Susan L; Euling, Susan Y; Gray, L Earl; Benson, Robert; Foster, Paul

    2013-09-15

    A case study was conducted, using dibutyl phthalate (DBP), to explore an approach to using toxicogenomic data in risk assessment. The toxicity and toxicogenomic data sets relative to DBP-related male reproductive developmental outcomes were considered conjointly to derive information about mode and mechanism of action. In this manuscript, we describe the case study evaluation of the toxicological database for DBP, focusing on identifying the full spectrum of male reproductive developmental effects. The data were assessed to 1) evaluate low dose and low incidence findings and 2) identify male reproductive toxicity endpoints without well-established modes of action (MOAs). These efforts led to the characterization of data gaps and research needs for the toxicity and toxicogenomic studies in a risk assessment context. Further, the identification of endpoints with unexplained MOAs in the toxicity data set was useful in the subsequent evaluation of the mechanistic information that the toxicogenomic data set evaluation could provide. The extensive analysis of the toxicology data set within the MOA context provided a resource of information for DBP in attempts to hypothesize MOAs (for endpoints without a well-established MOA) and to phenotypically anchor toxicogenomic and other mechanistic data both to toxicity endpoints and to available toxicogenomic data. This case study serves as an example of the steps that can be taken to develop a toxicological data source for a risk assessment, both in general and especially for risk assessments that include toxicogenomic data. Published by Elsevier Inc.

  4. Use of genomic data in risk assessment case study: I. Evaluation of the dibutyl phthalate male reproductive development toxicity data set

    Energy Technology Data Exchange (ETDEWEB)

    Makris, Susan L., E-mail: makris.susan@epa.gov [U.S. Environmental Protection Agency, National Center for Environmental Assessment, Office of Research and Development, (Mail code 8623P), 1200 Pennsylvania Ave., NW, Washington, DC 20460 (United States); Euling, Susan Y. [U.S. Environmental Protection Agency, National Center for Environmental Assessment, Office of Research and Development, (Mail code 8623P), 1200 Pennsylvania Ave., NW, Washington, DC 20460 (United States); Gray, L. Earl [U.S. Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Office of Research and Development, (MD-72), Highway 54, Research Triangle Park, NC 27711 (United States); Benson, Robert [U.S. Environmental Protection Agency, Region 8, (Mail code 8P-W), 1595 Wynkoop Street, Denver, CO 80202 (United States); Foster, Paul M.D. [National Toxicology Program, National Institute of Environmental Health Sciences, P.O. Box 12233 (MD K2-12), Research Triangle Park, NC 27709 (United States)

    2013-09-15

    A case study was conducted, using dibutyl phthalate (DBP), to explore an approach to using toxicogenomic data in risk assessment. The toxicity and toxicogenomic data sets relative to DBP-related male reproductive developmental outcomes were considered conjointly to derive information about mode and mechanism of action. In this manuscript, we describe the case study evaluation of the toxicological database for DBP, focusing on identifying the full spectrum of male reproductive developmental effects. The data were assessed to 1) evaluate low dose and low incidence findings and 2) identify male reproductive toxicity endpoints without well-established modes of action (MOAs). These efforts led to the characterization of data gaps and research needs for the toxicity and toxicogenomic studies in a risk assessment context. Further, the identification of endpoints with unexplained MOAs in the toxicity data set was useful in the subsequent evaluation of the mechanistic information that the toxicogenomic data set evaluation could provide. The extensive analysis of the toxicology data set within the MOA context provided a resource of information for DBP in attempts to hypothesize MOAs (for endpoints without a well-established MOA) and to phenotypically anchor toxicogenomic and other mechanistic data both to toxicity endpoints and to available toxicogenomic data. This case study serves as an example of the steps that can be taken to develop a toxicological data source for a risk assessment, both in general and especially for risk assessments that include toxicogenomic data.

  5. Evaluating thermodynamic models of enhancer activity on cellular resolution gene expression data.

    Science.gov (United States)

    Samee, Abul Hassan; Sinha, Saurabh

    2013-07-15

    With the advent of high throughput sequencing and high resolution transcriptomic technologies, there exists today an unprecedented opportunity to understand gene regulation at a quantitative level. State of the art models of the relationship between regulatory sequence and gene expression have shown great promise, but also suffer from some major shortcomings. In this paper, we identify and address methodological challenges pertaining to quantitative modeling of gene expression from sequence, and test our models on the anterior-posterior patterning system in the Drosophila embryo. We first develop a framework to process cellular resolution three-dimensional gene expression data from the Drosophila embryo and create data sets on which quantitative models can be trained. Next we propose a new score, called 'weighted pattern generating potential' (w-PGP), to evaluate model predictions, and show its advantages over the two most common scoring schemes in use today. The model building exercise uses w-PGP as the evaluation score and adopts a systematic strategy to increase a model's complexity while guarding against over-fitting. Our model identifies three transcription factors--ZELDA, SLOPPY-PAIRED, and NUBBIN--that have not been previously incorporated in quantitative models of this system, as having significant regulatory influence. Finally, we show how fitting quantitative models on data sets comprising a handful of enhancers, as reported in earlier work, may lead to unreliable models. Copyright © 2013. Published by Elsevier Inc.

  6. Evaluation Method for Service Branding Using Word-of-Mouth Data

    Science.gov (United States)

    Shirahada, Kunio; Kosaka, Michitaka

    Development and spread of internet technology contributes service firms to obtaining the high capability of brand information transmission as well as relative customer feedback data collection. In this paper, we propose a new evaluation method for service branding using firms and consumers data on the internet. Based on service marketing 7Ps (Product, Price, Place, Promotion, People, Physical evidence, Process) which are the key viewpoints for branding, we develop a brand evaluation system including coding methods for Word-of-Mouth (WoM) and corporate introductory information on the internet to identify both customer's service value recognition vector and firm's service value proposition vector. Our system quantitatively clarify both customer's service value recognition of the firm and firm's strength in service value proposition, thereby analyzing service brand communication gaps between firm and consumers. We applied this system to Japanese Ryokan hotel industry. Using six ryokan-hotels' data on Jyaran-net and Rakuten travel, we made totally 983 codes from WoM information and analyzed their service brand value according to three price based categories. As a result, we found that the characteristics of customers' service value recognition vector differ according to the price categories. In addition, the system clarified that there is a firm that has a different service value proposition vector from customers' recognition vector. This helps to analyze corporate service brand strategy and has a significance as a system technology supporting service management.

  7. Simpler Evaluation of Predictions and Signature Stability for Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Yvonne E. Pittelkow

    2009-01-01

    Full Text Available Scientific advances are raising expectations that patient-tailored treatment will soon be available. The development of resulting clinical approaches needs to be based on well-designed experimental and observational procedures that provide data to which proper biostatistical analyses are applied. Gene expression microarray and related technology are rapidly evolving. It is providing extremely large gene expression profiles containing many thousands of measurements. Choosing a subset from these gene expression measurements to include in a gene expression signature is one of the many challenges needing to be met. Choice of this signature depends on many factors, including the selection of patients in the training set. So the reliability and reproducibility of the resultant prognostic gene signature needs to be evaluated, in such a way as to be relevant to the clinical setting. A relatively straightforward approach is based on cross validation, with separate selection of genes at each iteration to avoid selection bias. Within this approach we developed two different methods, one based on forward selection, the other on genes that were statistically significant in all training blocks of data. We demonstrate our approach to gene signature evaluation with a well-known breast cancer data set.

  8. Efficiency evaluation of customer satisfaction index in e-banking using the fuzzy data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Aliyar Esmaeili

    2014-01-01

    Full Text Available E-commerce has created significant opportunities for the corporations to understand the customers’ expectations, desired values, to increase their satisfaction and to expand their market share, more efficiently. The most significant activity of e-commerce is in the field of e-banking and financial services. Customer satisfaction index is a concept introduced for evaluating of the service quality in electronic banking. Considering the relative importance of customer satisfaction in e-banking, defining scientific criteria for the assessment of mentioned index is very important. So, a scientific and efficient method is always needed. The purpose of this paper is to use the fuzzy data envelopment analysis (DEA techniques for evaluating and ranking the efficiency of online customer satisfaction index in eight economic banks in Iran. Here, we first study the fuzzy set theory and the method of traditional DEA in the same model. Next, the relationship between them were developed, which provide the fuzzy DEA model with qualitative data. The SPSS and GAMS software package were employed to analyze the data collected through questionnaires. The results show that three economic banks in terms of customer satisfaction in e-banking were located on the efficiency border and were quite efficient.

  9. An Overview and Evaluation of Recent Machine Learning Imputation Methods Using Cardiac Imaging Data.

    Science.gov (United States)

    Liu, Yuzhe; Gopalakrishnan, Vanathi

    2017-03-01

    Many clinical research datasets have a large percentage of missing values that directly impacts their usefulness in yielding high accuracy classifiers when used for training in supervised machine learning. While missing value imputation methods have been shown to work well with smaller percentages of missing values, their ability to impute sparse clinical research data can be problem specific. We previously attempted to learn quantitative guidelines for ordering cardiac magnetic resonance imaging during the evaluation for pediatric cardiomyopathy, but missing data significantly reduced our usable sample size. In this work, we sought to determine if increasing the usable sample size through imputation would allow us to learn better guidelines. We first review several machine learning methods for estimating missing data. Then, we apply four popular methods (mean imputation, decision tree, k-nearest neighbors, and self-organizing maps) to a clinical research dataset of pediatric patients undergoing evaluation for cardiomyopathy. Using Bayesian Rule Learning (BRL) to learn ruleset models, we compared the performance of imputation-augmented models versus unaugmented models. We found that all four imputation-augmented models performed similarly to unaugmented models. While imputation did not improve performance, it did provide evidence for the robustness of our learned models.

  10. Evaluation of Driver Visibility from Mobile LIDAR Data and Weather Conditions

    Science.gov (United States)

    González-Jorge, H.; Díaz-Vilariño, L.; Lorenzo, H.; Arias, P.

    2016-06-01

    Visibility of drivers is crucial to ensure road safety. Visibility is influenced by two main factors, the geometry of the road and the weather present therein. The present work depicts an approach for automatic visibility evaluation using mobile LiDAR data and climate information provided from weather stations located in the neighbourhood of the road. The methodology is based on a ray-tracing algorithm to detect occlusions from point clouds with the purpose of identifying the visibility area from each driver position. The resulting data are normalized with the climate information to provide a polyline with an accurate area of visibility. Visibility ranges from 25 m (heavy fog) to more than 10,000 m (clean atmosphere). Values over 250 m are not taken into account for road safety purposes, since this value corresponds to the maximum braking distance of a vehicle. Two case studies are evaluated an urban road in the city of Vigo (Spain) and an inter-urban road between the city of Ourense and the village of Castro Caldelas (Spain). In both cases, data from the Galician Weather Agency (Meteogalicia) are used. The algorithm shows promising results allowing the detection of particularly dangerous areas from the viewpoint of driver visibility. The mountain road between Ourense and Castro Caldelas, with great presence of slopes and sharp curves, shows special interest for this type of application. In this case, poor visibility can especially contribute to the run over of pedestrians or cyclists traveling on the road shoulders.

  11. Evaluating leaf litter beetle data sampled by Winkler extraction from Atlantic forest sites in southern Brazil

    Directory of Open Access Journals (Sweden)

    Philipp Werner Hopp

    2011-06-01

    Full Text Available Evaluating leaf litter beetle data sampled by Winkler extraction from Atlantic forest sites in southern Brazil. To evaluate the reliability of data obtained by Winkler extraction in Atlantic forest sites in southern Brazil, we studied litter beetle assemblages in secondary forests (5 to 55 years after abandonment and old-growth forests at two seasonally different points in time. For all regeneration stages, species density and abundance were lower in April compared to August; but, assemblage composition of the corresponding forest stages was similar in both months. We suggest that sampling of small litter inhabiting beetles at different points in time using the Winkler technique reveals identical ecological patterns, which are more likely to be influenced by sample incompleteness than by differences in their assemblage composition. A strong relationship between litter quantity and beetle occurrences indicates the importance of this variable for the temporal species density pattern. Additionally, the sampled beetle material was compared with beetle data obtained with pitfall traps in one old-growth forest. Over 60% of the focal species captured with pitfall traps were also sampled by Winkler extraction in different forest stages. Few beetles with a body size too large to be sampled by Winkler extraction were only sampled with pitfall traps. This indicates that the local litter beetle fauna is dominated by small species. Hence, being aware of the exclusion of large beetles and beetle species occurring during the wet season, the Winkler method reveals a reliable picture of the local leaf litter beetle community.

  12. Evaluating Corn (Zea Mays L.) N Variability Via Remote Sensed Data

    Science.gov (United States)

    Sullivan, D. G.; Shaw, J. N.; Mask, P. L.; Rickman, D.; Luvall, J.; Wersinger, J. M.

    2003-01-01

    Transformations and losses of nitrogen (N) throughout the growing season can be costly. Methods in place to improve N management and facilitate split N applications during the growing season can be time consuming and logistically difficult. Remote sensing (RS) may be a method to rapidly assess temporal changes in crop N status and promote more efficient N management. This study was designed to evaluate the ability of three different RS platforms to predict N variability in corn (Zea mays L.) leaves during vegetative and early reproductive growth stages. Plots (15 x 15m) were established in the Coastal Plain (CP) and Appalachian Plateau (AP) physiographic regions each spring from 2000 to 2002 in a completely randomized design. Treatments consisted of four N rates (0, 56, 112, and 168 kg N/ha) applied as ammonium nitrate (NH4N03) replicated four time. Spectral measurements were acquired via spectroradiometer (lambda = 350 - 1050 nm), Airborne Terrestrial Applications Sensor (ATLAS) (lambda = 400 - 12,500 nm), and the IKONOS satellite (lambda = 450 - 900 nm). Spectroradiometer data were collected on a biweekly basis from V4 through R1. Due to the nature of - satellite and aircraft acquisitions, these data were acquired per availability. Chlorophyll meter (SPAD) and tissue N were collected as ancillary data along with each RS acquisition. Results showed vegetation indices derived from hand-held spectroradiometer measurements as early as V6-V8 were linearly related to yield and tissue N content. ATLAS data was correlated with tissue N at the AP site during the V6 stage (r2 = 0.66), but no significant relationships were observed at the CP site. No significant relationships were observed between plant N and IKONOS imagery. Using a combination of the greenness vegetation index (GNDVI) and the normalized difference vegetation index (NDVI), RS data acquired via ATLAS and the spectroradiometer could be used to evaluate tissue N variability and estimate corn yield variability

  13. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    Science.gov (United States)

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  14. Office of Analysis and Evaluation of Operational Data 1989 annual report, Power reactors

    Energy Technology Data Exchange (ETDEWEB)

    None

    1990-07-01

    The annual report of the US Nuclear Regulatory Commission's Office for Analysis and Evaluation of Operational Data (AEOD) is devoted to the activities performed during 1989. The report is published in two separate parts. This document, NUREG-1272, Vol. 4, No. 1, covers power reactors and presents an overview of the operating experience of the nuclear power industry from the NRC perspective, including comments about the trends of some key performance measures. The report also includes the principal findings and issues identified in AEOD studies over the past year and summarizes information from such sources as licensee event reports, diagnostic evaluations, and reports to the NRC's Operations Center. This report also compiles the status of staff actions resulting from previous Incident Investigation Team (IIT) reports. 16 figs., 9 tabs.

  15. Using hydrogeologic data to evaluate geothermal potential in the eastern Great Basin

    Science.gov (United States)

    Masbruch, Melissa D.; Heilweil, Victor M.; Brooks, Lynette E.

    2012-01-01

    In support of a larger study to evaluate geothermal resource development of high-permeability stratigraphic units in sedimentary basins, this paper integrates groundwater and thermal data to evaluate heat and fluid flow within the eastern Great Basin. Previously published information from a hydrogeologic framework, a potentiometric-surface map, and groundwater budgets was compared to a surficial heat-flow map. Comparisons between regional groundwater flow patterns and surficial heat flow indicate a strong spatial relation between regional groundwater movement and surficial heat distribution. Combining aquifer geometry and heat-flow maps, a selected group of subareas within the eastern Great Basin are identified that have high surficial heat flow and are underlain by a sequence of thick basin-fill deposits and permeable carbonate aquifers. These regions may have potential for future geothermal resources development.

  16. Channel Models for Capacity Evaluation of MIMO Handsets in Data Mode

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ødum; Yanakiev, Boyan; Barrio, Samantha Caporal Del

    2017-01-01

    This work investigates different correlation based models useful for evaluation of outage capacity (OC) of mobile multiple-input multiple-output (MIMO) handsets. The work is based on a large measurement campaign in a micro-cellular setup involving two dual-band base stations, 10 different handsets...... in an indoor environment for different use cases and test users. Several models are evaluated statistically, comparing the OC values estimated from the model and measurement data, respectively, for about 2,700 measurement routes. The models are based on either estimates of the full correlation matrices...... or simplifications. Among other results, it is shown that the OC can be predicted accurately (median error typically within 2.6%) with a model assuming knowledge only of the Tx-correlation coefficient and the mean power gain....

  17. Draft evaluation of the frequency for gas sampling for the high burnup confirmatory data project

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, Christine T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Alsaed, Halim A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-26

    This report fulfills the M3 milestone M3FT-15SN0802041, “Draft Evaluation of the Frequency for Gas Sampling for the High Burn-up Storage Demonstration Project” under Work Package FT-15SN080204, “ST Field Demonstration Support – SNL”. This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed based on operational considerations. Gas sampling will provide information on the presence of residual water (and byproducts associated with its reactions and decomposition) and breach of cladding, which could inform the decision of when to open the project cask.

  18. Evaluation of TRMM 3B43 Precipitation Data for Drought Monitoring in Jiangsu Province, China

    Directory of Open Access Journals (Sweden)

    Hui Tao

    2016-05-01

    Full Text Available Satellite-based precipitation monitoring at high spatial resolution is crucial for assessing the water and energy cycles at the global and regional scale. Based on the recently released 7th version of the Multi-satellite Precipitation Analysis (TMPA product of the Tropical Rainfall Measuring Mission (TRMM, and the monthly precipitation data (3B43 are evaluated using observed monthly precipitation from 65 meteorological stations in Jiangsu Province, China, for the period 1998–2014. Additionally, the standardized precipitation index (SPI, which is derived by a nonparametric approach, is employed to investigate the suitability of the TRMM 3B43 precipitation data for drought monitoring in Jiangsu Province. The temporal correlations between observations and the TRMM 3B43 precipitation data show, in general, reasonable agreement for different time scales. However, in summer, only 50% of the stations present correlation coefficients that are statistically significant at the 95% confidence interval. The overall best agreement of TRMM 3B43 precipitation data at seasonal scale tends to occur in autumn (SON. The comparative analysis of the calculated SPI time series suggests that the accuracy of TRMM3B43 decreases with increasing time scale. Stations with significant correlation coefficients also become less spatially homogeneous with increasing time scale. In summary, the findings demonstrate that TRMM 3B43 precipitation data can be used for reliable short-term drought monitoring in Jiangsu province, while temporal-spatial limitations exist for longer time scales.

  19. Quality evaluation of cancer study Common Data Elements using the UMLS Semantic Network.

    Science.gov (United States)

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-12-01

    The binding of controlled terminology has been regarded as important for standardization of Common Data Elements (CDEs) in cancer research. However, the potential of such binding has not yet been fully explored, especially its quality assurance aspect. The objective of this study is to explore whether there is a relationship between terminological annotations and the UMLS Semantic Network (SN) that can be exploited to improve those annotations. We profiled the terminological concepts associated with the standard structure of the CDEs of the NCI Cancer Data Standards Repository (caDSR) using the UMLS SN. We processed 17798 data elements and extracted 17526 primary object class/property concept pairs. We identified dominant semantic types for the categories "object class" and "property" and determined that the preponderance of the instances were disjoint (i.e. the intersection of semantic types between the two categories is empty). We then performed a preliminary evaluation on the data elements whose asserted primary object class/property concept pairs conflict with this observation - where the semantic type of the object class fell into a SN category typically used by property or visa-versa. In conclusion, the UMLS SN based profiling approach is feasible for the quality assurance and accessibility of the cancer study CDEs. This approach could provide useful insight about how to build mechanisms of quality assurance in a meta-data repository. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. 2009.3 Revision of the Evaluated Nuclear Data Library (ENDL2009.3)

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, I. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Beck, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Descalle, M. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mattoon, C. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jurgenson, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-11-06

    LLNL's Computational Nuclear Data and Theory Group have created a 2009.3 revised release of the Evaluated Nuclear Data Library (ENDL2009.3). This library is designed to support LLNL's current and future nuclear data needs and will be employed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. The ENDL2009 database was the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. It was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/Office of Science's US Nuclear Data Program. This document lists the revisions and fixes made in a new release called ENDL2009.3, by com- paring with the existing data in the previous release ENDL2009.2. These changes are made in conjunction with the revisions for ENDL2011.3, so that both the .3 releases are as free as possible of known defects.