WorldWideScience

Sample records for multiple source databases

  1. The Protein Identifier Cross-Referencing (PICR service: reconciling protein identifiers across multiple source databases

    Directory of Open Access Journals (Sweden)

    Leinonen Rasko

    2007-10-01

    Full Text Available Abstract Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR service, a web application that provides interactive and programmatic (SOAP and REST access to a mapping algorithm that uses the UniProt Archive (UniParc as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV or Microsoft Excel (XLS files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR

  2. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    clusters with differential expression during the differentiation toward megakaryocyte were identified. Conclusions TRAM is designed to create, and statistically analyze, quantitative transcriptome maps, based on gene expression data from multiple sources. The release includes FileMaker Pro database management runtime application and it is freely available at http://apollo11.isto.unibo.it/software/, along with preconfigured implementations for mapping of human, mouse and zebrafish transcriptomes.

  3. Mobile Source Observation Database (MSOD)

    Science.gov (United States)

    The Mobile Source Observation Database (MSOD) is a relational database developed by the Assessment and Standards Division (ASD) of the U.S. EPA Office of Transportation and Air Quality (formerly the Office of Mobile Sources).

  4. Technical Note: A new global database of trace gases and aerosols from multiple sources of high vertical resolution measurements

    Directory of Open Access Journals (Sweden)

    G. E. Bodeker

    2008-09-01

    Full Text Available A new database of trace gases and aerosols with global coverage, derived from high vertical resolution profile measurements, has been assembled as a collection of binary data files; hereafter referred to as the "Binary DataBase of Profiles" (BDBP. Version 1.0 of the BDBP, described here, includes measurements from different satellite- (HALOE, POAM II and III, SAGE I and II and ground-based measurement systems (ozonesondes. In addition to the primary product of ozone, secondary measurements of other trace gases, aerosol extinction, and temperature are included. All data are subjected to very strict quality control and for every measurement a percentage error on the measurement is included. To facilitate analyses, each measurement is added to 3 different instances (3 different grids of the database where measurements are indexed by: (1 geographic latitude, longitude, altitude (in 1 km steps and time, (2 geographic latitude, longitude, pressure (at levels ~1 km apart and time, (3 equivalent latitude, potential temperature (8 levels from 300 K to 650 K and time.

    In contrast to existing zonal mean databases, by including a wider range of measurement sources (both satellite and ozonesondes, the BDBP is sufficiently dense to permit calculation of changes in ozone by latitude, longitude and altitude. In addition, by including other trace gases such as water vapour, this database can be used for comprehensive radiative transfer calculations. By providing the original measurements rather than derived monthly means, the BDBP is applicable to a wider range of applications than databases containing only monthly mean data. Monthly mean zonal mean ozone concentrations calculated from the BDBP are compared with the database of Randel and Wu, which has been used in many earlier analyses. As opposed to that database which is generated from regression model fits, the BDBP uses the original (quality controlled measurements with no smoothing applied in any

  5. Mobile Source Observation Database (MSOD)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Mobile Source Observation Database (MSOD) is a relational database being developed by the Assessment and Standards Division (ASD) of the US Environmental...

  6. Open Source Vulnerability Database Project

    Directory of Open Access Journals (Sweden)

    Jake Kouns

    2008-06-01

    Full Text Available This article introduces the Open Source Vulnerability Database (OSVDB project which manages a global collection of computer security vulnerabilities, available for free use by the information security community. This collection contains information on known security weaknesses in operating systems, software products, protocols, hardware devices, and other infrastructure elements of information technology. The OSVDB project is intended to be the centralized global open source vulnerability collection on the Internet.

  7. Neutron source multiplication method

    International Nuclear Information System (INIS)

    Clayton, E.D.

    1985-01-01

    Extensive use has been made of neutron source multiplication in thousands of measurements of critical masses and configurations and in subcritical neutron-multiplication measurements in situ that provide data for criticality prevention and control in nuclear materials operations. There is continuing interest in developing reliable methods for monitoring the reactivity, or k/sub eff/, of plant operations, but the required measurements are difficult to carry out and interpret on the far subcritical configurations usually encountered. The relationship between neutron multiplication and reactivity is briefly discussed and data presented to illustrate problems associated with the absolute measurement of neutron multiplication and reactivity in subcritical systems. A number of curves of inverse multiplication have been selected from a variety of experiments showing variations observed in multiplication during the course of critical and subcritical experiments where different methods of reactivity addition were used, with different neutron source detector position locations. Concern is raised regarding the meaning and interpretation of k/sub eff/ as might be measured in a far subcritical system because of the modal effects and spectrum differences that exist between the subcritical and critical systems. Because of this, the calculation of k/sub eff/ identical with unity for the critical assembly, although necessary, may not be sufficient to assure safety margins in calculations pertaining to far subcritical systems. Further study is needed on the interpretation and meaning of k/sub eff/ in the far subcritical system

  8. Hyperdatabase: A schema for browsing multiple databases

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, M A [Dalhousie Univ., Halifax (Canada). Computer Science Div.; Watters, C R [Waterloo Univ., Waterloo (Canada). Computer Science Dept.

    1990-05-01

    In order to insure effective information retrieval, a user may need to search multiple databases on multiple systems. Although front end systems have been developed to assist the user in accessing different systems, they access one retrieval system at a time and the search has to be repeated for each required database on each retrieval system. More importantly, the user interacts with the results as independent sessions. This paper models multiple bibliographic databases distributed over one or more retrieval systems as a hyperdatabase, i.e., a single virtual database. The hyperdatabase is viewed as a hypergraph in which each node represents a bibliographic item and the links among nodes represent relations among the items. In the response to a query, bibliographic items are extracted from the hyperdatabase and linked together to form a transient hypergraph. This hypergraph is transient in the sense that it is ``created`` in response to a query and only ``exists`` for the duration of the query session. A hypertext interface permits the user to browse the transient hypergraph in a nonlinear manner. The technology to implement a system based on this model is available now, consisting of powerful workstation, distributed processing, high-speed communications, and CD-ROMs. As the technology advances and costs decrease such systems should be generally available. (author). 13 refs, 5 figs.

  9. Hyperdatabase: A schema for browsing multiple databases

    International Nuclear Information System (INIS)

    Shepherd, M.A.; Watters, C.R.

    1990-05-01

    In order to insure effective information retrieval, a user may need to search multiple databases on multiple systems. Although front end systems have been developed to assist the user in accessing different systems, they access one retrieval system at a time and the search has to be repeated for each required database on each retrieval system. More importantly, the user interacts with the results as independent sessions. This paper models multiple bibliographic databases distributed over one or more retrieval systems as a hyperdatabase, i.e., a single virtual database. The hyperdatabase is viewed as a hypergraph in which each node represents a bibliographic item and the links among nodes represent relations among the items. In the response to a query, bibliographic items are extracted from the hyperdatabase and linked together to form a transient hypergraph. This hypergraph is transient in the sense that it is ''created'' in response to a query and only ''exists'' for the duration of the query session. A hypertext interface permits the user to browse the transient hypergraph in a nonlinear manner. The technology to implement a system based on this model is available now, consisting of powerful workstation, distributed processing, high-speed communications, and CD-ROMs. As the technology advances and costs decrease such systems should be generally available. (author). 13 refs, 5 figs

  10. Free software and open source databases

    Directory of Open Access Journals (Sweden)

    Napoleon Alexandru SIRITEANU

    2006-01-01

    Full Text Available The emergence of free/open source software -FS/OSS- enterprises seeks to push software development out of the academic stream into the commercial mainstream, and as a result, end-user applications such as open source database management systems (PostgreSQL, MySQL, Firebird are becoming more popular. Companies like Sybase, Oracle, Sun, IBM are increasingly implementing open source strategies and porting programs/applications into the Linux environment. Open source software is redefining the software industry in general and database development in particular.

  11. DEIMOS – an Open Source Image Database

    Directory of Open Access Journals (Sweden)

    M. Blazek

    2011-12-01

    Full Text Available The DEIMOS (DatabasE of Images: Open Source is created as an open-source database of images and videos for testing, verification and comparing of various image and/or video processing techniques such as enhancing, compression and reconstruction. The main advantage of DEIMOS is its orientation to various application fields – multimedia, television, security, assistive technology, biomedicine, astronomy etc. The DEIMOS is/will be created gradually step-by-step based upon the contributions of team members. The paper is describing basic parameters of DEIMOS database including application examples.

  12. Data analysis and pattern recognition in multiple databases

    CERN Document Server

    Adhikari, Animesh; Pedrycz, Witold

    2014-01-01

    Pattern recognition in data is a well known classical problem that falls under the ambit of data analysis. As we need to handle different data, the nature of patterns, their recognition and the types of data analyses are bound to change. Since the number of data collection channels increases in the recent time and becomes more diversified, many real-world data mining tasks can easily acquire multiple databases from various sources. In these cases, data mining becomes more challenging for several essential reasons. We may encounter sensitive data originating from different sources - those cannot be amalgamated. Even if we are allowed to place different data together, we are certainly not able to analyse them when local identities of patterns are required to be retained. Thus, pattern recognition in multiple databases gives rise to a suite of new, challenging problems different from those encountered before. Association rule mining, global pattern discovery, and mining patterns of select items provide different...

  13. Plume rise from multiple sources

    International Nuclear Information System (INIS)

    Briggs, G.A.

    1975-01-01

    A simple enhancement factor for plume rise from multiple sources is proposed and tested against plume-rise observations. For bent-over buoyant plumes, this results in the recommendation that multiple-source rise be calculated as [(N + S)/(1 + S)]/sup 1/3/ times the single-source rise, Δh 1 , where N is the number of sources and S = 6 (total width of source configuration/N/sup 1/3/ Δh 1 )/sup 3/2/. For calm conditions a crude but simple method is suggested for predicting the height of plume merger and subsequent behavior which is based on the geometry and velocity variations of a single buoyant plume. Finally, it is suggested that large clusters of buoyant sources might occasionally give rise to concentrated vortices either within the source configuration or just downwind of it

  14. The Development of Ontology from Multiple Databases

    Science.gov (United States)

    Kasim, Shahreen; Aswa Omar, Nurul; Fudzee, Mohd Farhan Md; Azhar Ramli, Azizul; Aizi Salamat, Mohamad; Mahdin, Hairulnizam

    2017-08-01

    The area of halal industry is the fastest growing global business across the world. The halal food industry is thus crucial for Muslims all over the world as it serves to ensure them that the food items they consume daily are syariah compliant. Currently, ontology has been widely used in computer sciences area such as web on the heterogeneous information processing, semantic web, and information retrieval. However, ontology has still not been used widely in the halal industry. Today, Muslim community still have problem to verify halal status for products in the market especially foods consisting of E number. This research tried to solve problem in validating the halal status from various halal sources. There are various chemical ontology from multilple databases found to help this ontology development. The E numbers in this chemical ontology are codes for chemicals that can be used as food additives. With this E numbers ontology, Muslim community could identify and verify the halal status effectively for halal products in the market.

  15. Linking Multiple Databases: Term Project Using "Sentences" DBMS.

    Science.gov (United States)

    King, Ronald S.; Rainwater, Stephen B.

    This paper describes a methodology for use in teaching an introductory Database Management System (DBMS) course. Students master basic database concepts through the use of a multiple component project implemented in both relational and associative data models. The associative data model is a new approach for designing multi-user, Web-enabled…

  16. Developing an Inhouse Database from Online Sources.

    Science.gov (United States)

    Smith-Cohen, Deborah

    1993-01-01

    Describes the development of an in-house bibliographic database by the U.S. Army Corp of Engineers Cold Regions Research and Engineering Laboratory on arctic wetlands research. Topics discussed include planning; identifying relevant search terms and commercial online databases; downloading citations; criteria for software selection; management…

  17. Power source roadmaps using bibliometrics and database tomography

    International Nuclear Information System (INIS)

    Kostoff, R.N.; Tshiteya, R.; Pfeil, K.M.; Humenik, J.A.; Karypis, G.

    2005-01-01

    Database Tomography (DT) is a textual database analysis system consisting of two major components: (1) algorithms for extracting multi-word phrase frequencies and phrase proximities (physical closeness of the multi-word technical phrases) from any type of large textual database, to augment (2) interpretative capabilities of the expert human analyst. DT was used to derive technical intelligence from a Power Sources database derived from the Science Citation Index. Phrase frequency analysis by the technical domain experts provided the pervasive technical themes of the Power Sources database, and the phrase proximity analysis provided the relationships among the pervasive technical themes. Bibliometric analysis of the Power Sources literature supplemented the DT results with author/journal/institution/country publication and citation data

  18. Quality Assurance Source Requirements Traceability Database

    International Nuclear Information System (INIS)

    MURTHY, R.; NAYDENOVA, A.; DEKLEVER, R.; BOONE, A.

    2006-01-01

    At the Yucca Mountain Project the Project Requirements Processing System assists in the management of relationships between regulatory and national/industry standards source criteria, and Quality Assurance Requirements and Description document (DOE/R W-0333P) requirements to create compliance matrices representing respective relationships. The matrices are submitted to the U.S. Nuclear Regulatory Commission to assist in the commission's review, interpretation, and concurrence with the Yucca Mountain Project QA program document. The tool is highly customized to meet the needs of the Office of Civilian Radioactive Waste Management Office of Quality Assurance

  19. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif

    2014-12-11

    Various examples are provided for land streamer seismic surveying using multiple sources. In one example, among others, a method includes disposing a land streamer in-line with first and second shot sources. The first shot source is at a first source location adjacent to a proximal end of the land streamer and the second shot source is at a second source location separated by a fixed length corresponding to a length of the land streamer. Shot gathers can be obtained when the shot sources are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first shot source. The second shot source is separated from the first shot source by a fixed overall length corresponding to the land streamer.

  20. Implementation of a database for the management of radioactive sources

    International Nuclear Information System (INIS)

    MOHAMAD, M.

    2012-01-01

    In Madagascar, the application of nuclear technology continues to develop. In order to protect the human health and his environment against the harmful effects of the ionizing radiation, each user of radioactive sources has to implement a program of nuclear security and safety and to declare their sources at Regulatory Authority. This Authority must have access to all the informations relating to all the sources and their uses. This work is based on the elaboration of a software using python as programming language and SQlite as database. It makes possible to computerize the radioactive sources management.This application unifies the various existing databases and centralizes the activities of the radioactive sources management.The objective is to follow the movement of each source in the Malagasy territory in order to avoid the risks related on the use of the radioactive sources and the illicit traffic. [fr

  1. Multiple brain atlas database and atlas-based neuroimaging system.

    Science.gov (United States)

    Nowinski, W L; Fang, A; Nguyen, B T; Raphel, J K; Jagannathan, L; Raghavan, R; Bryan, R N; Miller, G A

    1997-01-01

    For the purpose of developing multiple, complementary, fully labeled electronic brain atlases and an atlas-based neuroimaging system for analysis, quantification, and real-time manipulation of cerebral structures in two and three dimensions, we have digitized, enhanced, segmented, and labeled the following print brain atlases: Co-Planar Stereotaxic Atlas of the Human Brain by Talairach and Tournoux, Atlas for Stereotaxy of the Human Brain by Schaltenbrand and Wahren, Referentially Oriented Cerebral MRI Anatomy by Talairach and Tournoux, and Atlas of the Cerebral Sulci by Ono, Kubik, and Abernathey. Three-dimensional extensions of these atlases have been developed as well. All two- and three-dimensional atlases are mutually preregistered and may be interactively registered with an actual patient's data. An atlas-based neuroimaging system has been developed that provides support for reformatting, registration, visualization, navigation, image processing, and quantification of clinical data. The anatomical index contains about 1,000 structures and over 400 sulcal patterns. Several new applications of the brain atlas database also have been developed, supported by various technologies such as virtual reality, the Internet, and electronic publishing. Fusion of information from multiple atlases assists the user in comprehensively understanding brain structures and identifying and quantifying anatomical regions in clinical data. The multiple brain atlas database and atlas-based neuroimaging system have substantial potential impact in stereotactic neurosurgery and radiotherapy by assisting in visualization and real-time manipulation in three dimensions of anatomical structures, in quantitative neuroradiology by allowing interactive analysis of clinical data, in three-dimensional neuroeducation, and in brain function studies.

  2. Thermionic detector with multiple layered ionization source

    International Nuclear Information System (INIS)

    Patterson, P. L.

    1985-01-01

    Method and apparatus for analyzing specific chemical substances in a gaseous environment comprises a thermionic source formed of multiple layers of ceramic material composition, an electrical current instrumentality for heating the thermionic source to operating temperatures in the range of 100 0 C. to 1000 0 C., an instrumentality for exposing the surface of the thermionic source to contact with the specific chemical substances for the purpose of forming gas phase ionization of the substances by a process of electrical charge emission from the surface, a collector electrode disposed adjacent to the thermiomic source, an instrumentality for biasing the thermionic source at an electrical potential which causes the gas phase ions to move toward the collector, and an instrumentality for measuring the ion current arriving at the collector. The thermionic source is constructed of a metallic heater element molded inside a sub-layer of hardened ceramic cement material impregnated with a metallic compound additive which is non-corrosive to the heater element during operation. The sub-layer is further covered by a surface-layer formed of hardened ceramic cement material impregnated with an alkali metal compound in a manner that eliminates corrosive contact of the alkali compounds with the heater element. The sub-layer further protects the heater element from contact with gas environments which may be corrosive. The specific ionization of different chemical substances is varied over a wide range by changing the composition and temperature of the thermionic source, and by changing the composition of the gas environment

  3. Binaural Processing of Multiple Sound Sources

    Science.gov (United States)

    2016-08-18

    AFRL-AFOSR-VA-TR-2016-0298 Binaural Processing of Multiple Sound Sources William Yost ARIZONA STATE UNIVERSITY 660 S MILL AVE STE 312 TEMPE, AZ 85281...18-08-2016 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Jul 2012 to 14 Jul 2016 4. TITLE AND SUBTITLE Binaural Processing of...three topics cited above are entirely within the scope of the AFOSR grant. 15. SUBJECT TERMS Binaural hearing, Sound Localization, Interaural signal

  4. The NASA Goddard Group's Source Monitoring Database and Program

    Science.gov (United States)

    Gipson, John; Le Bail, Karine; Ma, Chopo

    2014-12-01

    Beginning in 2003, the Goddard VLBI group developed a program to purposefully monitor when sources were observed and to increase the observations of ``under-observed'' sources. The heart of the program consists of a MySQL database that keeps track of, on a session-by-session basis: the number of observations that are scheduled for a source, the number of observations that are successfully correlated, and the number of observations that are used in a session. In addition, there is a table that contains the target number of successful sessions over the last twelve months. Initially this table just contained two categories. Sources in the geodetic catalog had a target of 12 sessions/year; the remaining ICRF-1 defining sources had a target of two sessions/year. All other sources did not have a specific target. As the program evolved, different kinds of sources with different observing targets were added. During the scheduling process, the scheduler has the option of automatically selecting N sources which have not met their target. We discuss the history and present some results of this successful program.

  5. A New Global Open Source Marine Hydrocarbon Emission Site Database

    Science.gov (United States)

    Onyia, E., Jr.; Wood, W. T.; Barnard, A.; Dada, T.; Qazzaz, M.; Lee, T. R.; Herrera, E.; Sager, W.

    2017-12-01

    Hydrocarbon emission sites (e.g. seeps) discharge large volumes of fluids and gases into the oceans that are not only important for biogeochemical budgets, but also support abundant chemosynthetic communities. Documenting the locations of modern emissions is a first step towards understanding and monitoring how they affect the global state of the seafloor and oceans. Currently, no global open source (i.e. non-proprietry) detailed maps of emissions sites are available. As a solution, we have created a database that is housed within an Excel spreadsheet and use the latest versions of Earthpoint and Google Earth for position coordinate conversions and data mapping, respectively. To date, approximately 1,000 data points have been collected from referenceable sources across the globe, and we are continualy expanding the dataset. Due to the variety of spatial extents encountered, to identify each site we used two different methods: 1) point (x, y, z) locations for individual sites and; 2) delineation of areas where sites are clustered. Certain well-known areas, such as the Gulf of Mexico and the Mediterranean Sea, have a greater abundance of information; whereas significantly less information is available in other regions due to the absence of emission sites, lack of data, or because the existing data is proprietary. Although the geographical extent of the data is currently restricted to regions where the most data is publicly available, as the database matures, we expect to have more complete coverage of the world's oceans. This database is an information resource that consolidates and organizes the existing literature on hydrocarbons released into the marine environment, thereby providing a comprehensive reference for future work. We expect that the availability of seafloor hydrocarbon emission maps will benefit scientific understanding of hydrocarbon rich areas as well as potentially aiding hydrocarbon exploration and environmental impact assessements.

  6. Efficient Processing of Multiple DTW Queries in Time Series Databases

    DEFF Research Database (Denmark)

    Kremer, Hardy; Günnemann, Stephan; Ivanescu, Anca-Maria

    2011-01-01

    . In many of today’s applications, however, large numbers of queries arise at any given time. Existing DTW techniques do not process multiple DTW queries simultaneously, a serious limitation which slows down overall processing. In this paper, we propose an efficient processing approach for multiple DTW...... for multiple DTW queries....

  7. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  8. Multiple k Nearest Neighbor Query Processing in Spatial Network Databases

    DEFF Research Database (Denmark)

    Xuegang, Huang; Jensen, Christian Søndergaard; Saltenis, Simonas

    2006-01-01

    This paper concerns the efficient processing of multiple k nearest neighbor queries in a road-network setting. The assumed setting covers a range of scenarios such as the one where a large population of mobile service users that are constrained to a road network issue nearest-neighbor queries...... for points of interest that are accessible via the road network. Given multiple k nearest neighbor queries, the paper proposes progressive techniques that selectively cache query results in main memory and subsequently reuse these for query processing. The paper initially proposes techniques for the case...... where an upper bound on k is known a priori and then extends the techniques to the case where this is not so. Based on empirical studies with real-world data, the paper offers insight into the circumstances under which the different proposed techniques can be used with advantage for multiple k nearest...

  9. 46 CFR 111.10-5 - Multiple energy sources.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  10. Data Sources for Trait Databases: Comparing the Phenomic Content of Monographs and Evolutionary Matrices.

    Science.gov (United States)

    Dececchi, T Alex; Mabee, Paula M; Blackburn, David C

    2016-01-01

    Databases of organismal traits that aggregate information from one or multiple sources can be leveraged for large-scale analyses in biology. Yet the differences among these data streams and how well they capture trait diversity have never been explored. We present the first analysis of the differences between phenotypes captured in free text of descriptive publications ('monographs') and those used in phylogenetic analyses ('matrices'). We focus our analysis on osteological phenotypes of the limbs of four extinct vertebrate taxa critical to our understanding of the fin-to-limb transition. We find that there is low overlap between the anatomical entities used in these two sources of phenotype data, indicating that phenotypes represented in matrices are not simply a subset of those found in monographic descriptions. Perhaps as expected, compared to characters found in matrices, phenotypes in monographs tend to emphasize descriptive and positional morphology, be somewhat more complex, and relate to fewer additional taxa. While based on a small set of focal taxa, these qualitative and quantitative data suggest that either source of phenotypes alone will result in incomplete knowledge of variation for a given taxon. As a broader community develops to use and expand databases characterizing organismal trait diversity, it is important to recognize the limitations of the data sources and develop strategies to more fully characterize variation both within species and across the tree of life.

  11. Integrating multiple data sources for malware classification

    Science.gov (United States)

    Anderson, Blake Harrell; Storlie, Curtis B; Lane, Terran

    2015-04-28

    Disclosed herein are representative embodiments of tools and techniques for classifying programs. According to one exemplary technique, at least one graph representation of at least one dynamic data source of at least one program is generated. Also, at least one graph representation of at least one static data source of the at least one program is generated. Additionally, at least using the at least one graph representation of the at least one dynamic data source and the at least one graph representation of the at least one static data source, the at least one program is classified.

  12. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif; Schuster, Gerard T.

    2014-01-01

    are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first

  13. Reconstruction of multiple line source attenuation maps

    International Nuclear Information System (INIS)

    Celler, A.; Sitek, A.; Harrop, R.

    1996-01-01

    A simple configuration for a transmission source for the single photon emission computed tomography (SPECT) was proposed, which utilizes a series of collimated line sources parallel to the axis of rotation of a camera. The detector is equipped with a standard parallel hole collimator. We have demonstrated that this type of source configuration can be used to generate sufficient data for the reconstruction of the attenuation map when using 8-10 line sources spaced by 3.5-4.5 cm for a 30 x 40cm detector at 65cm distance from the sources. Transmission data for a nonuniform thorax phantom was simulated, then binned and reconstructed using filtered backprojection (FBP) and iterative methods. The optimum maps are obtained with data binned into 2-3 bins and FBP reconstruction. The activity in the source was investigated for uniform and exponential activity distributions, as well as the effect of gaps and overlaps of the neighboring fan beams. A prototype of the line source has been built and the experimental verification of the technique has started

  14. MetaboSearch: tool for mass-based metabolite identification using multiple databases.

    Directory of Open Access Journals (Sweden)

    Bin Zhou

    Full Text Available Searching metabolites against databases according to their masses is often the first step in metabolite identification for a mass spectrometry-based untargeted metabolomics study. Major metabolite databases include Human Metabolome DataBase (HMDB, Madison Metabolomics Consortium Database (MMCD, Metlin, and LIPID MAPS. Since each one of these databases covers only a fraction of the metabolome, integration of the search results from these databases is expected to yield a more comprehensive coverage. However, the manual combination of multiple search results is generally difficult when identification of hundreds of metabolites is desired. We have implemented a web-based software tool that enables simultaneous mass-based search against the four major databases, and the integration of the results. In addition, more complete chemical identifier information for the metabolites is retrieved by cross-referencing multiple databases. The search results are merged based on IUPAC International Chemical Identifier (InChI keys. Besides a simple list of m/z values, the software can accept the ion annotation information as input for enhanced metabolite identification. The performance of the software is demonstrated on mass spectrometry data acquired in both positive and negative ionization modes. Compared with search results from individual databases, MetaboSearch provides better coverage of the metabolome and more complete chemical identifier information.The software tool is available at http://omics.georgetown.edu/MetaboSearch.html.

  15. Research on neutron source multiplication method in nuclear critical safety

    International Nuclear Information System (INIS)

    Zhu Qingfu; Shi Yongqian; Hu Dingsheng

    2005-01-01

    The paper concerns in the neutron source multiplication method research in nuclear critical safety. Based on the neutron diffusion equation with external neutron source the effective sub-critical multiplication factor k s is deduced, and k s is different to the effective neutron multiplication factor k eff in the case of sub-critical system with external neutron source. The verification experiment on the sub-critical system indicates that the parameter measured with neutron source multiplication method is k s , and k s is related to the external neutron source position in sub-critical system and external neutron source spectrum. The relation between k s and k eff and the effect of them on nuclear critical safety is discussed. (author)

  16. Oak Ridge Reservation Environmental Protection Rad Neshaps Radionuclide Inventory Web Database and Rad Neshaps Source and Dose Database.

    Science.gov (United States)

    Scofield, Patricia A; Smith, Linda L; Johnson, David N

    2017-07-01

    The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y-12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations on Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package-1988 computer model files. This database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.

  17. Learning from Multiple Sources for Video Summarisation

    OpenAIRE

    Zhu, Xiatian; Loy, Chen Change; Gong, Shaogang

    2015-01-01

    Many visual surveillance tasks, e.g.video summarisation, is conventionally accomplished through analysing imagerybased features. Relying solely on visual cues for public surveillance video understanding is unreliable, since visual observations obtained from public space CCTV video data are often not sufficiently trustworthy and events of interest can be subtle. On the other hand, non-visual data sources such as weather reports and traffic sensory signals are readily accessible but are not exp...

  18. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  19. Performance of an open-source heart sound segmentation algorithm on eight independent databases.

    Science.gov (United States)

    Liu, Chengyu; Springer, David; Clifford, Gari D

    2017-08-01

    Heart sound segmentation is a prerequisite step for the automatic analysis of heart sound signals, facilitating the subsequent identification and classification of pathological events. Recently, hidden Markov model-based algorithms have received increased interest due to their robustness in processing noisy recordings. In this study we aim to evaluate the performance of the recently published logistic regression based hidden semi-Markov model (HSMM) heart sound segmentation method, by using a wider variety of independently acquired data of varying quality. Firstly, we constructed a systematic evaluation scheme based on a new collection of heart sound databases, which we assembled for the PhysioNet/CinC Challenge 2016. This collection includes a total of more than 120 000 s of heart sounds recorded from 1297 subjects (including both healthy subjects and cardiovascular patients) and comprises eight independent heart sound databases sourced from multiple independent research groups around the world. Then, the HSMM-based segmentation method was evaluated using the assembled eight databases. The common evaluation metrics of sensitivity, specificity, accuracy, as well as the [Formula: see text] measure were used. In addition, the effect of varying the tolerance window for determining a correct segmentation was evaluated. The results confirm the high accuracy of the HSMM-based algorithm on a separate test dataset comprised of 102 306 heart sounds. An average [Formula: see text] score of 98.5% for segmenting S1 and systole intervals and 97.2% for segmenting S2 and diastole intervals were observed. The [Formula: see text] score was shown to increases with an increases in the tolerance window size, as expected. The high segmentation accuracy of the HSMM-based algorithm on a large database confirmed the algorithm's effectiveness. The described evaluation framework, combined with the largest collection of open access heart sound data, provides essential resources for

  20. An open source web interface for linking models to infrastructure system databases

    Science.gov (United States)

    Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.

    2016-12-01

    Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.

  1. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  2. Volcano Monitoring using Multiple Remote Data Sources

    Science.gov (United States)

    Reath, K. A.; Pritchard, M. E.

    2016-12-01

    Satellite-based remote sensing instruments can be used to determine quantitative values related to precursory activity that can act as a warning sign of an upcoming eruption. These warning signs are measured through examining anomalous activity in: (1) thermal flux, (2) gas/aerosol emission rates, (3) ground deformation, and (4) ground-based seismic readings. Patterns in each of these data sources are then analyzed to create classifications of different phases of precursory activity. These different phases of activity act as guidelines to monitor the progression of precursory activity leading to an eruption. Current monitoring methods rely on using high temporal resolution satellite imagery from instruments like the Advanced Very High Resolution Radiometer (AVHRR) and the Moderate Resolution Imaging Spectrometer (MODIS) sensors, for variations in thermal and aerosol emissions, and the Ozone Monitoring Instruments (OMI) and Ozone Mapping Profiler Suite (OMPS) instruments, for variations in gas emissions, to provide a valuable resource for near real-time monitoring of volcanic activity. However, the low spatial resolution of these data only enable events that produce a high thermal output or a large amount of gas/aerosol emissions to be detected. High spatial resolution instruments, like the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensor, have a small enough pixel size (90m2) that the subtle variations in both thermal flux and gas/aerosol emission rates in the pre-eruptive period can be detected. Including these data with the already established high temporal resolution data helps to identify and classify precursory activity patterns months before an eruption (Reath et al, 2016). By correlating these data with ground surface deformation data, determined from the Interferometric Synthetic Aperture Radar (InSAR) sensor, and seismic data, collected by the Incorporated Research Institution for Seismology (IRIS) data archive, subtle

  3. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Directory of Open Access Journals (Sweden)

    Surasak Saokaew

    Full Text Available Health technology assessment (HTA has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced.Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided.Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources.Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  4. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Science.gov (United States)

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  5. European database on indoor air pollution sources in buildings: Current status of database structure and software

    NARCIS (Netherlands)

    Molina, J.L.; Clausen, G.H.; Saarela, K.; Plokker, W.; Bluyssen, P.M.; Bishop, W.; Oliveira Fernandes, E. de

    1996-01-01

    the European Joule II Project European Data Base for Indoor Air Pollution Sources in Buildings. The aim of the project is to produce a tool which would be used by designers to take into account the actual pollution of the air from the building elements and ventilation and air conditioning system

  6. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    Science.gov (United States)

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  7. Optimising case detection within UK electronic health records : use of multiple linked databases for detecting liver injury

    NARCIS (Netherlands)

    Wing, Kevin; Bhaskaran, Krishnan; Smeeth, Liam; van Staa, Tjeerd P|info:eu-repo/dai/nl/304827762; Klungel, Olaf H|info:eu-repo/dai/nl/181447649; Reynolds, Robert F; Douglas, Ian

    2016-01-01

    OBJECTIVES: We aimed to create a 'multidatabase' algorithm for identification of cholestatic liver injury using multiple linked UK databases, before (1) assessing the improvement in case ascertainment compared to using a single database and (2) developing a new single-database case-definition

  8. Characteristics of pediatric multiple sclerosis: The Turkish pediatric multiple sclerosis database.

    Science.gov (United States)

    Yılmaz, Ünsal; Anlar, Banu; Gücüyener, Kıvılcım

    2017-11-01

    To document the clinical and paraclinical features of pediatric multiple sclerosis (MS) in Turkey. Data of MS patients with onset before age 18 years (n = 193) were collected from 27 pediatric neurology centers throughout Turkey. Earlier-onset (<12 years) and later-onset (≥12 years) groups were compared. There were 123 (63.7%) girls and 70 (36.3%) boys aged 4-17 years, median 14 years at disease onset. Family history of MS was 6.5%. The first presentation was polysymptomatic in 55.4% of patients, with brainstem syndromes (50.3%), sensory disturbances (44%), motor symptoms (33.2%), and optic neuritis (26.4%) as common initial manifestations. Nineteen children had facial paralysis and 10 had epileptic seizures at first attack; 21 (11%) were initially diagnosed with acute disseminated encephalomyelitis (ADEM). Oligoclonal bands were identified in 68% of patients. Magnetic resonance imaging revealed periventricular (96%), cortical/juxtacortical (64.2%), brainstem (63%), cerebellum (51.4%), and spinal cord (67%) involvement. Visual evoked potentials (VEP) were abnormal in 52%; serum 25-hydroxyvitamin D levels were low in 68.5% of patients. The earlier-onset group had a higher rate of infection/vaccination preceding initial attack, initial diagnosis of ADEM, longer interval between first 2 attacks, and more disability accumulating in the first 3 years of the disease. Brainstem and cerebellum are common sites of clinical and radiological involvement in pediatric-onset MS. VEP abnormalities are frequent even in patients without history of optic neuropathy. Vitamin D status does not appear to affect the course in early disease. MS beginning before 12 years of age has certain characteristics in history and course. Copyright © 2017 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  9. Coordinating a Large, Amalgamated REU Program with Multiple Funding Sources

    Science.gov (United States)

    Fiorini, Eugene; Myers, Kellen; Naqvi, Yusra

    2017-01-01

    In this paper, we discuss the challenges of organizing a large REU program amalgamated from multiple funding sources, including diverse participants, mentors, and research projects. We detail the program's structure, activities, and recruitment, and we hope to demonstrate that the organization of this REU is not only beneficial to its…

  10. Comparison of open source database systems(characteristics, limits of usage)

    OpenAIRE

    Husárik, Braňko

    2008-01-01

    The goal of this work is to compare some chosen open source database systems (Ingres, PostgreSQL, Firebird, Mysql). First part of work is focused on history and present situation of companies which are developing these products. Second part contains the comparision of certain group of specific features and limits. The benchmark of some operations is its own part. Possibilities of usage of mentioned database systems are summarized at the end of work.

  11. Datafish Multiphase Data Mining Technique to Match Multiple Mutually Inclusive Independent Variables in Large PACS Databases.

    Science.gov (United States)

    Kelley, Brendan P; Klochko, Chad; Halabi, Safwan; Siegal, Daniel

    2016-06-01

    Retrospective data mining has tremendous potential in research but is time and labor intensive. Current data mining software contains many advanced search features but is limited in its ability to identify patients who meet multiple complex independent search criteria. Simple keyword and Boolean search techniques are ineffective when more complex searches are required, or when a search for multiple mutually inclusive variables becomes important. This is particularly true when trying to identify patients with a set of specific radiologic findings or proximity in time across multiple different imaging modalities. Another challenge that arises in retrospective data mining is that much variation still exists in how image findings are described in radiology reports. We present an algorithmic approach to solve this problem and describe a specific use case scenario in which we applied our technique to a real-world data set in order to identify patients who matched several independent variables in our institution's picture archiving and communication systems (PACS) database.

  12. Evaluation of Network Reliability for Computer Networks with Multiple Sources

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2012-01-01

    Full Text Available Evaluating the reliability of a network with multiple sources to multiple sinks is a critical issue from the perspective of quality management. Due to the unrealistic definition of paths of network models in previous literature, existing models are not appropriate for real-world computer networks such as the Taiwan Advanced Research and Education Network (TWAREN. This paper proposes a modified stochastic-flow network model to evaluate the network reliability of a practical computer network with multiple sources where data is transmitted through several light paths (LPs. Network reliability is defined as being the probability of delivering a specified amount of data from the sources to the sink. It is taken as a performance index to measure the service level of TWAREN. This paper studies the network reliability of the international portion of TWAREN from two sources (Taipei and Hsinchu to one sink (New York that goes through a submarine and land surface cable between Taiwan and the United States.

  13. Multisensory softness perceived compliance from multiple sources of information

    CERN Document Server

    Luca, Massimiliano Di

    2014-01-01

    Offers a unique multidisciplinary overview of how humans interact with soft objects and how multiple sensory signals are used to perceive material properties, with an emphasis on object deformability. The authors describe a range of setups that have been employed to study and exploit sensory signals involved in interactions with compliant objects as well as techniques to simulate and modulate softness - including a psychophysical perspective of the field. Multisensory Softness focuses on the cognitive mechanisms underlying the use of multiple sources of information in softness perception. D

  14. Tracking of Multiple Moving Sources Using Recursive EM Algorithm

    Directory of Open Access Journals (Sweden)

    Böhme Johann F

    2005-01-01

    Full Text Available We deal with recursive direction-of-arrival (DOA estimation of multiple moving sources. Based on the recursive EM algorithm, we develop two recursive procedures to estimate the time-varying DOA parameter for narrowband signals. The first procedure requires no prior knowledge about the source movement. The second procedure assumes that the motion of moving sources is described by a linear polynomial model. The proposed recursion updates the polynomial coefficients when a new data arrives. The suggested approaches have two major advantages: simple implementation and easy extension to wideband signals. Numerical experiments show that both procedures provide excellent results in a slowly changing environment. When the DOA parameter changes fast or two source directions cross with each other, the procedure designed for a linear polynomial model has a better performance than the general procedure. Compared to the beamforming technique based on the same parameterization, our approach is computationally favorable and has a wider range of applications.

  15. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  16. Research on amplification multiple of source neutron number for ADS

    International Nuclear Information System (INIS)

    Liu Guisheng; Zhao Zhixiang; Zhang Baocheng; Shen Qingbiao; Ding Dazhao

    1998-01-01

    NJOY-91.91 and MILER code systems was applied to process and generate 44 group cross sections in AMPX master library format from CENDL-2 and ENDF/B-6. It is important an ADS (Accelerator-Driven System) assembly spectrum is used as the weighting spectrum for generating multi-group constants. Amplification multiples of source neutron number for several fast assemblies were calculated

  17. 75 FR 69591 - Medicaid Program; Withdrawal of Determination of Average Manufacturer Price, Multiple Source Drug...

    Science.gov (United States)

    2010-11-15

    ..., Multiple Source Drug Definition, and Upper Limits for Multiple Source Drugs AGENCY: Centers for Medicare... withdrawing the definition of ``multiple source drug'' as it was revised in the ``Medicaid Program; Multiple Source Drug Definition'' final rule published in the October 7, 2008 Federal Register. DATES: Effective...

  18. Quantum Query Complexity for Searching Multiple Marked States from an Unsorted Database

    International Nuclear Information System (INIS)

    Shang Bin

    2007-01-01

    An important and usual sort of search problems is to find all marked states from an unsorted database with a large number of states. Grover's original quantum search algorithm is for finding single marked state with uncertainty, and it has been generalized to the case of multiple marked states, as well as been modified to find single marked state with certainty. However, the query complexity for finding all multiple marked states has not been addressed. We use a generalized Long's algorithm with high precision to solve such a problem. We calculate the approximate query complexity, which increases with the number of marked states and with the precision that we demand. In the end we introduce an algorithm for the problem on a 'duality computer' and show its advantage over other algorithms.

  19. Multiple approaches to microbial source tracking in tropical northern Australia

    KAUST Repository

    Neave, Matthew

    2014-09-16

    Microbial source tracking is an area of research in which multiple approaches are used to identify the sources of elevated bacterial concentrations in recreational lakes and beaches. At our study location in Darwin, northern Australia, water quality in the harbor is generally good, however dry-season beach closures due to elevated Escherichia coli and enterococci counts are a cause for concern. The sources of these high bacteria counts are currently unknown. To address this, we sampled sewage outfalls, other potential inputs, such as urban rivers and drains, and surrounding beaches, and used genetic fingerprints from E. coli and enterococci communities, fecal markers and 454 pyrosequencing to track contamination sources. A sewage effluent outfall (Larrakeyah discharge) was a source of bacteria, including fecal bacteria that impacted nearby beaches. Two other treated effluent discharges did not appear to influence sites other than those directly adjacent. Several beaches contained fecal indicator bacteria that likely originated from urban rivers and creeks within the catchment. Generally, connectivity between the sites was observed within distinct geographical locations and it appeared that most of the bacterial contamination on Darwin beaches was confined to local sources.

  20. DABAM: an open-source database of X-ray mirrors metrology

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez del Rio, Manuel, E-mail: srio@esrf.eu [ESRF - The European Synchrotron, 71 Avenue des Martyrs, 38000 Grenoble (France); Bianchi, Davide [AC2T Research GmbH, Viktro-Kaplan-Strasse 2-C, 2700 Wiener Neustadt (Austria); Cocco, Daniele [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Glass, Mark [ESRF - The European Synchrotron, 71 Avenue des Martyrs, 38000 Grenoble (France); Idir, Mourad [NSLS II, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Metz, Jim [InSync Inc., 2511C Broadbent Parkway, Albuquerque, NM 87107 (United States); Raimondi, Lorenzo; Rebuffi, Luca [Elettra-Sincrotrone Trieste SCpA, Basovizza (TS) (Italy); Reininger, Ruben; Shi, Xianbo [Advanced Photon Source, Argonne National Laboratory, Argonne, IL 60439 (United States); Siewert, Frank [BESSY II, Helmholtz Zentrum Berlin, Institute for Nanometre Optics and Technology, Albert-Einstein-Strasse 15, 12489 Berlin (Germany); Spielmann-Jaeggi, Sibylle [Swiss Light Source at Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Takacs, Peter [Instrumentation Division, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States); Tomasset, Muriel [Synchrotron Soleil (France); Tonnessen, Tom [InSync Inc., 2511C Broadbent Parkway, Albuquerque, NM 87107 (United States); Vivo, Amparo [ESRF - The European Synchrotron, 71 Avenue des Martyrs, 38000 Grenoble (France); Yashchuk, Valeriy [Advanced Light Source, Lawrence Berkeley National Laboratory, MS 15-R0317, 1 Cyclotron Road, Berkeley, CA 94720-8199 (United States)

    2016-04-20

    DABAM, an open-source database of X-ray mirrors metrology to be used with ray-tracing and wave-propagation codes for simulating the effect of the surface errors on the performance of a synchrotron radiation beamline. An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper, with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.

  1. MyMolDB: a micromolecular database solution with open source and free components.

    Science.gov (United States)

    Xia, Bing; Tai, Zheng-Fu; Gu, Yu-Cheng; Li, Bang-Jing; Ding, Li-Sheng; Zhou, Yan

    2011-10-01

    To manage chemical structures in small laboratories is one of the important daily tasks. Few solutions are available on the internet, and most of them are closed source applications. The open-source applications typically have limited capability and basic cheminformatics functionalities. In this article, we describe an open-source solution to manage chemicals in research groups based on open source and free components. It has a user-friendly interface with the functions of chemical handling and intensive searching. MyMolDB is a micromolecular database solution that supports exact, substructure, similarity, and combined searching. This solution is mainly implemented using scripting language Python with a web-based interface for compound management and searching. Almost all the searches are in essence done with pure SQL on the database by using the high performance of the database engine. Thus, impressive searching speed has been archived in large data sets for no external Central Processing Unit (CPU) consuming languages were involved in the key procedure of the searching. MyMolDB is an open-source software and can be modified and/or redistributed under GNU General Public License version 3 published by the Free Software Foundation (Free Software Foundation Inc. The GNU General Public License, Version 3, 2007. Available at: http://www.gnu.org/licenses/gpl.html). The software itself can be found at http://code.google.com/p/mymoldb/. Copyright © 2011 Wiley Periodicals, Inc.

  2. Localizing Brain Activity from Multiple Distinct Sources via EEG

    Directory of Open Access Journals (Sweden)

    George Dassios

    2014-01-01

    Full Text Available An important question arousing in the framework of electroencephalography (EEG is the possibility to recognize, by means of a recorded surface potential, the number of activated areas in the brain. In the present paper, employing a homogeneous spherical conductor serving as an approximation of the brain, we provide a criterion which determines whether the measured surface potential is evoked by a single or multiple localized neuronal excitations. We show that the uniqueness of the inverse problem for a single dipole is closely connected with attaining certain relations connecting the measured data. Further, we present the necessary and sufficient conditions which decide whether the collected data originates from a single dipole or from numerous dipoles. In the case where the EEG data arouses from multiple parallel dipoles, an isolation of the source is, in general, not possible.

  3. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  4. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  5. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  6. Writing in the workplace: Constructing documents using multiple digital sources

    Directory of Open Access Journals (Sweden)

    Mariëlle Leijten

    2014-02-01

    Full Text Available In today’s workplaces professional communication often involves constructing documents from multiple digital sources—integrating one’s own texts/graphics with ideas based on others’ text/graphics. This article presents a case study of a professional communication designer as he constructs a proposal over several days. Drawing on keystroke and interview data, we map the professional’s overall process, plot the time course of his writing/design, illustrate how he searches for content and switches among optional digital sources, and show how he modifies and reuses others’ content. The case study reveals not only that the professional (1 searches extensively through multiple sources for content and ideas but that he also (2 constructs visual content (charts, graphs, photographs as well as verbal content, and (3 manages his attention and motivation over this extended task. Since these three activities are not represented in current models of writing, we propose their addition not just to models of communication design, but also to models of writing in general.

  7. Freshwater Biological Traits Database (Traits)

    Science.gov (United States)

    The traits database was compiled for a project on climate change effects on river and stream ecosystems. The traits data, gathered from multiple sources, focused on information published or otherwise well-documented by trustworthy sources.

  8. Summary of Adsorption/Desorption Experiments for the European Database on Indoor Air Pollution Sources in Buildings

    DEFF Research Database (Denmark)

    Kjær, Ulla Dorte; Tirkkonen, T.

    1996-01-01

    Experimental data for adsorption/desorption in building materials. Contribution to the European Database on Indoor Air Pollution Sources in buildings.......Experimental data for adsorption/desorption in building materials. Contribution to the European Database on Indoor Air Pollution Sources in buildings....

  9. Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.

    Science.gov (United States)

    Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran

    2017-04-01

    Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.

  10. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    Science.gov (United States)

    Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  11. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    Science.gov (United States)

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  12. Interpretation of the TRADE In-Pile source multiplication experiments

    International Nuclear Information System (INIS)

    Mercatali, Luigi; Carta, Mario; Peluso, Vincenzo

    2006-01-01

    Within the framework of the neutronic characterization of the TRIGA RC-1 reactor in support to the TRADE (TRiga Accelerator Driven Experiment) program, the interpretation of the subcriticality level measurements performed in static regime during the TRADE In-Pile experimental program is presented. Different levels of subcriticality have been measured using the MSA (Modified Source Approximated) method by the insertion of a standard fixed radioactive source into different core positions. Starting from a reference configuration, fuel elements were removed: control rods were moved outward as required for the coupling experiments envisioned with the proton accelerator and fission chambers were inserted in order to measure subcritical count rates. A neutron-physics analysis based on the modified formulation of the source multiplication method (MSM) has been carried out, which requires the systematic solution for each experimental configuration of the homogeneous, both in the forward and adjoint forms, and inhomogeneous Boltzmann equations. By means of such a methodology calculated correction factors to be applied to the MSA measured reactivities were produced in order to take into account spatial and energetic effects creating changes in the detector efficiencies and effective source with respect to the calibration configuration. The methodology presented has been tested against a large number of experimental states. The measurements have underlined the sensitivity of the MSA measured reactivities to core geometry changes and control rod perturbations; the efficiency of MSM factors to dramatically correct for this sensitivity is underlined, making of this technique a relevant methodology in view of the incoming US RACE program to be performed in TRIGA reactors

  13. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  14. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  15. Feature extraction from multiple data sources using genetic programming.

    Energy Technology Data Exchange (ETDEWEB)

    Szymanski, J. J. (John J.); Brumby, Steven P.; Pope, P. A. (Paul A.); Eads, D. R. (Damian R.); Galassi, M. C. (Mark C.); Harvey, N. R. (Neal R.); Perkins, S. J. (Simon J.); Porter, R. B. (Reid B.); Theiler, J. P. (James P.); Young, A. C. (Aaron Cody); Bloch, J. J. (Jeffrey J.); David, N. A. (Nancy A.); Esch-Mosher, D. M. (Diana M.)

    2002-01-01

    Feature extration from imagery is an important and long-standing problem in remote sensing. In this paper, we report on work using genetic programming to perform feature extraction simultaneously from multispectral and digital elevation model (DEM) data. The tool used is the GENetic Imagery Exploitation (GENIE) software, which produces image-processing software that inherently combines spatial and spectral processing. GENIE is particularly useful in exploratory studies of imagery, such as one often does in combining data from multiple sources. The user trains the software by painting the feature of interest with a simple graphical user interface. GENIE then uses genetic programming techniques to produce an image-processing pipeline. Here, we demonstrate evolution of image processing algorithms that extract a range of land-cover features including towns, grasslands, wild fire burn scars, and several types of forest. We use imagery from the DOE/NNSA Multispectral Thermal Imager (MTI) spacecraft, fused with USGS 1:24000 scale DEM data.

  16. Estimation of subcriticality by neutron source multiplication method

    International Nuclear Information System (INIS)

    Sakurai, Kiyoshi; Suzaki, Takenori; Arakawa, Takuya; Naito, Yoshitaka

    1995-03-01

    Subcritical cores were constructed in a core tank of the TCA by arraying 2.6% enriched UO 2 fuel rods into nxn square lattices of 1.956 cm pitch. Vertical distributions of the neutron count rates for the fifteen subcritical cores (n=17, 16, 14, 11, 8) with different water levels were measured at 5 cm interval with 235 U micro-fission counters at the in-core and out-core positions arranging a 252 C f neutron source at near core center. The continuous energy Monte Carlo code MCNP-4A was used for the calculation of neutron multiplication factors and neutron count rates. In this study, important conclusions are as follows: (1) Differences of neutron multiplication factors resulted from exponential experiment and MCNP-4A are below 1% in most cases. (2) Standard deviations of neutron count rates calculated from MCNP-4A with 500000 histories are 5-8%. The calculated neutron count rates are consistent with the measured one. (author)

  17. Assessing the use of multiple sources in student essays.

    Science.gov (United States)

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  18. Performance of popular open source databases for HEP related computing problems

    International Nuclear Information System (INIS)

    Kovalskyi, D; Sfiligoi, I; Wuerthwein, F; Yagil, A

    2014-01-01

    Databases are used in many software components of HEP computing, from monitoring and job scheduling to data storage and processing. It is not always clear at the beginning of a project if a problem can be handled by a single server, or if one needs to plan for a multi-server solution. Before a scalable solution is adopted, it helps to know how well it performs in a single server case to avoid situations when a multi-server solution is adopted mostly due to sub-optimal performance per node. This paper presents comparison benchmarks of popular open source database management systems. As a test application we use a user job monitoring system based on the Glidein workflow management system used in the CMS Collaboration.

  19. ZeBase: an open-source relational database for zebrafish laboratories.

    Science.gov (United States)

    Hensley, Monica R; Hassenplug, Eric; McPhail, Rodney; Leung, Yuk Fai

    2012-03-01

    Abstract ZeBase is an open-source relational database for zebrafish inventory. It is designed for the recording of genetic, breeding, and survival information of fish lines maintained in a single- or multi-laboratory environment. Users can easily access ZeBase through standard web-browsers anywhere on a network. Convenient search and reporting functions are available to facilitate routine inventory work; such functions can also be automated by simple scripting. Optional barcode generation and scanning are also built-in for easy access to the information related to any fish. Further information of the database and an example implementation can be found at http://zebase.bio.purdue.edu.

  20. Multiple time-reversed guide-sources in shallow water

    Science.gov (United States)

    Gaumond, Charles F.; Fromm, David M.; Lingevitch, Joseph F.; Gauss, Roger C.; Menis, Richard

    2003-10-01

    Detection in a monostatic, broadband, active sonar system in shallow water is degraded by propagation-induced spreading. The detection improvement from multiple spatially separated guide sources (GSs) is presented as a method to mitigate this degradation. The improvement of detection by using information in a set of one-way transmissions from a variety of positions is shown using sea data. The experimental area is south of the Hudson Canyon off the coast of New Jersey. The data were taken using five elements of a time-reversing VLA. The five elements were contiguous and at midwater depth. The target and guide source was an echo repeater positioned at various ranges and at middepth. The transmitted signals were 3.0- to 3.5-kHz LFMs. The data are analyzed to show the amount of information present in the collection, a baseline probability of detection (PD) not using the collection of GS signals, the improvement in PD from the use of various sets of GS signals. The dependence of the improvement as a function of range is also shown. [The authors acknowledge support from Dr. Jeffrey Simmen, ONR321OS, and the chief scientist Dr. Charles Holland. Work supported by ONR.

  1. Consumer Product Category Database

    Science.gov (United States)

    The Chemical and Product Categories database (CPCat) catalogs the use of over 40,000 chemicals and their presence in different consumer products. The chemical use information is compiled from multiple sources while product information is gathered from publicly available Material Safety Data Sheets (MSDS). EPA researchers are evaluating the possibility of expanding the database with additional product and use information.

  2. Raman database of amino acids solutions: A critical study of Extended Multiplicative Signal Correction

    KAUST Repository

    Candeloro, Patrizio

    2013-01-01

    The Raman spectra of biological materials always exhibit complex profiles, constituting several peaks and/or bands which arise due to the large variety of biomolecules. The extraction of quantitative information from these spectra is not a trivial task. While qualitative information can be retrieved from the changes in peaks frequencies or from the appearance/disappearance of some peaks, quantitative analysis requires an examination of peak intensities. Unfortunately in biological samples it is not easy to identify a reference peak for normalizing intensities, and this makes it very difficult to study the peak intensities. In the last decades a more refined mathematical tool, the extended multiplicative signal correction (EMSC), has been proposed for treating infrared spectra, which is also capable of providing quantitative information. From the mathematical and physical point of view, EMSC can also be applied to Raman spectra, as recently proposed. In this work the reliability of the EMSC procedure is tested by application to a well defined biological system: the 20 standard amino acids and their combination in peptides. The first step is the collection of a Raman database of these 20 amino acids, and subsequently EMSC processing is applied to retrieve quantitative information from amino acids mixtures and peptides. A critical review of the results is presented, showing that EMSC has to be carefully handled for complex biological systems. © 2013 The Royal Society of Chemistry.

  3. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  4. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  5. Performance Evaluation of a Database System in a Multiple Backend Configurations,

    Science.gov (United States)

    1984-10-01

    leaving a systemn process , the * internal performance measuremnents of MMSD have been carried out. Mathodo lo.- gies for constructing test databases...access d i rectory data via the AT, EDIT, and CDT. In designing the test database, one of the key concepts is the choice of the directory attributes in...internal timing. These requests are selected since they retrieve the seIaI lest portion of the test database and the processing time for each request is

  6. NoSQL data model for semi-automatic integration of ethnomedicinal plant data from multiple sources.

    Science.gov (United States)

    Ningthoujam, Sanjoy Singh; Choudhury, Manabendra Dutta; Potsangbam, Kumar Singh; Chetia, Pankaj; Nahar, Lutfun; Sarker, Satyajit D; Basar, Norazah; Das Talukdar, Anupam

    2014-01-01

    Sharing traditional knowledge with the scientific community could refine scientific approaches to phytochemical investigation and conservation of ethnomedicinal plants. As such, integration of traditional knowledge with scientific data using a single platform for sharing is greatly needed. However, ethnomedicinal data are available in heterogeneous formats, which depend on cultural aspects, survey methodology and focus of the study. Phytochemical and bioassay data are also available from many open sources in various standards and customised formats. To design a flexible data model that could integrate both primary and curated ethnomedicinal plant data from multiple sources. The current model is based on MongoDB, one of the Not only Structured Query Language (NoSQL) databases. Although it does not contain schema, modifications were made so that the model could incorporate both standard and customised ethnomedicinal plant data format from different sources. The model presented can integrate both primary and secondary data related to ethnomedicinal plants. Accommodation of disparate data was accomplished by a feature of this database that supported a different set of fields for each document. It also allowed storage of similar data having different properties. The model presented is scalable to a highly complex level with continuing maturation of the database, and is applicable for storing, retrieving and sharing ethnomedicinal plant data. It can also serve as a flexible alternative to a relational and normalised database. Copyright © 2014 John Wiley & Sons, Ltd.

  7. The Net Enabled Waste Management Database as an international source of radioactive waste management information

    International Nuclear Information System (INIS)

    Csullog, G.W.; Friedrich, V.; Miaw, S.T.W.; Tonkay, D.; Petoe, A.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an integral part of the IAEA's policies and strategy related to the collection and dissemination of information, both internal to the IAEA in support of its activities and external to the IAEA (publicly available). The paper highlights the NEWMDB's role in relation to the routine reporting of status and trends in radioactive waste management, in assessing the development and implementation of national systems for radioactive waste management, in support of a newly developed indicator of sustainable development for radioactive waste management, in support of reporting requirements for the Joint Convention on the Safety of Spent Fuel Management and on the Safety of Radioactive Waste Management, in support of IAEA activities related to the harmonization of waste management information at the national and international levels and in relation to the management of spent/disused sealed radioactive sources. (author)

  8. Free and Open Source Options for Creating Database-Driven Subject Guides

    Directory of Open Access Journals (Sweden)

    Edward M. Corrado

    2008-03-01

    Full Text Available This article reviews available cost-effective options libraries have for updating and maintaining pathfinders such as subject guides and course pages. The paper discusses many of the available options, from the standpoint of a mid-sized academic library which is evaluating alternatives to static-HTML subject guides. Static HTML guides, while useful, have proven difficult and time-consuming to maintain. The article includes a discussion of open source database-driven solutions (such as SubjectsPlus, LibData, Research Guide, and Library Course Builder, Wikis, and social tagging sites like del.icio.us. This article discusses both the functionality and the relative strengths and weaknessess of each of these options.

  9. International patent analysis of water source heat pump based on orbit database

    Science.gov (United States)

    Li, Na

    2018-02-01

    Using orbit database, this paper analysed the international patents of water source heat pump (WSHP) industry with patent analysis methods such as analysis of publication tendency, geographical distribution, technology leaders and top assignees. It is found that the beginning of the 21st century is a period of rapid growth of the patent application of WSHP. Germany and the United States had done researches and development of WSHP in an early time, but now Japan and China have become important countries of patent applications. China has been developing faster and faster in recent years, but the patents are concentrated in universities and urgent to be transferred. Through an objective analysis, this paper aims to provide appropriate decision references for the development of domestic WSHP industry.

  10. DeitY-TU face database: its design, multiple camera capturing, characteristics, and evaluation

    Science.gov (United States)

    Bhowmik, Mrinal Kanti; Saha, Kankan; Saha, Priya; Bhattacharjee, Debotosh

    2014-10-01

    The development of the latest face databases is providing researchers different and realistic problems that play an important role in the development of efficient algorithms for solving the difficulties during automatic recognition of human faces. This paper presents the creation of a new visual face database, named the Department of Electronics and Information Technology-Tripura University (DeitY-TU) face database. It contains face images of 524 persons belonging to different nontribes and Mongolian tribes of north-east India, with their anthropometric measurements for identification. Database images are captured within a room with controlled variations in illumination, expression, and pose along with variability in age, gender, accessories, make-up, and partial occlusion. Each image contains the combined primary challenges of face recognition, i.e., illumination, expression, and pose. This database also represents some new features: soft biometric traits such as mole, freckle, scar, etc., and facial anthropometric variations that may be helpful for researchers for biometric recognition. It also gives an equivalent study of the existing two-dimensional face image databases. The database has been tested using two baseline algorithms: linear discriminant analysis and principal component analysis, which may be used by other researchers as the control algorithm performance score.

  11. Scanning an individual monitoring database for multiple occurrences using bi-gram analysis

    International Nuclear Information System (INIS)

    Van Dijk, J. W. E.

    2007-01-01

    Maintaining the integrity of the databases is one of the important aspects of quality assurance at individual monitoring services and national dose registers. This paper presents a method for finding and preventing the occurrence of duplicate entries in the databases that can occur, e.g. because of a variable spelling or misspelling of the name. The method is based on bi-gram text analysis techniques. The methods can also be used for retrieving dose data in historical databases in the framework of dose reconstruction efforts of persons of whom the spelling of the name as originally entered, possibly decades ago, is uncertain. (authors)

  12. Federated or cached searches: providing expected performance from multiple invasive species databases

    Science.gov (United States)

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-01-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  13. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    OpenAIRE

    Seok-Hyoung Lee; Hwan-Min Kim; Ho-Seop Choe

    2012-01-01

    While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to ach...

  14. Neutron generators with size scalability, ease of fabrication and multiple ion source functionalities

    Science.gov (United States)

    Elizondo-Decanini, Juan M

    2014-11-18

    A neutron generator is provided with a flat, rectilinear geometry and surface mounted metallizations. This construction provides scalability and ease of fabrication, and permits multiple ion source functionalities.

  15. Existing data sources for clinical epidemiology: the Danish Patient Compensation Association database.

    Science.gov (United States)

    Tilma, Jens; Nørgaard, Mette; Mikkelsen, Kim Lyngby; Johnsen, Søren Paaske

    2015-01-01

    Any patient in the Danish health care system who experiences a treatment injury can make a compensation claim to the Danish Patient Compensation Association (DPCA) free of charge. The aim of this paper is to describe the DPCA database as a source of data for epidemiological research. Data to DPCA are collected prospectively on all claims and include information on patient factors and health records, system factors, and administrative data. Approval of claims is based on injury due to the principle of treatment below experienced specialist standard or intolerable, unexpected extensiveness of injury. Average processing time of a compensation claim is 6-8 months. Data collection is nationwide and started in 1992. The patient's central registration system number, a unique personal identifier, allows for data linkage to other registries such as the Danish National Patient Registry. The DPCA data are accessible for research following data usage permission and make it possible to analyze all claims or specific subgroups to identify predictors, outcomes, etc. DPCA data have until now been used only in few studies but could be a useful data source in future studies of health care-related injuries.

  16. Existing data sources for clinical epidemiology: the Danish Patient Compensation Association database

    Directory of Open Access Journals (Sweden)

    Tilma J

    2015-07-01

    Full Text Available Jens Tilma,1 Mette Nørgaard,1 Kim Lyngby Mikkelsen,2 Søren Paaske Johnsen1 1Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, 2Danish Patient Compensation Association, Copenhagen, Denmark Abstract: Any patient in the Danish health care system who experiences a treatment injury can make a compensation claim to the Danish Patient Compensation Association (DPCA free of charge. The aim of this paper is to describe the DPCA database as a source of data for epidemiological research. Data to DPCA are collected prospectively on all claims and include information on patient factors and health records, system factors, and administrative data. Approval of claims is based on injury due to the principle of treatment below experienced specialist standard or intolerable, unexpected extensiveness of injury. Average processing time of a compensation claim is 6–8 months. Data collection is nationwide and started in 1992. The patient's central registration system number, a unique personal identifier, allows for data linkage to other registries such as the Danish National Patient Registry. The DPCA data are accessible for research following data usage permission and make it possible to analyze all claims or specific subgroups to identify predictors, outcomes, etc. DPCA data have until now been used only in few studies but could be a useful data source in future studies of health care-related injuries. Keywords: public health care, treatment injuries, no-fault compensation, registries, research, Denmark

  17. Relationship between exposure to multiple noise sources and noise annoyance

    NARCIS (Netherlands)

    Miedema, H.M.E.

    2004-01-01

    Relationships between exposure to noise [metric: day-night level (DNL) or day-evening-night level (DENL)] from a single source (aircraft, road traffic, or railways) and annoyance based on a large international dataset have been published earlier. Also for stationary sources relationships have been

  18. H2DB: a heritability database across multiple species by annotating trait-associated genomic loci.

    Science.gov (United States)

    Kaminuma, Eli; Fujisawa, Takatomo; Tanizawa, Yasuhiro; Sakamoto, Naoko; Kurata, Nori; Shimizu, Tokurou; Nakamura, Yasukazu

    2013-01-01

    H2DB (http://tga.nig.ac.jp/h2db/), an annotation database of genetic heritability estimates for humans and other species, has been developed as a knowledge database to connect trait-associated genomic loci. Heritability estimates have been investigated for individual species, particularly in human twin studies and plant/animal breeding studies. However, there appears to be no comprehensive heritability database for both humans and other species. Here, we introduce an annotation database for genetic heritabilities of various species that was annotated by manually curating online public resources in PUBMED abstracts and journal contents. The proposed heritability database contains attribute information for trait descriptions, experimental conditions, trait-associated genomic loci and broad- and narrow-sense heritability specifications. Annotated trait-associated genomic loci, for which most are single-nucleotide polymorphisms derived from genome-wide association studies, may be valuable resources for experimental scientists. In addition, we assigned phenotype ontologies to the annotated traits for the purposes of discussing heritability distributions based on phenotypic classifications.

  19. Modeling water demand when households have multiple sources of water

    Science.gov (United States)

    Coulibaly, Lassina; Jakus, Paul M.; Keith, John E.

    2014-07-01

    A significant portion of the world's population lives in areas where public water delivery systems are unreliable and/or deliver poor quality water. In response, people have developed important alternatives to publicly supplied water. To date, most water demand research has been based on single-equation models for a single source of water, with very few studies that have examined water demand from two sources of water (where all nonpublic system water sources have been aggregated into a single demand). This modeling approach leads to two outcomes. First, the demand models do not capture the full range of alternatives, so the true economic relationship among the alternatives is obscured. Second, and more seriously, economic theory predicts that demand for a good becomes more price-elastic as the number of close substitutes increases. If researchers artificially limit the number of alternatives studied to something less than the true number, the price elasticity estimate may be biased downward. This paper examines water demand in a region with near universal access to piped water, but where system reliability and quality is such that many alternative sources of water exist. In extending the demand analysis to four sources of water, we are able to (i) demonstrate why households choose the water sources they do, (ii) provide a richer description of the demand relationships among sources, and (iii) calculate own-price elasticity estimates that are more elastic than those generally found in the literature.

  20. Multiple station beamline at an undulator x-ray source

    DEFF Research Database (Denmark)

    Als-Nielsen, J.; Freund, A.K.; Grübel, G.

    1994-01-01

    The undulator X-ray source is an ideal source for many applications: the beam is brilliant, highly collimated in all directions, quasi-monochromatic, pulsed and linearly polarized. Such a precious source can feed several independently operated instruments by utilizing a downstream series of X......-ray transparent monochromator crystals. Diamond in particular is an attractive monochromator as it is rather X-ray transparent and can be fabricated to a high degree of crystal perfection. Moreover, it has a very high heat conductivity and a rather small thermal expansion so the beam X-ray heat load problem...

  1. Multiple approaches to microbial source tracking in tropical northern Australia

    KAUST Repository

    Neave, Matthew; Luter, Heidi; Padovan, Anna; Townsend, Simon; Schobben, Xavier; Gibb, Karen

    2014-01-01

    , other potential inputs, such as urban rivers and drains, and surrounding beaches, and used genetic fingerprints from E. coli and enterococci communities, fecal markers and 454 pyrosequencing to track contamination sources. A sewage effluent outfall

  2. Metasurface Cloak Performance Near-by Multiple Line Sources and PEC Cylindrical Objects

    DEFF Research Database (Denmark)

    Arslanagic, Samel; Yatman, William H.; Pehrson, Signe

    2014-01-01

    The performance/robustness of metasurface cloaks to a complex field environment which may represent a realistic scenario of radiating sources is presently reported. Attention is devoted to the cloak operation near-by multiple line sources and multiple perfectly electrically conducting cylinders. ...

  3. Aurorasaurus Database of Real-Time, Soft-Sensor Sourced Aurora Data for Space Weather Research

    Science.gov (United States)

    Kosar, B.; MacDonald, E.; Heavner, M.

    2017-12-01

    Aurorasaurus is an innovative citizen science project focused on two fundamental objectives i.e., collecting real-time, ground-based signals of auroral visibility from citizen scientists (soft-sensors) and incorporating this new type of data into scientific investigations pertaining to aurora. The project has been live since the Fall of 2014, and as of Summer 2017, the database compiled approximately 12,000 observations (5295 direct reports and 6413 verified tweets). In this presentation, we will focus on demonstrating the utility of this robust science quality data for space weather research needs. These data scale with the size of the event and are well-suited to capture the largest, rarest events. Emerging state-of-the-art computational methods based on statistical inference such as machine learning frameworks and data-model integration methods can offer new insights that could potentially lead to better real-time assessment and space weather prediction when citizen science data are combined with traditional sources.

  4. Speculative Attacks with Multiple Sources of Public Information

    OpenAIRE

    Cornand, Camille; Heinemann, Frank

    2005-01-01

    We propose a speculative attack model in which agents receive multiple public signals. It is characterised by its focus on an informational structure, which sets free from the strict separation between public information and private information. Diverse pieces of public information can be taken into account differently by players and are likely to lead to different appreciations ex post. This process defines players’ private value. The main result is to show that equilibrium uniqueness depend...

  5. Synergies of multiple remote sensing data sources for REDD+ monitoring

    NARCIS (Netherlands)

    Sy, de V.; Herold, M.; Achard, F.; Asner, G.P.; Held, A.; Kellndorfer, J.; Verbesselt, J.

    2012-01-01

    Remote sensing technologies can provide objective, practical and cost-effective solutions for developing and maintaining REDD+ monitoring systems. This paper reviews the potential and status of available remote sensing data sources with a focus on different forest information products and synergies

  6. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    Directory of Open Access Journals (Sweden)

    Seok-Hyoung Lee

    2012-06-01

    Full Text Available While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to achieve interoperability of the information and thus not easy to implement meaningful science technology information services through information convergence. This study aims to address the aforementioned issue by analyzing mapping systems between classification systems in order to design a structure to connect a variety of classification systems used in the academic information database of the Korea Institute of Science and Technology Information, which provides science and technology information portal service. This study also aims to design a mapping system for the classification systems to be applied to actual science and technology information services and information management systems.

  7. Multiple-source multiple-harmonic active vibration control of variable section cylindrical structures: A numerical study

    Science.gov (United States)

    Liu, Jinxin; Chen, Xuefeng; Gao, Jiawei; Zhang, Xingwu

    2016-12-01

    Air vehicles, space vehicles and underwater vehicles, the cabins of which can be viewed as variable section cylindrical structures, have multiple rotational vibration sources (e.g., engines, propellers, compressors and motors), making the spectrum of noise multiple-harmonic. The suppression of such noise has been a focus of interests in the field of active vibration control (AVC). In this paper, a multiple-source multiple-harmonic (MSMH) active vibration suppression algorithm with feed-forward structure is proposed based on reference amplitude rectification and conjugate gradient method (CGM). An AVC simulation scheme called finite element model in-loop simulation (FEMILS) is also proposed for rapid algorithm verification. Numerical studies of AVC are conducted on a variable section cylindrical structure based on the proposed MSMH algorithm and FEMILS scheme. It can be seen from the numerical studies that: (1) the proposed MSMH algorithm can individually suppress each component of the multiple-harmonic noise with an unified and improved convergence rate; (2) the FEMILS scheme is convenient and straightforward for multiple-source simulations with an acceptable loop time. Moreover, the simulations have similar procedure to real-life control and can be easily extended to physical model platform.

  8. PUBLIC EXPOSURE TO MULTIPLE RF SOURCES IN GHANA.

    Science.gov (United States)

    Deatanyah, P; Abavare, E K K; Menyeh, A; Amoako, J K

    2018-03-16

    This paper describes an effort to respond to the suggestion in World Health Organization (WHO) research agenda to better quantify potential exposure levels from a range of radiofrequency (RF) sources at 200 public access locations in Ghana. Wide-band measurements were performed-with a spectrum analyser and a log-periodic antenna using three-point spatial averaging method. The overall results represented a maximum of 0.19% of the ICNIRP reference levels for public exposure. These results were generally lower than found in some previous but were 58% (2.0 dB) greater, than found in similar work conducted in the USA. Major contributing sources of RF fields were identified to be FM broadcast and mobile base station sites. Three locations with the greatest measured RF fields could represent potential areas for epidemiological studies.

  9. Accommodating multiple illumination sources in an imaging colorimetry environment

    Science.gov (United States)

    Tobin, Kenneth W., Jr.; Goddard, James S., Jr.; Hunt, Martin A.; Hylton, Kathy W.; Karnowski, Thomas P.; Simpson, Marc L.; Richards, Roger K.; Treece, Dale A.

    2000-03-01

    Researchers at the Oak Ridge National Laboratory have been developing a method for measuring color quality in textile products using a tri-stimulus color camera system. Initial results of the Imaging Tristimulus Colorimeter (ITC) were reported during 1999. These results showed that the projection onto convex sets (POCS) approach to color estimation could be applied to complex printed patterns on textile products with high accuracy and repeatability. Image-based color sensors used for on-line measurement are not colorimetric by nature and require a non-linear transformation of the component colors based on the spectral properties of the incident illumination, imaging sensor, and the actual textile color. Our earlier work reports these results for a broad-band, smoothly varying D65 standard illuminant. To move the measurement to the on-line environment with continuously manufactured textile webs, the illumination source becomes problematic. The spectral content of these light sources varies substantially from the D65 standard illuminant and can greatly impact the measurement performance of the POCS system. Although absolute color measurements are difficult to make under different illumination, referential measurements to monitor color drift provide a useful indication of product quality. Modifications to the ITC system have been implemented to enable the study of different light sources. These results and the subsequent analysis of relative color measurements will be reported for textile products.

  10. Multiple sources of boron in urban surface waters and groundwaters

    Energy Technology Data Exchange (ETDEWEB)

    Hasenmueller, Elizabeth A., E-mail: eahasenm@wustl.edu; Criss, Robert E.

    2013-03-01

    Previous studies attribute abnormal boron (B) levels in streams and groundwaters to wastewater and fertilizer inputs. This study shows that municipal drinking water used for lawn irrigation contributes substantial non-point loads of B and other chemicals (S-species, Li, and Cu) to surface waters and shallow groundwaters in the St. Louis, Missouri, area. Background levels and potential B sources were characterized by analysis of lawn and street runoff, streams, rivers, springs, local rainfall, wastewater influent and effluent, and fertilizers. Urban surface waters and groundwaters are highly enriched in B (to 250 μg/L) compared to background levels found in rain and pristine, carbonate-hosted streams and springs (< 25 μg/L), but have similar concentrations (150 to 259 μg/L) compared to municipal drinking waters derived from the Missouri River. Other data including B/SO{sub 4}{sup 2-}−S and B/Li ratios confirm major contributions from this source. Moreover, sequential samples of runoff collected during storms show that B concentrations decrease with increased discharge, proving that elevated B levels are not primarily derived from combined sewer overflows (CSOs) during flooding. Instead, non-point source B exhibits complex behavior depending on land use. In urban settings B is rapidly mobilized from lawns during “first flush” events, likely representing surficial salt residues from drinking water used to irrigate lawns, and is also associated with the baseflow fraction, likely derived from the shallow groundwater reservoir that over time accumulates B from drinking water that percolates into the subsurface. The opposite occurs in small rural watersheds, where B is leached from soils by recent rainfall and covaries with the event water fraction. Highlights: ► Boron sources and loads differ between urban and rural watersheds. ► Wastewaters are not the major boron source in small St. Louis, MO watersheds. ► Municipal drinking water used for lawn

  11. Kajian Unified Theory of Acceptance and Use of Technology Dalam Penggunaan Open Source Software Database Management System

    Directory of Open Access Journals (Sweden)

    Michael Sonny

    2016-06-01

    Full Text Available Perkembangan perangkat lunak computer dewasa ini terjadi sedemikian pesatnya, perkembangan tidak hanya terjadi pada perangkat lunak yang memiliki lisensi tertentu, perangkat open source pun demikian. Perkembangan itu tentu saja sangat menggembirakan bagi pengguna computer khususnya di kalangan pendidikan maupun di kalangan mahasiswa, karena pengguna mempunyai beberapa pilihan untuk menggunakan aplikasi. Perangkat lunak open source juga menawarkan produk yang umumnya gratis, diberikan kode programnya, kebebasan untuk modifikasi dan mengembangkan. Meneliti aplikasi berbasis open source tentu saja sangat beragam seperti aplikasi untuk pemrograman (PHP, Gambas, Database Management System (MySql, SQLite, browsing (Mozilla, Firefox, Opera. Pada penelitian ini di kaji penerimaan aplikasi DBMS (Database Management System seperti MySql dan SQLite dengan menggunakan sebuah model yang dikembangkan oleh Venkantes(2003 yaitu UTAUT (Unified Theory of Acceptance and Use of Technology. Faktor – faktor tertentu juga mempengaruhi dalam melakukan kegiatan pembelajaran aplikasi open source ini, salah satu faktor atau yang disebut dengan moderating yang bisa mempengaruhi efektifitas dan efisiensi. Dengan demikian akan mendapatkan hasil yang bisa membuat kelancaran dalam pembelajaran aplikasi berbasis open source ini.   Kata kunci— open source, Database Management System (DBMS, Modereting

  12. Some problems of neutron source multiplication method for site measurement technology in nuclear critical safety

    International Nuclear Information System (INIS)

    Shi Yongqian; Zhu Qingfu; Hu Dingsheng; He Tao; Yao Shigui; Lin Shenghuo

    2004-01-01

    The paper gives experiment theory and experiment method of neutron source multiplication method for site measurement technology in the nuclear critical safety. The measured parameter by source multiplication method actually is a sub-critical with source neutron effective multiplication factor k s , but not the neutron effective multiplication factor k eff . The experiment research has been done on the uranium solution nuclear critical safety experiment assembly. The k s of different sub-criticality is measured by neutron source multiplication experiment method, and k eff of different sub-criticality, the reactivity coefficient of unit solution level, is first measured by period method, and then multiplied by difference of critical solution level and sub-critical solution level and obtained the reactivity of sub-critical solution level. The k eff finally can be extracted from reactivity formula. The effect on the nuclear critical safety and different between k eff and k s are discussed

  13. Multiple Sources of Prescription Payment and Risky Opioid Therapy Among Veterans.

    Science.gov (United States)

    Becker, William C; Fenton, Brenda T; Brandt, Cynthia A; Doyle, Erin L; Francis, Joseph; Goulet, Joseph L; Moore, Brent A; Torrise, Virginia; Kerns, Robert D; Kreiner, Peter W

    2017-07-01

    Opioid overdose and other related harms are a major source of morbidity and mortality among US Veterans, in part due to high-risk opioid prescribing. We sought to determine whether having multiple sources of payment for opioids-as a marker for out-of-system access-is associated with risky opioid therapy among veterans. Cross-sectional study examining the association between multiple sources of payment and risky opioid therapy among all individuals with Veterans Health Administration (VHA) payment for opioid analgesic prescriptions in Kentucky during fiscal year 2014-2015. Source of payment categories: (1) VHA only source of payment (sole source); (2) sources of payment were VHA and at least 1 cash payment [VHA+cash payment(s)] whether or not there was a third source of payment; and (3) at least one other noncash source: Medicare, Medicaid, or private insurance [VHA+noncash source(s)]. Our outcomes were 2 risky opioid therapies: combination opioid/benzodiazepine therapy and high-dose opioid therapy, defined as morphine equivalent daily dose ≥90 mg. Of the 14,795 individuals in the analytic sample, there were 81.9% in the sole source category, 6.6% in the VHA+cash payment(s) category, and 11.5% in the VHA+noncash source(s) category. In logistic regression, controlling for age and sex, persons with multiple payment sources had significantly higher odds of each risky opioid therapy, with those in the VHA+cash having significantly higher odds than those in the VHA+noncash source(s) group. Prescribers should examine the prescription monitoring program as multiple payment sources increase the odds of risky opioid therapy.

  14. Multiple Household Water Sources and Their Use in Remote Communities With Evidence From Pacific Island Countries

    Science.gov (United States)

    Elliott, Mark; MacDonald, Morgan C.; Chan, Terence; Kearton, Annika; Shields, Katherine F.; Bartram, Jamie K.; Hadwen, Wade L.

    2017-11-01

    Global water research and monitoring typically focus on the household's "main source of drinking-water." Use of multiple water sources to meet daily household needs has been noted in many developing countries but rarely quantified or reported in detail. We gathered self-reported data using a cross-sectional survey of 405 households in eight communities of the Republic of the Marshall Islands (RMI) and five Solomon Islands (SI) communities. Over 90% of households used multiple sources, with differences in sources and uses between wet and dry seasons. Most RMI households had large rainwater tanks and rationed stored rainwater for drinking throughout the dry season, whereas most SI households collected rainwater in small pots, precluding storage across seasons. Use of a source for cooking was strongly positively correlated with use for drinking, whereas use for cooking was negatively correlated or uncorrelated with nonconsumptive uses (e.g., bathing). Dry season water uses implied greater risk of water-borne disease, with fewer (frequently zero) handwashing sources reported and more unimproved sources consumed. Use of multiple sources is fundamental to household water management and feasible to monitor using electronic survey tools. We contend that recognizing multiple water sources can greatly improve understanding of household-level and community-level climate change resilience, that use of multiple sources confounds health impact studies of water interventions, and that incorporating multiple sources into water supply interventions can yield heretofore-unrealized benefits. We propose that failure to consider multiple sources undermines the design and effectiveness of global water monitoring, data interpretation, implementation, policy, and research.

  15. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    Science.gov (United States)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  16. A robust poverty profile for Brazil using multiple data sources

    Directory of Open Access Journals (Sweden)

    Ferreira Francisco H. G.

    2003-01-01

    Full Text Available This paper presents a poverty profile for Brazil, based on three different sources of household data for 1996. We use PPV consumption data to estimate poverty and indigence lines. ''Contagem'' data is used to allow for an unprecedented refinement of the country's poverty map. Poverty measures and shares are also presented for a wide range of population subgroups, based on the PNAD 1996, with new adjustments for imputed rents and spatial differences in cost of living. Robustness of the profile is verified with respect to different poverty lines, spatial price deflators, and equivalence scales. Overall poverty incidence ranges from 23% with respect to an indigence line to 45% with respect to a more generous poverty line. More importantly, however, poverty is found to vary significantly across regions and city sizes, with rural areas, small and medium towns and the metropolitan peripheries of the North and Northeast regions being poorest.

  17. Exploiting semantic linkages among multiple sources for semantic information retrieval

    Science.gov (United States)

    Li, JianQiang; Yang, Ji-Jiang; Liu, Chunchen; Zhao, Yu; Liu, Bo; Shi, Yuliang

    2014-07-01

    The vision of the Semantic Web is to build a global Web of machine-readable data to be consumed by intelligent applications. As the first step to make this vision come true, the initiative of linked open data has fostered many novel applications aimed at improving data accessibility in the public Web. Comparably, the enterprise environment is so different from the public Web that most potentially usable business information originates in an unstructured form (typically in free text), which poses a challenge for the adoption of semantic technologies in the enterprise environment. Considering that the business information in a company is highly specific and centred around a set of commonly used concepts, this paper describes a pilot study to migrate the concept of linked data into the development of a domain-specific application, i.e. the vehicle repair support system. The set of commonly used concepts, including the part name of a car and the phenomenon term on the car repairing, are employed to build the linkage between data and documents distributed among different sources, leading to the fusion of documents and data across source boundaries. Then, we describe the approaches of semantic information retrieval to consume these linkages for value creation for companies. The experiments on two real-world data sets show that the proposed approaches outperform the best baseline 6.3-10.8% and 6.4-11.1% in terms of top five and top 10 precisions, respectively. We believe that our pilot study can serve as an important reference for the development of similar semantic applications in an enterprise environment.

  18. Search Strategy of Detector Position For Neutron Source Multiplication Method by Using Detected-Neutron Multiplication Factor

    International Nuclear Information System (INIS)

    Endo, Tomohiro

    2011-01-01

    In this paper, an alternative definition of a neutron multiplication factor, detected-neutron multiplication factor kdet, is produced for the neutron source multiplication method..(NSM). By using kdet, a search strategy of appropriate detector position for NSM is also proposed. The NSM is one of the practical subcritical measurement techniques, i.e., the NSM does not require any special equipment other than a stationary external neutron source and an ordinary neutron detector. Additionally, the NSM method is based on steady-state analysis, so that this technique is very suitable for quasi real-time measurement. It is noted that the correction factors play important roles in order to accurately estimate subcriticality from the measured neutron count rates. The present paper aims to clarify how to correct the subcriticality measured by the NSM method, the physical meaning of the correction factors, and how to reduce the impact of correction factors by setting a neutron detector at an appropriate detector position

  19. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML database--Xindice.

    Science.gov (United States)

    Li, Feng; Li, Maoyu; Xiao, Zhiqiang; Zhang, Pengfei; Li, Jianling; Chen, Zhuchu

    2006-01-11

    Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC) is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML) editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  20. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML Database – Xindice

    Directory of Open Access Journals (Sweden)

    Li Jianling

    2006-01-01

    Full Text Available Abstract Background Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. Results The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Conclusion Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  1. Joint part-of-speech and dependency projection from multiple sources

    DEFF Research Database (Denmark)

    Johannsen, Anders Trærup; Agic, Zeljko; Søgaard, Anders

    2016-01-01

    for multiple tasks from multiple source languages, relying on parallel corpora available for hundreds of languages. When training POS taggers and dependency parsers on jointly projected POS tags and syntactic dependencies using our algorithm, we obtain better performance than a standard approach on 20...

  2. The Usefulness of Multilevel Hash Tables with Multiple Hash Functions in Large Databases

    Directory of Open Access Journals (Sweden)

    A.T. Akinwale

    2009-05-01

    Full Text Available In this work, attempt is made to select three good hash functions which uniformly distribute hash values that permute their internal states and allow the input bits to generate different output bits. These functions are used in different levels of hash tables that are coded in Java Programming Language and a quite number of data records serve as primary data for testing the performances. The result shows that the two-level hash tables with three different hash functions give a superior performance over one-level hash table with two hash functions or zero-level hash table with one function in term of reducing the conflict keys and quick lookup for a particular element. The result assists to reduce the complexity of join operation in query language from O( n2 to O( 1 by placing larger query result, if any, in multilevel hash tables with multiple hash functions and generate shorter query result.

  3. Genetic diversity and antimicrobial resistance of Escherichia coli from human and animal sources uncovers multiple resistances from human sources.

    Directory of Open Access Journals (Sweden)

    A Mark Ibekwe

    Full Text Available Escherichia coli are widely used as indicators of fecal contamination, and in some cases to identify host sources of fecal contamination in surface water. Prevalence, genetic diversity and antimicrobial susceptibility were determined for 600 generic E. coli isolates obtained from surface water and sediment from creeks and channels along the middle Santa Ana River (MSAR watershed of southern California, USA, after a 12 month study. Evaluation of E. coli populations along the creeks and channels showed that E. coli were more prevalent in sediment compared to surface water. E. coli populations were not significantly different (P = 0.05 between urban runoff sources and agricultural sources, however, E. coli genotypes determined by pulsed-field gel electrophoresis (PFGE were less diverse in the agricultural sources than in urban runoff sources. PFGE also showed that E. coli populations in surface water were more diverse than in the sediment, suggesting isolates in sediment may be dominated by clonal populations.Twenty four percent (144 isolates of the 600 isolates exhibited resistance to more than one antimicrobial agent. Most multiple resistances were associated with inputs from urban runoff and involved the antimicrobials rifampicin, tetracycline, and erythromycin. The occurrence of a greater number of E. coli with multiple antibiotic resistances from urban runoff sources than agricultural sources in this watershed provides useful evidence in planning strategies for water quality management and public health protection.

  4. Data integration and knowledge discovery in biomedical databases. Reliable information from unreliable sources

    Directory of Open Access Journals (Sweden)

    A Mitnitski

    2003-01-01

    Full Text Available To better understand information about human health from databases we analyzed three datasets collected for different purposes in Canada: a biomedical database of older adults, a large population survey across all adult ages, and vital statistics. Redundancy in the variables was established, and this led us to derive a generalized (macroscopic state variable, being a fitness/frailty index that reflects both individual and group health status. Evaluation of the relationship between fitness/frailty and the mortality rate revealed that the latter could be expressed in terms of variables generally available from any cross-sectional database. In practical terms, this means that the risk of mortality might readily be assessed from standard biomedical appraisals collected for other purposes.

  5. The effect of energy distribution of external source on source multiplication in fast assemblies

    International Nuclear Information System (INIS)

    Karam, R.A.; Vakilian, M.

    1976-02-01

    The essence of this study is the effect of energy distribution of a source on the detection rate as a function of K effective in fast assemblies. This effectiveness, as a function of K was studied in a fission chamber, using the ABN cross-section set and Mach 1 code. It was found that with a source which has a fission spectrum, the reciprocal count rate versus mass relationship is linear down to K effective 0.59. For a thermal source, the linearity was never achieved. (author)

  6. The risk of fracture in patients with multiple sclerosis: The UK general practice research database

    DEFF Research Database (Denmark)

    Bazelier, Marloes T; van Staa, Tjeerd; Uitdehaag, Bernard Mj

    2011-01-01

    Patients with multiple sclerosis (MS) may be at an increased risk of fracture owing to a greater risk of falling and decreased bone mineral density when compared with the general population. This study was designed to estimate the relative and absolute risk of fracture in patients with MS. We...... were used to derive adjusted hazard ratios (HRs) for fracture associated with MS. Time-dependent adjustments were made for age, comorbidity, and drug use. Absolute 5- and 10-year risks of fracture were estimated for MS patients as a function of age. Compared with controls, MS patients had an almost...... threefold increased risk of hip fracture [HR = 2.79,95% confidence interval (CI) 1.83-4.26] and a risk of osteoporotic fracture that was increased 1.4-fold (HR = 1.35,95% CI 1.13-1.62). Risk was greater in patients who had been prescribed oral/intravenous glucocorticoids (GCs; HR = 1.85, 95% CI 1...

  7. Investigating sources and pathways of perfluoroalkyl acids (PFAAs) in aquifers in Tokyo using multiple tracers

    International Nuclear Information System (INIS)

    Kuroda, Keisuke; Murakami, Michio; Oguma, Kumiko; Takada, Hideshige; Takizawa, Satoshi

    2014-01-01

    We employed a multi-tracer approach to investigate sources and pathways of perfluoroalkyl acids (PFAAs) in urban groundwater, based on 53 groundwater samples taken from confined aquifers and unconfined aquifers in Tokyo. While the median concentrations of groundwater PFAAs were several ng/L, the maximum concentrations of perfluorooctane sulfonate (PFOS, 990 ng/L), perfluorooctanoate (PFOA, 1800 ng/L) and perfluorononanoate (PFNA, 620 ng/L) in groundwater were several times higher than those of wastewater and street runoff reported in the literature. PFAAs were more frequently detected than sewage tracers (carbamazepine and crotamiton), presumably owing to the higher persistence of PFAAs, the multiple sources of PFAAs beyond sewage (e.g., surface runoff, point sources) and the formation of PFAAs from their precursors. Use of multiple methods of source apportionment including principal component analysis–multiple linear regression (PCA–MLR) and perfluoroalkyl carboxylic acid ratio analysis highlighted sewage and point sources as the primary sources of PFAAs in the most severely polluted groundwater samples, with street runoff being a minor source (44.6% sewage, 45.7% point sources and 9.7% street runoff, by PCA–MLR). Tritium analysis indicated that, while young groundwater (recharged during or after the 1970s, when PFAAs were already in commercial use) in shallow aquifers (< 50 m depth) was naturally highly vulnerable to PFAA pollution, PFAAs were also found in old groundwater (recharged before the 1950s, when PFAAs were not in use) in deep aquifers (50–500 m depth). This study demonstrated the utility of multiple uses of tracers (pharmaceuticals and personal care products; PPCPs, tritium) and source apportionment methods in investigating sources and pathways of PFAAs in multiple aquifer systems. - Highlights: • Aquifers in Tokyo had high levels of perfluoroalkyl acids (up to 1800 ng/L). • PFAAs were more frequently detected than sewage

  8. Investigating sources and pathways of perfluoroalkyl acids (PFAAs) in aquifers in Tokyo using multiple tracers

    Energy Technology Data Exchange (ETDEWEB)

    Kuroda, Keisuke, E-mail: keisukekr@gmail.com [Department of Urban Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-8656 (Japan); Murakami, Michio [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro, Tokyo 153-8505 (Japan); Oguma, Kumiko [Department of Urban Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-8656 (Japan); Takada, Hideshige [Laboratory of Organic Geochemistry (LOG), Institute of Symbiotic Science and Technology, Tokyo University of Agriculture and Technology, Fuchu, Tokyo 183-8509 (Japan); Takizawa, Satoshi [Department of Urban Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-8656 (Japan)

    2014-08-01

    We employed a multi-tracer approach to investigate sources and pathways of perfluoroalkyl acids (PFAAs) in urban groundwater, based on 53 groundwater samples taken from confined aquifers and unconfined aquifers in Tokyo. While the median concentrations of groundwater PFAAs were several ng/L, the maximum concentrations of perfluorooctane sulfonate (PFOS, 990 ng/L), perfluorooctanoate (PFOA, 1800 ng/L) and perfluorononanoate (PFNA, 620 ng/L) in groundwater were several times higher than those of wastewater and street runoff reported in the literature. PFAAs were more frequently detected than sewage tracers (carbamazepine and crotamiton), presumably owing to the higher persistence of PFAAs, the multiple sources of PFAAs beyond sewage (e.g., surface runoff, point sources) and the formation of PFAAs from their precursors. Use of multiple methods of source apportionment including principal component analysis–multiple linear regression (PCA–MLR) and perfluoroalkyl carboxylic acid ratio analysis highlighted sewage and point sources as the primary sources of PFAAs in the most severely polluted groundwater samples, with street runoff being a minor source (44.6% sewage, 45.7% point sources and 9.7% street runoff, by PCA–MLR). Tritium analysis indicated that, while young groundwater (recharged during or after the 1970s, when PFAAs were already in commercial use) in shallow aquifers (< 50 m depth) was naturally highly vulnerable to PFAA pollution, PFAAs were also found in old groundwater (recharged before the 1950s, when PFAAs were not in use) in deep aquifers (50–500 m depth). This study demonstrated the utility of multiple uses of tracers (pharmaceuticals and personal care products; PPCPs, tritium) and source apportionment methods in investigating sources and pathways of PFAAs in multiple aquifer systems. - Highlights: • Aquifers in Tokyo had high levels of perfluoroalkyl acids (up to 1800 ng/L). • PFAAs were more frequently detected than sewage

  9. PSD Applicability Determination for Multiple Owner/Operator Point Sources Within a Single Facility

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  10. Separation of Correlated Astrophysical Sources Using Multiple-Lag Data Covariance Matrices

    Directory of Open Access Journals (Sweden)

    Baccigalupi C

    2005-01-01

    Full Text Available This paper proposes a new strategy to separate astrophysical sources that are mutually correlated. This strategy is based on second-order statistics and exploits prior information about the possible structure of the mixing matrix. Unlike ICA blind separation approaches, where the sources are assumed mutually independent and no prior knowledge is assumed about the mixing matrix, our strategy allows the independence assumption to be relaxed and performs the separation of even significantly correlated sources. Besides the mixing matrix, our strategy is also capable to evaluate the source covariance functions at several lags. Moreover, once the mixing parameters have been identified, a simple deconvolution can be used to estimate the probability density functions of the source processes. To benchmark our algorithm, we used a database that simulates the one expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.

  11. Observational constraints on the physical nature of submillimetre source multiplicity: chance projections are common

    Science.gov (United States)

    Hayward, Christopher C.; Chapman, Scott C.; Steidel, Charles C.; Golob, Anneya; Casey, Caitlin M.; Smith, Daniel J. B.; Zitrin, Adi; Blain, Andrew W.; Bremer, Malcolm N.; Chen, Chian-Chou; Coppin, Kristen E. K.; Farrah, Duncan; Ibar, Eduardo; Michałowski, Michał J.; Sawicki, Marcin; Scott, Douglas; van der Werf, Paul; Fazio, Giovanni G.; Geach, James E.; Gurwell, Mark; Petitpas, Glen; Wilner, David J.

    2018-05-01

    Interferometric observations have demonstrated that a significant fraction of single-dish submillimetre (submm) sources are blends of multiple submm galaxies (SMGs), but the nature of this multiplicity, i.e. whether the galaxies are physically associated or chance projections, has not been determined. We performed spectroscopy of 11 SMGs in six multicomponent submm sources, obtaining spectroscopic redshifts for nine of them. For an additional two component SMGs, we detected continuum emission but no obvious features. We supplement our observed sources with four single-dish submm sources from the literature. This sample allows us to statistically constrain the physical nature of single-dish submm source multiplicity for the first time. In three (3/7, { or} 43^{+39 }_{ -33} {per cent at 95 {per cent} confidence}) of the single-dish sources for which the nature of the blending is unambiguous, the components for which spectroscopic redshifts are available are physically associated, whereas 4/7 (57^{+33 }_{ -39} per cent) have at least one unassociated component. When components whose spectra exhibit continuum but no features and for which the photometric redshift is significantly different from the spectroscopic redshift of the other component are also considered, 6/9 (67^{+26 }_{ -37} per cent) of the single-dish sources are comprised of at least one unassociated component SMG. The nature of the multiplicity of one single-dish source is ambiguous. We conclude that physically associated systems and chance projections both contribute to the multicomponent single-dish submm source population. This result contradicts the conventional wisdom that bright submm sources are solely a result of merger-induced starbursts, as blending of unassociated galaxies is also important.

  12. Source attribution using FLEXPART and carbon monoxide emission inventories for the IAGOS In-situ Observation database

    Science.gov (United States)

    Fontaine, Alain; Sauvage, Bastien; Pétetin, Hervé; Auby, Antoine; Boulanger, Damien; Thouret, Valerie

    2016-04-01

    Since 1994, the IAGOS program (In-Service Aircraft for a Global Observing System http://www.iagos.org) and its predecessor MOZAIC has produced in-situ measurements of the atmospheric composition during more than 46000 commercial aircraft flights. In order to help analyzing these observations and further understanding the processes driving their evolution, we developed a modelling tool SOFT-IO quantifying their source/receptor link. We improved the methodology used by Stohl et al. (2003), based on the FLEXPART plume dispersion model, to simulate the contributions of anthropogenic and biomass burning emissions from the ECCAD database (http://eccad.aeris-data.fr) to the measured carbon monoxide mixing ratio along each IAGOS flight. Thanks to automated processes, contributions are simulated for the last 20 days before observation, separating individual contributions from the different source regions. The main goal is to supply add-value products to the IAGOS database showing pollutants geographical origin and emission type. Using this information, it may be possible to link trends in the atmospheric composition to changes in the transport pathways and to the evolution of emissions. This tool could be used for statistical validation as well as for inter-comparisons of emission inventories using large amounts of data, as Lagrangian models are able to bring the global scale emissions down to a smaller scale, where they can be directly compared to the in-situ observations from the IAGOS database.

  13. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    Science.gov (United States)

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  14. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    Science.gov (United States)

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  15. Mortality and comorbidities in patients with multiple sclerosis compared with a population without multiple sclerosis: An observational study using the US Department of Defense administrative claims database.

    Science.gov (United States)

    Capkun, Gorana; Dahlke, Frank; Lahoz, Raquel; Nordstrom, Beth; Tilson, Hugh H; Cutter, Gary; Bischof, Dorina; Moore, Alan; Simeone, Jason; Fraeman, Kathy; Bancken, Fabrice; Geissbühler, Yvonne; Wagner, Michael; Cohan, Stanley

    2015-11-01

    Data are limited for mortality and comorbidities in patients with multiple sclerosis (MS). Compare mortality rates and event rates for comorbidities in MS (n=15,684) and non-MS (n=78,420) cohorts from the US Department of Defense (DoD) database. Comorbidities and all-cause mortality were assessed using the database. Causes of death (CoDs) were assessed through linkage with the National Death Index. Cohorts were compared using mortality (MRR) and event (ERR) rate ratios. All-cause mortality was 2.9-fold higher in the MS versus non-MS cohort (MRR, 95% confidence interval [CI]: 2.9, 2.7-3.2). Frequent CoDs in the MS versus non-MS cohort were infectious diseases (6.2, 4.2-9.4), diseases of the nervous (5.8, 3.7-9.0), respiratory (5.0, 3.9-6.4) and circulatory (2.1, 1.7-2.7) systems and suicide (2.6, 1.3-5.2). Comorbidities including sepsis (ERR, 95% CI: 5.7, 5.1-6.3), ischemic stroke (3.8, 3.5-4.2), attempted suicide (2.4, 1.3-4.5) and ulcerative colitis (2.0, 1.7-2.3), were higher in the MS versus non-MS cohort. The rate of cancers was also higher in the MS versus the non-MS cohort, including lymphoproliferative disorders (2.2, 1.9-2.6) and melanoma (1.7, 1.4-2.0). Rates of mortality and several comorbidities are higher in the MS versus non-MS cohort. Early recognition and management of comorbidities may reduce premature mortality and improve quality of life in patients with MS. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Linked Patient-Reported Outcomes Data From Patients With Multiple Sclerosis Recruited on an Open Internet Platform to Health Care Claims Databases Identifies a Representative Population for Real-Life Data Analysis in Multiple Sclerosis.

    Science.gov (United States)

    Risson, Valery; Ghodge, Bhaskar; Bonzani, Ian C; Korn, Jonathan R; Medin, Jennie; Saraykar, Tanmay; Sengupta, Souvik; Saini, Deepanshu; Olson, Melvin

    2016-09-22

    An enormous amount of information relevant to public health is being generated directly by online communities. To explore the feasibility of creating a dataset that links patient-reported outcomes data, from a Web-based survey of US patients with multiple sclerosis (MS) recruited on open Internet platforms, to health care utilization information from health care claims databases. The dataset was generated by linkage analysis to a broader MS population in the United States using both pharmacy and medical claims data sources. US Facebook users with an interest in MS were alerted to a patient-reported survey by targeted advertisements. Eligibility criteria were diagnosis of MS by a specialist (primary progressive, relapsing-remitting, or secondary progressive), ≥12-month history of disease, age 18-65 years, and commercial health insurance. Participants completed a questionnaire including data on demographic and disease characteristics, current and earlier therapies, relapses, disability, health-related quality of life, and employment status and productivity. A unique anonymous profile was generated for each survey respondent. Each anonymous profile was linked to a number of medical and pharmacy claims datasets in the United States. Linkage rates were assessed and survey respondents' representativeness was evaluated based on differences in the distribution of characteristics between the linked survey population and the general MS population in the claims databases. The advertisement was placed on 1,063,973 Facebook users' pages generating 68,674 clicks, 3719 survey attempts, and 651 successfully completed surveys, of which 440 could be linked to any of the claims databases for 2014 or 2015 (67.6% linkage rate). Overall, no significant differences were found between patients who were linked and not linked for educational status, ethnicity, current or prior disease-modifying therapy (DMT) treatment, or presence of a relapse in the last 12 months. The frequencies of the

  17. An Approach to Measuring Semantic Relatedness of Geographic Terminologies Using a Thesaurus and Lexical Database Sources

    Directory of Open Access Journals (Sweden)

    Zugang Chen

    2018-03-01

    Full Text Available In geographic information science, semantic relatedness is important for Geographic Information Retrieval (GIR, Linked Geospatial Data, geoparsing, and geo-semantics. But computing the semantic similarity/relatedness of geographic terminology is still an urgent issue to tackle. The thesaurus is a ubiquitous and sophisticated knowledge representation tool existing in various domains. In this article, we combined the generic lexical database (WordNet or HowNet with the Thesaurus for Geographic Science and proposed a thesaurus–lexical relatedness measure (TLRM to compute the semantic relatedness of geographic terminology. This measure quantified the relationship between terminologies, interlinked the discrete term trees by using the generic lexical database, and realized the semantic relatedness computation of any two terminologies in the thesaurus. The TLRM was evaluated on a new relatedness baseline, namely, the Geo-Terminology Relatedness Dataset (GTRD which was built by us, and the TLRM obtained a relatively high cognitive plausibility. Finally, we applied the TLRM on a geospatial data sharing portal to support data retrieval. The application results of the 30 most frequently used queries of the portal demonstrated that using TLRM could improve the recall of geospatial data retrieval in most situations and rank the retrieval results by the matching scores between the query of users and the geospatial dataset.

  18. Existing data sources for clinical epidemiology: Aarhus University Clinical Trial Candidate Database, Denmark.

    Science.gov (United States)

    Nørrelund, Helene; Mazin, Wiktor; Pedersen, Lars

    2014-01-01

    Denmark is facing a reduction in clinical trial activity as the pharmaceutical industry has moved trials to low-cost emerging economies. Competitiveness in industry-sponsored clinical research depends on speed, quality, and cost. Because Denmark is widely recognized as a region that generates high quality data, an enhanced ability to attract future trials could be achieved if speed can be improved by taking advantage of the comprehensive national and regional registries. A "single point-of-entry" system has been established to support collaboration between hospitals and industry. When assisting industry in early-stage feasibility assessments, potential trial participants are identified by use of registries to shorten the clinical trial startup times. The Aarhus University Clinical Trial Candidate Database consists of encrypted data from the Danish National Registry of Patients allowing an immediate estimation of the number of patients with a specific discharge diagnosis in each hospital department or outpatient specialist clinic in the Central Denmark Region. The free access to health care, thorough monitoring of patients who are in contact with the health service, completeness of registration at the hospital level, and ability to link all databases are competitive advantages in an increasingly complex clinical trial environment.

  19. Use of ultrasonic array method for positioning multiple partial discharge sources in transformer oil.

    Science.gov (United States)

    Xie, Qing; Tao, Junhan; Wang, Yongqiang; Geng, Jianghai; Cheng, Shuyi; Lü, Fangcheng

    2014-08-01

    Fast and accurate positioning of partial discharge (PD) sources in transformer oil is very important for the safe, stable operation of power systems because it allows timely elimination of insulation faults. There is usually more than one PD source once an insulation fault occurs in the transformer oil. This study, which has both theoretical and practical significance, proposes a method of identifying multiple PD sources in the transformer oil. The method combines the two-sided correlation transformation algorithm in the broadband signal focusing and the modified Gerschgorin disk estimator. The method of classification of multiple signals is used to determine the directions of arrival of signals from multiple PD sources. The ultrasonic array positioning method is based on the multi-platform direction finding and the global optimization searching. Both the 4 × 4 square planar ultrasonic sensor array and the ultrasonic array detection platform are built to test the method of identifying and positioning multiple PD sources. The obtained results verify the validity and the engineering practicability of this method.

  20. Multiple sclerosis: patients’ information sources and needs on disease symptoms and management

    Directory of Open Access Journals (Sweden)

    Albert I Matti

    2010-06-01

    Full Text Available Albert I Matti1, Helen McCarl2, Pamela Klaer2, Miriam C Keane1, Celia S Chen11Department of Ophthalmology, Flinders Medical Centre and Flinders University, Bedford Park, SA, Australia; 2The Multiple Sclerosis Society of South Australia and Northern Territory, Klemzig, SA, AustraliaObjective: To investigate the current information sources of patients with multiple sclerosis (MS in the early stages of their disease and to identify patients’ preferred source of information. The relative amounts of information from the different sources were also compared.Methods: Participants at a newly diagnosed information session organized by the Multiple Sclerosis Society of South Australia were invited to complete a questionnaire. Participants were asked to rate on a visual analog scale how much information they had received about MS and optic neuritis from different information sources and how much information they would like to receive from each of the sources.Results: A close to ideal amount of information is being provided by the MS society and MS specialist nurses. There is a clear deficit between what information patients are currently receiving and the amount of information they actually want from various sources. Patients wish to receive significantly more information from treating general practitioners, eye specialists, neurologists, and education sessions. Patients have identified less than adequate information received on optic neuritis from all sources.Conclusion: This study noted a clear information deficit regarding MS from all sources. This information deficit is more pronounced in relation to optic neuritis and needs to be addressed in the future.Practice implications: More patient information and counselling needs to be provided to MS patients even at early stages of their disease, especially in relation to management of disease relapse.Keywords: information sources, information needs, MS patients, optic neuritis

  1. Mobility and Sector-specific Effects of Changes in Multiple Sources ...

    African Journals Online (AJOL)

    Using the second and third Cameroon household consumption surveys, this study examined mobility and sector-specific effects of changes in multiple sources of deprivation in Cameroon. Results indicated that between 2001 and 2007, deprivations associated with human capital and labour capital reduced, while ...

  2. Transfer functions of double- and multiple-cavity Fabry-Perot filters driven by Lorentzian sources.

    Science.gov (United States)

    Marti, J; Capmany, J

    1996-12-20

    We derive expressions for the transfer functions of double- and multiple-cavity Fabry-Perot filters driven by laser sources with Lorentzian spectrum. These are of interest because of their applications in sensing and channel filtering in optical frequency-division multiplexing networks.

  3. Organizational Communication in Emergencies: Using Multiple Channels and Sources to Combat Noise and Capture Attention

    Science.gov (United States)

    Stephens, Keri K.; Barrett, Ashley K.; Mahometa, Michael J.

    2013-01-01

    This study relies on information theory, social presence, and source credibility to uncover what best helps people grasp the urgency of an emergency. We surveyed a random sample of 1,318 organizational members who received multiple notifications about a large-scale emergency. We found that people who received 3 redundant messages coming through at…

  4. Reading on the World Wide Web: Dealing with conflicting information from multiple sources

    NARCIS (Netherlands)

    Van Strien, Johan; Brand-Gruwel, Saskia; Boshuizen, Els

    2011-01-01

    Van Strien, J. L. H., Brand-Gruwel, S., & Boshuizen, H. P. A. (2011, August). Reading on the World Wide Web: Dealing with conflicting information from multiple sources. Poster session presented at the biannual conference of the European Association for Research on Learning and Instruction, Exeter,

  5. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array

    Directory of Open Access Journals (Sweden)

    Yankui Zhang

    2018-05-01

    Full Text Available Direct position determination (DPD is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer–Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  6. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array.

    Science.gov (United States)

    Zhang, Yankui; Ba, Bin; Wang, Daming; Geng, Wei; Xu, Haiyun

    2018-05-08

    Direct position determination (DPD) is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer⁻Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  7. A New Database Facilitates Characterization of Flavonoid Intake, Sources, and Positive Associations with Diet Quality among US Adults.

    Science.gov (United States)

    Sebastian, Rhonda S; Wilkinson Enns, Cecilia; Goldman, Joseph D; Martin, Carrie L; Steinfeldt, Lois C; Murayi, Theophile; Moshfegh, Alanna J

    2015-06-01

    Epidemiologic studies demonstrate inverse associations between flavonoid intake and chronic disease risk. However, lack of comprehensive databases of the flavonoid content of foods has hindered efforts to fully characterize population intakes and determine associations with diet quality. Using a newly released database of flavonoid values, this study sought to describe intake and sources of total flavonoids and 6 flavonoid classes and identify associations between flavonoid intake and the Healthy Eating Index (HEI) 2010. One day of 24-h dietary recall data from adults aged ≥ 20 y (n = 5420) collected in What We Eat in America (WWEIA), NHANES 2007-2008, were analyzed. Flavonoid intakes were calculated using the USDA Flavonoid Values for Survey Foods and Beverages 2007-2008. Regression analyses were conducted to provide adjusted estimates of flavonoid intake, and linear trends in total and component HEI scores by flavonoid intake were assessed using orthogonal polynomial contrasts. All analyses were weighted to be nationally representative. Mean intake of flavonoids was 251 mg/d, with flavan-3-ols accounting for 81% of intake. Non-Hispanic whites had significantly higher (P empty calories increased (P < 0.001) across flavonoid intake quartiles. A new database that permits comprehensive estimation of flavonoid intakes in WWEIA, NHANES 2007-2008; identification of their major food/beverage sources; and determination of associations with dietary quality will lead to advances in research on relations between flavonoid intake and health. Findings suggest that diet quality, as measured by HEI, is positively associated with flavonoid intake. © 2015 American Society for Nutrition.

  8. Simulation of neutron multiplicity measurements using Geant4. Open source software for nuclear arms control

    Energy Technology Data Exchange (ETDEWEB)

    Kuett, Moritz

    2016-07-07

    Nuclear arms control, including nuclear safeguards and verification technologies for nuclear disarmament typically use software as part of many different technological applications. This thesis proposes to use three open source criteria for such software, allowing users and developers to have free access to a program, have access to the full source code and be able to publish modifications for the program. This proposition is presented and analyzed in detail, together with the description of the development of ''Open Neutron Multiplicity Simulation'', an open source software tool to simulate neutron multiplicity measurements. The description includes physical background of the method, details of the developed program and a comprehensive set of validation calculations.

  9. Multiple Speech Source Separation Using Inter-Channel Correlation and Relaxed Sparsity

    Directory of Open Access Journals (Sweden)

    Maoshen Jia

    2018-01-01

    Full Text Available In this work, a multiple speech source separation method using inter-channel correlation and relaxed sparsity is proposed. A B-format microphone with four spatially located channels is adopted due to the size of the microphone array to preserve the spatial parameter integrity of the original signal. Specifically, we firstly measure the proportion of overlapped components among multiple sources and find that there exist many overlapped time-frequency (TF components with increasing source number. Then, considering the relaxed sparsity of speech sources, we propose a dynamic threshold-based separation approach of sparse components where the threshold is determined by the inter-channel correlation among the recording signals. After conducting a statistical analysis of the number of active sources at each TF instant, a form of relaxed sparsity called the half-K assumption is proposed so that the active source number in a certain TF bin does not exceed half the total number of simultaneously occurring sources. By applying the half-K assumption, the non-sparse components are recovered by regarding the extracted sparse components as a guide, combined with vector decomposition and matrix factorization. Eventually, the final TF coefficients of each source are recovered by the synthesis of sparse and non-sparse components. The proposed method has been evaluated using up to six simultaneous speech sources under both anechoic and reverberant conditions. Both objective and subjective evaluations validated that the perceptual quality of the separated speech by the proposed approach outperforms existing blind source separation (BSS approaches. Besides, it is robust to different speeches whilst confirming all the separated speeches with similar perceptual quality.

  10. openBIS ELN-LIMS: an open-source database for academic laboratories.

    Science.gov (United States)

    Barillari, Caterina; Ottoz, Diana S M; Fuentes-Serna, Juan Mariano; Ramakrishnan, Chandrasekhar; Rinn, Bernd; Rudolf, Fabian

    2016-02-15

    The open-source platform openBIS (open Biology Information System) offers an Electronic Laboratory Notebook and a Laboratory Information Management System (ELN-LIMS) solution suitable for the academic life science laboratories. openBIS ELN-LIMS allows researchers to efficiently document their work, to describe materials and methods and to collect raw and analyzed data. The system comes with a user-friendly web interface where data can be added, edited, browsed and searched. The openBIS software, a user guide and a demo instance are available at https://openbis-eln-lims.ethz.ch. The demo instance contains some data from our laboratory as an example to demonstrate the possibilities of the ELN-LIMS (Ottoz et al., 2014). For rapid local testing, a VirtualBox image of the ELN-LIMS is also available. © The Author 2015. Published by Oxford University Press.

  11. Treatment patterns and health care resource utilization associated with dalfampridine extended release in multiple sclerosis: a retrospective claims database analysis

    Directory of Open Access Journals (Sweden)

    Guo A

    2016-05-01

    Full Text Available Amy Guo,1 Michael Grabner,2 Swetha Rao Palli,2 Jessica Elder,1 Matthew Sidovar,1 Peter Aupperle,1 Stephen Krieger3 1Acorda Therapeutics Inc., Ardsley, New York, NY, USA; 2HealthCore Inc., Wilmington, DE, USA; 3Corinne Goldsmith Dickinson Center for MS, Icahn School of Medicine at Mount Sinai, New York, NY, USA Background: Although previous studies have demonstrated the clinical benefits of dalfampridine extended release (D-ER tablets in patients with multiple sclerosis (MS, there are limited real-world data on D-ER utilization and associated outcomes in patients with MS. Purpose: The objective of this study was to evaluate treatment patterns, budget impact, and health care resource utilization (HRU associated with D-ER use in a real-world setting. Methods: A retrospective claims database analysis was conducted using the HealthCore Integrated Research DatabaseSM. Adherence (measured by medication possession ratio, or [MPR] and persistence (measured by days between initial D-ER claim and discontinuation or end of follow-up were evaluated over 1-year follow-up. Budget impact was calculated as cost per member per month (PMPM over the available follow-up period. D-ER and control cohorts were propensity-score matched on baseline demographics, comorbidities, and MS-related resource utilization to compare walking-impairment-related HRU over follow-up. Results: Of the 2,138 MS patients identified, 1,200 were not treated with D-ER (control and 938 were treated with D-ER. Patients were aged 51 years on average and 74% female. Approximately 82.6% of D-ER patients were adherent (MPR >80%. The estimated budget impact range of D-ER was $0.014–$0.026 PMPM. Propensity-score-matched D-ER and controls yielded 479 patients in each cohort. Postmatching comparison showed that the D-ER cohort was associated with fewer physician (21.5% vs 62.4%, P<0.0001 and other outpatient visits (22.8% vs 51.4%, P<0.0001 over the 12-month follow-up. Changes in HRU from follow

  12. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  13. Acoustic Database for Turbofan Engine Core-Noise Sources. I; Volume

    Science.gov (United States)

    Gordon, Grant

    2015-01-01

    In this program, a database of dynamic temperature and dynamic pressure measurements were acquired inside the core of a TECH977 turbofan engine to support investigations of indirect combustion noise. Dynamic temperature and pressure measurements were recorded for engine gas dynamics up to temperatures of 3100 degrees Fahrenheit and transient responses as high as 1000 hertz. These measurements were made at the entrance of the high pressure turbine (HPT) and at the entrance and exit of the low pressure turbine (LPT). Measurements were made at two circumferential clocking positions. In the combustor and inter-turbine duct (ITD), measurements were made at two axial locations to enable the exploration of time delays. The dynamic temperature measurements were made using dual thin-wire thermocouple probes. The dynamic pressure measurements were made using semi-infinite probes. Prior to the engine test, a series of bench, oven, and combustor rig tests were conducted to characterize the performance of the dual wire temperature probes and to define and characterize the data acquisition systems. A measurement solution for acquiring dynamic temperature and pressure data on the engine was defined. A suite of hardware modifications were designed to incorporate the dynamic temperature and pressure instrumentation into the TECH977 engine. In particular, a probe actuation system was developed to protect the delicate temperature probes during engine startup and transients in order to maximize sensor life. A set of temperature probes was procured and the TECH977 engine was assembled with the suite of new and modified hardware. The engine was tested at four steady state operating speeds, with repeats. Dynamic pressure and temperature data were acquired at each condition for at least one minute. At the two highest power settings, temperature data could not be obtained at the forward probe locations since the mean temperatures exceeded the capability of the probes. The temperature data

  14. The Freight Analysis Framework Verson 4 (FAF4) - Building the FAF4 Regional Database: Data Sources and Estimation Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ho-Ling [ORNL; Hargrove, Stephanie [ORNL; Chin, Shih-Miao [ORNL; Wilson, Daniel W [ORNL; Taylor, Rob D [ORNL; Davidson, Diane [ORNL

    2016-09-01

    The Freight Analysis Framework (FAF) integrates data from a variety of sources to create a comprehensive national picture of freight movements among states and major metropolitan areas by all modes of transportation. It provides a national picture of current freight flows to, from, and within the United States, assigns the flows to the transportation network, and projects freight flow patterns into the future. The FAF4 is the fourth database of its kind, FAF1 provided estimates for truck, rail, and water tonnage for calendar year 1998, FAF2 provided a more complete picture based on the 2002 Commodity Flow Survey (CFS) and FAF3 made further improvements building on the 2007 CFS. Since the first FAF effort, a number of changes in both data sources and products have taken place. The FAF4 flow matrix described in this report is used as the base-year data to forecast future freight activities, projecting shipment weights and values from year 2020 to 2045 in five-year intervals. It also provides the basis for annual estimates to the FAF4 flow matrix, aiming to provide users with the timeliest data. Furthermore, FAF4 truck freight is routed on the national highway network to produce the FAF4 network database and flow assignments for truck. This report details the data sources and methodologies applied to develop the base year 2012 FAF4 database. An overview of the FAF4 components is briefly discussed in Section 2. Effects on FAF4 from the changes in the 2012 CFS are highlighted in Section 3. Section 4 provides a general discussion on the process used in filling data gaps within the domestic CFS matrix, specifically on the estimation of CFS suppressed/unpublished cells. Over a dozen CFS OOS components of FAF4 are then addressed in Section 5 through Section 11 of this report. This includes discussions of farm-based agricultural shipments in Section 5, shipments from fishery and logging sectors in Section 6. Shipments of municipal solid wastes and debris from construction

  15. Detecting and accounting for multiple sources of positional variance in peak list registration analysis and spin system grouping.

    Science.gov (United States)

    Smelter, Andrey; Rouchka, Eric C; Moseley, Hunter N B

    2017-08-01

    Peak lists derived from nuclear magnetic resonance (NMR) spectra are commonly used as input data for a variety of computer assisted and automated analyses. These include automated protein resonance assignment and protein structure calculation software tools. Prior to these analyses, peak lists must be aligned to each other and sets of related peaks must be grouped based on common chemical shift dimensions. Even when programs can perform peak grouping, they require the user to provide uniform match tolerances or use default values. However, peak grouping is further complicated by multiple sources of variance in peak position limiting the effectiveness of grouping methods that utilize uniform match tolerances. In addition, no method currently exists for deriving peak positional variances from single peak lists for grouping peaks into spin systems, i.e. spin system grouping within a single peak list. Therefore, we developed a complementary pair of peak list registration analysis and spin system grouping algorithms designed to overcome these limitations. We have implemented these algorithms into an approach that can identify multiple dimension-specific positional variances that exist in a single peak list and group peaks from a single peak list into spin systems. The resulting software tools generate a variety of useful statistics on both a single peak list and pairwise peak list alignment, especially for quality assessment of peak list datasets. We used a range of low and high quality experimental solution NMR and solid-state NMR peak lists to assess performance of our registration analysis and grouping algorithms. Analyses show that an algorithm using a single iteration and uniform match tolerances approach is only able to recover from 50 to 80% of the spin systems due to the presence of multiple sources of variance. Our algorithm recovers additional spin systems by reevaluating match tolerances in multiple iterations. To facilitate evaluation of the

  16. Numerical Procedure to Forecast the Tsunami Parameters from a Database of Pre-Simulated Seismic Unit Sources

    Science.gov (United States)

    Jiménez, César; Carbonel, Carlos; Rojas, Joel

    2018-04-01

    We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.

  17. Source location in plates based on the multiple sensors array method and wavelet analysis

    International Nuclear Information System (INIS)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon

    2014-01-01

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  18. Source location in plates based on the multiple sensors array method and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon [Inha University, Incheon (Korea, Republic of)

    2014-01-15

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  19. Monte Carlo analyses of the source multiplication factor of the YALINA booster facility

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, Alberto; Gohar, Y.; Kondev, F.; Aliberti, Gerardo [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Bolshinsky, I. [Idaho National Laboratory, P. O. Box 2528, Idaho Falls, Idaho 83403 (United States); Kiyavitskaya, Hanna; Bournos, Victor; Fokov, Yury; Routkovskaya, Christina; Serafimovich, Ivan [Joint Institute for Power and Nuclear Research-Sosny, National Academy of Sciences, Minsk, acad. Krasin, 99, 220109 (Belarus)

    2008-07-01

    The multiplication factor of a subcritical assembly is affected by the energy spectrum and spatial distribution of the neutron source. In a critical assembly, neutrons emerge from the fission reactions with an average energy of approx2 MeV; in a deuteron accelerator driven subcritical assembly, neutrons emerge from the fusion target with a fixed energy of 2.45 or 14.1 MeV, from the Deuterium-Deuterium (D-D) and Deuterium-Tritium (D-T) reactions respectively. This study aims at generating accurate neutronics models for the YALINA Booster facility, based on the use of different Monte Carlo neutron transport codes, at defining the facility key physical parameters, and at comparing the neutron multiplication factor for three different neutron sources: fission, D-D and D-T. The calculated values are compared with the experimental results. (authors)

  20. Monte Carlo analyses of the source multiplication factor of the YALINA booster facility

    International Nuclear Information System (INIS)

    Talamo, Alberto; Gohar, Y.; Kondev, F.; Aliberti, Gerardo; Bolshinsky, I.; Kiyavitskaya, Hanna; Bournos, Victor; Fokov, Yury; Routkovskaya, Christina; Serafimovich, Ivan

    2008-01-01

    The multiplication factor of a subcritical assembly is affected by the energy spectrum and spatial distribution of the neutron source. In a critical assembly, neutrons emerge from the fission reactions with an average energy of ∼2 MeV; in a deuteron accelerator driven subcritical assembly, neutrons emerge from the fusion target with a fixed energy of 2.45 or 14.1 MeV, from the Deuterium-Deuterium (D-D) and Deuterium-Tritium (D-T) reactions respectively. This study aims at generating accurate neutronics models for the YALINA Booster facility, based on the use of different Monte Carlo neutron transport codes, at defining the facility key physical parameters, and at comparing the neutron multiplication factor for three different neutron sources: fission, D-D and D-T. The calculated values are compared with the experimental results. (authors)

  1. Distributed 3D Source Localization from 2D DOA Measurements Using Multiple Linear Arrays

    Directory of Open Access Journals (Sweden)

    Antonio Canclini

    2017-01-01

    Full Text Available This manuscript addresses the problem of 3D source localization from direction of arrivals (DOAs in wireless acoustic sensor networks. In this context, multiple sensors measure the DOA of the source, and a central node combines the measurements to yield the source location estimate. Traditional approaches require 3D DOA measurements; that is, each sensor estimates the azimuth and elevation of the source by means of a microphone array, typically in a planar or spherical configuration. The proposed methodology aims at reducing the hardware and computational costs by combining measurements related to 2D DOAs estimated from linear arrays arbitrarily displaced in the 3D space. Each sensor measures the DOA in the plane containing the array and the source. Measurements are then translated into an equivalent planar geometry, in which a set of coplanar equivalent arrays observe the source preserving the original DOAs. This formulation is exploited to define a cost function, whose minimization leads to the source location estimation. An extensive simulation campaign validates the proposed approach and compares its accuracy with state-of-the-art methodologies.

  2. A Monte Carlo multiple source model applied to radiosurgery narrow photon beams

    International Nuclear Information System (INIS)

    Chaves, A.; Lopes, M.C.; Alves, C.C.; Oliveira, C.; Peralta, L.; Rodrigues, P.; Trindade, A.

    2004-01-01

    Monte Carlo (MC) methods are nowadays often used in the field of radiotherapy. Through successive steps, radiation fields are simulated, producing source Phase Space Data (PSD) that enable a dose calculation with good accuracy. Narrow photon beams used in radiosurgery can also be simulated by MC codes. However, the poor efficiency in simulating these narrow photon beams produces PSD whose quality prevents calculating dose with the required accuracy. To overcome this difficulty, a multiple source model was developed that enhances the quality of the reconstructed PSD, reducing also the time and storage capacities. This multiple source model was based on the full MC simulation, performed with the MC code MCNP4C, of the Siemens Mevatron KD2 (6 MV mode) linear accelerator head and additional collimators. The full simulation allowed the characterization of the particles coming from the accelerator head and from the additional collimators that shape the narrow photon beams used in radiosurgery treatments. Eight relevant photon virtual sources were identified from the full characterization analysis. Spatial and energy distributions were stored in histograms for the virtual sources representing the accelerator head components and the additional collimators. The photon directions were calculated for virtual sources representing the accelerator head components whereas, for the virtual sources representing the additional collimators, they were recorded into histograms. All these histograms were included in the MC code, DPM code and using a sampling procedure that reconstructed the PSDs, dose distributions were calculated in a water phantom divided in 20000 voxels of 1x1x5 mm 3 . The model accurately calculates dose distributions in the water phantom for all the additional collimators; for depth dose curves, associated errors at 2σ were lower than 2.5% until a depth of 202.5 mm for all the additional collimators and for profiles at various depths, deviations between measured

  3. Gas production strategy of underground coal gasification based on multiple gas sources.

    Science.gov (United States)

    Tianhong, Duan; Zuotang, Wang; Limin, Zhou; Dongdong, Li

    2014-01-01

    To lower stability requirement of gas production in UCG (underground coal gasification), create better space and opportunities of development for UCG, an emerging sunrise industry, in its initial stage, and reduce the emission of blast furnace gas, converter gas, and coke oven gas, this paper, for the first time, puts forward a new mode of utilization of multiple gas sources mainly including ground gasifier gas, UCG gas, blast furnace gas, converter gas, and coke oven gas and the new mode was demonstrated by field tests. According to the field tests, the existing power generation technology can fully adapt to situation of high hydrogen, low calorific value, and gas output fluctuation in the gas production in UCG in multiple-gas-sources power generation; there are large fluctuations and air can serve as a gasifying agent; the gas production of UCG in the mode of both power and methanol based on multiple gas sources has a strict requirement for stability. It was demonstrated by the field tests that the fluctuations in gas production in UCG can be well monitored through a quality control chart method.

  4. Gas Production Strategy of Underground Coal Gasification Based on Multiple Gas Sources

    Directory of Open Access Journals (Sweden)

    Duan Tianhong

    2014-01-01

    Full Text Available To lower stability requirement of gas production in UCG (underground coal gasification, create better space and opportunities of development for UCG, an emerging sunrise industry, in its initial stage, and reduce the emission of blast furnace gas, converter gas, and coke oven gas, this paper, for the first time, puts forward a new mode of utilization of multiple gas sources mainly including ground gasifier gas, UCG gas, blast furnace gas, converter gas, and coke oven gas and the new mode was demonstrated by field tests. According to the field tests, the existing power generation technology can fully adapt to situation of high hydrogen, low calorific value, and gas output fluctuation in the gas production in UCG in multiple-gas-sources power generation; there are large fluctuations and air can serve as a gasifying agent; the gas production of UCG in the mode of both power and methanol based on multiple gas sources has a strict requirement for stability. It was demonstrated by the field tests that the fluctuations in gas production in UCG can be well monitored through a quality control chart method.

  5. Opinions on Drug Interaction Sources in Anticancer Treatments and Parameters for an Oncology-Specific Database by Pharmacy Practitioners in Asia

    Directory of Open Access Journals (Sweden)

    2010-01-01

    Full Text Available Cancer patients undergoing chemotherapy are particularly susceptible to drug-drug interactions (DDIs. Practitioners should keep themselves updated with the most current DDI information, particularly involving new anticancer drugs (ACDs. Databases can be useful to obtain up-to-date DDI information in a timely and efficient manner. Our objective was to investigate the DDI information sources of pharmacy practitioners in Asia and their views on the usefulness of an oncology-specific database for ACD interactions. A qualitative, cross-sectional survey was done to collect information on the respondents' practice characteristics, sources of DDI information and parameters useful in an ACD interaction database. Response rate was 49%. Electronic databases (70%, drug interaction textbooks (69% and drug compendia (64% were most commonly used. Majority (93% indicated that a database catering towards ACD interactions was useful. Essential parameters that should be included in the database were the mechanism and severity of the detected interaction, and the presence of a management plan (98% each. This study has improved our understanding on the usefulness of various DDI information sources for ACD interactions among pharmacy practitioners in Asia. An oncology-specific DDI database targeting ACD interactions is definitely attractive for clinical practice.

  6. Managing Multiple Sources of Competitive Advantage in a Complex Competitive Environment

    Directory of Open Access Journals (Sweden)

    Alexandre Howard Henry Lapersonne

    2013-12-01

    Full Text Available The aim of this article is to review the literature on the topic of sustained and temporary competitive advantage creation, specifically in dynamic markets, and to propose further research possibilities. After having analyzed the main trends and scholars’ works on the subject, it was concluded that a firm which has been experiencing erosion of its core sources of economic rent generation, should have diversified its strategy portfolio in a search for new sources of competitive advantage, ones that could compensate for the decline of profits provoked by intensive competitive environments. This review concludes with the hypothesis that firms, who have decided to enter and manage multiple competitive environments, should have developed a multiple strategies framework approach. The management of this source of competitive advantage portfolio should have allowed persistence of a firm’s superior economic performance through the management of diverse temporary advantages lifecycle and through a resilient effect, where a very successful source of competitive advantage compensates the ones that have been eroded. Additionally, the review indicates that economies of emerging countries, such as the ones from the BRIC block, should present a more complex competitive environment due to their historical nature of cultural diversity, social contrasts and frequent economic disruption, and also because of recent institutional normalization that has turned the market into hypercompetition. Consequently, the study of complex competition should be appropriate in such environments.

  7. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  8. Transparent mediation-based access to multiple yeast data sources using an ontology driven interface.

    Science.gov (United States)

    Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F

    2012-01-25

    Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.

  9. Multiple Spectral Ratio Analyses Reveal Earthquake Source Spectra of Small Earthquakes and Moment Magnitudes of Microearthquakes

    Science.gov (United States)

    Uchide, T.; Imanishi, K.

    2016-12-01

    Spectral studies for macroscopic earthquake source parameters are helpful for characterizing earthquake rupture process and hence understanding earthquake source physics and fault properties. Those studies require us mute wave propagation path and site effects in spectra of seismograms to accentuate source effect. We have recently developed the multiple spectral ratio method [Uchide and Imanishi, BSSA, 2016] employing many empirical Green's function (EGF) events to reduce errors from the choice of EGF events. This method helps us estimate source spectra more accurately as well as moment ratios among reference and EGF events, which are useful to constrain the seismic moment of microearthquakes. First, we focus on earthquake source spectra. The source spectra have generally been thought to obey the omega-square model with single corner-frequency. However recent studies imply the existence of another corner frequency for some earthquakes. We analyzed small shallow inland earthquakes (3.5 multiple spectral ratio analyses. For 20000 microearthquakes in Fukushima Hamadori and northern Ibaraki prefecture area, we found that the JMA magnitudes (Mj) based on displacement or velocity amplitude are systematically below Mw. The slope of the Mj-Mw relation is 0.5 for Mj 5. We propose a fitting curve for the obtained relationship as Mw = (1/2)Mj + (1/2)(Mjγ + Mcorγ)1/γ+ c, where Mcor is a corner magnitude, γ determines the sharpness of the corner, and c denotes an offset. We obtained Mcor = 4.1, γ = 5.6, and c = -0.47 to fit the observation. The parameters are useful for characterizing the Mj-Mw relationship. This non-linear relationship affects the b-value of the Gutenberg-Richter law. Quantitative discussions on b-values are affected by the definition of magnitude to use.

  10. DMPD: Suppressor of cytokine signaling (SOCS) 2, a protein with multiple functions. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available 17070092 Suppressor of cytokine signaling (SOCS) 2, a protein with multiple function...Epub 2006 Oct 27. (.png) (.svg) (.html) (.csml) Show Suppressor of cytokine signaling (SOCS) 2, a protein with multiple function...SOCS) 2, a protein with multiple functions. Authors Rico-Bautista E, Flores-Morales A, Fernandez-Perez L. Pu

  11. DMPD: Multiple signaling pathways leading to the activation of interferon regulatoryfactor 3. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available 12213596 Multiple signaling pathways leading to the activation of interferon regula...(.html) (.csml) Show Multiple signaling pathways leading to the activation of interferon regulatoryfactor 3.... PubmedID 12213596 Title Multiple signaling pathways leading to the activation of

  12. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    Science.gov (United States)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  13. An efficient central DOA tracking algorithm for multiple incoherently distributed sources

    Science.gov (United States)

    Hassen, Sonia Ben; Samet, Abdelaziz

    2015-12-01

    In this paper, we develop a new tracking method for the direction of arrival (DOA) parameters assuming multiple incoherently distributed (ID) sources. The new approach is based on a simple covariance fitting optimization technique exploiting the central and noncentral moments of the source angular power densities to estimate the central DOAs. The current estimates are treated as measurements provided to the Kalman filter that model the dynamic property of directional changes for the moving sources. Then, the covariance-fitting-based algorithm and the Kalman filtering theory are combined to formulate an adaptive tracking algorithm. Our algorithm is compared to the fast approximated power iteration-total least square-estimation of signal parameters via rotational invariance technique (FAPI-TLS-ESPRIT) algorithm using the TLS-ESPRIT method and the subspace updating via FAPI-algorithm. It will be shown that the proposed algorithm offers an excellent DOA tracking performance and outperforms the FAPI-TLS-ESPRIT method especially at low signal-to-noise ratio (SNR) values. Moreover, the performances of the two methods increase as the SNR values increase. This increase is more prominent with the FAPI-TLS-ESPRIT method. However, their performances degrade when the number of sources increases. It will be also proved that our method depends on the form of the angular distribution function when tracking the central DOAs. Finally, it will be shown that the more the sources are spaced, the more the proposed method can exactly track the DOAs.

  14. Interpolating between random walks and optimal transportation routes: Flow with multiple sources and targets

    Science.gov (United States)

    Guex, Guillaume

    2016-05-01

    In recent articles about graphs, different models proposed a formalism to find a type of path between two nodes, the source and the target, at crossroads between the shortest-path and the random-walk path. These models include a freely adjustable parameter, allowing to tune the behavior of the path toward randomized movements or direct routes. This article presents a natural generalization of these models, namely a model with multiple sources and targets. In this context, source nodes can be viewed as locations with a supply of a certain good (e.g. people, money, information) and target nodes as locations with a demand of the same good. An algorithm is constructed to display the flow of goods in the network between sources and targets. With again a freely adjustable parameter, this flow can be tuned to follow routes of minimum cost, thus displaying the flow in the context of the optimal transportation problem or, by contrast, a random flow, known to be similar to the electrical current flow if the random-walk is reversible. Moreover, a source-targetcoupling can be retrieved from this flow, offering an optimal assignment to the transportation problem. This algorithm is described in the first part of this article and then illustrated with case studies.

  15. PSI/TM-Coffee: a web server for fast and accurate multiple sequence alignments of regular and transmembrane proteins using homology extension on reduced databases.

    Science.gov (United States)

    Floden, Evan W; Tommaso, Paolo D; Chatzou, Maria; Magis, Cedrik; Notredame, Cedric; Chang, Jia-Ming

    2016-07-08

    The PSI/TM-Coffee web server performs multiple sequence alignment (MSA) of proteins by combining homology extension with a consistency based alignment approach. Homology extension is performed with Position Specific Iterative (PSI) BLAST searches against a choice of redundant and non-redundant databases. The main novelty of this server is to allow databases of reduced complexity to rapidly perform homology extension. This server also gives the possibility to use transmembrane proteins (TMPs) reference databases to allow even faster homology extension on this important category of proteins. Aside from an MSA, the server also outputs topological prediction of TMPs using the HMMTOP algorithm. Previous benchmarking of the method has shown this approach outperforms the most accurate alignment methods such as MSAProbs, Kalign, PROMALS, MAFFT, ProbCons and PRALINE™. The web server is available at http://tcoffee.crg.cat/tmcoffee. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Differentiating between anthropogenic and geological sources of nitrate using multiple geochemical tracers

    Science.gov (United States)

    Linhoff, B.; Norton, S.; Travis, R.; Romero, Z.; Waters, B.

    2017-12-01

    Nitrate contamination of groundwater is a major problem globally including within the Albuquerque Basin in New Mexico. Ingesting high concentrations of nitrate (> 10 mg/L as N) can lead to an increased risk of cancer and to methemoglobinemia in infants. Numerous anthropogenic sources of nitrate have been identified within the Albuquerque Basin including fertilizers, landfills, multiple sewer pipe releases, sewer lagoons, domestic septic leach fields, and a nitric acid line outfall. Furthermore, groundwater near ephemeral streams often exhibits elevated NO3 concentrations and high NO3/Cl ratios incongruous with an anthropogenic source. These results suggest that NO3 can be concentrated through evaporation beneath ephemeral streams and mobilized via irrigation or land use change. This study seeks to use extensive geochemical analyses of groundwater and surface water to differentiate between various sources of NO3 contamination. The U.S. Geological Survey collected 54 groundwater samples from wells and six samples from ephemeral streams from within and from outside of areas of known nitrate contamination. To fingerprint the sources of nitrate pollution, samples were analyzed for major ions, trace metals, nutrients, dissolved gases, δ15N and δ18O in NO3, δ15N within N2 gas, and, δ2H and δ18O in H2O. Furthermore, most sites were sampled for artificial sweeteners and numerous contaminants of emerging concern including pharmaceutical drugs, caffeine, and wastewater indicators. This study will also investigate the age distribution of groundwater and the approximate age of anthropogenic NO3 contamination using 3He/4He, δ13C, 14C, 3H, as well as pharmaceutical drugs and artificial sweeteners with known patent and U.S. Food and Drug Administration approval dates. This broad suite of analytes will be used to differentiate between naturally occurring and multiple anthropogenic NO3 sources, and to potentially determine the approximate date of NO3 contamination.

  17. Validation and calibration of structural models that combine information from multiple sources.

    Science.gov (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  18. Glutathione provides a source of cysteine essential for intracellular multiplication of Francisella tularensis.

    Directory of Open Access Journals (Sweden)

    Khaled Alkhuder

    2009-01-01

    Full Text Available Francisella tularensis is a highly infectious bacterium causing the zoonotic disease tularemia. Its ability to multiply and survive in macrophages is critical for its virulence. By screening a bank of HimarFT transposon mutants of the F. tularensis live vaccine strain (LVS to isolate intracellular growth-deficient mutants, we selected one mutant in a gene encoding a putative gamma-glutamyl transpeptidase (GGT. This gene (FTL_0766 was hence designated ggt. The mutant strain showed impaired intracellular multiplication and was strongly attenuated for virulence in mice. Here we present evidence that the GGT activity of F. tularensis allows utilization of glutathione (GSH, gamma-glutamyl-cysteinyl-glycine and gamma-glutamyl-cysteine dipeptide as cysteine sources to ensure intracellular growth. This is the first demonstration of the essential role of a nutrient acquisition system in the intracellular multiplication of F. tularensis. GSH is the most abundant source of cysteine in the host cytosol. Thus, the capacity this intracellular bacterial pathogen has evolved to utilize the available GSH, as a source of cysteine in the host cytosol, constitutes a paradigm of bacteria-host adaptation.

  19. Subcritical Neutron Multiplication Measurements of HEU Using Delayed Neutrons as the Driving Source

    International Nuclear Information System (INIS)

    Hollas, C.L.; Goulding, C.A.; Myers, W.L.

    1999-01-01

    A new method for the determination of the multiplication of highly enriched uranium systems is presented. The method uses delayed neutrons to drive the HEU system. These delayed neutrons are from fission events induced by a pulsed 14-MeV neutron source. Between pulses, neutrons are detected within a medium efficiency neutron detector using 3 He ionization tubes within polyethylene enclosures. The neutron detection times are recorded relative to the initiation of the 14-MeV neutron pulse, and subsequently analyzed with the Feynman reduced variance method to extract singles, doubles and triples neutron counting rates. Measurements have been made on a set of nested hollow spheres of 93% enriched uranium, with mass values from 3.86 kg to 21.48 kg. The singles, doubles and triples counting rates for each uranium system are compared to calculations from point kinetics models of neutron multiplicity to assign multiplication values. These multiplication values are compared to those from MC NP K-Code calculations

  20. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    Science.gov (United States)

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  1. Development of repository-wide radionuclide transport model considering the effects of multiple sources

    International Nuclear Information System (INIS)

    Hatanaka, Koichiro; Watari, Shingo; Ijiri, Yuji

    1999-11-01

    Safety assessment of the geological isolation system according to the groundwater scenario has traditionally been conducted based on the signal canister configuration and then the safety of total system has been evaluated based on the dose rates which were obtained by multiplying the migration rates released from the engineered barrier and/or the natural barrier by dose conversion factors and total number of canisters disposed in the repository. The dose conversion factors can be obtained from the biosphere analysis. In this study, we focused on the effect of multiple sources due to the disposal of canisters at different positions in the repository. By taking the effect of multiple sources into consideration, concentration interference in the repository region is possible to take place. Therefore, radionuclide transport model/code considering the effect of concentration interference due to the multiple sources was developed to make assessments of the effect quantitatively. The newly developed model/code was verified through the comparison analysis with the existing radionuclide transport analysis code used in the second progress report. In addition, the effect of the concentration interference was evaluated by setting a simple problem using the newly developed analysis code. This results shows that the maximum park value of the migration rates from the repository was about two orders of magnitude lower than that based on single canister configuration. Since the analysis code was developed by assuming that all canisters disposed of along the one-dimensional groundwater flow contribute to the concentration interference in the repository region, the assumption should be verified by conducting two or three-dimensional analysis considering heterogeneous geological structure as a future work. (author)

  2. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases.

    Science.gov (United States)

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-07-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.

  3. Relaxation dynamics in the presence of pulse multiplicative noise sources with different correlation properties

    Science.gov (United States)

    Kargovsky, A. V.; Chichigina, O. A.; Anashkina, E. I.; Valenti, D.; Spagnolo, B.

    2015-10-01

    The relaxation dynamics of a system described by a Langevin equation with pulse multiplicative noise sources with different correlation properties is considered. The solution of the corresponding Fokker-Planck equation is derived for Gaussian white noise. Moreover, two pulse processes with regulated periodicity are considered as a noise source: the dead-time-distorted Poisson process and the process with fixed time intervals, which is characterized by an infinite correlation time. We find that the steady state of the system is dependent on the correlation properties of the pulse noise. An increase of the noise correlation causes the decrease of the mean value of the solution at the steady state. The analytical results are in good agreement with the numerical ones.

  4. Prediction of the neutrons subcritical multiplication using the diffusion hybrid equation with external neutron sources

    Energy Technology Data Exchange (ETDEWEB)

    Costa da Silva, Adilson; Carvalho da Silva, Fernando [COPPE/UFRJ, Programa de Engenharia Nuclear, Caixa Postal 68509, 21941-914, Rio de Janeiro (Brazil); Senra Martinez, Aquilino, E-mail: aquilino@lmp.ufrj.br [COPPE/UFRJ, Programa de Engenharia Nuclear, Caixa Postal 68509, 21941-914, Rio de Janeiro (Brazil)

    2011-07-15

    Highlights: > We proposed a new neutron diffusion hybrid equation with external neutron source. > A coarse mesh finite difference method for the adjoint flux and reactivity calculation was developed. > 1/M curve to predict the criticality condition is used. - Abstract: We used the neutron diffusion hybrid equation, in cartesian geometry with external neutron sources to predict the subcritical multiplication of neutrons in a pressurized water reactor, using a 1/M curve to predict the criticality condition. A Coarse Mesh Finite Difference Method was developed for the adjoint flux calculation and to obtain the reactivity values of the reactor. The results obtained were compared with benchmark values in order to validate the methodology presented in this paper.

  5. Prediction of the neutrons subcritical multiplication using the diffusion hybrid equation with external neutron sources

    International Nuclear Information System (INIS)

    Costa da Silva, Adilson; Carvalho da Silva, Fernando; Senra Martinez, Aquilino

    2011-01-01

    Highlights: → We proposed a new neutron diffusion hybrid equation with external neutron source. → A coarse mesh finite difference method for the adjoint flux and reactivity calculation was developed. → 1/M curve to predict the criticality condition is used. - Abstract: We used the neutron diffusion hybrid equation, in cartesian geometry with external neutron sources to predict the subcritical multiplication of neutrons in a pressurized water reactor, using a 1/M curve to predict the criticality condition. A Coarse Mesh Finite Difference Method was developed for the adjoint flux calculation and to obtain the reactivity values of the reactor. The results obtained were compared with benchmark values in order to validate the methodology presented in this paper.

  6. An open-source software program for performing Bonferroni and related corrections for multiple comparisons

    Directory of Open Access Journals (Sweden)

    Kyle Lesack

    2011-01-01

    Full Text Available Increased type I error resulting from multiple statistical comparisons remains a common problem in the scientific literature. This may result in the reporting and promulgation of spurious findings. One approach to this problem is to correct groups of P-values for "family-wide significance" using a Bonferroni correction or the less conservative Bonferroni-Holm correction or to correct for the "false discovery rate" with a Benjamini-Hochberg correction. Although several solutions are available for performing this correction through commercially available software there are no widely available easy to use open source programs to perform these calculations. In this paper we present an open source program written in Python 3.2 that performs calculations for standard Bonferroni, Bonferroni-Holm and Benjamini-Hochberg corrections.

  7. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  8. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young [Low-temperature Plasma Laboratory, Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 305-701 (Korea, Republic of); An, Sang-Hyuk [Agency of Defense Development, Yuseong-gu, Daejeon 305-151 (Korea, Republic of)

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources due to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.

  9. Shape optimization of an airfoil in a BZT flow with multiple-source uncertainties

    International Nuclear Information System (INIS)

    Congedo, P.M.; Corre, C.; Martinez, J.M.

    2011-01-01

    Bethe-Zel'dovich-Thompson fluids (BZT) are characterized by negative values of the fundamental derivative of gas dynamics for a range of temperatures and pressures in the vapor phase, which leads to non-classical gas dynamic behaviors such as the disintegration of compression shocks. These non-classical phenomena can be exploited, when using these fluids in Organic Rankine Cycles (ORCs), to increase isentropic efficiency. A predictive numerical simulation of these flows must account for two main sources of physical uncertainties: the BZT fluid properties often difficult to measure accurately and the usually fluctuating turbine inlet conditions. For taking full advantage of the BZT properties, the turbine geometry must also be specifically designed, keeping in mind the geometry achieved in practice after machining always slightly differs from the theoretical shape. This paper investigates some efficient procedures to perform shape optimization in a 2D BZT flow with multiple-source uncertainties (thermodynamic model, operating conditions and geometry). To demonstrate the feasibility of the proposed efficient strategies for shape optimization in the presence of multiple-source uncertainties, a zero incidence symmetric airfoil wave-drag minimization problem is retained as a case-study. This simplified configuration encompasses most of the features associated with a turbine design problem, as far the uncertainty quantification is concerned. A preliminary analysis of the contributions to the variance of the wave-drag allows to select the most significant sources of uncertainties using a reduced number of flow computations. The resulting mean value and variance of the objective are next turned into meta models. The optimal Pareto sets corresponding to the minimization of various substitute functions are obtained using a genetic algorithm as optimizer and their differences are discussed. (authors)

  10. Use of multiple data sources to estimate hepatitis C seroprevalence among prisoners: A retrospective cohort study.

    Directory of Open Access Journals (Sweden)

    Kathryn J Snow

    Full Text Available Hepatitis C is a major cause of preventable morbidity and mortality. Prisoners are a key population for hepatitis C control programs, and with the advent of highly effective therapies, prisons are increasingly important sites for hepatitis C diagnosis and treatment. Accurate estimates of hepatitis C prevalence among prisoners are needed in order to plan and resource service provision, however many prevalence estimates are based on surveys compromised by limited and potentially biased participation. We aimed to compare estimates derived from three different data sources, and to assess whether the use of self-report as a supplementary data source may help researchers assess the risk of selection bias. We used three data sources to estimate the prevalence of hepatitis C antibodies in a large cohort of Australian prisoners-prison medical records, self-reported status during a face-to-face interview prior to release from prison, and data from a statewide notifiable conditions surveillance system. Of 1,315 participants, 33.8% had at least one indicator of hepatitis C seropositivity, however less than one third of these (9.5% of the entire cohort were identified by all three data sources. Among participants of known status, self-report had a sensitivity of 80.1% and a positive predictive value of 97.8%. Any one data source used in isolation would have under-estimated the prevalence of hepatitis C in this cohort. Using multiple data sources in studies of hepatitis C seroprevalence among prisoners may improve case detection and help researchers assess the risk of selection bias due to non-participation in serological testing.

  11. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  12. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  13. Optimizing Irrigation Water Allocation under Multiple Sources of Uncertainty in an Arid River Basin

    Science.gov (United States)

    Wei, Y.; Tang, D.; Gao, H.; Ding, Y.

    2015-12-01

    Population growth and climate change add additional pressures affecting water resources management strategies for meeting demands from different economic sectors. It is especially challenging in arid regions where fresh water is limited. For instance, in the Tailanhe River Basin (Xinjiang, China), a compromise must be made between water suppliers and users during drought years. This study presents a multi-objective irrigation water allocation model to cope with water scarcity in arid river basins. To deal with the uncertainties from multiple sources in the water allocation system (e.g., variations of available water amount, crop yield, crop prices, and water price), the model employs a interval linear programming approach. The multi-objective optimization model developed from this study is characterized by integrating eco-system service theory into water-saving measures. For evaluation purposes, the model is used to construct an optimal allocation system for irrigation areas fed by the Tailan River (Xinjiang Province, China). The objective functions to be optimized are formulated based on these irrigation areas' economic, social, and ecological benefits. The optimal irrigation water allocation plans are made under different hydroclimate conditions (wet year, normal year, and dry year), with multiple sources of uncertainty represented. The modeling tool and results are valuable for advising decision making by the local water authority—and the agricultural community—especially on measures for coping with water scarcity (by incorporating uncertain factors associated with crop production planning).

  14. Assessing regional groundwater stress for nations using multiple data sources with the groundwater footprint

    International Nuclear Information System (INIS)

    Gleeson, Tom; Wada, Yoshihide

    2013-01-01

    Groundwater is a critical resource for agricultural production, ecosystems, drinking water and industry, yet groundwater depletion is accelerating, especially in a number of agriculturally important regions. Assessing the stress of groundwater resources is crucial for science-based policy and management, yet water stress assessments have often neglected groundwater and used single data sources, which may underestimate the uncertainty of the assessment. We consistently analyze and interpret groundwater stress across whole nations using multiple data sources for the first time. We focus on two nations with the highest national groundwater abstraction rates in the world, the United States and India, and use the recently developed groundwater footprint and multiple datasets of groundwater recharge and withdrawal derived from hydrologic models and data synthesis. A minority of aquifers, mostly with known groundwater depletion, show groundwater stress regardless of the input dataset. The majority of aquifers are not stressed with any input data while less than a third are stressed for some input data. In both countries groundwater stress affects agriculturally important regions. In the United States, groundwater stress impacts a lower proportion of the national area and population, and is focused in regions with lower population and water well density compared to India. Importantly, the results indicate that the uncertainty is generally greater between datasets than within datasets and that much of the uncertainty is due to recharge estimates. Assessment of groundwater stress consistently across a nation and assessment of uncertainty using multiple datasets are critical for the development of a science-based rationale for policy and management, especially with regard to where and to what extent to focus limited research and management resources. (letter)

  15. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    Science.gov (United States)

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  16. Sources of water column methylmercury across multiple estuaries in the Northeast U.S.

    Science.gov (United States)

    Balcom, Prentiss H; Schartup, Amina T; Mason, Robert P; Chen, Celia Y

    2015-12-20

    Estuarine water column methylmercury (MeHg) is an important driver of mercury (Hg) bioaccumulation in pelagic organisms and thus it is necessary to understand the sources and processes affecting environmental levels of MeHg. Increases in water column MeHg concentrations can ultimately be transferred to fish consumed by humans, but despite this, the sources of MeHg to the estuarine water column are still poorly understood. Here we evaluate MeHg sources across 4 estuaries and 10 sampling sites and examine the distributions and partitioning of sediment and water column MeHg across a geographic range (Maine to New Jersey). Our study sites present a gradient in the concentrations of sediment, pore water and water column Hg species. Suspended particle MeHg ranged from below detection to 187 pmol g -1 , dissolved MeHg from 0.01 to 0.68 pM, and sediment MeHg from 0.01 to 109 pmol g -1 . Across multiple estuaries, dissolved MeHg correlated with Hg species in the water column, and sediment MeHg correlated with sediment total Hg (HgT). Water column MeHg did not correlate well with sediment Hg across estuaries, indicating that sediment concentrations were not a good predictor of water MeHg concentrations. This is an unexpected finding since it has been shown that MeHg production from inorganic Hg 2+ within sediment is the primary source of MeHg to coastal waters. Additional sources of MeHg regulate water column MeHg levels in some of the shallow estuaries included in this study.

  17. Combining evidence from multiple electronic health care databases: performances of one-stage and two-stage meta-analysis in matched case-control studies.

    Science.gov (United States)

    La Gamba, Fabiola; Corrao, Giovanni; Romio, Silvana; Sturkenboom, Miriam; Trifirò, Gianluca; Schink, Tania; de Ridder, Maria

    2017-10-01

    Clustering of patients in databases is usually ignored in one-stage meta-analysis of multi-database studies using matched case-control data. The aim of this study was to compare bias and efficiency of such a one-stage meta-analysis with a two-stage meta-analysis. First, we compared the approaches by generating matched case-control data under 5 simulated scenarios, built by varying: (1) the exposure-outcome association; (2) its variability among databases; (3) the confounding strength of one covariate on this association; (4) its variability; and (5) the (heterogeneous) confounding strength of two covariates. Second, we made the same comparison using empirical data from the ARITMO project, a multiple database study investigating the risk of ventricular arrhythmia following the use of medications with arrhythmogenic potential. In our study, we specifically investigated the effect of current use of promethazine. Bias increased for one-stage meta-analysis with increasing (1) between-database variance of exposure effect and (2) heterogeneous confounding generated by two covariates. The efficiency of one-stage meta-analysis was slightly lower than that of two-stage meta-analysis for the majority of investigated scenarios. Based on ARITMO data, there were no evident differences between one-stage (OR = 1.50, CI = [1.08; 2.08]) and two-stage (OR = 1.55, CI = [1.12; 2.16]) approaches. When the effect of interest is heterogeneous, a one-stage meta-analysis ignoring clustering gives biased estimates. Two-stage meta-analysis generates estimates at least as accurate and precise as one-stage meta-analysis. However, in a study using small databases and rare exposures and/or outcomes, a correct one-stage meta-analysis becomes essential. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Fast in-database cross-matching of high-cadence, high-density source lists with an up-to-date sky model

    Science.gov (United States)

    Scheers, B.; Bloemen, S.; Mühleisen, H.; Schellart, P.; van Elteren, A.; Kersten, M.; Groot, P. J.

    2018-04-01

    Coming high-cadence wide-field optical telescopes will image hundreds of thousands of sources per minute. Besides inspecting the near real-time data streams for transient and variability events, the accumulated data archive is a wealthy laboratory for making complementary scientific discoveries. The goal of this work is to optimise column-oriented database techniques to enable the construction of a full-source and light-curve database for large-scale surveys, that is accessible by the astronomical community. We adopted LOFAR's Transients Pipeline as the baseline and modified it to enable the processing of optical images that have much higher source densities. The pipeline adds new source lists to the archive database, while cross-matching them with the known cataloguedsources in order to build a full light-curve archive. We investigated several techniques of indexing and partitioning the largest tables, allowing for faster positional source look-ups in the cross matching algorithms. We monitored all query run times in long-term pipeline runs where we processed a subset of IPHAS data that have image source density peaks over 170,000 per field of view (500,000 deg-2). Our analysis demonstrates that horizontal table partitions of declination widths of one-degree control the query run times. Usage of an index strategy where the partitions are densely sorted according to source declination yields another improvement. Most queries run in sublinear time and a few (< 20%) run in linear time, because of dependencies on input source-list and result-set size. We observed that for this logical database partitioning schema the limiting cadence the pipeline achieved with processing IPHAS data is 25 s.

  19. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    International Nuclear Information System (INIS)

    Woracek, R.; Hofmann, T.; Bulat, M.; Sales, M.; Habicht, K.; Andersen, K.; Strobl, M.

    2016-01-01

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  20. Misconceptions and biases in German students' perception of multiple energy sources: implications for science education

    Science.gov (United States)

    Lee, Roh Pin

    2016-04-01

    Misconceptions and biases in energy perception could influence people's support for developments integral to the success of restructuring a nation's energy system. Science education, in equipping young adults with the cognitive skills and knowledge necessary to navigate in the confusing energy environment, could play a key role in paving the way for informed decision-making. This study examined German students' knowledge of the contribution of diverse energy sources to their nation's energy mix as well as their affective energy responses so as to identify implications for science education. Specifically, the study investigated whether and to what extent students hold mistaken beliefs about the role of multiple energy sources in their nation's energy mix, and assessed how misconceptions could act as self-generated reference points to underpin support/resistance of proposed developments. An in-depth analysis of spontaneous affective associations with five key energy sources also enabled the identification of underlying concerns driving people's energy responses and facilitated an examination of how affective perception, in acting as a heuristic, could lead to biases in energy judgment and decision-making. Finally, subgroup analysis differentiated by education and gender supported insights into a 'two culture' effect on energy perception and the challenge it poses to science education.

  1. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    Energy Technology Data Exchange (ETDEWEB)

    Woracek, R., E-mail: robin.woracek@esss.se [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Hofmann, T.; Bulat, M. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Sales, M. [Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark); Habicht, K. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Andersen, K. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Strobl, M. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark)

    2016-12-11

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  2. The Impact of Multiple Malignancies on Patients with Bladder Carcinoma: A Population-Based Study Using the SEER Database

    Directory of Open Access Journals (Sweden)

    Joshua R. Ehrlich

    2009-01-01

    Results. Analyses demonstrated diminished survival among AB and ABS cohorts. However, when cohorts were substratified by stage, patients in the high-stage BS cohort appeared to have a survival advantage over high-stage BO patients. Conclusions. Bladder cancer patients with multiple malignancies have diminished survival. The survival advantage of high-stage BS patients is likely a statistical phenomenon. Such findings are important to shape future research and to improve our understanding of patients with multiple malignancies.

  3. A multiple objective magnet sorting algorithm for the Advanced Light Source insertion devices

    International Nuclear Information System (INIS)

    Humphries, D.; Goetz, F.; Kownacki, P.; Marks, S.; Schlueter, R.

    1995-01-01

    Insertion devices for the Advanced Light Source (ALS) incorporate large numbers of permanent magnets which have a variety of magnetization orientation errors. These orientation errors can produce field errors which affect both the spectral brightness of the insertion devices and the storage ring electron beam dynamics. A perturbation study was carried out to quantify the effects of orientation errors acting in a hybrid magnetic structure. The results of this study were used to develop a multiple stage sorting algorithm which minimizes undesirable integrated field errors and essentially eliminates pole excitation errors. When applied to a measured magnet population for an existing insertion device, an order of magnitude reduction in integrated field errors was achieved while maintaining near zero pole excitation errors

  4. Use of multiple water surface flow constructed wetlands for non-point source water pollution control.

    Science.gov (United States)

    Li, Dan; Zheng, Binghui; Liu, Yan; Chu, Zhaosheng; He, Yan; Huang, Minsheng

    2018-05-02

    Multiple free water surface flow constructed wetlands (multi-FWS CWs) are a variety of conventional water treatment plants for the interception of pollutants. This review encapsulated the characteristics and applications in the field of ecological non-point source water pollution control technology. The roles of in-series design and operation parameters (hydraulic residence time, hydraulic load rate, water depth and aspect ratio, composition of influent, and plant species) for performance intensification were also analyzed, which were crucial to achieve sustainable and effective contaminants removal, especially the retention of nutrient. The mechanism study of design and operation parameters for the removal of nitrogen and phosphorus was also highlighted. Conducive perspectives for further research on optimizing its design/operation parameters and advanced technologies of ecological restoration were illustrated to possibly interpret the functions of multi-FWS CWs.

  5. The test beamline of the European Spallation Source - Instrumentation development and wavelength frame multiplication

    DEFF Research Database (Denmark)

    Woracek, R.; Hofmann, T.; Bulat, M.

    2016-01-01

    which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor...... wavelength band between 1.6 A and 10 A by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components....... This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects....

  6. Community Response to Multiple Sound Sources: Integrating Acoustic and Contextual Approaches in the Analysis

    Directory of Open Access Journals (Sweden)

    Peter Lercher

    2017-06-01

    Full Text Available Sufficient data refer to the relevant prevalence of sound exposure by mixed traffic sources in many nations. Furthermore, consideration of the potential effects of combined sound exposure is required in legal procedures such as environmental health impact assessments. Nevertheless, current practice still uses single exposure response functions. It is silently assumed that those standard exposure-response curves accommodate also for mixed exposures—although some evidence from experimental and field studies casts doubt on this practice. The ALPNAP-study population (N = 1641 shows sufficient subgroups with combinations of rail-highway, highway-main road and rail-highway-main road sound exposure. In this paper we apply a few suggested approaches of the literature to investigate exposure-response curves and its major determinants in the case of exposure to multiple traffic sources. Highly/moderate annoyance and full scale mean annoyance served as outcome. The results show several limitations of the current approaches. Even facing the inherent methodological limitations (energy equivalent summation of sound, rating of overall annoyance the consideration of main contextual factors jointly occurring with the sources (such as vibration, air pollution or coping activities and judgments of the wider area soundscape increases the variance explanation from up to 8% (bivariate, up to 15% (base adjustments up to 55% (full contextual model. The added predictors vary significantly, depending on the source combination. (e.g., significant vibration effects with main road/railway, not highway. Although no significant interactions were found, the observed additive effects are of public health importance. Especially in the case of a three source exposure situation the overall annoyance is already high at lower levels and the contribution of the acoustic indicators is small compared with the non-acoustic and contextual predictors. Noise mapping needs to go down to

  7. Multiple Sources of Pressure for Change: The Barroso Commission and Energy Policy for an Enlarged EU

    Directory of Open Access Journals (Sweden)

    Jan Frederik Braun

    2009-11-01

    Full Text Available This article presents a preliminary analysis of how and why the role, work and status of the European Commission are changing in an enlarged European Union. It does so by focusing on multiple sources of pressure for change. These include: enlargement, new modes of governance, administrative reforms and changed leadership under Barroso. Combined, though not interlinked, these multiple sources of pressure are evidence of the increasing difficulty for the Commission to design and propose Community-wide answers to complex challenges in a more diverse Union. For this reason, the Commission under Barroso relies less on its traditional monopoly power to propose formal legislation and more on non-traditional modes of policy-making. Energy policy, especially its external dimension, constitutes a policy field that has been affected by enlargement, i.e. characterised by an increasing heterogeneity of needs and preferences among the member states. Not only does it resists Community-wide answers, it also allows the Commission, as an agent, to make use of bureaucratic drifts, i.e. exploit its strategic position in the EU’s governance system and use of a range of formal and informal resources of expertise. To deliver sustainable European added value to this complex policy area, however, the Commission must focus more on pragmatic policy results by making smart use of the EU’s increasing asymmetry, diversity and subsidiarity in a bottom-up approach. A non-legislative approach can serve as a modus vivendi to keep the momentum going in the Union’s difficult struggle to establish a workable energy regime.

  8. The Use of Source-Related Strategies in Evaluating Multiple Psychology Texts: A Student-Scientist Comparison

    Science.gov (United States)

    von der Mühlen, Sarah; Richter, Tobias; Schmid, Sebastian; Schmidt, Elisabeth Marie; Berthold, Kirsten

    2016-01-01

    Multiple text comprehension can greatly benefit from paying attention to sources and from using this information for evaluating text information. Previous research based on texts from the domain of history suggests that source-related strategies are acquired as part of the discipline expertise as opposed to the spontaneous use of these strategies…

  9. Optimal planning of multiple distributed generation sources in distribution networks: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    AlRashidi, M.R., E-mail: malrash2002@yahoo.com [Department of Electrical Engineering, College of Technological Studies, Public Authority for Applied Education and Training (PAAET) (Kuwait); AlHajri, M.F., E-mail: mfalhajri@yahoo.com [Department of Electrical Engineering, College of Technological Studies, Public Authority for Applied Education and Training (PAAET) (Kuwait)

    2011-10-15

    Highlights: {yields} A new hybrid PSO for optimal DGs placement and sizing. {yields} Statistical analysis to fine tune PSO parameters. {yields} Novel constraint handling mechanism to handle different constraints types. - Abstract: An improved particle swarm optimization algorithm (PSO) is presented for optimal planning of multiple distributed generation sources (DG). This problem can be divided into two sub-problems: the DG optimal size (continuous optimization) and location (discrete optimization) to minimize real power losses. The proposed approach addresses the two sub-problems simultaneously using an enhanced PSO algorithm capable of handling multiple DG planning in a single run. A design of experiment is used to fine tune the proposed approach via proper analysis of PSO parameters interaction. The proposed algorithm treats the problem constraints differently by adopting a radial power flow algorithm to satisfy the equality constraints, i.e. power flows in distribution networks, while the inequality constraints are handled by making use of some of the PSO features. The proposed algorithm was tested on the practical 69-bus power distribution system. Different test cases were considered to validate the proposed approach consistency in detecting optimal or near optimal solution. Results are compared with those of Sequential Quadratic Programming.

  10. Optimal planning of multiple distributed generation sources in distribution networks: A new approach

    International Nuclear Information System (INIS)

    AlRashidi, M.R.; AlHajri, M.F.

    2011-01-01

    Highlights: → A new hybrid PSO for optimal DGs placement and sizing. → Statistical analysis to fine tune PSO parameters. → Novel constraint handling mechanism to handle different constraints types. - Abstract: An improved particle swarm optimization algorithm (PSO) is presented for optimal planning of multiple distributed generation sources (DG). This problem can be divided into two sub-problems: the DG optimal size (continuous optimization) and location (discrete optimization) to minimize real power losses. The proposed approach addresses the two sub-problems simultaneously using an enhanced PSO algorithm capable of handling multiple DG planning in a single run. A design of experiment is used to fine tune the proposed approach via proper analysis of PSO parameters interaction. The proposed algorithm treats the problem constraints differently by adopting a radial power flow algorithm to satisfy the equality constraints, i.e. power flows in distribution networks, while the inequality constraints are handled by making use of some of the PSO features. The proposed algorithm was tested on the practical 69-bus power distribution system. Different test cases were considered to validate the proposed approach consistency in detecting optimal or near optimal solution. Results are compared with those of Sequential Quadratic Programming.

  11. Multiple Signal Classification Algorithm Based Electric Dipole Source Localization Method in an Underwater Environment

    Directory of Open Access Journals (Sweden)

    Yidong Xu

    2017-10-01

    Full Text Available A novel localization method based on multiple signal classification (MUSIC algorithm is proposed for positioning an electric dipole source in a confined underwater environment by using electric dipole-receiving antenna array. In this method, the boundary element method (BEM is introduced to analyze the boundary of the confined region by use of a matrix equation. The voltage of each dipole pair is used as spatial-temporal localization data, and it does not need to obtain the field component in each direction compared with the conventional fields based localization method, which can be easily implemented in practical engineering applications. Then, a global-multiple region-conjugate gradient (CG hybrid search method is used to reduce the computation burden and to improve the operation speed. Two localization simulation models and a physical experiment are conducted. Both the simulation results and physical experiment result provide accurate positioning performance, with the help to verify the effectiveness of the proposed localization method in underwater environments.

  12. E-SovTox: An online database of the main publicly-available sources of toxicity data concerning REACH-relevant chemicals published in the Russian language.

    Science.gov (United States)

    Sihtmäe, Mariliis; Blinova, Irina; Aruoja, Villem; Dubourguier, Henri-Charles; Legrand, Nicolas; Kahru, Anne

    2010-08-01

    A new open-access online database, E-SovTox, is presented. E-SovTox provides toxicological data for substances relevant to the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system, from publicly-available Russian language data sources. The database contains information selected mainly from scientific journals published during the Soviet Union era. The main information source for this database - the journal, Gigiena Truda i Professional'nye Zabolevania [Industrial Hygiene and Occupational Diseases], published between 1957 and 1992 - features acute, but also chronic, toxicity data for numerous industrial chemicals, e.g. for rats, mice, guinea-pigs and rabbits. The main goal of the abovementioned toxicity studies was to derive the maximum allowable concentration limits for industrial chemicals in the occupational health settings of the former Soviet Union. Thus, articles featured in the database include mostly data on LD50 values, skin and eye irritation, skin sensitisation and cumulative properties. Currently, the E-SovTox database contains toxicity data selected from more than 500 papers covering more than 600 chemicals. The user is provided with the main toxicity information, as well as abstracts of these papers in Russian and in English (given as provided in the original publication). The search engine allows cross-searching of the database by the name or CAS number of the compound, and the author of the paper. The E-SovTox database can be used as a decision-support tool by researchers and regulators for the hazard assessment of chemical substances. 2010 FRAME.

  13. Electrical source imaging of interictal spikes using multiple sparse volumetric priors for presurgical epileptogenic focus localization

    Directory of Open Access Journals (Sweden)

    Gregor Strobbe

    2016-01-01

    Full Text Available Electrical source imaging of interictal spikes observed in EEG recordings of patients with refractory epilepsy provides useful information to localize the epileptogenic focus during the presurgical evaluation. However, the selection of the time points or time epochs of the spikes in order to estimate the origin of the activity remains a challenge. In this study, we consider a Bayesian EEG source imaging technique for distributed sources, i.e. the multiple volumetric sparse priors (MSVP approach. The approach allows to estimate the time courses of the intensity of the sources corresponding with a specific time epoch of the spike. Based on presurgical averaged interictal spikes in six patients who were successfully treated with surgery, we estimated the time courses of the source intensities for three different time epochs: (i an epoch starting 50 ms before the spike peak and ending at 50% of the spike peak during the rising phase of the spike, (ii an epoch starting 50 ms before the spike peak and ending at the spike peak and (iii an epoch containing the full spike time period starting 50 ms before the spike peak and ending 230 ms after the spike peak. To identify the primary source of the spike activity, the source with the maximum energy from 50 ms before the spike peak till 50% of the spike peak was subsequently selected for each of the time windows. For comparison, the activity at the spike peaks and at 50% of the peaks was localized using the LORETA inversion technique and an ECD approach. Both patient-specific spherical forward models and patient-specific 5-layered finite difference models were considered to evaluate the influence of the forward model. Based on the resected zones in each of the patients, extracted from post-operative MR images, we compared the distances to the resection border of the estimated activity. Using the spherical models, the distances to the resection border for the MSVP approach and each of the different time

  14. Prediction With Dimension Reduction of Multiple Molecular Data Sources for Patient Survival

    Directory of Open Access Journals (Sweden)

    Adam Kaplan

    2017-07-01

    Full Text Available Predictive modeling from high-dimensional genomic data is often preceded by a dimension reduction step, such as principal component analysis (PCA. However, the application of PCA is not straightforward for multisource data, wherein multiple sources of ‘omics data measure different but related biological components. In this article, we use recent advances in the dimension reduction of multisource data for predictive modeling. In particular, we apply exploratory results from Joint and Individual Variation Explained (JIVE, an extension of PCA for multisource data, for prediction of differing response types. We conduct illustrative simulations to illustrate the practical advantages and interpretability of our approach. As an application example, we consider predicting survival for patients with glioblastoma multiforme from 3 data sources measuring messenger RNA expression, microRNA expression, and DNA methylation. We also introduce a method to estimate JIVE scores for new samples that were not used in the initial dimension reduction and study its theoretical properties; this method is implemented in the R package R.JIVE on CRAN, in the function jive.predict.

  15. SIGMA/B, Doses in Space Vehicle for Multiple Trajectories, Various Radiation Source

    International Nuclear Information System (INIS)

    Jordan, T.M.

    2003-01-01

    1 - Description of problem or function: SIGMA/B calculates radiation dose at arbitrary points inside a space vehicle, taking into account vehicle geometry, heterogeneous placement of equipment and stores, vehicle materials, time-weighted astronaut positions and many radiation sources from mission trajectories, e.g. geomagnetically trapped protons and electrons, solar flare particles, galactic cosmic rays and their secondary radiations. The vehicle geometry, equipment and supplies, and man models are described by quadric surfaces. The irradiating flux field may be anisotropic. The code can be used to perform simultaneous dose calculations for multiple vehicle trajectories, each involving several radiation sources. Results are presented either as dose as a function of shield thickness, or the dose received through designated outer sections of the vehicle. 2 - Method of solution: Automatic sectoring of the vehicle is performed by a Simpson's rule integration over angle; the dose is computed by a numerical angular integration of the dose attenuation kernels about the dose points. The kernels are curve-fit functions constructed from input data tables. 3 - Restrictions on the complexity of the problem: The code uses variable dimensioning techniques to store data. The only restriction on problem size is the available core storage

  16. Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources

    Science.gov (United States)

    Jia, Z.; Zhan, Z.

    2017-12-01

    Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.

  17. AutoLabDB: a substantial open source database schema to support a high-throughput automated laboratory.

    Science.gov (United States)

    Sparkes, Andrew; Clare, Amanda

    2012-05-15

    Modern automated laboratories need substantial data management solutions to both store and make accessible the details of the experiments they perform. To be useful, a modern Laboratory Information Management System (LIMS) should be flexible and easily extensible to support evolving laboratory requirements, and should be based on the solid foundations of a robust, well-designed database. We have developed such a database schema to support an automated laboratory that performs experiments in systems biology and high-throughput screening. We describe the design of the database schema (AutoLabDB), detailing the main features and describing why we believe it will be relevant to LIMS manufacturers or custom builders. This database has been developed to support two large automated Robot Scientist systems over the last 5 years, where it has been used as the basis of an LIMS that helps to manage both the laboratory and all the experiment data produced.

  18. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave

    Directory of Open Access Journals (Sweden)

    Ikaro Silva

    2014-09-01

    Full Text Available The WaveForm DataBase (WFDB Toolbox for MATLAB/Octave enables  integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox allows direct loading into MATLAB/Octave's workspace of over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by meta data such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  19. Management of Multiple Nitrogen Sources during Wine Fermentation by Saccharomyces cerevisiae.

    Science.gov (United States)

    Crépin, Lucie; Truong, Nhat My; Bloem, Audrey; Sanchez, Isabelle; Dequin, Sylvie; Camarasa, Carole

    2017-03-01

    During fermentative growth in natural and industrial environments, Saccharomyces cerevisiae must redistribute the available nitrogen from multiple exogenous sources to amino acids in order to suitably fulfill anabolic requirements. To exhaustively explore the management of this complex resource, we developed an advanced strategy based on the reconciliation of data from a set of stable isotope tracer experiments with labeled nitrogen sources. Thus, quantifying the partitioning of the N compounds through the metabolism network during fermentation, we demonstrated that, contrary to the generally accepted view, only a limited fraction of most of the consumed amino acids is directly incorporated into proteins. Moreover, substantial catabolism of these molecules allows for efficient redistribution of nitrogen, supporting the operative de novo synthesis of proteinogenic amino acids. In contrast, catabolism of consumed amino acids plays a minor role in the formation of volatile compounds. Another important feature is that the α-keto acid precursors required for the de novo syntheses originate mainly from the catabolism of sugars, with a limited contribution from the anabolism of consumed amino acids. This work provides a comprehensive view of the intracellular fate of consumed nitrogen sources and the metabolic origin of proteinogenic amino acids, highlighting a strategy of distribution of metabolic fluxes implemented by yeast as a means of adapting to environments with changing and scarce nitrogen resources. IMPORTANCE A current challenge for the wine industry, in view of the extensive competition in the worldwide market, is to meet consumer expectations regarding the sensory profile of the product while ensuring an efficient fermentation process. Understanding the intracellular fate of the nitrogen sources available in grape juice is essential to the achievement of these objectives, since nitrogen utilization affects both the fermentative activity of yeasts and the

  20. Management of Multiple Nitrogen Sources during Wine Fermentation by Saccharomyces cerevisiae

    Science.gov (United States)

    Crépin, Lucie; Truong, Nhat My; Bloem, Audrey; Sanchez, Isabelle; Dequin, Sylvie

    2017-01-01

    ABSTRACT During fermentative growth in natural and industrial environments, Saccharomyces cerevisiae must redistribute the available nitrogen from multiple exogenous sources to amino acids in order to suitably fulfill anabolic requirements. To exhaustively explore the management of this complex resource, we developed an advanced strategy based on the reconciliation of data from a set of stable isotope tracer experiments with labeled nitrogen sources. Thus, quantifying the partitioning of the N compounds through the metabolism network during fermentation, we demonstrated that, contrary to the generally accepted view, only a limited fraction of most of the consumed amino acids is directly incorporated into proteins. Moreover, substantial catabolism of these molecules allows for efficient redistribution of nitrogen, supporting the operative de novo synthesis of proteinogenic amino acids. In contrast, catabolism of consumed amino acids plays a minor role in the formation of volatile compounds. Another important feature is that the α-keto acid precursors required for the de novo syntheses originate mainly from the catabolism of sugars, with a limited contribution from the anabolism of consumed amino acids. This work provides a comprehensive view of the intracellular fate of consumed nitrogen sources and the metabolic origin of proteinogenic amino acids, highlighting a strategy of distribution of metabolic fluxes implemented by yeast as a means of adapting to environments with changing and scarce nitrogen resources. IMPORTANCE A current challenge for the wine industry, in view of the extensive competition in the worldwide market, is to meet consumer expectations regarding the sensory profile of the product while ensuring an efficient fermentation process. Understanding the intracellular fate of the nitrogen sources available in grape juice is essential to the achievement of these objectives, since nitrogen utilization affects both the fermentative activity of yeasts and

  1. Development of a Monte Carlo multiple source model for inclusion in a dose calculation auditing tool.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  2. A Numerical Study on the Excitation of Guided Waves in Rectangular Plates Using Multiple Point Sources

    Directory of Open Access Journals (Sweden)

    Wenbo Duan

    2017-12-01

    Full Text Available Ultrasonic guided waves are widely used to inspect and monitor the structural integrity of plates and plate-like structures, such as ship hulls and large storage-tank floors. Recently, ultrasonic guided waves have also been used to remove ice and fouling from ship hulls, wind-turbine blades and aeroplane wings. In these applications, the strength of the sound source must be high for scanning a large area, or to break the bond between ice, fouling and plate substrate. More than one transducer may be used to achieve maximum sound power output. However, multiple sources can interact with each other, and form a sound field in the structure with local constructive and destructive regions. Destructive regions are weak regions and shall be avoided. When multiple transducers are used it is important that they are arranged in a particular way so that the desired wave modes can be excited to cover the whole structure. The objective of this paper is to provide a theoretical basis for generating particular wave mode patterns in finite-width rectangular plates whose length is assumed to be infinitely long with respect to its width and thickness. The wave modes have displacements in both width and thickness directions, and are thus different from the classical Lamb-type wave modes. A two-dimensional semi-analytical finite element (SAFE method was used to study dispersion characteristics and mode shapes in the plate up to ultrasonic frequencies. The modal analysis provided information on the generation of modes suitable for a particular application. The number of point sources and direction of loading for the excitation of a few representative modes was investigated. Based on the SAFE analysis, a standard finite element modelling package, Abaqus, was used to excite the designed modes in a three-dimensional plate. The generated wave patterns in Abaqus were then compared with mode shapes predicted in the SAFE model. Good agreement was observed between the

  3. Combining multiple sources of data to inform conservation of Lesser Prairie-Chicken populations

    Science.gov (United States)

    Ross, Beth; Haukos, David A.; Hagen, Christian A.; Pitman, James

    2018-01-01

    Conservation of small populations is often based on limited data from spatially and temporally restricted studies, resulting in management actions based on an incomplete assessment of the population drivers. If fluctuations in abundance are related to changes in weather, proper management is especially important, because extreme weather events could disproportionately affect population abundance. Conservation assessments, especially for vulnerable populations, are aided by a knowledge of how extreme events influence population status and trends. Although important for conservation efforts, data may be limited for small or vulnerable populations. Integrated population models maximize information from various sources of data to yield population estimates that fully incorporate uncertainty from multiple data sources while allowing for the explicit incorporation of environmental covariates of interest. Our goal was to assess the relative influence of population drivers for the Lesser Prairie-Chicken (Tympanuchus pallidicinctus) in the core of its range, western and southern Kansas, USA. We used data from roadside lek count surveys, nest monitoring surveys, and survival data from telemetry monitoring combined with climate (Palmer drought severity index) data in an integrated population model. Our results indicate that variability in population growth rate was most influenced by variability in juvenile survival. The Palmer drought severity index had no measurable direct effects on adult survival or mean number of offspring per female; however, there were declines in population growth rate following severe drought. Because declines in population growth rate occurred at a broad spatial scale, declines in response to drought were likely due to decreases in chick and juvenile survival rather than emigration outside of the study area. Overall, our model highlights the importance of accounting for environmental and demographic sources of variability, and provides a thorough

  4. Estimating the prevalence of illicit opioid use in New York City using multiple data sources

    Directory of Open Access Journals (Sweden)

    McNeely Jennifer

    2012-06-01

    Full Text Available Abstract Background Despite concerns about its health and social consequences, little is known about the prevalence of illicit opioid use in New York City. Individuals who misuse heroin and prescription opioids are known to bear a disproportionate burden of morbidity and mortality. Service providers and public health authorities are challenged to provide appropriate interventions in the absence of basic knowledge about the size and characteristics of this population. While illicit drug users are underrepresented in population-based surveys, they may be identified in multiple administrative data sources. Methods We analyzed large datasets tracking hospital inpatient and emergency room admissions as well as drug treatment and detoxification services utilization. These were applied in combination with findings from a large general population survey and administrative records tracking prescriptions, drug overdose deaths, and correctional health services, to estimate the prevalence of heroin and non-medical prescription opioid use among New York City residents in 2006. These data were further applied to a descriptive analysis of opioid users entering drug treatment and hospital-based medical care. Results These data sources identified 126,681 cases of opioid use among New York City residents in 2006. After applying adjustment scenarios to account for potential overlap between data sources, we estimated over 92,000 individual opioid users. By contrast, just 21,600 opioid users initiated drug treatment in 2006. Opioid users represented 4 % of all individuals hospitalized, and over 44,000 hospitalizations during the calendar year. Conclusions Our findings suggest that innovative approaches are needed to provide adequate services to this sizeable population of opioid users. Given the observed high rates of hospital services utilization, greater integration of drug services into medical settings could be one component of an effective approach to

  5. ALFRED: An Allele Frequency Database for Microevolutionary Studies

    Directory of Open Access Journals (Sweden)

    Kenneth K Kidd

    2005-01-01

    Full Text Available Many kinds of microevolutionary studies require data on multiple polymorphisms in multiple populations. Increasingly, and especially for human populations, multiple research groups collect relevant data and those data are dispersed widely in the literature. ALFRED has been designed to hold data from many sources and make them available over the web. Data are assembled from multiple sources, curated, and entered into the database. Multiple links to other resources are also established by the curators. A variety of search options are available and additional geographic based interfaces are being developed. The database can serve the human anthropologic genetic community by identifying what loci are already typed on many populations thereby helping to focus efforts on a common set of markers. The database can also serve as a model for databases handling similar DNA polymorphism data for other species.

  6. DOA Estimation of Multiple LFM Sources Using a STFT-based and FBSS-based MUSIC Algorithm

    Directory of Open Access Journals (Sweden)

    K. B. Cui

    2017-12-01

    Full Text Available Direction of arrival (DOA estimation is an important problem in array signal processing. An effective multiple signal classification (MUSIC method based on the short-time Fourier transform (STFT and forward/ backward spatial smoothing (FBSS techniques for the DOA estimation problem of multiple time-frequency (t-f joint LFM sources is addressed. Previous work in the area e. g. STFT-MUSIC algorithm cannot resolve the t-f completely or largely joint sources because they can only select the single-source t-f points. The proposed method con¬structs the spatial t-f distributions (STFDs by selecting the multiple-source t-f points and uses the FBSS techniques to solve the problem of rank loss. In this way, the STFT-FBSS-MUSIC algorithm can resolve the t-f largely joint or completely joint LFM sources. In addition, the proposed algorithm also owns pretty low computational complexity when resolving multiple LFM sources because it can reduce the times of the feature decomposition and spectrum search. The performance of the proposed method is compared with that of the existing t-f based MUSIC algorithms through computer simulations and the results show its good performance.

  7. Directory of IAEA databases

    International Nuclear Information System (INIS)

    1991-11-01

    The first edition of the Directory of IAEA Databases is intended to describe the computerized information sources available to IAEA staff members. It contains a listing of all databases produced at the IAEA, together with information on their availability

  8. Food ordering for children in restaurants: multiple sources of influence on decision making

    Science.gov (United States)

    Castro, Iana A; Williams, Christine B; Madanat, Hala; Pickrel, Julie L; Jun, Hee-Jin; Zive, Michelle; Gahagan, Sheila; Ayala, Guadalupe X

    2017-01-01

    Objective Restaurants are playing an increasingly important role in children’s dietary intake. Interventions to promote healthy ordering in restaurants have primarily targeted adults. Much remains unknown about how to influence ordering for and by children. Using an ecological lens, the present study sought to identify sources of influence on ordering behaviour for and by children in restaurants. Design A mixed-methods study was conducted using unobtrusive observations of dining parties with children and post-order interviews. Observational data included: child’s gender, person ordering for the child and server interactions with the dining party. Interview data included: child’s age, restaurant visit frequency, timing of child’s decision making, and factors influencing decision making. Setting Ten independent, table-service restaurants in San Diego, CA, USA participated. Subjects Complete observational and interview data were obtained from 102 dining parties with 150 children (aged 3–14 years). Results Taste preferences, family influences and menus impacted ordering. However, most children knew what they intended to order before arriving at the restaurant, especially if they dined there at least monthly. Furthermore, about one-third of children shared their meals with others and all shared meals were ordered from adult (v. children’s) menus. Parents placed most orders, although parental involvement in ordering was less frequent with older children. Servers interacted frequently with children but generally did not recommend menu items or prompt use of the children’s menu. Conclusions Interventions to promote healthy ordering should consider the multiple sources of influence that are operating when ordering for and by children in restaurants. PMID:27334904

  9. Multiple data sources improve DNA-based mark-recapture population estimates of grizzly bears.

    Science.gov (United States)

    Boulanger, John; Kendall, Katherine C; Stetz, Jeffrey B; Roon, David A; Waits, Lisette P; Paetkau, David

    2008-04-01

    A fundamental challenge to estimating population size with mark-recapture methods is heterogeneous capture probabilities and subsequent bias of population estimates. Confronting this problem usually requires substantial sampling effort that can be difficult to achieve for some species, such as carnivores. We developed a methodology that uses two data sources to deal with heterogeneity and applied this to DNA mark-recapture data from grizzly bears (Ursus arctos). We improved population estimates by incorporating additional DNA "captures" of grizzly bears obtained by collecting hair from unbaited bear rub trees concurrently with baited, grid-based, hair snag sampling. We consider a Lincoln-Petersen estimator with hair snag captures as the initial session and rub tree captures as the recapture session and develop an estimator in program MARK that treats hair snag and rub tree samples as successive sessions. Using empirical data from a large-scale project in the greater Glacier National Park, Montana, USA, area and simulation modeling we evaluate these methods and compare the results to hair-snag-only estimates. Empirical results indicate that, compared with hair-snag-only data, the joint hair-snag-rub-tree methods produce similar but more precise estimates if capture and recapture rates are reasonably high for both methods. Simulation results suggest that estimators are potentially affected by correlation of capture probabilities between sample types in the presence of heterogeneity. Overall, closed population Huggins-Pledger estimators showed the highest precision and were most robust to sparse data, heterogeneity, and capture probability correlation among sampling types. Results also indicate that these estimators can be used when a segment of the population has zero capture probability for one of the methods. We propose that this general methodology may be useful for other species in which mark-recapture data are available from multiple sources.

  10. Systematic identification of yeast cell cycle transcription factors using multiple data sources

    Directory of Open Access Journals (Sweden)

    Li Wen-Hsiung

    2008-12-01

    Full Text Available Abstract Background Eukaryotic cell cycle is a complex process and is precisely regulated at many levels. Many genes specific to the cell cycle are regulated transcriptionally and are expressed just before they are needed. To understand the cell cycle process, it is important to identify the cell cycle transcription factors (TFs that regulate the expression of cell cycle-regulated genes. Results We developed a method to identify cell cycle TFs in yeast by integrating current ChIP-chip, mutant, transcription factor binding site (TFBS, and cell cycle gene expression data. We identified 17 cell cycle TFs, 12 of which are known cell cycle TFs, while the remaining five (Ash1, Rlm1, Ste12, Stp1, Tec1 are putative novel cell cycle TFs. For each cell cycle TF, we assigned specific cell cycle phases in which the TF functions and identified the time lag for the TF to exert regulatory effects on its target genes. We also identified 178 novel cell cycle-regulated genes, among which 59 have unknown functions, but they may now be annotated as cell cycle-regulated genes. Most of our predictions are supported by previous experimental or computational studies. Furthermore, a high confidence TF-gene regulatory matrix is derived as a byproduct of our method. Each TF-gene regulatory relationship in this matrix is supported by at least three data sources: gene expression, TFBS, and ChIP-chip or/and mutant data. We show that our method performs better than four existing methods for identifying yeast cell cycle TFs. Finally, an application of our method to different cell cycle gene expression datasets suggests that our method is robust. Conclusion Our method is effective for identifying yeast cell cycle TFs and cell cycle-regulated genes. Many of our predictions are validated by the literature. Our study shows that integrating multiple data sources is a powerful approach to studying complex biological systems.

  11. Field validation of secondary data sources: a novel measure of representativity applied to a Canadian food outlet database.

    Science.gov (United States)

    Clary, Christelle M; Kestens, Yan

    2013-06-19

    Validation studies of secondary datasets used to characterize neighborhood food businesses generally evaluate how accurately the database represents the true situation on the ground. Depending on the research objectives, the characterization of the business environment may tolerate some inaccuracies (e.g. minor imprecisions in location or errors in business names). Furthermore, if the number of false negatives (FNs) and false positives (FPs) is balanced within a given area, one could argue that the database still provides a "fair" representation of existing resources in this area. Yet, traditional validation measures do not relax matching criteria, and treat FNs and FPs independently. Through the field validation of food businesses found in a Canadian database, this paper proposes alternative criteria for validity. Field validation of the 2010 Enhanced Points of Interest (EPOI) database (DMTI Spatial®) was performed in 2011 in 12 census tracts (CTs) in Montreal, Canada. Some 410 food outlets were extracted from the database and 484 were observed in the field. First, traditional measures of sensitivity and positive predictive value (PPV) accounting for every single mismatch between the field and the database were computed. Second, relaxed measures of sensitivity and PPV that tolerate mismatches in business names or slight imprecisions in location were assessed. A novel measure of representativity that further allows for compensation between FNs and FPs within the same business category and area was proposed. Representativity was computed at CT level as ((TPs +|FPs-FNs|)/(TPs+FNs)), with TPs meaning true positives, and |FPs-FNs| being the absolute value of the difference between the number of FNs and the number of FPs within each outlet category. The EPOI database had a "moderate" capacity to detect an outlet present in the field (sensitivity: 54.5%) or to list only the outlets that actually existed in the field (PPV: 64.4%). Relaxed measures of sensitivity and PPV

  12. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    Science.gov (United States)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  13. Development of an asymmetric multiple-position neutron source (AMPNS) method to monitor the criticality of a degraded reactor core

    International Nuclear Information System (INIS)

    Kim, S.S.; Levine, S.H.

    1985-01-01

    An analytical/experimental method has been developed to monitor the subcritical reactivity and unfold the k/sub infinity/ distribution of a degraded reactor core. The method uses several fixed neutron detectors and a Cf-252 neutron source placed sequentially in multiple positions in the core. Therefore, it is called the Asymmetric Multiple Position Neutron Source (AMPNS) method. The AMPNS method employs nucleonic codes to analyze the neutron multiplication of a Cf-252 neutron source. An optimization program, GPM, is utilized to unfold the k/sub infinity/ distribution of the degraded core, in which the desired performance measure minimizes the error between the calculated and the measured count rates of the degraded reactor core. The analytical/experimental approach is validated by performing experiments using the Penn State Breazeale TRIGA Reactor (PSBR). A significant result of this study is that it provides a method to monitor the criticality of a damaged core during the recovery period

  14. High brightness fiber laser pump sources based on single emitters and multiple single emitters

    Science.gov (United States)

    Scheller, Torsten; Wagner, Lars; Wolf, Jürgen; Bonati, Guido; Dörfel, Falk; Gabler, Thomas

    2008-02-01

    Driven by the potential of the fiber laser market, the development of high brightness pump sources has been pushed during the last years. The main approaches to reach the targets of this market had been the direct coupling of single emitters (SE) on the one hand and the beam shaping of bars and stacks on the other hand, which often causes higher cost per watt. Meanwhile the power of single emitters with 100μm emitter size for direct coupling increased dramatically, which also pushed a new generation of wide stripe emitters or multi emitters (ME) of up to 1000μm emitter size respectively "minibars" with apertures of 3 to 5mm. The advantage of this emitter type compared to traditional bars is it's scalability to power levels of 40W to 60W combined with a small aperture which gives advantages when coupling into a fiber. We show concepts using this multiple single emitters for fiber coupled systems of 25W up to 40W out of a 100μm fiber NA 0.22 with a reasonable optical efficiency. Taking into account a further efficiency optimization and an increase in power of these devices in the near future, the EUR/W ratio pushed by the fiber laser manufacturer will further decrease. Results will be shown as well for higher power pump sources. Additional state of the art tapered fiber bundles for photonic crystal fibers are used to combine 7 (19) pump sources to output powers of 100W (370W) out of a 130μm (250μm) fiber NA 0.6 with nominal 20W per port. Improving those TFB's in the near future and utilizing 40W per pump leg, an output power of even 750W out of 250μm fiber NA 0.6 will be possible. Combined Counter- and Co-Propagated pumping of the fiber will then lead to the first 1kW fiber laser oscillator.

  15. Synergistic effect of multiple indoor allergen sources on atopic symptoms in primary school children

    International Nuclear Information System (INIS)

    Chen, W-Y.; Tseng, H-I.; Wu, M-T.; Hung, H-C.; Wu, H-T.; Chen, H-L.; Lu, C.-C.

    2003-01-01

    Accumulating data show that the complex modern indoor environment contributes to increasing prevalence of atopic diseases. However, the dose-response relationship between allergic symptoms and complexity of indoor environmental allergen sources (IEAS) has not been clearly evaluated before. Therefore, we designed this study to investigate the overall effect of multiple IEAS on appearance of asthma (AS), allergic rhinitis (AR), and eczema (EC) symptoms in 1472 primary school children. Among various IEAS analyzed, only stuffed toys, cockroaches, and mold patches fit the model of 'more IEAS, higher odds ratio (OR) of association'. The association of IEAS and AR increased stepwise as more IEAS appeared in the environment (1.71, 2.47, to 2.86). In AS and EC, the association was significant only when all three IEAS were present (1.42, 1.98, to 4.11 in AS; 1.40, 1.76, to 2.95 in EC). These results showed that different IEAS had a synergistic effect on their association with atopic symptoms and also suggest that there is a dose-response relationship between kinds of IEAS and risk of appearance of atopic diseases

  16. Integrating Multiple Data Sources for Combinatorial Marker Discovery: A Study in Tumorigenesis.

    Science.gov (United States)

    Bandyopadhyay, Sanghamitra; Mallik, Saurav

    2018-01-01

    Identification of combinatorial markers from multiple data sources is a challenging task in bioinformatics. Here, we propose a novel computational framework for identifying significant combinatorial markers ( s) using both gene expression and methylation data. The gene expression and methylation data are integrated into a single continuous data as well as a (post-discretized) boolean data based on their intrinsic (i.e., inverse) relationship. A novel combined score of methylation and expression data (viz., ) is introduced which is computed on the integrated continuous data for identifying initial non-redundant set of genes. Thereafter, (maximal) frequent closed homogeneous genesets are identified using a well-known biclustering algorithm applied on the integrated boolean data of the determined non-redundant set of genes. A novel sample-based weighted support ( ) is then proposed that is consecutively calculated on the integrated boolean data of the determined non-redundant set of genes in order to identify the non-redundant significant genesets. The top few resulting genesets are identified as potential s. Since our proposed method generates a smaller number of significant non-redundant genesets than those by other popular methods, the method is much faster than the others. Application of the proposed technique on an expression and a methylation data for Uterine tumor or Prostate Carcinoma produces a set of significant combination of markers. We expect that such a combination of markers will produce lower false positives than individual markers.

  17. Use of Multiple Data Sources to Estimate the Economic Cost of Dengue Illness in Malaysia

    Science.gov (United States)

    Shepard, Donald S.; Undurraga, Eduardo A.; Lees, Rosemary Susan; Halasa, Yara; Lum, Lucy Chai See; Ng, Chiu Wan

    2012-01-01

    Dengue represents a substantial burden in many tropical and sub-tropical regions of the world. We estimated the economic burden of dengue illness in Malaysia. Information about economic burden is needed for setting health policy priorities, but accurate estimation is difficult because of incomplete data. We overcame this limitation by merging multiple data sources to refine our estimates, including an extensive literature review, discussion with experts, review of data from health and surveillance systems, and implementation of a Delphi process. Because Malaysia has a passive surveillance system, the number of dengue cases is under-reported. Using an adjusted estimate of total dengue cases, we estimated an economic burden of dengue illness of US$56 million (Malaysian Ringgit MYR196 million) per year, which is approximately US$2.03 (Malaysian Ringgit 7.14) per capita. The overall economic burden of dengue would be even higher if we included costs associated with dengue prevention and control, dengue surveillance, and long-term sequelae of dengue. PMID:23033404

  18. Maximum Likelihood DOA Estimation of Multiple Wideband Sources in the Presence of Nonuniform Sensor Noise

    Directory of Open Access Journals (Sweden)

    K. Yao

    2007-12-01

    Full Text Available We investigate the maximum likelihood (ML direction-of-arrival (DOA estimation of multiple wideband sources in the presence of unknown nonuniform sensor noise. New closed-form expression for the direction estimation Cramér-Rao-Bound (CRB has been derived. The performance of the conventional wideband uniform ML estimator under nonuniform noise has been studied. In order to mitigate the performance degradation caused by the nonuniformity of the noise, a new deterministic wideband nonuniform ML DOA estimator is derived and two associated processing algorithms are proposed. The first algorithm is based on an iterative procedure which stepwise concentrates the log-likelihood function with respect to the DOAs and the noise nuisance parameters, while the second is a noniterative algorithm that maximizes the derived approximately concentrated log-likelihood function. The performance of the proposed algorithms is tested through extensive computer simulations. Simulation results show the stepwise-concentrated ML algorithm (SC-ML requires only a few iterations to converge and both the SC-ML and the approximately-concentrated ML algorithm (AC-ML attain a solution close to the derived CRB at high signal-to-noise ratio.

  19. Journey of a Package: Category 1 Source (Co-60) Shipment with Several Border Crossings, Multiple Modes

    International Nuclear Information System (INIS)

    Gray, P. A.

    2016-01-01

    Radioactive materials (RAM) are used extensively in a vast array of industries and in an even wider breadth of applications on a truly global basis each and every day. Over the past 50 years, these applications and the quantity (activity) of RAM shipped has grown significantly, with the next 50 years expected to show a continuing trend. The movement of these goods occurs in all regions of the world, and must therefore be conducted in a manner which will not adversely impact people or the environment. Industry and regulators have jointly met this challenge, so much so that RAM shipments are amongst the safest of any product. How has this level of performance been achieved? What is involved in shipping RAM from one corner of the world to another, often via a number of in-transit locations and often utilizing multiple modes of transport in any single shipment? This paper reviews one such journey, of Category 1 Cobalt-60 sources, as they move from point of manufacture through to point of use including the detailed and multi-approval process, the stringent regulatory requirements in place, the extensive communications required throughout, and the practical aspects needed to simply offer such a product for sale and transport. Upon completion, the rationale for such an exemplary safety and security record will be readily apparent. (author)

  20. Freezing of enkephalinergic functions by multiple noxious foci: a source of pain sensitization?

    Directory of Open Access Journals (Sweden)

    François Cesselin

    Full Text Available BACKGROUND: The functional significance of proenkephalin systems in processing pain remains an open question and indeed is puzzling. For example, a noxious mechanical stimulus does not alter the release of Met-enkephalin-like material (MELM from segments of the spinal cord related to the stimulated area of the body, but does increase its release from other segments. METHODOLOGY/PRINCIPAL FINDINGS: Here we show that, in the rat, a noxious mechanical stimulus applied to either the right or the left hind paw elicits a marked increase of MELM release during perifusion of either the whole spinal cord or the cervico-trigeminal area. However, these stimulatory effects were not additive and indeed, disappeared completely when the right and left paws were stimulated simultaneously. CONCLUSION/SIGNIFICANCE: We have concluded that in addition to the concept of a diffuse control of the transmission of nociceptive signals through the dorsal horn, there is a diffuse control of the modulation of this transmission. The "freezing" of Met-enkephalinergic functions represents a potential source of central sensitization in the spinal cord, notably in clinical situations involving multiple painful foci, e.g. cancer with metastases, poly-traumatism or rheumatoid arthritis.

  1. Use of multiple data sources to estimate the economic cost of dengue illness in Malaysia.

    Science.gov (United States)

    Shepard, Donald S; Undurraga, Eduardo A; Lees, Rosemary Susan; Halasa, Yara; Lum, Lucy Chai See; Ng, Chiu Wan

    2012-11-01

    Dengue represents a substantial burden in many tropical and sub-tropical regions of the world. We estimated the economic burden of dengue illness in Malaysia. Information about economic burden is needed for setting health policy priorities, but accurate estimation is difficult because of incomplete data. We overcame this limitation by merging multiple data sources to refine our estimates, including an extensive literature review, discussion with experts, review of data from health and surveillance systems, and implementation of a Delphi process. Because Malaysia has a passive surveillance system, the number of dengue cases is under-reported. Using an adjusted estimate of total dengue cases, we estimated an economic burden of dengue illness of US$56 million (Malaysian Ringgit MYR196 million) per year, which is approximately US$2.03 (Malaysian Ringgit 7.14) per capita. The overall economic burden of dengue would be even higher if we included costs associated with dengue prevention and control, dengue surveillance, and long-term sequelae of dengue.

  2. Software listing: CHEMTOX database

    International Nuclear Information System (INIS)

    Moskowitz, P.D.

    1993-01-01

    Initially launched in 1983, the CHEMTOX Database was among the first microcomputer databases containing hazardous chemical information. The database is used in many industries and government agencies in more than 17 countries. Updated quarterly, the CHEMTOX Database provides detailed environmental and safety information on 7500-plus hazardous substances covered by dozens of regulatory and advisory sources. This brief listing describes the method of accessing data and provides ordering information for those wishing to obtain the CHEMTOX Database

  3. Experiment Databases

    Science.gov (United States)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  4. DUSTMS-D: DISPOSAL UNIT SOURCE TERM - MULTIPLE SPECIES - DISTRIBUTED FAILURE DATA INPUT GUIDE.

    Energy Technology Data Exchange (ETDEWEB)

    SULLIVAN, T.M.

    2006-01-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). Many of these physical processes are influenced by the design of the disposal facility (e.g., how the engineered barriers control infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This has been done and the resulting models have been incorporated into the computer code DUST-MS (Disposal Unit Source Term-Multiple Species). The DUST-MS computer code is designed to model water flow, container degradation, release of contaminants from the wasteform to the contacting solution and transport through the subsurface media. Water flow through the facility over time is modeled using tabular input. Container degradation models include three types of failure rates: (a) instantaneous (all containers in a control volume fail at once), (b) uniformly distributed failures (containers fail at a linear rate between a specified starting and ending time), and (c) gaussian failure rates (containers fail at a rate determined by a mean failure time, standard deviation and gaussian distribution). Wasteform release models include four release mechanisms: (a) rinse with partitioning (inventory is released instantly upon container failure subject to equilibrium partitioning (sorption) with

  5. Effects of neutron spectrum and external neutron source on neutron multiplication parameters in accelerator-driven system

    International Nuclear Information System (INIS)

    Shahbunder, Hesham; Pyeon, Cheol Ho; Misawa, Tsuyoshi; Lim, Jae-Yong; Shiroya, Seiji

    2010-01-01

    The neutron multiplication parameters: neutron multiplication M, subcritical multiplication factor k s , external source efficiency φ*, play an important role for numerical assessment and reactor power evaluation of an accelerator-driven system (ADS). Those parameters can be evaluated by using the measured reaction rate distribution in the subcritical system. In this study, the experimental verification of this methodology is performed in various ADS cores; with high-energy (100 MeV) proton-tungsten source in hard and soft neutron spectra cores and 14 MeV D-T neutron source in soft spectrum core. The comparison between measured and calculated multiplication parameters reveals a maximum relative difference in the range of 6.6-13.7% that is attributed to the calculation nuclear libraries uncertainty and accuracy for energies higher than 20 MeV and also dependent on the reaction rate distribution position and count rates. The effects of different core neutron spectra and external neutron sources on the neutron multiplication parameters are discussed.

  6. Impact of multiple-dose versus single-dose inhaler devices on COPD patients’ persistence with long-acting β2-agonists: a dispensing database analysis

    Science.gov (United States)

    van Boven, Job FM; van Raaij, Joost J; van der Galiën, Ruben; Postma, Maarten J; van der Molen, Thys; Dekhuijzen, PN Richard; Vegter, Stefan

    2014-01-01

    Background: With a growing availability of different devices and types of medication, additional evidence is required to assist clinicians in prescribing the optimal medication in relation to chronic obstructive pulmonary disease (COPD) patients’ persistence with long-acting β2-agonists (LABAs). Aims: To assess the impact of the type of inhaler device (multiple-dose versus single-dose inhalers) on 1-year persistence and switching patterns with LABAs. Methods: A retrospective observational cohort study was performed comparing a cohort of patients initiating multiple-dose inhalers and a cohort initiating single-dose inhalers. The study population consisted of long-acting bronchodilator naive COPD patients, initiating inhalation therapy with mono-LABAs (formoterol, indacaterol or salmeterol). Analyses were performed using pharmacy dispensing data from 1994 to 2012, obtained from the IADB.nl database. Study outcomes were 1-year persistence and switching patterns. Results were adjusted for initial prescriber, initial medication, dosing regimen and relevant comorbidities. Results: In all, 575 patients initiating LABAs were included in the final study cohort. Among them, 475 (83%) initiated a multiple-dose inhaler and 100 (17%) a single-dose inhaler. Further, 269 (47%) initiated formoterol, 9 (2%) indacaterol and 297 (52%) salmeterol. There was no significant difference in persistence between users of multiple-dose or single-dose inhalers (hazard ratio: 0.98, 95% confidence interval: 0.76–1.26, P=0.99). Over 80% re-started or switched medication. Conclusions: There seems no impact of inhaler device (multiple-dose versus single-dose inhalers) on COPD patients’ persistence with LABAs. Over 80% of patients who initially seemed to discontinue LABAs, re-started their initial medication or switched inhalers or medication within 1 year. PMID:25274453

  7. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  8. NAViGaTing the micronome--using multiple microRNA prediction databases to identify signalling pathway-associated microRNAs.

    Directory of Open Access Journals (Sweden)

    Elize A Shirdel

    2011-02-01

    Full Text Available MicroRNAs are a class of small RNAs known to regulate gene expression at the transcript level, the protein level, or both. Since microRNA binding is sequence-based but possibly structure-specific, work in this area has resulted in multiple databases storing predicted microRNA:target relationships computed using diverse algorithms. We integrate prediction databases, compare predictions to in vitro data, and use cross-database predictions to model the microRNA:transcript interactome--referred to as the micronome--to study microRNA involvement in well-known signalling pathways as well as associations with disease. We make this data freely available with a flexible user interface as our microRNA Data Integration Portal--mirDIP (http://ophid.utoronto.ca/mirDIP.mirDIP integrates prediction databases to elucidate accurate microRNA:target relationships. Using NAViGaTOR to produce interaction networks implicating microRNAs in literature-based, KEGG-based and Reactome-based pathways, we find these signalling pathway networks have significantly more microRNA involvement compared to chance (p<0.05, suggesting microRNAs co-target many genes in a given pathway. Further examination of the micronome shows two distinct classes of microRNAs; universe microRNAs, which are involved in many signalling pathways; and intra-pathway microRNAs, which target multiple genes within one signalling pathway. We find universe microRNAs to have more targets (p<0.0001, to be more studied (p<0.0002, and to have higher degree in the KEGG cancer pathway (p<0.0001, compared to intra-pathway microRNAs.Our pathway-based analysis of mirDIP data suggests microRNAs are involved in intra-pathway signalling. We identify two distinct classes of microRNAs, suggesting a hierarchical organization of microRNAs co-targeting genes both within and between pathways, and implying differential involvement of universe and intra-pathway microRNAs at the disease level.

  9. Application of the modified neutron source multiplication method for a measurement of sub-criticality in AGN-201K reactor

    International Nuclear Information System (INIS)

    Myung-Hyun Kim

    2010-01-01

    Measurement of sub-criticality is a challenging and required task in nuclear industry both for nuclear criticality safety and physics test in nuclear power plant. A relatively new method named as Modified Neutron Source Multiplication Method (MNSM) was proposed in Japan. This method is an improvement of traditional Neutron Source Multiplication (NSM) Method, in which three correction factors are applied additionally. In this study, MNSM was tested in calculation of rod worth using an educational reactor in Kyung Hee University, AGN-201K. For this study, a revised nuclear data library and a neutron transport code system TRANSX-PARTISN were used for the calculation of correction factors for various control rod positions and source locations. Experiments were designed and performed to enhance errors in NSM from the location effects of source and detectors. MNSM can correct these effects but current results showed not much correction effects. (author)

  10. Relative accuracy and availability of an Irish National Database of dispensed medication as a source of medication history information: observational study and retrospective record analysis.

    LENUS (Irish Health Repository)

    Grimes, T

    2013-01-27

    WHAT IS KNOWN AND OBJECTIVE: The medication reconciliation process begins by identifying which medicines a patient used before presentation to hospital. This is time-consuming, labour intensive and may involve interruption of clinicians. We sought to identify the availability and accuracy of data held in a national dispensing database, relative to other sources of medication history information. METHODS: For patients admitted to two acute hospitals in Ireland, a Gold Standard Pre-Admission Medication List (GSPAML) was identified and corroborated with the patient or carer. The GSPAML was compared for accuracy and availability to PAMLs from other sources, including the Health Service Executive Primary Care Reimbursement Scheme (HSE-PCRS) dispensing database. RESULTS: Some 1111 medication were assessed for 97 patients, who were median age 74 years (range 18-92 years), median four co-morbidities (range 1-9), used median 10 medications (range 3-25) and half (52%) were male. The HSE-PCRS PAML was the most accurate source compared to lists provided by the general practitioner, community pharmacist or cited in previous hospital documentation: the list agreed for 74% of the medications the patients actually used, representing complete agreement for all medications in 17% of patients. It was equally contemporaneous to other sources, but was less reliable for male than female patients, those using increasing numbers of medications and those using one or more item that was not reimbursable by the HSE. WHAT IS NEW AND CONCLUSION: The HSE-PCRS database is a relatively accurate, available and contemporaneous source of medication history information and could support acute hospital medication reconciliation.

  11. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    Science.gov (United States)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  12. Multiple uses for an old ibm-pc 486 in nuclear medicine using open source software

    International Nuclear Information System (INIS)

    Anselmi, C.E.; Anselmi, O.E.

    2002-01-01

    Multiple uses for an old ibm-pc 486 in nuclear medicine using open source software. Aim: To use a low budget platform to: 1 - send patient's images from processing workstation to the nuclear medicine information system; 2 - backup data files from acquisition in DICOM format in cd-rom; 3 - move data across different hospitals allowing remote processing and reading of studies. Both nuclear medicine systems in the two hospitals are Siemens Icon workstations. Material and methods: The computer used is an ibm-pc 486, which sells for about US dollar 70. The operating system installed is Red Hat Linux 6.2. The sending of the patient's images to the information system is performed through AppleTalk and Samba. The backup of acquisition files is performed by the communication from the workstation through DICOM to the Storage Class Provider (Office Dicom Toolkit) running in the 486, and the files are later burned on cd-rom. A similar configuration is present in another hospital, with minor differences in processor type. Data from any of the hospitals can be sent to the other one through the remote synchronization performed by Rsync. The connection between both Linux computers is encrypted through Secure Shell (open SSH). All software installed in the 486 was downloaded from the internet at no cost. No software was installed in the workstations. Results: The whole system is recognized transparently by the workstation's system as a local storage disk, such as the acquisition cameras or the other workstations. The transfer of images from the workstation to the information system or to a remote hospital is done the same way as copying data from the acquisition cameras in the vendor's software. When transferring large files across hospitals, the synchronization may take 1 to 3 minutes through broad band internet. The backup in DICOM format in cd-rom allows review of patient data in any computer equipped with a DICOM viewing software, as well as the re-processing of that

  13. A methodology for combining multiple commercial data sources to improve measurement of the food and alcohol environment: applications of geographical information systems

    Directory of Open Access Journals (Sweden)

    Dara D. Mendez

    2014-11-01

    Full Text Available Commercial data sources have been increasingly used to measure and locate community resources. We describe a methodology for combining and comparing the differences in commercial data of the food and alcohol environment. We used commercial data from two commercial databases (InfoUSA and Dun&Bradstreet for 2003 and 2009 to obtain infor- mation on food and alcohol establishments and developed a matching process using computer algorithms and manual review by applying ArcGIS to geocode addresses, standard industrial classification and North American industry classification tax- onomy for type of establishment and establishment name. We constructed population and area-based density measures (e.g. grocery stores and assessed differences across data sources and used ArcGIS to map the densities. The matching process resulted in 8,705 and 7,078 unique establishments for 2003 and 2009, respectively. There were more establishments cap- tured in the combined dataset than relying on one data source alone, and the additional establishments captured ranged from 1,255 to 2,752 in 2009. The correlations for the density measures between the two data sources was highest for alcohol out- lets (r = 0.75 and 0.79 for per capita and area, respectively and lowest for grocery stores/supermarkets (r = 0.32 for both. This process for applying geographical information systems to combine multiple commercial data sources and develop meas- ures of the food and alcohol environment captured more establishments than relying on one data source alone. This replic- able methodology was found to be useful for understanding the food and alcohol environment when local or public data are limited.

  14. Exploiting Deep Neural Networks and Head Movements for Robust Binaural Localization of Multiple Sources in Reverberant Environments

    DEFF Research Database (Denmark)

    Ma, Ning; May, Tobias; Brown, Guy J.

    2017-01-01

    This paper presents a novel machine-hearing system that exploits deep neural networks (DNNs) and head movements for robust binaural localization of multiple sources in reverberant environments. DNNs are used to learn the relationship between the source azimuth and binaural cues, consisting...... of the complete cross-correlation function (CCF) and interaural level differences (ILDs). In contrast to many previous binaural hearing systems, the proposed approach is not restricted to localization of sound sources in the frontal hemifield. Due to the similarity of binaural cues in the frontal and rear...

  15. Development of an updated phytoestrogen database for use with the SWAN food frequency questionnaire: intakes and food sources in a community-based, multiethnic cohort study.

    Science.gov (United States)

    Huang, Mei-Hua; Norris, Jean; Han, Weijuan; Block, Torin; Gold, Ellen; Crawford, Sybil; Greendale, Gail A

    2012-01-01

    Phytoestrogens, heterocyclic phenols found in plants, may benefit several health outcomes. However, epidemiologic studies of the health effects of dietary phytoestrogens have yielded mixed results, in part due to challenges inherent in estimating dietary intakes. The goal of this study was to improve the estimates of dietary phytoestrogen consumption using a modified Block Food Frequency Questionnaire (FFQ), a 137-item FFQ created for the Study of Women's Health Across the Nation (SWAN) in 1994. To expand the database of sources from which phytonutrient intakes were computed, we conducted a comprehensive PubMed/Medline search covering January 1994 through September 2008. The expanded database included 4 isoflavones, coumestrol, and 4 lignans. The new database estimated isoflavone content of 105 food items (76.6%) vs. 14 (10.2%) in the 1994 version and computed coumestrol content of 52 food items (38.0%), compared to 1 (0.7%) in the original version. Newly added were lignans; values for 104 FFQ food items (75.9%) were calculated. In addition, we report here the phytonutrient intakes for each racial and language group in the SWAN sample and present major food sources from which the phytonutrients came. This enhanced ascertainment of phytoestrogens will permit improved studies of their health effects.

  16. Investigation of black and brown carbon multiple-wavelength-dependent light absorption from biomass and fossil fuel combustion source emissions

    Science.gov (United States)

    Michael R. Olson; Mercedes Victoria Garcia; Michael A. Robinson; Paul Van Rooy; Mark A. Dietenberger; Michael Bergin; James Jay Schauer

    2015-01-01

    Quantification of the black carbon (BC) and brown carbon (BrC) components of source emissions is critical to understanding the impact combustion aerosols have on atmospheric light absorption. Multiple-wavelength absorption was measured from fuels including wood, agricultural biomass, coals, plant matter, and petroleum distillates in controlled combustion settings....

  17. Frequency-swept laser light source at 1050 nm with higher bandwidth due to multiple semiconductor optical amplifiers in series

    DEFF Research Database (Denmark)

    Marschall, Sebastian; Thrane, Lars; Andersen, Peter E.

    2009-01-01

    We report on the development of an all-fiber frequency-swept laser light source in the 1050 nm range based on semiconductor optical amplifiers (SOA) with improved bandwidth due to multiple gain media. It is demonstrated that even two SOAs with nearly equal gain spectra can improve the performance...

  18. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  19. Determination of Key Risk Supervision Areas around River-Type Water Sources Affected by Multiple Risk Sources: A Case Study of Water Sources along the Yangtze’s Nanjing Section

    Directory of Open Access Journals (Sweden)

    Qi Zhou

    2017-02-01

    Full Text Available To provide a reference for risk management of water sources, this study screens the key risk supervision areas around river-type water sources (hereinafter referred to as the water sources threatened by multiple fixed risk sources (the risk sources, and establishes a comprehensive methodological system. Specifically, it comprises: (1 method of partitioning risk source concentrated sub-regions for screening water source perimeter key risk supervision areas; (2 approach of determining sub-regional risk indexes (SrRI, which characterizes the scale of sub-regional risks considering factors like risk distribution intensity within sub-regions, risk indexes of risk sources (RIRS, characterizing the risk scale of risk sources and the number of risk sources; and (3 method of calculating sub-region’s risk threats to the water sources (SrTWS which considers the positional relationship between water sources and sub-regions as well as SrRI, and the criteria for determining key supervision sub-regions. Favorable effects are achieved by applying this methodological system in determining water source perimeter sub-regions distributed along the Yangtze’s Nanjing section. Results revealed that for water sources, the key sub-regions needing supervision were SD16, SD06, SD21, SD26, SD15, SD03, SD02, SD32, SD10, SD11, SD14, SD05, SD27, etc., in the order of criticality. The sub-region with the greatest risk threats on the water sources was SD16, which was located in the middle reaches of Yangtze River. In general, sub-regions along the upper Yangtze reaches had greater threats to water sources than the lower reach sub-regions other than SD26 and SD21. Upstream water sources were less subject to the threats of sub-regions than the downstream sources other than NJ09B and NJ03.

  20. submitter BioSharing: curated and crowd-sourced metadata standards, databases and data policies in the life sciences

    CERN Document Server

    McQuilton, Peter; Rocca-Serra, Philippe; Thurston, Milo; Lister, Allyson; Maguire, Eamonn; Sansone, Susanna-Assunta

    2016-01-01

    BioSharing (http://www.biosharing.org) is a manually curated, searchable portal of three linked registries. These resources cover standards (terminologies, formats and models, and reporting guidelines), databases, and data policies in the life sciences, broadly encompassing the biological, environmental and biomedical sciences. Launched in 2011 and built by the same core team as the successful MIBBI portal, BioSharing harnesses community curation to collate and cross-reference resources across the life sciences from around the world. BioSharing makes these resources findable and accessible (the core of the FAIR principle). Every record is designed to be interlinked, providing a detailed description not only on the resource itself, but also on its relations with other life science infrastructures. Serving a variety of stakeholders, BioSharing cultivates a growing community, to which it offers diverse benefits. It is a resource for funding bodies and journal publishers to navigate the metadata landscape of the ...

  1. Comparison of cluster-based and source-attribution methods for estimating transmission risk using large HIV sequence databases.

    Science.gov (United States)

    Le Vu, Stéphane; Ratmann, Oliver; Delpech, Valerie; Brown, Alison E; Gill, O Noel; Tostevin, Anna; Fraser, Christophe; Volz, Erik M

    2018-06-01

    Phylogenetic clustering of HIV sequences from a random sample of patients can reveal epidemiological transmission patterns, but interpretation is hampered by limited theoretical support and statistical properties of clustering analysis remain poorly understood. Alternatively, source attribution methods allow fitting of HIV transmission models and thereby quantify aspects of disease transmission. A simulation study was conducted to assess error rates of clustering methods for detecting transmission risk factors. We modeled HIV epidemics among men having sex with men and generated phylogenies comparable to those that can be obtained from HIV surveillance data in the UK. Clustering and source attribution approaches were applied to evaluate their ability to identify patient attributes as transmission risk factors. We find that commonly used methods show a misleading association between cluster size or odds of clustering and covariates that are correlated with time since infection, regardless of their influence on transmission. Clustering methods usually have higher error rates and lower sensitivity than source attribution method for identifying transmission risk factors. But neither methods provide robust estimates of transmission risk ratios. Source attribution method can alleviate drawbacks from phylogenetic clustering but formal population genetic modeling may be required to estimate quantitative transmission risk factors. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Real-Time Localization of Moving Dipole Sources for Tracking Multiple Free-Swimming Weakly Electric Fish

    Science.gov (United States)

    Jun, James Jaeyoon; Longtin, André; Maler, Leonard

    2013-01-01

    In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI) and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT) to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF) requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal’s positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole source

  3. Real-Time Localization of Moving Dipole Sources for Tracking Multiple Free-Swimming Weakly Electric Fish.

    Directory of Open Access Journals (Sweden)

    James Jaeyoon Jun

    Full Text Available In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal's positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole

  4. Brain Tumor Database, a free relational database for collection and analysis of brain tumor patient information.

    Science.gov (United States)

    Bergamino, Maurizio; Hamilton, David J; Castelletti, Lara; Barletta, Laura; Castellan, Lucio

    2015-03-01

    In this study, we describe the development and utilization of a relational database designed to manage the clinical and radiological data of patients with brain tumors. The Brain Tumor Database was implemented using MySQL v.5.0, while the graphical user interface was created using PHP and HTML, thus making it easily accessible through a web browser. This web-based approach allows for multiple institutions to potentially access the database. The BT Database can record brain tumor patient information (e.g. clinical features, anatomical attributes, and radiological characteristics) and be used for clinical and research purposes. Analytic tools to automatically generate statistics and different plots are provided. The BT Database is a free and powerful user-friendly tool with a wide range of possible clinical and research applications in neurology and neurosurgery. The BT Database graphical user interface source code and manual are freely available at http://tumorsdatabase.altervista.org. © The Author(s) 2013.

  5. Estimation of Multiple Point Sources for Linear Fractional Order Systems Using Modulating Functions

    KAUST Repository

    Belkhatir, Zehor; Laleg-Kirati, Taous-Meriem

    2017-01-01

    This paper proposes an estimation algorithm for the characterization of multiple point inputs for linear fractional order systems. First, using polynomial modulating functions method and a suitable change of variables the problem of estimating

  6. Source term determination from subcritical multiplication measurements at Koral-1 reactor

    International Nuclear Information System (INIS)

    Blazquez, J.B.; Barrado, J.M.

    1978-01-01

    By using an AmBe neutron source two independent procedures have been settled for the zero-power experimental fast-reactor Coral-1 in order to measure the source term which appears in the point kinetical equations. In the first one, the source term is measured when the reactor is just critical with source by taking advantage of the wide range of the linear approach to critical for Coral-1. In the second one, the measurement is made in subcritical state by making use of the previous calibrated control rods. Several applications are also included such as the measurement of the detector dead time, the determinations of the reactivity of small samples and the shape of the neutron importance of the source. (author)

  7. Automated classification of seismic sources in a large database: a comparison of Random Forests and Deep Neural Networks.

    Science.gov (United States)

    Hibert, Clement; Stumpf, André; Provost, Floriane; Malet, Jean-Philippe

    2017-04-01

    In the past decades, the increasing quality of seismic sensors and capability to transfer remotely large quantity of data led to a fast densification of local, regional and global seismic networks for near real-time monitoring of crustal and surface processes. This technological advance permits the use of seismology to document geological and natural/anthropogenic processes (volcanoes, ice-calving, landslides, snow and rock avalanches, geothermal fields), but also led to an ever-growing quantity of seismic data. This wealth of seismic data makes the construction of complete seismicity catalogs, which include earthquakes but also other sources of seismic waves, more challenging and very time-consuming as this critical pre-processing stage is classically done by human operators and because hundreds of thousands of seismic signals have to be processed. To overcome this issue, the development of automatic methods for the processing of continuous seismic data appears to be a necessity. The classification algorithm should satisfy the need of a method that is robust, precise and versatile enough to be deployed to monitor the seismicity in very different contexts. In this study, we evaluate the ability of machine learning algorithms for the analysis of seismic sources at the Piton de la Fournaise volcano being Random Forest and Deep Neural Network classifiers. We gather a catalog of more than 20,000 events, belonging to 8 classes of seismic sources. We define 60 attributes, based on the waveform, the frequency content and the polarization of the seismic waves, to parameterize the seismic signals recorded. We show that both algorithms provide similar positive classification rates, with values exceeding 90% of the events. When trained with a sufficient number of events, the rate of positive identification can reach 99%. These very high rates of positive identification open the perspective of an operational implementation of these algorithms for near-real time monitoring of

  8. Reduced dose uncertainty in MRI-based polymer gel dosimetry using parallel RF transmission with multiple RF sources

    International Nuclear Information System (INIS)

    Sang-Young Kim; Jung-Hoon Lee; Jin-Young Jung; Do-Wan Lee; Seu-Ran Lee; Bo-Young Choe; Hyeon-Man Baek; Korea University of Science and Technology, Daejeon; Dae-Hyun Kim; Jung-Whan Min; Ji-Yeon Park

    2014-01-01

    In this work, we present the feasibility of using a parallel RF transmit with multiple RF sources imaging method (MultiTransmit imaging) in polymer gel dosimetry. Image quality and B 1 field homogeneity was statistically better in the MultiTransmit imaging method than in conventional single source RF transmission imaging method. In particular, the standard uncertainty of R 2 was lower on the MultiTransmit images than on the conventional images. Furthermore, the MultiTransmit measurement showed improved dose resolution. Improved image quality and B 1 homogeneity results in reduced dose uncertainty, thereby suggesting the feasibility of MultiTransmit MR imaging in gel dosimetry. (author)

  9. On-line biomedical databases-the best source for quick search of the scientific information in the biomedicine.

    Science.gov (United States)

    Masic, Izet; Milinovic, Katarina

    2012-06-01

    Most of medical journals now has it's electronic version, available over public networks. Although there are parallel printed and electronic versions, and one other form need not to be simultaneously published. Electronic version of a journal can be published a few weeks before the printed form and must not has identical content. Electronic form of a journals may have an extension that does not contain a printed form, such as animation, 3D display, etc., or may have available fulltext, mostly in PDF or XML format, or just the contents or a summary. Access to a full text is usually not free and can be achieved only if the institution (library or host) enters into an agreement on access. Many medical journals, however, provide free access for some articles, or after a certain time (after 6 months or a year) to complete content. The search for such journals provide the network archive as High Wire Press, Free Medical Journals.com. It is necessary to allocate PubMed and PubMed Central, the first public digital archives unlimited collect journals of available medical literature, which operates in the system of the National Library of Medicine in Bethesda (USA). There are so called on- line medical journals published only in electronic form. It could be searched over on-line databases. In this paper authors shortly described about 30 data bases and short instructions how to make access and search the published papers in indexed medical journals.

  10. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system.

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  11. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  12. Application of the Modified Source Multiplication (MSM) Technique to Subcritical Reactivity Worth Measurements in Thermal and Fast Reactor Systems

    International Nuclear Information System (INIS)

    Blaise, P.; Fougeras, Ph.; Mellier, F.

    2011-01-01

    The Amplified Source Multiplication (ASM) method and its improved Modified Source Multiplication (MSM) method have been widely used in the CEA's EOLE and MASURCA critical facilities over the past decades for the determination of reactivity worths by using fission chambers in subcritical configurations. The ASM methodology uses relatively simple relationships between count rates of efficient miniature fission chambers located in slightly subcritical reference and perturbed configurations. While this method works quite well for small reactivity variations, the raw results need to be corrected to take into account the flux perturbation at the fission chamber location. This is performed by applying to the measurement a correction factor called MSM. This paper describes in detail both methodologies, with their associated uncertainties. Applications on absorber cluster worth in the MISTRAL-4 full MOX mock-up core and the last core loaded in MASURCA show the importance of the MSM correction on raw ASM data. (authors)

  13. Identification of dust storm source areas in West Asia using multiple environmental datasets.

    Science.gov (United States)

    Cao, Hui; Amiraslani, Farshad; Liu, Jian; Zhou, Na

    2015-01-01

    Sand and Dust storms are common phenomena in arid and semi-arid areas. West Asia Region, especially Tigris-Euphrates alluvial plain, has been recognized as one of the most important dust source areas in the world. In this paper, a method is applied to extract SDS (Sand and Dust Storms) sources in West Asia region using thematic maps, climate and geography, HYSPLIT model and satellite images. Out of 50 dust storms happened during 2000-2013 and collected in form of MODIS images, 27 events were incorporated as demonstrations of the simulated trajectories by HYSPLIT model. Besides, a dataset of the newly released Landsat images was used as base-map for the interpretation of SDS source regions. As a result, six main clusters were recognized as dust source areas. Of which, 3 clusters situated in Tigris-Euphrates plain were identified as severe SDS sources (including 70% dust storms in this research). Another cluster in Sistan plain is also a potential source area. This approach also confirmed six main paths causing dust storms. These paths are driven by the climate system including Siberian and Polar anticyclones, monsoon from Indian Subcontinent and depression from north of Africa. The identification of SDS source areas and paths will improve our understandings on the mechanisms and impacts of dust storms on socio-economy and environment of the region. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Start-up Characteristics of Swallow-tailed Axial-grooved Heat Pipe under the conditions of Multiple Heat Sources

    Science.gov (United States)

    Zhang, Renping

    2017-12-01

    A mathematical model was developed for predicting start-up characteristics of Swallow-tailed Axial-grooved Heat Pipe under the conditions of Multiple Heat Sources. The effects of heat capacitance of heat source, liquid-vapour interfacial evaporation-condensation heat transfer, shear stress at the interface was considered in current model. The interfacial evaporating mass flow rate is based on the kinetic analysis. Time variations of evaporating mass rate, wall temperature and liquid velocity are studied from the start-up to steady state. The calculated results show that wall temperature demonstrates step transition at the junction between the heat source and non-existent heat source on the evaporator. The liquid velocity changes drastically at the evaporator section, however, it has slight variation at the evaporator section without heat source. When the effect of heat source is ignored, the numerical temperature demonstrates a quicker response. With the consideration of capacitance of the heat source, the data obtained from the proposed model agree well with the experimental results.

  15. Multiple sources of metals of mineralization in Lower Cambrian black shales of South China: Evidence from geochemical and petrographic study

    Czech Academy of Sciences Publication Activity Database

    Pašava, J.; Kříbek, B.; Vymazalová, A.; Sýkorová, Ivana; Žák, Karel; Orberger, B.

    2008-01-01

    Roč. 58, č. 1 (2008), s. 25-42 ISSN 1344-1698 R&D Projects: GA AV ČR IAA300460510 Institutional research plan: CEZ:AV0Z30460519; CEZ:AV0Z30130516 Keywords : multiple source * Cambrian Ni-Mo-polymetalic black shale * SEDEX barite deposit Subject RIV: DD - Geochemistry Impact factor: 0.377, year: 2008

  16. Basalt generation at the Apollo 12 site. Part 2: Source heterogeneity, multiple melts, and crustal contamination

    Science.gov (United States)

    Neal, Clive R.; Hacker, Matthew D.; Snyder, Gregory A.; Taylor, Lawrence A.; Liu, Yun-Gang; Schmitt, Roman A.

    1994-01-01

    The petrogenesis of Apollo 12 mare basalts has been examined with emphasis on trace-element ratios and abundances. Vitrophyric basalts were used as parental compositions for the modeling, and proportions of fractionating phases were determined using the MAGFOX prograqm of Longhi (1991). Crystal fractionation processes within crustal and sub-crustal magma chambers are evaluated as a function of pressure. Knowledge of the fractionating phases allows trace-element variations to be considered as either source related or as a product of post-magma-generation processes. For the ilmenite and olivine basalts, trace-element variations are inherited from the source, but the pigeonite basalt data have been interpreted with open-system evolution processes through crustal assimilation. Three groups of basalts have been examined: (1) Pigeonite basalts-produced by the assimilation of lunar crustal material by a parental melt (up to 3% assimilation and 10% crystal fractionation, with an 'r' value of 0.3). (2) Ilmenite basalts-produced by variable degrees of partial melting (4-8%) of a source of olivine, pigeonite, augite, and plagioclase, brought together by overturn of the Lunar Magma Ocean (LMO) cumulate pile. After generation, which did not exhaust any of the minerals in the source, these melts experienced closed-system crystal fractionation/accumulation. (3) Olivine basalts-produced by variable degrees of partial melting (5-10%) of a source of olivine, pigeonite, and augite. After generation, again without exhausting any of the minerals in the source, these melts evolved through crystal accumulation. The evolved liquid counterparts of these cumulates have not been sampled. The source compositions for the ilmenite and olivine basalts were calculated by assuming that the vitrophyric compositions were primary and the magmas were produced by non-modal batch melting. Although the magnitude is unclear, evaluation of these source regions indicates that both be composed of early- and

  17. FormScanner: Open-Source Solution for Grading Multiple-Choice Exams

    Science.gov (United States)

    Young, Chadwick; Lo, Glenn; Young, Kaisa; Borsetta, Alberto

    2016-01-01

    The multiple-choice exam remains a staple for many introductory physics courses. In the past, people have graded these by hand or even flaming needles. Today, one usually grades the exams with a form scanner that utilizes optical mark recognition (OMR). Several companies provide these scanners and particular forms, such as the eponymous…

  18. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression

    Science.gov (United States)

    Beckstead, Jason W.

    2012-01-01

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic…

  19. Using Multiple Sources of Information in Establishing Text Complexity. Reading Research Report. #11.03

    Science.gov (United States)

    Hiebert, Elfrieda H.

    2011-01-01

    A focus of the Common Core State Standards/English Language Arts (CCSS/ELA) is that students become increasingly more capable with complex text over their school careers. This focus has redirected attention to the measurement of text complexity. Although CCSS/ELA suggests multiple criteria for this task, the standards offer a single measure of…

  20. The Chandra Source Catalog : Automated Source Correlation

    Science.gov (United States)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  1. Noise analysis of a white-light supercontinuum light source for multiple wavelength confocal laser scanning fluorescence microscopy

    Energy Technology Data Exchange (ETDEWEB)

    McConnell, Gail [Centre for Biophotonics, Strathclyde Institute for Biomedical Sciences, University of Strathclyde, 27 Taylor Street, Glasgow, G4 0NR (United Kingdom)

    2005-08-07

    Intensity correlations of a Ti : sapphire, Kr/Ar and a white-light supercontinuum were performed to quantify the typical signal amplitude fluctuations and hence ascertain the comparative output stability of the white-light supercontinuum source for confocal laser scanning microscopy (CLSM). Intensity correlations across a two-pixel sample (n = 1000) of up to 98%, 95% and 94% were measured for the Ti : sapphire, Kr/Ar and white-light supercontinuum source, respectively. The white-light supercontinuum noise level is therefore acceptable for CLSM, with the added advantage of wider wavelength flexibility over traditional CLSM excitation sources. The relatively low-noise white-light supercontinuum was then used to perform multiple wavelength sequential CLSM of guinea pig detrusor to confirm the reliability of the system and to demonstrate system flexibility.

  2. Optimizing the regimes of the Advanced LIGO gravitational wave detector for multiple source types

    International Nuclear Information System (INIS)

    Kondrashov, I. S.; Simakov, D. A.; Khalili, F. Ya.; Danilishin, S. L.

    2008-01-01

    We developed algorithms which allow us to find regimes of the signal-recycled Fabry-Perot-Michelson interferometer [for example, the Advanced Laser Interferometric Gravitational Wave Observatory (LIGO)], optimized concurrently for two (binary inspirals + bursts) and three (binary inspirals + bursts + millisecond pulsars) types of gravitational wave sources. We show that there exists a relatively large area in the interferometer parameters space where the detector sensitivity to the first two kinds of sources differs only by a few percent from the maximal ones for each kind of source. In particular, there exists a specific regime where this difference is ≅0.5% for both of them. Furthermore, we show that even more multipurpose regimes are also possible that provide significant sensitivity gain for millisecond pulsars with only minor sensitivity degradation for binary inspirals and bursts.

  3. Multiple human schemas and the communication-information sources use: An application of Q-methodology

    Directory of Open Access Journals (Sweden)

    Mansour Shahvali

    2014-12-01

    Full Text Available This study was conducted with the aim of developing a communication and information model for greenhouse farmers in Yazd city using schema theory. Performing the Q methodology together with the factor analysis, as such, the different variables were loaded over the five schematic factors which included the human philosophical nature, ideological, economic, social, and environmental-conservation beliefs. Running AMOS,of course, it was also unveiled that the philosophical, ideological, social, economic and environmental schemas influence directly on the personal communication-information sources use. Furthermore, the environmental-conservation schema affects directly and indirectly the personal communication-information sources use. More importantly, this study indicated the important role of the indigenous sources which play in constructing, evaluating and retrieving the environmental knowledge in respondents. The research predisposes a suitable context for policymakers who seek to draw up much more effective and appropriate communication and information strategies to address the specific target groups’ needs.

  4. Ionizing radiation sources: very diversified means, multiple applications and a changing regulatory environment. Conference proceedings

    International Nuclear Information System (INIS)

    2011-11-01

    This document brings together the available presentations given at the conference organised by the French society of radiation protection about ionizing radiation source means, applications and regulatory environment. Twenty eight presentations (slides) are compiled in this document and deal with: 1 - Overview of sources - some quantitative data from the national inventory of ionizing radiation sources (Yann Billarand, IRSN); 2 - Overview of sources (Jerome Fradin, ASN); 3 - Regulatory framework (Sylvie Rodde, ASN); 4 - Alternatives to Iridium radiography - the case of pressure devices at the manufacturing stage (Henri Walaszek, Cetim; Bruno Kowalski, Welding Institute); 5 - Dosimetric stakes of medical scanner examinations (Jean-Louis Greffe, Charleroi hospital of Medical University); 6 - The removal of ionic smoke detectors (Bruno Charpentier, ASN); 7 - Joint-activity and reciprocal liabilities - Organisation of labour risk prevention in case of companies joint-activity (Paulo Pinto, DGT); 8 - Consideration of gamma-graphic testing in the organization of a unit outage activities (Jean-Gabriel Leonard, EDF); 9 - Radiological risk control at a closed and independent work field (Stephane Sartelet, Areva); 10 - Incidents and accidents status and typology (Pascale Scanff, IRSN); 11 - Regional overview of radiation protection significant events (Philippe Menechal, ASN); 12 - Incident leading to a tritium contamination in and urban area - consequences and experience feedback (Laurence Fusil, CEA); 13 - Experience feedback - loss of sealing of a calibration source (Philippe Mougnard, Areva); 14 - Blocking incident of a 60 Co source (Bruno Delille, Salvarem); 15 - Triggering of gantry's alarm: status of findings (Philippe Prat, Syctom); 16 - Non-medical electric devices: regulatory changes (Sophie Dagois, IRSN; Jerome Fradin, ASN); 17 - Evaluation of the dose equivalent rate in pulsed fields: method proposed by the IRSN and implementation test (Laurent Donadille, IRSN

  5. Multiple information sources and consequences of conflicting information about medicine use during pregnancy: a multinational Internet-based survey.

    Science.gov (United States)

    Hämeen-Anttila, Katri; Nordeng, Hedvig; Kokki, Esa; Jyrkkä, Johanna; Lupattelli, Angela; Vainio, Kirsti; Enlund, Hannes

    2014-02-20

    A wide variety of information sources on medicines is available for pregnant women. When using multiple information sources, there is the risk that information will vary or even conflict. The objective of this multinational study was to analyze the extent to which pregnant women use multiple information sources and the consequences of conflicting information, and to investigate which maternal sociodemographic, lifestyle, and medical factors were associated with these objectives. An anonymous Internet-based questionnaire was made accessible during a period of 2 months, on 1 to 4 Internet websites used by pregnant women in 5 regions (Eastern Europe, Western Europe, Northern Europe, Americas, Australia). A total of 7092 responses were obtained (n=5090 pregnant women; n=2002 women with a child younger than 25 weeks). Descriptive statistics and logistic regression analysis were used. Of the respondents who stated that they needed information, 16.16% (655/4054) used one information source and 83.69% (3393/4054) used multiple information sources. Of respondents who used more than one information source, 22.62% (759/3355) stated that the information was conflicted. According to multivariate logistic regression analysis, factors significantly associated with experiencing conflict in medicine information included being a mother (OR 1.32, 95% CI 1.11-1.58), having university (OR 1.33, 95% CI 1.09-1.63) or other education (OR 1.49, 95% CI 1.09-2.03), residing in Eastern Europe (OR 1.52, 95% CI 1.22-1.89) or Australia (OR 2.28, 95% CI 1.42-3.67), use of 3 (OR 1.29, 95% CI 1.04-1.60) or >4 information sources (OR 1.82, 95% CI 1.49-2.23), and having ≥2 chronic diseases (OR 1.49, 95% CI 1.18-1.89). Because of conflicting information, 43.61% (331/759) decided not to use medication during pregnancy, 30.30% (230/759) sought a new information source, 32.67% (248/759) chose to rely on one source and ignore the conflicting one, 25.03% (190/759) became anxious, and 2.64% (20/759) did

  6. Research on numerical method for multiple pollution source discharge and optimal reduction program

    Science.gov (United States)

    Li, Mingchang; Dai, Mingxin; Zhou, Bin; Zou, Bin

    2018-03-01

    In this paper, the optimal method for reduction program is proposed by the nonlinear optimal algorithms named that genetic algorithm. The four main rivers in Jiangsu province, China are selected for reducing the environmental pollution in nearshore district. Dissolved inorganic nitrogen (DIN) is studied as the only pollutant. The environmental status and standard in the nearshore district is used to reduce the discharge of multiple river pollutant. The research results of reduction program are the basis of marine environmental management.

  7. How organic carbon derived from multiple sources contributes to carbon sequestration processes in a shallow coastal system?

    Science.gov (United States)

    Watanabe, Kenta; Kuwae, Tomohiro

    2015-04-16

    Carbon captured by marine organisms helps sequester atmospheric CO 2 , especially in shallow coastal ecosystems, where rates of primary production and burial of organic carbon (OC) from multiple sources are high. However, linkages between the dynamics of OC derived from multiple sources and carbon sequestration are poorly understood. We investigated the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of particulate OC (POC) and dissolved OC (DOC) in the water column and sedimentary OC using elemental, isotopic, and optical signatures in Furen Lagoon, Japan. Based on these data analysis, we explored how OC from multiple sources contributes to sequestration via storage in sediments, water column sequestration, and air-sea CO 2 exchanges, and analyzed how the contributions vary with salinity in a shallow seagrass meadow as well. The relative contribution of terrestrial POC in the water column decreased with increasing salinity, whereas autochthonous POC increased in the salinity range 10-30. Phytoplankton-derived POC dominated the water column POC (65-95%) within this salinity range; however, it was minor in the sediments (3-29%). In contrast, terrestrial and phytobenthos-derived POC were relatively minor contributors in the water column but were major contributors in the sediments (49-78% and 19-36%, respectively), indicating that terrestrial and phytobenthos-derived POC were selectively stored in the sediments. Autochthonous DOC, part of which can contribute to long-term carbon sequestration in the water column, accounted for >25% of the total water column DOC pool in the salinity range 15-30. Autochthonous OC production decreased the concentration of dissolved inorganic carbon in the water column and thereby contributed to atmospheric CO 2 uptake, except in the low-salinity zone. Our results indicate that shallow coastal ecosystems function not only as transition zones between land and ocean but also as carbon sequestration filters. They

  8. Trends in Solar energy Driven Vertical Ground Source Heat Pump Systems in Sweden - An Analysis Based on the Swedish Well Database

    Science.gov (United States)

    Juhlin, K.; Gehlin, S.

    2016-12-01

    Sweden is a world leader in developing and using vertical ground source heat pump (GSHP) technology. GSHP systems extract passively stored solar energy in the ground and the Earth's natural geothermal energy. Geothermal energy is an admitted renewable energy source in Sweden since 2007 and is the third largest renewable energy source in the country today. The Geological Survey of Sweden (SGU) is the authority in Sweden that provides open access geological data of rock, soil and groundwater for the public. All wells drilled must be registered in the SGU Well Database and it is the well driller's duty to submit registration of drilled wells.Both active and passive geothermal energy systems are in use. Large GSHP systems, with at least 20 boreholes, are active geothermal energy systems. Energy is stored in the ground which allows both comfort heating and cooling to be extracted. Active systems are therefore relevant for larger properties and industrial buildings. Since 1978 more than 600 000 wells (water wells, GSHP boreholes etc) have been registered in the Well Database, with around 20 000 new registrations per year. Of these wells an estimated 320 000 wells are registered as GSHP boreholes. The vast majority of these boreholes are single boreholes for single-family houses. The number of properties with registered vertical borehole GSHP installations amounts to approximately 243 000. Of these sites between 300-350 are large GSHP systems with at least 20 boreholes. While the increase in number of new registrations for smaller homes and households has slowed down after the rapid development in the 80's and 90's, the larger installations for commercial and industrial buildings have increased in numbers over the last ten years. This poster uses data from the SGU Well Database to quantify and analyze the trends in vertical GSHP systems reported between 1978-2015 in Sweden, with special focus on large systems. From the new aggregated data, conclusions can be drawn about

  9. From Big Data to Smart Data for Pharmacovigilance: The Role of Healthcare Databases and Other Emerging Sources.

    Science.gov (United States)

    Trifirò, Gianluca; Sultana, Janet; Bate, Andrew

    2018-02-01

    In the last decade 'big data' has become a buzzword used in several industrial sectors, including but not limited to telephony, finance and healthcare. Despite its popularity, it is not always clear what big data refers to exactly. Big data has become a very popular topic in healthcare, where the term primarily refers to the vast and growing volumes of computerized medical information available in the form of electronic health records, administrative or health claims data, disease and drug monitoring registries and so on. This kind of data is generally collected routinely during administrative processes and clinical practice by different healthcare professionals: from doctors recording their patients' medical history, drug prescriptions or medical claims to pharmacists registering dispensed prescriptions. For a long time, this data accumulated without its value being fully recognized and leveraged. Today big data has an important place in healthcare, including in pharmacovigilance. The expanding role of big data in pharmacovigilance includes signal detection, substantiation and validation of drug or vaccine safety signals, and increasingly new sources of information such as social media are also being considered. The aim of the present paper is to discuss the uses of big data for drug safety post-marketing assessment.

  10. Memory for Textual Conflicts Predicts Sourcing When Adolescents Read Multiple Expository Texts

    Science.gov (United States)

    Stang Lund, Elisabeth; Bråten, Ivar; Brante, Eva W.; Strømsø, Helge I.

    2017-01-01

    This study investigated whether memory for conflicting information predicted mental representation of source-content links (i.e., who said what) in a sample of 86 Norwegian adolescent readers. Participants read four texts presenting conflicting claims about sun exposure and health. With differences in gender, prior knowledge, and interest…

  11. Recent performances of the multiple charged heavy-ion source - triple mafios

    International Nuclear Information System (INIS)

    Briand, P.; Chan-tung, N.; Geller, R.; Jacquot, B.

    1977-01-01

    The principle and the characteristics of the ion source are described. We also furnish upto date performances concerning ion currents, globale emittances of the beam as well as the emittances of Ar 1+ to Ar 10+ in the radial and axial planes. (orig./WL) [de

  12. The GLIMS Glacier Database

    Science.gov (United States)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), Map

  13. Evaluation of Personal and Built Environment Attributes to Physical Activity: A Multilevel Analysis on Multiple Population-Based Data Sources

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2012-01-01

    Full Text Available Background. Studies have documented that built environment factors potentially promote or impede leisure time physical activity (LTPA. This study explored the relationship between multiple built environment factors and individual characteristics on LTPA. Methods. Multiple data sources were utilized including individual level data for health behaviors and health status from the Nevada Behavioral Risk Factor Surveillance System (BRFSS and community level data from different data sources including indicators for recreation facilities, safety, air quality, commute time, urbanization, population density, and land mix level. Mixed model logistic regression and geographic information system (GIS spatial analysis were conducted. Results. Among 6,311 respondents, 24.4% reported no LTPA engagement during the past 30 days. No engagement in LTPA was significantly associated with (1 individual factors: older age, less education, lower income, being obesity, and low life satisfaction and (2 community factors: more commute time, higher crime rate, urban residence, higher population density, but not for density and distance to recreation facilities, air quality, and land mix. Conclusions. Multiple data systems including complex population survey and spatial analysis are valuable tools on health and built environment studies.

  14. Pesticide pollution of multiple drinking water sources in the Mekong Delta, Vietnam: evidence from two provinces.

    Science.gov (United States)

    Chau, N D G; Sebesvari, Z; Amelung, W; Renaud, F G

    2015-06-01

    Pollution of drinking water sources with agrochemicals is often a major threat to human and ecosystem health in some river deltas, where agricultural production must meet the requirements of national food security or export aspirations. This study was performed to survey the use of different drinking water sources and their pollution with pesticides in order to inform on potential exposure sources to pesticides in rural areas of the Mekong River delta, Vietnam. The field work comprised both household surveys and monitoring of 15 frequently used pesticide active ingredients in different water sources used for drinking (surface water, groundwater, water at public pumping stations, surface water chemically treated at household level, harvested rainwater, and bottled water). Our research also considered the surrounding land use systems as well as the cropping seasons. Improper pesticide storage and waste disposal as well as inadequate personal protection during pesticide handling and application were widespread amongst the interviewed households, with little overall risk awareness for human and environmental health. The results show that despite the local differences in the amount and frequency of pesticides applied, pesticide pollution was ubiquitous. Isoprothiolane (max. concentration 8.49 μg L(-1)), fenobucarb (max. 2.32 μg L(-1)), and fipronil (max. 0.41 μg L(-1)) were detected in almost all analyzed water samples (98 % of all surface samples contained isoprothiolane, for instance). Other pesticides quantified comprised butachlor, pretilachlor, propiconazole, hexaconazole, difenoconazole, cypermethrin, fenoxapro-p-ethyl, tebuconazole, trifloxystrobin, azoxystrobin, quinalphos, and thiamethoxam. Among the studied water sources, concentrations were highest in canal waters. Pesticide concentrations varied with cropping season but did not diminish through the year. Even in harvested rainwater or purchased bottled water, up to 12 different pesticides were detected at

  15. Locating non-volcanic tremor along the San Andreas Fault using a multiple array source imaging technique

    Science.gov (United States)

    Ryberg, T.; Haberland, C.H.; Fuis, G.S.; Ellsworth, W.L.; Shelly, D.R.

    2010-01-01

    Non-volcanic tremor (NVT) has been observed at several subduction zones and at the San Andreas Fault (SAF). Tremor locations are commonly derived by cross-correlating envelope-transformed seismic traces in combination with source-scanning techniques. Recently, they have also been located by using relative relocations with master events, that is low-frequency earthquakes that are part of the tremor; locations are derived by conventional traveltime-based methods. Here we present a method to locate the sources of NVT using an imaging approach for multiple array data. The performance of the method is checked with synthetic tests and the relocation of earthquakes. We also applied the method to tremor occurring near Cholame, California. A set of small-aperture arrays (i.e. an array consisting of arrays) installed around Cholame provided the data set for this study. We observed several tremor episodes and located tremor sources in the vicinity of SAF. During individual tremor episodes, we observed a systematic change of source location, indicating rapid migration of the tremor source along SAF. ?? 2010 The Authors Geophysical Journal International ?? 2010 RAS.

  16. Integrated Tsunami Database: simulation and identification of seismic tsunami sources, 3D visualization and post-disaster assessment on the shore

    Science.gov (United States)

    Krivorot'ko, Olga; Kabanikhin, Sergey; Marinin, Igor; Karas, Adel; Khidasheli, David

    2013-04-01

    One of the most important problems of tsunami investigation is the problem of seismic tsunami source reconstruction. Non-profit organization WAPMERR (http://wapmerr.org) has provided a historical database of alleged tsunami sources around the world that obtained with the help of information about seaquakes. WAPMERR also has a database of observations of the tsunami waves in coastal areas. The main idea of presentation consists of determining of the tsunami source parameters using seismic data and observations of the tsunami waves on the shore, and the expansion and refinement of the database of presupposed tsunami sources for operative and accurate prediction of hazards and assessment of risks and consequences. Also we present 3D visualization of real-time tsunami wave propagation and loss assessment, characterizing the nature of the building stock in cities at risk, and monitoring by satellite images using modern GIS technology ITRIS (Integrated Tsunami Research and Information System) developed by WAPMERR and Informap Ltd. The special scientific plug-in components are embedded in a specially developed GIS-type graphic shell for easy data retrieval, visualization and processing. The most suitable physical models related to simulation of tsunamis are based on shallow water equations. We consider the initial-boundary value problem in Ω := {(x,y) ?R2 : x ?(0,Lx ), y ?(0,Ly ), Lx,Ly > 0} for the well-known linear shallow water equations in the Cartesian coordinate system in terms of the liquid flow components in dimensional form Here ?(x,y,t) defines the free water surface vertical displacement, i.e. amplitude of a tsunami wave, q(x,y) is the initial amplitude of a tsunami wave. The lateral boundary is assumed to be a non-reflecting boundary of the domain, that is, it allows the free passage of the propagating waves. Assume that the free surface oscillation data at points (xm, ym) are given as a measured output data from tsunami records: fm(t) := ? (xm, ym,t), (xm

  17. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    International Nuclear Information System (INIS)

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  18. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Energy Technology Data Exchange (ETDEWEB)

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  19. Using multiple isotopes to understand the source of ingredients used in golden beverages

    Science.gov (United States)

    Wynn, J. G.

    2011-12-01

    Traditionally, beer contains 4 simple ingredients: water, barley, hops and yeast. Each of these ingredients used in the brewing process contributes some combination of a number of "traditional" stable isotopes (i.e., isotopes of H, C, O, N and S) to the final product. As an educational exercise in an "Analytical Techniques in Geology" course, a group of students analyzed the isotopic composition of the gas, liquid and solid phases of a variety of beer samples collected from throughout the world (including other beverages). The hydrogen and oxygen isotopic composition of the water followed closely the isotopic composition of local meteoric water at the source of the brewery, although there is a systematic offset from the global meteoric water line that may be due to the effects of CO2-H2O equilibration. The carbon isotopic composition of the CO2 reflected that of the solid residue (the source of carbon used as a fermentation substrate), but may potentially be modified by addition of gas-phase CO2 from an inorganic source. The carbon isotopic composition of the solid residue similarly tracks that of the fermentation substrate, and may indicate some alcohol fermented from added sugars in some cases. The nitrogen isotopic composition of the solid residue was relatively constant, and may track the source of nitrogen in the barley, hops and yeast. Each of the analytical methods used is a relatively standard technique used in geological applications, making this a "fun" exercise for those involved, and gives the students hands-on experience with a variety of analytes from a non-traditional sample material.

  20. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    OpenAIRE

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-01-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This...

  1. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    OpenAIRE

    M. Wronna; R. Omira; M. A. Baptista

    2015-01-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines – Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seis...

  2. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  3. Identification of Multiple Subtypes of Campylobacter jejuni in Chicken Meat and the Impact on Source Attribution

    Directory of Open Access Journals (Sweden)

    John A. Hudson

    2013-09-01

    Full Text Available Most source attribution studies for Campylobacter use subtyping data based on single isolates from foods and environmental sources in an attempt to draw epidemiological inferences. It has been suggested that subtyping only one Campylobacter isolate per chicken carcass incurs a risk of failing to recognise the presence of clinically relevant, but numerically infrequent, subtypes. To investigate this, between 21 and 25 Campylobacter jejuni isolates from each of ten retail chicken carcasses were subtyped by pulsed-field gel electrophoresis (PFGE using the two restriction enzymes SmaI and KpnI. Among the 227 isolates, thirteen subtypes were identified, the most frequently occurring subtype being isolated from three carcasses. Six carcasses carried a single subtype, three carcasses carried two subtypes each and one carcass carried three subtypes. Some subtypes carried by an individual carcass were shown to be potentially clonally related. Comparison of C. jejuni subtypes from chickens with isolate subtypes from human clinical cases (n = 1248 revealed seven of the thirteen chicken subtypes were indistinguishable from human cases. None of the numerically minor chicken subtypes were identified in the human data. Therefore, typing only one Campylobacter isolate from individual chicken carcasses may be adequate to inform Campylobacter source attribution.

  4. The development of software and formation of a database on the main sources of environmental contamination in areas around nuclear power plants

    International Nuclear Information System (INIS)

    Palitskaya, T.A.; Novikov, A.V.; Makeicheva, M.A.; Ivanov, E.A.

    2004-01-01

    Providing of environmental safety control in the process of nuclear power plants (NPPs) operation, environmental protection and rational use of the natural resources is one of the most important tasks of the Rosenergoatom Concern. To ensure the environmental safety, trustworthy, complete and timely information is needed on the natural resources availability and condition, on the natural environment quality and its contamination level. The industrial environmental monitoring allows obtaining, processing and evaluating data for making environmentally acceptable and economically efficient decisions. The industrial environmental monitoring system at NPPs is formed taking into account both radiation and non-radiation factors of impact. Obtaining data on non-radiation factors of the NPP impact is provide by a complex of special observations carried out by NPP's environment protection services. The gained information is transmitted to the Rosenergoatom Concern and input to a database of the Environment Protection Division of the Concern Department of Radiation Safety, Environment Protection and Nuclear Materials Accounting. The database on the main sources of environmental contamination in the areas around NPPs will provide the high level of the environmental control authenticity, maintenance of the set standards, and also - automation of the most labor-consuming and frequently repeating types of operations. he applied software is being developed by specialists from the All-Russia Research Institute of Nuclear Power Plants on the basis of the database management system Microsoft SQL Server using VBA and Microsoft Access. The data will be transmitted through open communication channels. The geo-referenced digital mapping information, basing on the ArcGIS and MapInfo will be the main forms of output data presentation. The Federal authority bodies, their regional units and the Concern's sub-divisions involved in the environmental protection activities will be the database

  5. The development of software and formation of a database on the main sources of environmental contamination in areas around nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Palitskaya, T.A.; Novikov, A.V. [Rosenergoatom Concern, Moscow (Russian Federation); Makeicheva, M.A.; Ivanov, E.A. [All-Russia Research Institute of Nuclear Power Plants, Moscow (Russian Federation)

    2004-07-01

    Providing of environmental safety control in the process of nuclear power plants (NPPs) operation, environmental protection and rational use of the natural resources is one of the most important tasks of the Rosenergoatom Concern. To ensure the environmental safety, trustworthy, complete and timely information is needed on the natural resources availability and condition, on the natural environment quality and its contamination level. The industrial environmental monitoring allows obtaining, processing and evaluating data for making environmentally acceptable and economically efficient decisions. The industrial environmental monitoring system at NPPs is formed taking into account both radiation and non-radiation factors of impact. Obtaining data on non-radiation factors of the NPP impact is provide by a complex of special observations carried out by NPP's environment protection services. The gained information is transmitted to the Rosenergoatom Concern and input to a database of the Environment Protection Division of the Concern Department of Radiation Safety, Environment Protection and Nuclear Materials Accounting. The database on the main sources of environmental contamination in the areas around NPPs will provide the high level of the environmental control authenticity, maintenance of the set standards, and also - automation of the most labor-consuming and frequently repeating types of operations. he applied software is being developed by specialists from the All-Russia Research Institute of Nuclear Power Plants on the basis of the database management system Microsoft SQL Server using VBA and Microsoft Access. The data will be transmitted through open communication channels. The geo-referenced digital mapping information, basing on the ArcGIS and MapInfo will be the main forms of output data presentation. The Federal authority bodies, their regional units and the Concern's sub-divisions involved in the environmental protection activities will be the

  6. Multiple sources driving the organic matter dynamics in two contrasting tropical mangroves

    International Nuclear Information System (INIS)

    Ray, R.; Shahraki, M.

    2016-01-01

    In this study, we have selected two different mangroves based on their geological, hydrological and climatological variations to investigate the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of dissolved organic carbon (DOC), particulate organic carbon (POC) in the water column and the sedimentary OC using elemental ratios and stable isotopes. Qeshm Island, representing the Iranian mangroves received no attention before this study in terms of DOC, POC biogeochemistry and their sources unlike the Sundarbans (Indian side), the world's largest mangrove system. Slightly higher DOC concentrations in the Iranian mangroves were recorded in our field campaigns between 2011 and 2014, compared to the Sundarbans (315 ± 25 μM vs. 278 ± 42 μM), owing to the longer water residence times, while 9–10 times greater POC concentration (303 ± 37 μM, n = 82) was linked to both suspended load (345 ± 104 mg L"− "1) and high algal production. Yearlong phytoplankton bloom in the mangrove-lined Persian Gulf was reported to be the perennial source of both POC and DOC contributing 80–86% to the DOC and 90–98% to the POC pool. Whereas in the Sundarbans, riverine input contributed 50–58% to the DOC pool and POC composition was regulated by the seasonal litter fall, river discharge and phytoplankton production. Algal derived organic matter (microphytobenthos) represented the maximum contribution (70–76%) to the sedimentary OC at Qeshm Island, while mangrove leaf litters dominated the OC pool in the Indian Sundarbans. Finally, hydrographical settings (i.e. riverine transport) appeared to be the determinant factor in differentiating OM sources in the water column between the dry and wet mangroves. - Highlights: • Sources of OC have been identified and compared between two contrasting mangroves. • Phytoplankton dominated the DOC and POC pool in the Iranian mangroves. • River input contributed half of the total DOC and part of POC in the Indian

  7. Multiple sources driving the organic matter dynamics in two contrasting tropical mangroves

    Energy Technology Data Exchange (ETDEWEB)

    Ray, R., E-mail: raghab.ray@gmail.com [Institut Universitaire Européen de la Mer, UBO, UMR 6539 LEMAR, rue Dumont dUrville, 29280 Plouzane (France); Leibniz Center for Tropical Marine Ecology, Fahrenheitstr. 6, 28359 Bremen (Germany); Shahraki, M. [Leibniz Center for Tropical Marine Ecology, Fahrenheitstr. 6, 28359 Bremen (Germany); Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research, Am Handelshafen 12, 27570 Bremerhaven (Germany)

    2016-11-15

    In this study, we have selected two different mangroves based on their geological, hydrological and climatological variations to investigate the origin (terrestrial, phytobenthos derived, and phytoplankton derived) of dissolved organic carbon (DOC), particulate organic carbon (POC) in the water column and the sedimentary OC using elemental ratios and stable isotopes. Qeshm Island, representing the Iranian mangroves received no attention before this study in terms of DOC, POC biogeochemistry and their sources unlike the Sundarbans (Indian side), the world's largest mangrove system. Slightly higher DOC concentrations in the Iranian mangroves were recorded in our field campaigns between 2011 and 2014, compared to the Sundarbans (315 ± 25 μM vs. 278 ± 42 μM), owing to the longer water residence times, while 9–10 times greater POC concentration (303 ± 37 μM, n = 82) was linked to both suspended load (345 ± 104 mg L{sup −} {sup 1}) and high algal production. Yearlong phytoplankton bloom in the mangrove-lined Persian Gulf was reported to be the perennial source of both POC and DOC contributing 80–86% to the DOC and 90–98% to the POC pool. Whereas in the Sundarbans, riverine input contributed 50–58% to the DOC pool and POC composition was regulated by the seasonal litter fall, river discharge and phytoplankton production. Algal derived organic matter (microphytobenthos) represented the maximum contribution (70–76%) to the sedimentary OC at Qeshm Island, while mangrove leaf litters dominated the OC pool in the Indian Sundarbans. Finally, hydrographical settings (i.e. riverine transport) appeared to be the determinant factor in differentiating OM sources in the water column between the dry and wet mangroves. - Highlights: • Sources of OC have been identified and compared between two contrasting mangroves. • Phytoplankton dominated the DOC and POC pool in the Iranian mangroves. • River input contributed half of the total DOC and part of POC in

  8. A modeling study of saltwater intrusion in the Andarax delta area using multiple data sources

    DEFF Research Database (Denmark)

    Antonsson, Arni Valur; Engesgaard, Peter Knudegaard; Jorreto, Sara

    context. The validity of a conceptual model is determined by different factors, where both data quantity and quality is of crucial importance. Often, when dealing with saltwater intrusion, data is limited. Therefore, using different sources (and types) of data can be beneficial and increase......In groundwater model development, construction of the conceptual model is one of the (initial and) critical aspects that determines the model reliability and applicability in terms of e.g. system (hydrogeological) understanding, groundwater quality predictions, and general use in water resources...

  9. Estimation of Multiple Point Sources for Linear Fractional Order Systems Using Modulating Functions

    KAUST Repository

    Belkhatir, Zehor

    2017-06-28

    This paper proposes an estimation algorithm for the characterization of multiple point inputs for linear fractional order systems. First, using polynomial modulating functions method and a suitable change of variables the problem of estimating the locations and the amplitudes of a multi-pointwise input is decoupled into two algebraic systems of equations. The first system is nonlinear and solves for the time locations iteratively, whereas the second system is linear and solves for the input’s amplitudes. Second, closed form formulas for both the time location and the amplitude are provided in the particular case of single point input. Finally, numerical examples are given to illustrate the performance of the proposed technique in both noise-free and noisy cases. The joint estimation of pointwise input and fractional differentiation orders is also presented. Furthermore, a discussion on the performance of the proposed algorithm is provided.

  10. A variational approach to liver segmentation using statistics from multiple sources

    Science.gov (United States)

    Zheng, Shenhai; Fang, Bin; Li, Laquan; Gao, Mingqi; Wang, Yi

    2018-01-01

    Medical image segmentation plays an important role in digital medical research, and therapy planning and delivery. However, the presence of noise and low contrast renders automatic liver segmentation an extremely challenging task. In this study, we focus on a variational approach to liver segmentation in computed tomography scan volumes in a semiautomatic and slice-by-slice manner. In this method, one slice is selected and its connected component liver region is determined manually to initialize the subsequent automatic segmentation process. From this guiding slice, we execute the proposed method downward to the last one and upward to the first one, respectively. A segmentation energy function is proposed by combining the statistical shape prior, global Gaussian intensity analysis, and enforced local statistical feature under the level set framework. During segmentation, the shape of the liver shape is estimated by minimization of this function. The improved Chan-Vese model is used to refine the shape to capture the long and narrow regions of the liver. The proposed method was verified on two independent public databases, the 3D-IRCADb and the SLIVER07. Among all the tested methods, our method yielded the best volumetric overlap error (VOE) of 6.5 +/- 2.8 % , the best root mean square symmetric surface distance (RMSD) of 2.1 +/- 0.8 mm, the best maximum symmetric surface distance (MSD) of 18.9 +/- 8.3 mm in 3D-IRCADb dataset, and the best average symmetric surface distance (ASD) of 0.8 +/- 0.5 mm, the best RMSD of 1.5 +/- 1.1 mm in SLIVER07 dataset, respectively. The results of the quantitative comparison show that the proposed liver segmentation method achieves competitive segmentation performance with state-of-the-art techniques.

  11. Exploring multiple sources of climatic information within personal and medical diaries, Bombay 1799-1828

    Science.gov (United States)

    Adamson, George

    2016-04-01

    Private diaries are being recognised as an important source of information on past climatic conditions, providing place-specific, often daily records of meteorological information. As many were not intended for publication, or indeed to be read by anyone other than the author, issues of observer bias are lower than some other types of documentary sources. This paper comprises an exploration of the variety of types of climatic information can be mined from a single document or set of documents. The focus of the analysis is three private and one medical diary kept by British colonists in Bombay, western India, during the first decades of the nineteenth century. The paper discusses the potential of the diaries for reconstruction of precipitation, temperature and extreme events. Ad-hoc temperature observations collected by the four observers prove to be particularly fruitful for reconstructing monthly extreme temperatures, with values comparable to more systematic observations collected during the period. This leads to a tentative conclusion that extreme temperatures in Bombay were around 5°C lower during the period than today, a difference likely predominantly attributable to the urban heat island effect.

  12. Testing the count rate performance of the scintillation camera by exponential attenuation: Decaying source; Multiple filters

    International Nuclear Information System (INIS)

    Adams, R.; Mena, I.

    1988-01-01

    An algorithm and two fortrAN programs have been developed to evaluate the count rate performance of scintillation cameras from count rates reduced exponentially, either by a decaying source or by filtration. The first method is used with short-lived radionuclides such as 191 /sup m/Ir or 191 /sup m/Au. The second implements a National Electrical Manufacturers' Association (NEMA) protocol in which the count rate from a source of 191 /sup m/Tc is attenuated by a varying number of copper filters stacked over it. The count rate at each data point is corrected for deadtime loss after assigning an arbitrary deadtime (tau). A second-order polynomial equation is fitted to the logarithms of net count rate values: ln(R) = A+BT+CT 2 where R is the net corrected count rate (cps), and T is the elapsed time (or the filter thickness in the NEMA method). Depending on C, tau is incremented or decremented iteratively, and the count rate corrections and curve fittings are repeated until C approaches zero, indicating a correct value of the deadtime (tau). The program then plots the measured count rate versus the corrected count rate values

  13. Examining Multiple Sources of Differential Item Functioning on the Clinician & Group CAHPS® Survey

    Science.gov (United States)

    Rodriguez, Hector P; Crane, Paul K

    2011-01-01

    Objective To evaluate psychometric properties of a widely used patient experience survey. Data Sources English-language responses to the Clinician & Group Consumer Assessment of Healthcare Providers and Systems (CG-CAHPS®) survey (n = 12,244) from a 2008 quality improvement initiative involving eight southern California medical groups. Methods We used an iterative hybrid ordinal logistic regression/item response theory differential item functioning (DIF) algorithm to identify items with DIF related to patient sociodemographic characteristics, duration of the physician–patient relationship, number of physician visits, and self-rated physical and mental health. We accounted for all sources of DIF and determined its cumulative impact. Principal Findings The upper end of the CG-CAHPS® performance range is measured with low precision. With sensitive settings, some items were found to have DIF. However, overall DIF impact was negligible, as 0.14 percent of participants had salient DIF impact. Latinos who spoke predominantly English at home had the highest prevalence of salient DIF impact at 0.26 percent. Conclusions The CG-CAHPS® functions similarly across commercially insured respondents from diverse backgrounds. Consequently, previously documented racial and ethnic group differences likely reflect true differences rather than measurement bias. The impact of low precision at the upper end of the scale should be clarified. PMID:22092021

  14. The development of a methodology to assess population doses from multiple sources and exposure pathways of radioactivity

    International Nuclear Information System (INIS)

    Hancox, J.; Stansby, S.; Thorne, M.

    2002-01-01

    The Environment Agency (EA) has new duties in accordance with the Basic Safety Standards Directive under which it is required to ensure that doses to individuals received from exposure to anthropogenic sources of radioactivity are within defined limits. In order to assess compliance with these requirements, the EA needs to assess the doses to members of the most highly exposed population groups ('critical' groups) from all relevant potential sources of anthropogenic radioactivity and all relevant potential exposure pathways to such radioactivity. The EA has identified a need to develop a methodology for the retrospective assessment of effective doses from multiple sources of radioactive materials and exposure pathways associated with those sources. Under contract to the EA, AEA Technology has undertaken the development of a suitable methodology as part of EA R and D Project P3-070. The methodology developed under this research project has been designed to support the EA in meeting its obligations under the Euratom Basic Safety Standards Directive and is consistent with UK and international approaches to radiation dosimetry and radiological protection. The development and trial application of the methodology is described in this report

  15. The metal-organic framework MIL-53(Al) constructed from multiple metal sources: alumina, aluminum hydroxide, and boehmite.

    Science.gov (United States)

    Li, Zehua; Wu, Yi-nan; Li, Jie; Zhang, Yiming; Zou, Xin; Li, Fengting

    2015-04-27

    Three aluminum compounds, namely alumina, aluminum hydroxide, and boehmite, are probed as the metal sources for the hydrothermal synthesis of a typical metal-organic framework MIL-53(Al). The process exhibits enhanced synthetic efficiency without the generation of strongly acidic byproducts. The time-course monitoring of conversion from different aluminum sources into MIL-53(Al) is achieved by multiple characterization that reveals a similar but differentiated crystallinity, porosity, and morphology relative to typical MIL-53(Al) prepared from water-soluble aluminum salts. Moreover, the prepared MIL-53(Al) constructed with the three insoluble aluminum sources exhibit an improved thermal stability of up to nearly 600 °C and enhanced yields. Alumina and boehmite are more preferable than aluminum hydroxide in terms of product porosity, yield, and reaction time. The adsorption performances of a typical environmental endocrine disruptor, dimethyl phthalate, on the prepared MIL-53(Al) samples are also investigated. The improved structural stability of MIL-53(Al) prepared from these alternative aluminum sources enables double-enhanced adsorption performance (up to 206 mg g(-1)) relative to the conventionally obtained MIL-53(Al). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. THE USE OF MULTIPLE DATA SOURCES IN THE PROCESS OF TOPOGRAPHIC MAPS UPDATING

    Directory of Open Access Journals (Sweden)

    A. Cantemir

    2016-06-01

    Full Text Available The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as

  17. Multiple remote sensing data sources to assess spatio-temporal patterns of fire incidence over Campos Amazônicos Savanna Vegetation Enclave (Brazilian Amazon).

    Science.gov (United States)

    Alves, Daniel Borini; Pérez-Cabello, Fernando

    2017-12-01

    Fire activity plays an important role in the past, present and future of Earth system behavior. Monitoring and assessing spatial and temporal fire dynamics have a fundamental relevance in the understanding of ecological processes and the human impacts on different landscapes and multiple spatial scales. This work analyzes the spatio-temporal distribution of burned areas in one of the biggest savanna vegetation enclaves in the southern Brazilian Amazon, from 2000 to 2016, deriving information from multiple remote sensing data sources (Landsat and MODIS surface reflectance, TRMM pluviometry and Vegetation Continuous Field tree cover layers). A fire scars database with 30 m spatial resolution was generated using a Landsat time series. MODIS daily surface reflectance was used for accurate dating of the fire scars. TRMM pluviometry data were analyzed to dynamically establish time limits of the yearly dry season and burning periods. Burned area extent, frequency and recurrence were quantified comparing the results annually/seasonally. Additionally, Vegetation Continuous Field tree cover layers were used to analyze fire incidence over different types of tree cover domains. In the last seventeen years, 1.03millionha were burned within the study area, distributed across 1432 fire occurrences, highlighting 2005, 2010 and 2014 as the most affected years. Middle dry season fires represent 86.21% of the total burned areas and 32.05% of fire occurrences, affecting larger amount of higher density tree surfaces than other burning periods. The results provide new insights into the analysis of burned areas of the neotropical savannas, spatially and statistically reinforcing important aspects linked to the seasonality patterns of fire incidence in this landscape. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A Multiple Source Approach to Organisational Justice: The Role of the Organisation, Supervisors, Coworkers, and Customers

    Directory of Open Access Journals (Sweden)

    Agustin Molina

    2015-07-01

    Full Text Available The vast research on organisational justice has focused on the organisation and the supervisor. This study aims to further this line of research by integrating two trends within organisational justice research: the overall approach to justice perceptions and the multifoci perspective of justice judgments. Specifically, this study aims to explore the effects of two additional sources of justice, coworker-focused justice and customer-focused justice, on relevant employees’ outcomes—burnout, turnover intentions, job satisfaction, and workplace deviance— while controlling the effect of organisation-focused justice and supervisor-focused justice. Given the increased importance attributed to coworkers and customers, we expect coworker-focused justice and customer-focused justice to explain incremental variance in the measured outcomes, above and beyond the effects of organisation-focused justice and supervisor-focused justice. Participants will be university students from Austria and Germany employed by service organisations. Data analysis will be conducted using structural equation modeling.

  19. High brightness--multiple beamlets source for patterned X-ray production

    Science.gov (United States)

    Leung, Ka-Ngo [Hercules, CA; Ji, Qing [Albany, CA; Barletta, William A [Oakland, CA; Jiang, Ximan [El Cerrito, CA; Ji, Lili [Albany, CA

    2009-10-27

    Techniques for controllably directing beamlets to a target substrate are disclosed. The beamlets may be either positive ions or electrons. It has been shown that beamlets may be produced with a diameter of 1 .mu.m, with inter-aperture spacings of 12 .mu.m. An array of such beamlets, may be used for maskless lithography. By step-wise movement of the beamlets relative to the target substrate, individual devices may be directly e-beam written. Ion beams may be directly written as well. Due to the high brightness of the beamlets from extraction from a multicusp source, exposure times for lithographic exposure are thought to be minimized. Alternatively, the beamlets may be electrons striking a high Z material for X-ray production, thereafter collimated to provide patterned X-ray exposures such as those used in CAT scans. Such a device may be used for remote detection of explosives.

  20. Evidence for multiple sources of 10Be in the early solar system

    DEFF Research Database (Denmark)

    Wielandt, Daniel Kim Peel; Nagashima, Kazuhide; Krot, Alexander N.

    2012-01-01

    Beryllium-10 is a short-lived radionuclide (t 1/2 = 1.4 Myr) uniquely synthesized by spallation reactions and inferred to have been present when the solar system's oldest solids (calcium-aluminum-rich inclusions, CAIs) formed. Yet, the astrophysical site of 10Be nucleosynthesis is uncertain. We...... in the gaseous CAI-forming reservoir, or in the inclusions themselves: this indicates at least two nucleosynthetic sources of 10Be in the early solar system. The most promising locale for 10Be synthesis is close to the proto-Sun during its early mass-accreting stages, as these are thought to coincide...

  1. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    Science.gov (United States)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-11-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.

  2. Bayesian inference based modelling for gene transcriptional dynamics by integrating multiple source of knowledge

    Directory of Open Access Journals (Sweden)

    Wang Shu-Qiang

    2012-07-01

    Full Text Available Abstract Background A key challenge in the post genome era is to identify genome-wide transcriptional regulatory networks, which specify the interactions between transcription factors and their target genes. Numerous methods have been developed for reconstructing gene regulatory networks from expression data. However, most of them are based on coarse grained qualitative models, and cannot provide a quantitative view of regulatory systems. Results A binding affinity based regulatory model is proposed to quantify the transcriptional regulatory network. Multiple quantities, including binding affinity and the activity level of transcription factor (TF are incorporated into a general learning model. The sequence features of the promoter and the possible occupancy of nucleosomes are exploited to estimate the binding probability of regulators. Comparing with the previous models that only employ microarray data, the proposed model can bridge the gap between the relative background frequency of the observed nucleotide and the gene's transcription rate. Conclusions We testify the proposed approach on two real-world microarray datasets. Experimental results show that the proposed model can effectively identify the parameters and the activity level of TF. Moreover, the kinetic parameters introduced in the proposed model can reveal more biological sense than previous models can do.

  3. Wheat multiple synthetic derivatives: a new source for heat stress tolerance adaptive traits

    Science.gov (United States)

    Elbashir, Awad Ahmed Elawad; Gorafi, Yasir Serag Alnor; Tahir, Izzat Sidahmed Ali; Kim, June-Sik; Tsujimoto, Hisashi

    2017-01-01

    Heat stress is detrimental to wheat (Triticum aestivum L.) productivity. In this study, we aimed to select heat-tolerant plants from a multiple synthetic derivatives (MSD) population and evaluate their agronomic and physiological traits. We selected six tolerant plants from the population with the background of the cultivar ‘Norin 61’ (N61) and established six MNH (MSD population of N61 selected as heat stress-tolerant) lines. We grew these lines with N61 in the field and growth chamber. In the field, we used optimum and late sowings to ensure plant exposure to heat. In the growth chamber, in addition to N61, we used the heat-tolerant cultivars ‘Gelenson’ and ‘Bacanora’. We confirmed that MNH2 and MNH5 lines acquired heat tolerance. These lines had higher photosynthesis and stomata conductance and exhibited no reduction in grain yield and biomass under heat stress compared to N61. We noticed that N61 had relatively good adaptability to heat stress. Our results indicate that the MSD population includes the diversity of Aegilops tauschii and is a promising resource to uncover useful quantitative traits derived from this wild species. Selected lines could be useful for heat stress tolerance breeding. PMID:28744178

  4. Food Habits Database (FHDBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NEFSC Food Habits Database has two major sources of data. The first, and most extensive, is the standard NEFSC Bottom Trawl Surveys Program. During these...

  5. Application of the Modified Source Multiplication (MSM) technique to subcritical reactivity worth measurements in thermal and fast reactor systems

    International Nuclear Information System (INIS)

    Blaise, P.; Fougeras, P.; Mellier, F.

    2009-01-01

    The Amplified Source Multiplication (ASM) method and its improved Modified Source Multiplication (MSM) method have been widely used in the CEA's EOLE and MASURCA critical facilities over the past decades for the determination of reactivity worths by using fission chambers in subcritical configurations. They have been successfully applied to absorber (single or clusters) worth measurement in both thermal and fast spectra, or for (sodium or water) void reactivity worths. The ASM methodology, which is the basic technique to estimate a reactivity worth, uses relatively simple relationships between count rates of efficient miniature fission chambers located in slightly subcritical reference and perturbed configurations. If this method works quite well for small reactivity variation (a few effective delayed neutron fraction), its raw results needs to be corrected to take into account the flux perturbation in the fission chamber. This is performed by applying to the measurement a correction factor called MSM. Its characteristics is to take into account the local space and energy variation of the spectrum in the fission chamber, through standard perturbation theory applied to neutron transport calculation in the perturbed configuration. The proposed paper describes in details both methodologies, with their associated uncertainties. Applications on absorber cluster worth in the MISTRAL-4 full MOX mock-up core and the last core loaded in MASURCA show the importance of the MSM correction on raw data. (authors)

  6. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    Science.gov (United States)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-08-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.

  7. IPeak: An open source tool to combine results from multiple MS/MS search engines.

    Science.gov (United States)

    Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun

    2015-09-01

    Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Exploiting multiple sources of information in learning an artificial language: human data and modeling.

    Science.gov (United States)

    Perruchet, Pierre; Tillmann, Barbara

    2010-03-01

    This study investigates the joint influences of three factors on the discovery of new word-like units in a continuous artificial speech stream: the statistical structure of the ongoing input, the initial word-likeness of parts of the speech flow, and the contextual information provided by the earlier emergence of other word-like units. Results of an experiment conducted with adult participants show that these sources of information have strong and interactive influences on word discovery. The authors then examine the ability of different models of word segmentation to account for these results. PARSER (Perruchet & Vinter, 1998) is compared to the view that word segmentation relies on the exploitation of transitional probabilities between successive syllables, and with the models based on the Minimum Description Length principle, such as INCDROP. The authors submit arguments suggesting that PARSER has the advantage of accounting for the whole pattern of data without ad-hoc modifications, while relying exclusively on general-purpose learning principles. This study strengthens the growing notion that nonspecific cognitive processes, mainly based on associative learning and memory principles, are able to account for a larger part of early language acquisition than previously assumed. Copyright © 2009 Cognitive Science Society, Inc.

  9. Sequence-based analysis of the microbial composition of water kefir from multiple sources.

    Science.gov (United States)

    Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D

    2013-11-01

    Water kefir is a water-sucrose-based beverage, fermented by a symbiosis of bacteria and yeast to produce a final product that is lightly carbonated, acidic and that has a low alcohol percentage. The microorganisms present in water kefir are introduced via water kefir grains, which consist of a polysaccharide matrix in which the microorganisms are embedded. We aimed to provide a comprehensive sequencing-based analysis of the bacterial population of water kefir beverages and grains, while providing an initial insight into the corresponding fungal population. To facilitate this objective, four water kefirs were sourced from the UK, Canada and the United States. Culture-independent, high-throughput, sequencing-based analyses revealed that the bacterial fraction of each water kefir and grain was dominated by Zymomonas, an ethanol-producing bacterium, which has not previously been detected at such a scale. The other genera detected were representatives of the lactic acid bacteria and acetic acid bacteria. Our analysis of the fungal component established that it was comprised of the genera Dekkera, Hanseniaspora, Saccharomyces, Zygosaccharomyces, Torulaspora and Lachancea. This information will assist in the ultimate identification of the microorganisms responsible for the potentially health-promoting attributes of these beverages. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  10. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  11. The role of envelope shape in the localization of multiple sound sources and echoes in the barn owl.

    Science.gov (United States)

    Baxter, Caitlin S; Nelson, Brian S; Takahashi, Terry T

    2013-02-01

    Echoes and sounds of independent origin often obscure sounds of interest, but echoes can go undetected under natural listening conditions, a perception called the precedence effect. How does the auditory system distinguish between echoes and independent sources? To investigate, we presented two broadband noises to barn owls (Tyto alba) while varying the similarity of the sounds' envelopes. The carriers of the noises were identical except for a 2- or 3-ms delay. Their onsets and offsets were also synchronized. In owls, sound localization is guided by neural activity on a topographic map of auditory space. When there are two sources concomitantly emitting sounds with overlapping amplitude spectra, space map neurons discharge when the stimulus in their receptive field is louder than the one outside it and when the averaged amplitudes of both sounds are rising. A model incorporating these features calculated the strengths of the two sources' representations on the map (B. S. Nelson and T. T. Takahashi; Neuron 67: 643-655, 2010). The target localized by the owls could be predicted from the model's output. The model also explained why the echo is not localized at short delays: when envelopes are similar, peaks in the leading sound mask corresponding peaks in the echo, weakening the echo's space map representation. When the envelopes are dissimilar, there are few or no corresponding peaks, and the owl localizes whichever source is predicted by the model to be less masked. Thus the precedence effect in the owl is a by-product of a mechanism for representing multiple sound sources on its map.

  12. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    International Nuclear Information System (INIS)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C

  13. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  14. Interrogating pollution sources in a mangrove food web using multiple stable isotopes.

    Science.gov (United States)

    Souza, Iara da C; Arrivabene, Hiulana P; Craig, Carol-Ann; Midwood, Andrew J; Thornton, Barry; Matsumoto, Silvia T; Elliott, Michael; Wunderlin, Daniel A; Monferrán, Magdalena V; Fernandes, Marisa N

    2018-06-01

    Anthropogenic activities including metal contamination create well-known problems in coastal mangrove ecosystems but understanding and linking specific pollution sources to distinct trophic levels within these environments is challenging. This study evaluated anthropogenic impacts on two contrasting mangrove food webs, by using stable isotopes (δ 13 C, δ 15 N, 87 Sr/ 86 Sr, 206 Pb/ 207 Pb and 208 Pb/ 207 Pb) measured in sediments, mangrove trees (Rhizophora mangle, Laguncularia racemosa, Avicennia schaueriana), plankton, shrimps (Macrobranchium sp.), crabs (Aratus sp.), oysters (Crassostrea rhizophorae) and fish (Centropomus parallelus) from both areas. Strontium and Pb isotopes were also analysed in water and atmospheric particulate matter (PM). δ 15 N indicated that crab, shrimp and oyster are at intermediate levels within the local food web and fish, in this case C. parallelus, was confirmed at the highest trophic level. δ 15 N also indicates different anthropogenic pressures between both estuaries; Vitória Bay, close to intensive human activities, showed higher δ 15 N across the food web, apparently influenced by sewage. The ratio 87 Sr/ 86 Sr showed the primary influence of marine water throughout the entire food web. Pb isotope ratios suggest that PM is primarily influenced by metallurgical activities, with some secondary influence on mangrove plants and crabs sampled in the area adjacent to the smelting works. To our knowledge, this is the first demonstration of the effect of anthropogenic pollution (probable sewage pollution) on the isotopic fingerprint of estuarine-mangrove systems located close to a city compared to less impacted estuarine mangroves. The influence of industrial metallurgical activity detected using Pb isotopic analysis of PM and mangrove plants close to such an impacted area is also notable and illustrates the value of isotopic analysis in tracing the impact and species affected by atmospheric pollution. Copyright © 2018 Elsevier B

  15. EVIDENCE FOR MULTIPLE SOURCES OF 10Be IN THE EARLY SOLAR SYSTEM

    International Nuclear Information System (INIS)

    Wielandt, Daniel; Krot, Alexander N.; Bizzarro, Martin; Nagashima, Kazuhide; Huss, Gary R.; Ivanova, Marina A.

    2012-01-01

    Beryllium-10 is a short-lived radionuclide (t 1/2 = 1.4 Myr) uniquely synthesized by spallation reactions and inferred to have been present when the solar system's oldest solids (calcium-aluminum-rich inclusions, CAIs) formed. Yet, the astrophysical site of 10 Be nucleosynthesis is uncertain. We report Li-Be-B isotope measurements of CAIs from CV chondrites, including CAIs that formed with the canonical 26 Al/ 27 Al ratio of ∼5 × 10 –5 (canonical CAIs) and CAIs with Fractionation and Unidentified Nuclear isotope effects (FUN-CAIs) characterized by 26 Al/ 27 Al ratios much lower than the canonical value. Our measurements demonstrate the presence of four distinct fossil 10 Be/ 9 Be isochrons, lower in the FUN-CAIs than in the canonical CAIs, and variable within these classes. Given that FUN-CAI precursors escaped evaporation-recondensation prior to evaporative melting, we suggest that the 10 Be/ 9 Be ratio recorded by FUN-CAIs represents a baseline level present in presolar material inherited from the protosolar molecular cloud, generated via enhanced trapping of galactic cosmic rays. The higher and possibly variable apparent 10 Be/ 9 Be ratios of canonical CAIs reflect additional spallogenesis, either in the gaseous CAI-forming reservoir, or in the inclusions themselves: this indicates at least two nucleosynthetic sources of 10 Be in the early solar system. The most promising locale for 10 Be synthesis is close to the proto-Sun during its early mass-accreting stages, as these are thought to coincide with periods of intense particle irradiation occurring on timescales significantly shorter than the formation interval of canonical CAIs.

  16. EarthCube Data Discovery Hub: Enhancing, Curating and Finding Data across Multiple Geoscience Data Sources.

    Science.gov (United States)

    Zaslavsky, I.; Valentine, D.; Richard, S. M.; Gupta, A.; Meier, O.; Peucker-Ehrenbrink, B.; Hudman, G.; Stocks, K. I.; Hsu, L.; Whitenack, T.; Grethe, J. S.; Ozyurt, I. B.

    2017-12-01

    EarthCube Data Discovery Hub (DDH) is an EarthCube Building Block project using technologies developed in CINERGI (Community Inventory of EarthCube Resources for Geoscience Interoperability) to enable geoscience users to explore a growing portfolio of EarthCube-created and other geoscience-related resources. Over 1 million metadata records are available for discovery through the project portal (cinergi.sdsc.edu). These records are retrieved from data facilities, including federal, state and academic sources, or contributed by geoscientists through workshops, surveys, or other channels. CINERGI metadata augmentation pipeline components 1) provide semantic enhancement based on a large ontology of geoscience terms, using text analytics to generate keywords with references to ontology classes, 2) add spatial extents based on place names found in the metadata record, and 3) add organization identifiers to the metadata. The records are indexed and can be searched via a web portal and standard search APIs. The added metadata content improves discoverability and interoperability of the registered resources. Specifically, the addition of ontology-anchored keywords enables faceted browsing and lets users navigate to datasets related by variables measured, equipment used, science domain, processes described, geospatial features studied, and other dataset characteristics that are generated by the pipeline. DDH also lets data curators access and edit the automatically generated metadata records using the CINERGI metadata editor, accept or reject the enhanced metadata content, and consider it in updating their metadata descriptions. We consider several complex data discovery workflows, in environmental seismology (quantifying sediment and water fluxes using seismic data), marine biology (determining available temperature, location, weather and bleaching characteristics of coral reefs related to measurements in a given coral reef survey), and river geochemistry (discovering

  17. Mosaic organization of the hippocampal neuroepithelium and the multiple germinal sources of dentate granule cells

    International Nuclear Information System (INIS)

    Altman, J.; Bayer, S.A.

    1990-01-01

    This study deals with the site of origin, migration, and settling of the principal cell constituents of the rat hippocampus during the embryonic period. The results indicate that the hippocampal neuroepithelium consists of three morphogenetically discrete components--the Ammonic neuroepithelium, the primary dentate neuroepithelium, and the fimbrial glioepithelium--and that these are discrete sources of the large neurons of Ammon's horn, the smaller granular neurons of the dentate gyrus, and the glial cells of the fimbria. The putative Ammonic neuroepithelium is marked in short-survival thymidine radiograms by a high level of proliferative activity and evidence of interkinetic nuclear migration from day E16 until day E19. On days E16 and E17 a diffuse band of unlabeled cells forms outside the Ammonic neuroepithelium. These postmitotic cells are considered to be stratum radiatum and stratum oriens neurons, which are produced in large numbers as early as day E15. A cell-dense layer, the incipient stratum pyramidale, begins to form on day E18 and spindle-shaped cells can be traced to it from the Ammonic neuroepithelium. This migratory band increases in size for several days, then declines, and finally disappears by day E22. It is inferred that this migration contains the pyramidal cells of Ammon's horn that are produced mostly on days E17 through E20. The putative primary dentate neuroepithelium is distinguished from the Ammonic neuroepithelium during the early phases of embryonic development by its location, shape, and cellular dynamics. It is located around a ventricular indentation, the dentate notch, contains fewer mitotic cells near the lumen of the ventricle than the Ammonic neuroepithelium, and shows a different labeling pattern both in short-survival and sequential-survival thymidine radiograms

  18. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  19. Towards single embryo transfer? Modelling clinical outcomes of potential treatment choices using multiple data sources: predictive models and patient perspectives.

    Science.gov (United States)

    Roberts, Sa; McGowan, L; Hirst, Wm; Brison, Dr; Vail, A; Lieberman, Ba

    2010-07-01

    specifically predicted multiple birth outcomes beyond those that predicted treatment success. In the fresh transfer following egg retrieval, SET would lead to a reduction of approximately one-third in the live birth probability compared with DET, a result consistent with the limited data from clinical trials. From the population or clinic perspective, selection of patients based on prognostic indicators might mitigate about half of the loss in live births associated with SET in the initial fresh transfer while achieving a twin rate of 10% or less. Data-based simulations suggested that, if all good-quality embryos are replaced over multiple frozen embryo transfers, repeated SET has the potential to produce more live birth events than repeated DET. However, this would depend on optimising cryopreservation procedures. Universal SET could both reduce the number of twin births and lead to more couples having a child, but at an average cost of one more embryo transfer procedure per egg retrieval. The interview and focus group data suggest that, despite the potential to maintain overall success rates, patients would prefer DET: the potential for twins was seen as positive, while additional transfer procedures can be emotionally, physically and financially draining. For any one transfer, SET has about a one-third loss of success rate relative to DET. This can be only partially mitigated by patient and treatment cycle selection, which may be criticised as unfair as all patients receiving SET will have a lower chance of success than they would with DET. However, considering complete cycles (fresh plus frozen transfers), it is possible for repeat SET to produce more live births than repeat DET. Such a strategy would require support from funders and acceptance by patients of both cryopreservation and the burden of additional transfer cycles. Future work should include development of improved clinical and regulatory database systems, surveys to quantify the extent of patients' beliefs and

  20. Cr(Vi) reduction capacity of activated sludge as affected by nitrogen and carbon sources, microbial acclimation and cell multiplication

    International Nuclear Information System (INIS)

    Ferro Orozco, A.M.; Contreras, E.M.; Zaritzky, N.E.

    2010-01-01

    The objectives of the present work were: (i) to analyze the capacity of activated sludge to reduce hexavalent chromium using different carbon sources as electron donors in batch reactors, (ii) to determine the relationship between biomass growth and the amount of Cr(VI) reduced considering the effect of the nitrogen to carbon source ratio, and (iii) to determine the effect of the Cr(VI) acclimation stage on the performance of the biological chromium reduction assessing the stability of the Cr(VI) reduction capacity of the activated sludge. The highest specific Cr(VI) removal rate (q Cr ) was attained with cheese whey or lactose as electron donors decreasing in the following order: cheese whey ∼ lactose > glucose > citrate > acetate. Batch assays with different nitrogen to carbon source ratio demonstrated that biological Cr(VI) reduction is associated to the cell multiplication phase; as a result, maximum Cr(VI) removal rates occur when there is no substrate limitation. The biomass can be acclimated to the presence of Cr(VI) and generate new cells that maintain the ability to reduce chromate. Therefore, the activated sludge process could be applied to a continuous Cr(VI) removal process.

  1. Analysis of large databases in vascular surgery.

    Science.gov (United States)

    Nguyen, Louis L; Barshes, Neal R

    2010-09-01

    Large databases can be a rich source of clinical and administrative information on broad populations. These datasets are characterized by demographic and clinical data for over 1000 patients from multiple institutions. Since they are often collected and funded for other purposes, their use for secondary analysis increases their utility at relatively low costs. Advantages of large databases as a source include the very large numbers of available patients and their related medical information. Disadvantages include lack of detailed clinical information and absence of causal descriptions. Researchers working with large databases should also be mindful of data structure design and inherent limitations to large databases, such as treatment bias and systemic sampling errors. Withstanding these limitations, several important studies have been published in vascular care using large databases. They represent timely, "real-world" analyses of questions that may be too difficult or costly to address using prospective randomized methods. Large databases will be an increasingly important analytical resource as we focus on improving national health care efficacy in the setting of limited resources.

  2. Estimating average alcohol consumption in the population using multiple sources: the case of Spain.

    Science.gov (United States)

    Sordo, Luis; Barrio, Gregorio; Bravo, María J; Villalbí, Joan R; Espelt, Albert; Neira, Montserrat; Regidor, Enrique

    2016-01-01

    National estimates on per capita alcohol consumption are provided regularly by various sources and may have validity problems, so corrections are needed for monitoring and assessment purposes. Our objectives were to compare different alcohol availability estimates for Spain, to build the best estimate (actual consumption), characterize its time trend during 2001-2011, and quantify the extent to which other estimates (coverage) approximated actual consumption. Estimates were: alcohol availability from the Spanish Tax Agency (Tax Agency availability), World Health Organization (WHO availability) and other international agencies, self-reported purchases from the Spanish Food Consumption Panel, and self-reported consumption from population surveys. Analyses included calculating: between-agency discrepancy in availability, multisource availability (correcting Tax Agency availability by underestimation of wine and cider), actual consumption (adjusting multisource availability by unrecorded alcohol consumption/purchases and alcohol losses), and coverage of selected estimates. Sensitivity analyses were undertaken. Time trends were characterized by joinpoint regression. Between-agency discrepancy in alcohol availability remained high in 2011, mainly because of wine and spirits, although some decrease was observed during the study period. The actual consumption was 9.5 l of pure alcohol/person-year in 2011, decreasing 2.3 % annually, mainly due to wine and spirits. 2011 coverage of WHO availability, Tax Agency availability, self-reported purchases, and self-reported consumption was 99.5, 99.5, 66.3, and 28.0 %, respectively, generally with downward trends (last three estimates, especially self-reported consumption). The multisource availability overestimated actual consumption by 12.3 %, mainly due to tourism imbalance. Spanish estimates of per capita alcohol consumption show considerable weaknesses. Using uncorrected estimates, especially self-reported consumption, for

  3. Contrasts between estimates of baseflow help discern multiple sources of water contributing to rivers

    Science.gov (United States)

    Cartwright, I.; Gilfedder, B.; Hofmann, H.

    2014-01-01

    This study compares baseflow estimates using chemical mass balance, local minimum methods, and recursive digital filters in the upper reaches of the Barwon River, southeast Australia. During the early stages of high-discharge events, the chemical mass balance overestimates groundwater inflows, probably due to flushing of saline water from wetlands and marshes, soils, or the unsaturated zone. Overall, however, estimates of baseflow from the local minimum and recursive digital filters are higher than those based on chemical mass balance using Cl calculated from continuous electrical conductivity measurements. Between 2001 and 2011, the baseflow contribution to the upper Barwon River calculated using chemical mass balance is between 12 and 25% of the annual discharge with a net baseflow contribution of 16% of total discharge. Recursive digital filters predict higher baseflow contributions of 19 to 52% of discharge annually with a net baseflow contribution between 2001 and 2011 of 35% of total discharge. These estimates are similar to those from the local minimum method (16 to 45% of annual discharge and 26% of total discharge). These differences most probably reflect how the different techniques characterise baseflow. The local minimum and recursive digital filters probably aggregate much of the water from delayed sources as baseflow. However, as many delayed transient water stores (such as bank return flow, floodplain storage, or interflow) are likely to be geochemically similar to surface runoff, chemical mass balance calculations aggregate them with the surface runoff component. The difference between the estimates is greatest following periods of high discharge in winter, implying that these transient stores of water feed the river for several weeks to months at that time. Cl vs. discharge variations during individual flow events also demonstrate that inflows of high-salinity older water occurs on the rising limbs of hydrographs followed by inflows of low

  4. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  5. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases such...

  6. Construction of a single/multiple wavelength RZ optical pulse source at 40 GHz by use of wavelength conversion in a high-nonlinearity DSF-NOLM

    DEFF Research Database (Denmark)

    Yu, Jianjun; Yujun, Qian; Jeppesen, Palle

    2001-01-01

    A single or multiple wavelength RZ optical pulse source at 40 GHz is successfully obtained by using wavelength conversion in a nonlinear optical loop mirror consisting of high nonlinearity-dispersion shifted fiber.......A single or multiple wavelength RZ optical pulse source at 40 GHz is successfully obtained by using wavelength conversion in a nonlinear optical loop mirror consisting of high nonlinearity-dispersion shifted fiber....

  7. An elemental concentration open source database for Hogdahl-Convention and Westcott-Formalism based on K0-INAA method in Malaysia

    International Nuclear Information System (INIS)

    Yavar, A.R.; Sukiman Sarmani; Tan, C.Y.; Rafie, N.N.; Lim, S.W.E.; Khoo, K.S.

    2012-01-01

    An electronic database has been developed and implemented for K 0 -INAA method in Malaysia. Databases are often developed according to national requirements. This database contains constant nuclear data for k 0 -INAA method. Hogdahl-convention and Westcott-formalism as 3 separate command user interfaces. It has been created using Microsoft Access 2007 under a Windows operating system. This database saves time and the quality of results can be assured when the calculation of neutron flux parameters and concentration of elements by k 0 -INAA method are utilised. An evaluation of the database was conducted by IAEA Soil7 where the results published which showed a high level of consistency. (Author)

  8. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  9. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    Science.gov (United States)

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  10. Quantum electrodynamics of the internal source x-ray holographies: Bremsstrahlung, fluorescence, and multiple-energy x-ray holography

    International Nuclear Information System (INIS)

    Miller, G.A.; Sorensen, L.B.

    1997-01-01

    Quantum electrodynamics (QED) is used to derive the differential cross sections measured in the three new experimental internal source ensemble x-ray holographies: bremsstrahlung (BXH), fluorescence (XFH), and multiple-energy (MEXH) x-ray holography. The polarization dependence of the BXH cross section is also obtained. For BXH, we study analytically and numerically the possible effects of the virtual photons and electrons which enter QED calculations in summing over the intermediate states. For the low photon and electron energies used in the current experiments, we show that the virtual intermediate states produce only very small effects. This is because the uncertainty principle limits the distance that the virtual particles can propagate to be much shorter than the separation between the regions of high electron density in the adjacent atoms. We also find that using the asymptotic form of the scattering wave function causes about a 5 10% error for near forward scattering. copyright 1997 The American Physical Society

  11. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  12. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery.

    Science.gov (United States)

    Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M

    2017-09-01

    Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.

  13. Design database for quantitative trait loci (QTL) data warehouse, data mining, and meta-analysis.

    Science.gov (United States)

    Hu, Zhi-Liang; Reecy, James M; Wu, Xiao-Lin

    2012-01-01

    A database can be used to warehouse quantitative trait loci (QTL) data from multiple sources for comparison, genomic data mining, and meta-analysis. A robust database design involves sound data structure logistics, meaningful data transformations, normalization, and proper user interface designs. This chapter starts with a brief review of relational database basics and concentrates on issues associated with curation of QTL data into a relational database, with emphasis on the principles of data normalization and structure optimization. In addition, some simple examples of QTL data mining and meta-analysis are included. These examples are provided to help readers better understand the potential and importance of sound database design.

  14. Factors Associated With Healthcare-Acquired Catheter-Associated Urinary Tract Infections: Analysis Using Multiple Data Sources and Data Mining Techniques.

    Science.gov (United States)

    Park, Jung In; Bliss, Donna Z; Chi, Chih-Lin; Delaney, Connie W; Westra, Bonnie L

    The purpose of this study was to identify factors associated with healthcare-acquired catheter-associated urinary tract infections (HA-CAUTIs) using multiple data sources and data mining techniques. Three data sets were integrated for analysis: electronic health record data from a university hospital in the Midwestern United States was combined with staffing and environmental data from the hospital's National Database of Nursing Quality Indicators and a list of patients with HA-CAUTIs. Three data mining techniques were used for identification of factors associated with HA-CAUTI: decision trees, logistic regression, and support vector machines. Fewer total nursing hours per patient-day, lower percentage of direct care RNs with specialty nursing certification, higher percentage of direct care RNs with associate's degree in nursing, and higher percentage of direct care RNs with BSN, MSN, or doctoral degree are associated with HA-CAUTI occurrence. The results also support the association of the following factors with HA-CAUTI identified by previous studies: female gender; older age (>50 years); longer length of stay; severe underlying disease; glucose lab results (>200 mg/dL); longer use of the catheter; and RN staffing. Additional findings from this study demonstrated that the presence of more nurses with specialty nursing certifications can reduce HA-CAUTI occurrence. While there may be valid reasons for leaving in a urinary catheter, findings show that having a catheter in for more than 48 hours contributes to HA-CAUTI occurrence. Finally, the findings suggest that more nursing hours per patient-day are related to better patient outcomes.

  15. TH-CD-207B-01: BEST IN PHYSICS (IMAGING): Development of High Brightness Multiple-Pixel X-Ray Source Using Oxide Coated Cathodes

    International Nuclear Information System (INIS)

    Kandlakunta, P; Pham, R; Zhang, T

    2016-01-01

    Purpose: To develop and characterize a high brightness multiple-pixel thermionic emission x-ray (MPTEX) source. Methods: Multiple-pixel x-ray sources allow for designs of novel x-ray imaging techniques, such as fixed gantry CT, digital tomosynthesis, tetrahedron beam computed tomography, etc. We are developing a high-brightness multiple-pixel thermionic emission x-ray (MPTEX) source based on oxide coated cathodes. Oxide cathode is chosen as the electron source due to its high emission current density and low operating temperature. A MPTEX prototype has been developed which may contain up to 41 micro-rectangular oxide cathodes in 4 mm pixel spacing. Electronics hardware was developed for source control and switching. The cathode emission current was evaluated and x-ray measurements were performed to estimate the focal spot size. Results: The oxide cathodes were able to produce ∼110 mA cathode current in pulse mode which corresponds to an emission current density of 0.55 A/cm 2 . The maximum kVp of the MPTEX prototype currently is limited to 100 kV due to the rating of high voltage feedthrough. Preliminary x-ray measurements estimated the focal spot size as 1.5 × 1.3 mm 2 . Conclusion: A MPTEX source was developed with thermionic oxide coated cathodes and preliminary source characterization was successfully performed. The MPTEX source is able to produce an array of high brightness x-ray beams with a fast switching speed.

  16. Multiple source genes of HAmo SINE actively expanded and ongoing retroposition in cyprinid genomes relying on its partner LINE

    Directory of Open Access Journals (Sweden)

    Gan Xiaoni

    2010-04-01

    Full Text Available Abstract Background We recently characterized HAmo SINE and its partner LINE in silver carp and bighead carp based on hybridization capture of repetitive elements from digested genomic DNA in solution using a bead-probe 1. To reveal the distribution and evolutionary history of SINEs and LINEs in cyprinid genomes, we performed a multi-species search for HAmo SINE and its partner LINE using the bead-probe capture and internal-primer-SINE polymerase chain reaction (PCR techniques. Results Sixty-seven full-size and 125 internal-SINE sequences (as well as 34 full-size and 9 internal sequences previously reported in bighead carp and silver carp from 17 species of the family Cyprinidae were aligned as well as 14 new isolated HAmoL2 sequences. Four subfamilies (type I, II, III and IV, which were divided based on diagnostic nucleotides in the tRNA-unrelated region, expanded preferentially within a certain lineage or within the whole family of Cyprinidae as multiple active source genes. The copy numbers of HAmo SINEs were estimated to vary from 104 to 106 in cyprinid genomes by quantitative RT-PCR. Over one hundred type IV members were identified and characterized in the primitive cyprinid Danio rerio genome but only tens of sequences were found to be similar with type I, II and III since the type IV was the oldest subfamily and its members dispersed in almost all investigated cyprinid fishes. For determining the taxonomic distribution of HAmo SINE, inter-primer SINE PCR was conducted in other non-cyprinid fishes, the results shows that HAmo SINE- related sequences may disperse in other families of order Cypriniforms but absent in other orders of bony fishes: Siluriformes, Polypteriformes, Lepidosteiformes, Acipenseriformes and Osteoglossiforms. Conclusions Depending on HAmo LINE2, multiple source genes (subfamilies of HAmo SINE actively expanded and underwent retroposition in a certain lineage or within the whole family of Cyprinidae. From this

  17. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  18. JICST Factual Database(2)

    Science.gov (United States)

    Araki, Keisuke

    The computer programme, which builds atom-bond connection tables from nomenclatures, is developed. Chemical substances with their nomenclature and varieties of trivial names or experimental code numbers are inputted. The chemical structures of the database are stereospecifically stored and are able to be searched and displayed according to stereochemistry. Source data are from laws and regulations of Japan, RTECS of US and so on. The database plays a central role within the integrated fact database service of JICST and makes interrelational retrieval possible.

  19. Specialist Bibliographic Databases.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Trukhachev, Vladimir I; Kostyukova, Elena I; Gerasimov, Alexey N; Kitas, George D

    2016-05-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.

  20. Specialist Bibliographic Databases

    Science.gov (United States)

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls. PMID:27134485

  1. DATABASE REPLICATION IN HETEROGENOUS PLATFORM

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2014-01-01

    The application of diverse database technologies in enterprises today is increasingly a common practice. To provide high availability and survavibality of real-time information, a database replication technology that has capability to replicate databases under heterogenous platforms is required. The purpose of this research is to find the technology with such capability. In this research, the data source is stored in MSSQL database server running on Windows. The data will be replicated to MyS...

  2. Discovering perturbation of modular structure in HIV progression by integrating multiple data sources through non-negative matrix factorization.

    Science.gov (United States)

    Ray, Sumanta; Maulik, Ujjwal

    2016-12-20

    Detecting perturbation in modular structure during HIV-1 disease progression is an important step to understand stage specific infection pattern of HIV-1 virus in human cell. In this article, we proposed a novel methodology on integration of multiple biological information to identify such disruption in human gene module during different stages of HIV-1 infection. We integrate three different biological information: gene expression information, protein-protein interaction information and gene ontology information in single gene meta-module, through non negative matrix factorization (NMF). As the identified metamodules inherit those information so, detecting perturbation of these, reflects the changes in expression pattern, in PPI structure and in functional similarity of genes during the infection progression. To integrate modules of different data sources into strong meta-modules, NMF based clustering is utilized here. Perturbation in meta-modular structure is identified by investigating the topological and intramodular properties and putting rank to those meta-modules using a rank aggregation algorithm. We have also analyzed the preservation structure of significant GO terms in which the human proteins of the meta-modules participate. Moreover, we have performed an analysis to show the change of coregulation pattern of identified transcription factors (TFs) over the HIV progression stages.

  3. Directory of IAEA databases

    International Nuclear Information System (INIS)

    1992-12-01

    This second edition of the Directory of IAEA Databases has been prepared within the Division of Scientific and Technical Information (NESI). Its main objective is to describe the computerized information sources available to staff members. This directory contains all databases produced at the IAEA, including databases stored on the mainframe, LAN's and PC's. All IAEA Division Directors have been requested to register the existence of their databases with NESI. For the second edition database owners were requested to review the existing entries for their databases and answer four additional questions. The four additional questions concerned the type of database (e.g. Bibliographic, Text, Statistical etc.), the category of database (e.g. Administrative, Nuclear Data etc.), the available documentation and the type of media used for distribution. In the individual entries on the following pages the answers to the first two questions (type and category) is always listed, but the answers to the second two questions (documentation and media) is only listed when information has been made available

  4. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  5. Prediction of acute toxicity of phenol derivatives using multiple linear regression approach for Tetrahymena pyriformis contaminant identification in a median-size database.

    Science.gov (United States)

    Dieguez-Santana, Karel; Pham-The, Hai; Villegas-Aguilar, Pedro J; Le-Thi-Thu, Huong; Castillo-Garit, Juan A; Casañola-Martin, Gerardo M

    2016-12-01

    In this article, the modeling of inhibitory grown activity against Tetrahymena pyriformis is described. The 0-2D Dragon descriptors based on structural aspects to gain some knowledge of factors influencing aquatic toxicity are mainly used. Besides, it is done by some enlarged data of phenol derivatives described for the first time and composed of 358 chemicals. It overcomes the previous datasets with about one hundred compounds. Moreover, the results of the model evaluation by the parameters in the training, prediction and validation give adequate results comparable with those of the previous works. The more influential descriptors included in the model are: X3A, MWC02, MWC10 and piPC03 with positive contributions to the dependent variable; and MWC09, piPC02 and TPC with negative contributions. In a next step, a median-size database of nearly 8000 phenolic compounds extracted from ChEMBL was evaluated with the quantitative-structure toxicity relationship (QSTR) model developed providing some clues (SARs) for identification of ecotoxicological compounds. The outcome of this report is very useful to screen chemical databases for finding the compounds responsible of aquatic contamination in the biomarker used in the current work. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  7. Snowstorm Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Snowstorm Database is a collection of over 500 snowstorms dating back to 1900 and updated operationally. Only storms having large areas of heavy snowfall (10-20...

  8. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May 1,...

  9. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  10. Constructing a Geology Ontology Using a Relational Database

    Science.gov (United States)

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances

  11. Contextualized perceptions of movement as a source of expanded insight: People with multiple sclerosis' experience with physiotherapy.

    Science.gov (United States)

    Normann, Britt; Sørgaard, Knut W; Salvesen, Rolf; Moe, Siri

    2013-01-01

    The hospitals' outpatient clinics for people with multiple sclerosis (PwMS) are important in the health care. Research regarding physiotherapy in such clinics is limited. The purpose was to investigate how PwMS perceive movement during single sessions of physiotherapy in a hospital's outpatient clinic, and what do these experiences mean for the patient's insight into their movement disturbances? Qualitative research interviews were performed with a purposive sample of 12 PwMS and supplemented with seven videotaped sessions. Content analysis was performed. The results indicate that contextualized perceptions of movement appear to be an essential source for PwMS to gain expanded insight with regard to their individual movement disturbances regardless of their ambulatory status. The contextualization implies that perceptions of movement are integrated with the physiotherapist's explanations regarding optimizing gait and balance or other activities of daily life. Perceptions of improvement in body part movement and/or functional activities are vital to enhancing their understanding of their individual movement disorders, and they may provide expanded insight regarding future possibilities and limitations involving everyday tasks. The implementation of movements, which transforms the perceived improvement into self-assisted exercises, appeared to be meaningful. Contextualized perceptions of improvements in movement may strengthen the person's sense of ownership and sense of agency and thus promote autonomy and self-encouragement. The findings underpin the importance of contextualized perceptions of movement based on exploration of potential for change, as an integrated part of information and communication in the health care for PwMS. Further investigations are necessary to deepen our knowledge.

  12. Glacier changes and climate trends derived from multiple sources in the data scarce Cordillera Vilcanota region, southern Peruvian Andes

    Science.gov (United States)

    Salzmann, N.; Huggel, C.; Rohrer, M.; Silverio, W.; Mark, B. G.; Burns, P.; Portocarrero, C.

    2013-01-01

    The role of glaciers as temporal water reservoirs is particularly pronounced in the (outer) tropics because of the very distinct wet/dry seasons. Rapid glacier retreat caused by climatic changes is thus a major concern, and decision makers demand urgently for regional/local glacier evolution trends, ice mass estimates and runoff assessments. However, in remote mountain areas, spatial and temporal data coverage is typically very scarce and this is further complicated by a high spatial and temporal variability in regions with complex topography. Here, we present an approach on how to deal with these constraints. For the Cordillera Vilcanota (southern Peruvian Andes), which is the second largest glacierized cordillera in Peru (after the Cordillera Blanca) and also comprises the Quelccaya Ice Cap, we assimilate a comprehensive multi-decadal collection of available glacier and climate data from multiple sources (satellite images, meteorological station data and climate reanalysis), and analyze them for respective changes in glacier area and volume and related trends in air temperature, precipitation and in a more general manner for specific humidity. While we found only marginal glacier changes between 1962 and 1985, there has been a massive ice loss since 1985 (about 30% of area and about 45% of volume). These high numbers corroborate studies from other glacierized cordilleras in Peru. The climate data show overall a moderate increase in air temperature, mostly weak and not significant trends for precipitation sums and probably cannot in full explain the observed substantial ice loss. Therefore, the likely increase of specific humidity in the upper troposphere, where the glaciers are located, is further discussed and we conclude that it played a major role in the observed massive ice loss of the Cordillera Vilcanota over the past decades.

  13. Glacier changes and climate trends derived from multiple sources in the data scarce Cordillera Vilcanota region, southern Peruvian Andes

    Directory of Open Access Journals (Sweden)

    N. Salzmann

    2013-01-01

    Full Text Available The role of glaciers as temporal water reservoirs is particularly pronounced in the (outer tropics because of the very distinct wet/dry seasons. Rapid glacier retreat caused by climatic changes is thus a major concern, and decision makers demand urgently for regional/local glacier evolution trends, ice mass estimates and runoff assessments. However, in remote mountain areas, spatial and temporal data coverage is typically very scarce and this is further complicated by a high spatial and temporal variability in regions with complex topography. Here, we present an approach on how to deal with these constraints. For the Cordillera Vilcanota (southern Peruvian Andes, which is the second largest glacierized cordillera in Peru (after the Cordillera Blanca and also comprises the Quelccaya Ice Cap, we assimilate a comprehensive multi-decadal collection of available glacier and climate data from multiple sources (satellite images, meteorological station data and climate reanalysis, and analyze them for respective changes in glacier area and volume and related trends in air temperature, precipitation and in a more general manner for specific humidity. While we found only marginal glacier changes between 1962 and 1985, there has been a massive ice loss since 1985 (about 30% of area and about 45% of volume. These high numbers corroborate studies from other glacierized cordilleras in Peru. The climate data show overall a moderate increase in air temperature, mostly weak and not significant trends for precipitation sums and probably cannot in full explain the observed substantial ice loss. Therefore, the likely increase of specific humidity in the upper troposphere, where the glaciers are located, is further discussed and we conclude that it played a major role in the observed massive ice loss of the Cordillera Vilcanota over the past decades.

  14. The Hidden Health and Economic Burden of Rotavirus Gastroenteritis in Malaysia: An Estimation Using Multiple Data Sources.

    Science.gov (United States)

    Loganathan, Tharani; Ng, Chiu-Wan; Lee, Way-Seah; Jit, Mark

    2016-06-01

    Rotavirus gastroenteritis (RVGE) results in substantial mortality and morbidity worldwide. However, an accurate estimation of the health and economic burden of RVGE in Malaysia covering public, private and home treatment is lacking. Data from multiple sources were used to estimate diarrheal mortality and morbidity according to health service utilization. The proportion of this burden attributable to rotavirus was estimated from a community-based study and a meta-analysis we conducted of primary hospital-based studies. Rotavirus incidence was determined by multiplying acute gastroenteritis incidence with estimates of the proportion of gastroenteritis attributable to rotavirus. The economic burden of rotavirus disease was estimated from the health systems and societal perspective. Annually, rotavirus results in 27 deaths, 31,000 hospitalizations, 41,000 outpatient visits and 145,000 episodes of home-treated gastroenteritis in Malaysia. We estimate an annual rotavirus incidence of 1 death per 100,000 children and 12 hospitalizations, 16 outpatient clinic visits and 57 home-treated episodes per 1000 children under-5 years. Annually, RVGE is estimated to cost US$ 34 million to the healthcare provider and US$ 50 million to society. Productivity loss contributes almost a third of costs to society. Publicly, privately and home-treated episodes consist of 52%, 27% and 21%, respectively, of the total societal costs. RVGE represents a considerable health and economic burden in Malaysia. Much of the burden lies in privately or home-treated episodes and is poorly captured in previous studies. This study provides vital information for future evaluation of cost-effectiveness, which are necessary for policy-making regarding universal vaccination.

  15. Text-Based Argumentation with Multiple Sources: A Descriptive Study of Opportunity to Learn in Secondary English Language Arts, History, and Science

    Science.gov (United States)

    Litman, Cindy; Marple, Stacy; Greenleaf, Cynthia; Charney-Sirott, Irisa; Bolz, Michael J.; Richardson, Lisa K.; Hall, Allison H.; George, MariAnne; Goldman, Susan R.

    2017-01-01

    This study presents a descriptive analysis of 71 videotaped lessons taught by 34 highly regarded secondary English language arts, history, and science teachers, collected to inform an intervention focused on evidence-based argumentation from multiple text sources. Studying the practices of highly regarded teachers is valuable for identifying…

  16. Effectiveness of Partition and Graph Theoretic Clustering Algorithms for Multiple Source Partial Discharge Pattern Classification Using Probabilistic Neural Network and Its Adaptive Version: A Critique Based on Experimental Studies

    Directory of Open Access Journals (Sweden)

    S. Venkatesh

    2012-01-01

    Full Text Available Partial discharge (PD is a major cause of failure of power apparatus and hence its measurement and analysis have emerged as a vital field in assessing the condition of the insulation system. Several efforts have been undertaken by researchers to classify PD pulses utilizing artificial intelligence techniques. Recently, the focus has shifted to the identification of multiple sources of PD since it is often encountered in real-time measurements. Studies have indicated that classification of multi-source PD becomes difficult with the degree of overlap and that several techniques such as mixed Weibull functions, neural networks, and wavelet transformation have been attempted with limited success. Since digital PD acquisition systems record data for a substantial period, the database becomes large, posing considerable difficulties during classification. This research work aims firstly at analyzing aspects concerning classification capability during the discrimination of multisource PD patterns. Secondly, it attempts at extending the previous work of the authors in utilizing the novel approach of probabilistic neural network versions for classifying moderate sets of PD sources to that of large sets. The third focus is on comparing the ability of partition-based algorithms, namely, the labelled (learning vector quantization and unlabelled (K-means versions, with that of a novel hypergraph-based clustering method in providing parsimonious sets of centers during classification.

  17. Analysis of Brassica oleracea early stage abiotic stress responses reveals tolerance in multiple crop types and for multiple sources of stress.

    Science.gov (United States)

    Beacham, Andrew M; Hand, Paul; Pink, David Ac; Monaghan, James M

    2017-12-01

    Brassica oleracea includes a number of important crop types such as cabbage, cauliflower, broccoli and kale. Current climate conditions and weather patterns are causing significant losses in these crops, meaning that new cultivars with improved tolerance of one or more abiotic stress types must be sought. In this study, genetically fixed B. oleracea lines belonging to a Diversity Fixed Foundation Set (DFFS) were assayed for their response to seedling stage-imposed drought, flood, salinity, heat and cold stress. Significant (P ≤ 0.05) variation in stress tolerance response was found for each stress, for each of four measured variables (relative fresh weight, relative dry weight, relative leaf number and relative plant height). Lines tolerant to multiple stresses were found to belong to several different crop types. There was no overall correlation between the responses to the different stresses. Abiotic stress tolerance was identified in multiple B. oleracea crop types, with some lines exhibiting resistance to multiple stresses. For each stress, no one crop type appeared significantly more or less tolerant than others. The results are promising for the development of more environmentally robust lines of different B. oleracea crops by identifying tolerant material and highlighting the relationship between responses to different stresses. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  18. Source apportionment of PM2.5 at multiple Northwest U.S. sites: Assessing regional winter wood smoke impacts from residential wood combustion

    Science.gov (United States)

    Kotchenruther, Robert A.

    2016-10-01

    Wood smoke from residential wood combustion is a significant source of elevated PM2.5 in many communities across the Northwest U.S. Accurate representation of residential wood combustion in source-oriented regional scale air quality models is challenging because of multiple uncertainties. As an alternative to source-oriented source apportionment, this work provides, through receptor-oriented source apportionment, an assessment of winter residential wood combustion impacts at multiple Northwest U.S. locations. Source apportionment was performed on chemically speciated PM2.5 from 19 monitoring sites using the Positive Matrix Factorization (PMF) receptor model. Each site was modeled independently, but a common data preparation and modeling protocol was used so that results were as comparable as possible across sites. Model solutions had from 4 to 8 PMF factors, depending on the site. PMF factors at each site were associated with a source classification (e.g., primary wood smoke), a dominant chemical composition (e.g., ammonium nitrate), or were some mixture. 15 different sources or chemical compositions were identified as contributing to PM2.5 across the 19 sites. The 6 most common were; aged wood smoke and secondary organic carbon, motor vehicles, primary wood smoke, ammonium nitrate, ammonium sulfate, and fugitive dust. Wood smoke was identified at every site, with both aged and primary wood smoke identified at most sites. Wood smoke contributions to PM2.5 were averaged for the two winter months of December and January, the months when wood smoke in the Northwest U.S. is mainly from residential wood combustion. The total contribution of residential wood combustion, that from primary plus aged smoke, ranged from 11.4% to 92.7% of average December and January PM2.5 depending on the site, with the highest percent contributions occurring in smaller towns that have fewer expected sources of winter PM2.5. Receptor modeling at multiple sites, such as that conducted in this

  19. TH-CD-207B-01: BEST IN PHYSICS (IMAGING): Development of High Brightness Multiple-Pixel X-Ray Source Using Oxide Coated Cathodes

    Energy Technology Data Exchange (ETDEWEB)

    Kandlakunta, P; Pham, R; Zhang, T [Washington University School of Medicine, St. Louis, MO (United States)

    2016-06-15

    Purpose: To develop and characterize a high brightness multiple-pixel thermionic emission x-ray (MPTEX) source. Methods: Multiple-pixel x-ray sources allow for designs of novel x-ray imaging techniques, such as fixed gantry CT, digital tomosynthesis, tetrahedron beam computed tomography, etc. We are developing a high-brightness multiple-pixel thermionic emission x-ray (MPTEX) source based on oxide coated cathodes. Oxide cathode is chosen as the electron source due to its high emission current density and low operating temperature. A MPTEX prototype has been developed which may contain up to 41 micro-rectangular oxide cathodes in 4 mm pixel spacing. Electronics hardware was developed for source control and switching. The cathode emission current was evaluated and x-ray measurements were performed to estimate the focal spot size. Results: The oxide cathodes were able to produce ∼110 mA cathode current in pulse mode which corresponds to an emission current density of 0.55 A/cm{sup 2}. The maximum kVp of the MPTEX prototype currently is limited to 100 kV due to the rating of high voltage feedthrough. Preliminary x-ray measurements estimated the focal spot size as 1.5 × 1.3 mm{sup 2}. Conclusion: A MPTEX source was developed with thermionic oxide coated cathodes and preliminary source characterization was successfully performed. The MPTEX source is able to produce an array of high brightness x-ray beams with a fast switching speed.

  20. Tunable optical frequency comb enabled scalable and cost-effective multiuser orthogonal frequency-division multiple access passive optical network with source-free optical network units.

    Science.gov (United States)

    Chen, Chen; Zhang, Chongfu; Liu, Deming; Qiu, Kun; Liu, Shuang

    2012-10-01

    We propose and experimentally demonstrate a multiuser orthogonal frequency-division multiple access passive optical network (OFDMA-PON) with source-free optical network units (ONUs), enabled by tunable optical frequency comb generation technology. By cascading a phase modulator (PM) and an intensity modulator and dynamically controlling the peak-to-peak voltage of a PM driven signal, a tunable optical frequency comb source can be generated. It is utilized to assist the configuration of a multiple source-free ONUs enhanced OFDMA-PON where simultaneous and interference-free multiuser upstream transmission over a single wavelength can be efficiently supported. The proposed multiuser OFDMA-PON is scalable and cost effective, and its feasibility is successfully verified by experiment.

  1. The CERN accelerator measurement database: on the road to federation

    International Nuclear Information System (INIS)

    Roderick, C.; Billen, R.; Gourber-Pace, M.; Hoibian, N.; Peryt, M.

    2012-01-01

    The Measurement database, acting as short-term central persistence and front-end of the CERN accelerator Logging Service, receives billions of time-series data per day for 200000+ signals. A variety of data acquisition systems on hundreds of front-end computers publish source data that eventually end up being logged in the Measurement database. As part of a federated approach to data management, information about source devices are defined in a Configuration database, whilst the signals to be logged are defined in the Measurement database. A mapping, which is often complex and subject to change/extension, is required in order to subscribe to the source devices, and write the published data to the corresponding named signals. Since 2005, this mapping was done by means of dozens of XML files, which were manually maintained by multiple persons, resulting in a configuration that was error prone. In 2010 this configuration was fully centralized in the Measurement database itself, reducing significantly the complexity and the actors in the process. Furthermore, logging processes immediately pick up modified configurations via JMS based notifications sent directly from the database. This paper will describe the architecture and the benefits of current implementation, as well as the next steps on the road to a fully federated solution. (authors)

  2. Gender as a Modifying Factor Influencing Myotonic Dystrophy Type 1 Phenotype Severity and Mortality: A Nationwide Multiple Databases Cross-Sectional Observational Study.

    Directory of Open Access Journals (Sweden)

    Celine Dogan

    Full Text Available Myotonic Dystrophy type 1 (DM1 is one of the most heterogeneous hereditary disease in terms of age of onset, clinical manifestations, and severity, challenging both medical management and clinical trials. The CTG expansion size is the main factor determining the age of onset although no factor can finely predict phenotype and prognosis. Differences between males and females have not been specifically reported. Our aim is to study gender impact on DM1 phenotype and severity.We first performed cross-sectional analysis of main multiorgan clinical parameters in 1409 adult DM1 patients (>18 y from the DM-Scope nationwide registry and observed different patterns in males and females. Then, we assessed gender impact on social and economic domains using the AFM-Téléthon DM1 survey (n = 970, and morbidity and mortality using the French National Health Service Database (n = 3301.Men more frequently had (1 severe muscular disability with marked myotonia, muscle weakness, cardiac, and respiratory involvement; (2 developmental abnormalities with facial dysmorphism and cognitive impairment inferred from low educational levels and work in specialized environments; and (3 lonely life. Alternatively, women more frequently had cataracts, dysphagia, digestive tract dysfunction, incontinence, thyroid disorder and obesity. Most differences were out of proportion to those observed in the general population. Compared to women, males were more affected in their social and economic life. In addition, they were more frequently hospitalized for cardiac problems, and had a higher mortality rate.Gender is a previously unrecognized factor influencing DM1 clinical profile and severity of the disease, with worse socio-economic consequences of the disease and higher morbidity and mortality in males. Gender should be considered in the design of both stratified medical management and clinical trials.

  3. Assessment of factors which affect multiple uses of water sources at household level in rural Zimbabwe - A case study of Marondera, Murehwa and Uzumba Maramba Pfungwe districts

    Science.gov (United States)

    Katsi, Luckson; Siwadi, Japson; Guzha, Edward; Makoni, Fungai S.; Smits, Stef

    Water with all its multiple uses plays a pivotal role in the sustenance of rural livelihoods, especially the poor. As such, the provision of water which go beyond domestic to include water for small-scale productive uses should be encouraged to enhance peoples’ livelihood options by making significant contribution to household income, food security, improved nutrition and health. All these multiple benefits, if combined can assist in the fight against hunger and poverty. This study was conducted in Mashonaland East province, covering Marondera, Murehwa and Uzumba Maramba Pfungwe districts in Zimbabwe for the period December 2005-May 2006 to assess factors which affect multiple uses of water sources at household level. Participatory Rural Appraisal tools such as discussions, observations and interviews were used for data collection. The survey found that people indeed require water for productive purposes apart from domestic uses, which are often given top priority. The study found out that multiple uses of water sources at household level can be affected by segmentation of water services into domestic and productive water supply schemes, technology and system design, water quality and quantity and distance to water sources among other factors. The study recommends that water service providers to be able to provide appropriate, efficient and sustainable services, they should understand and appreciate that people’s water needs are integrated and are part and parcel of their multifaceted livelihood strategies.

  4. Determination of the multiplication factor and its bias by the 252Cf-source technique: A method for code benchmarking with subcritical configurations

    International Nuclear Information System (INIS)

    Perez, R.B.; Valentine, T.E.; Mihalczo, J.T.; Mattingly, J.K.

    1997-01-01

    A brief discussion of the Cf-252 source driven method for subcritical measurements serves as an introduction to the concept and use of the spectral ratio, Γ. It has also been shown that the Monte Carlo calculation of spectral densities and effective multiplication factors have as a common denominator the transport propagator. This commonality follows from the fact that the Neumann series expansion of the propagator lends itself to the Monte Carlo method. On this basis a linear relationship between the spectral ratio and the effective multiplication factor has been shown. This relationship demonstrates the ability of subcritical measurements of the ratio of spectral densities to validate transport theory methods and cross sections

  5. Students' Consideration of Source Information during the Reading of Multiple Texts and Its Effect on Intertextual Conflict Resolution

    Science.gov (United States)

    Kobayashi, Keiichi

    2014-01-01

    This study investigated students' spontaneous use of source information for the resolution of conflicts between texts. One-hundred fifty-four undergraduate students read two conflicting explanations concerning the relationship between blood type and personality under two conditions: either one explanation with a higher credibility source and…

  6. Children's Ability to Distinguish between Memories from Multiple Sources: Implications for the Quality and Accuracy of Eyewitness Statements.

    Science.gov (United States)

    Roberts, Kim P.

    2002-01-01

    Outlines five perspectives addressing alternate aspects of the development of children's source monitoring: source-monitoring theory, fuzzy-trace theory, schema theory, person-based perspective, and mental-state reasoning model. Discusses research areas with relation to forensic developmental psychology: agent identity, prospective processing,…

  7. MEG (Magnetoencephalography) multipolar modeling of distributed sources using RAP-MUSIC (Recursively Applied and Projected Multiple Signal Characterization)

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J. C. (John C.); Baillet, S. (Sylvain); Jerbi, K. (Karim); Leahy, R. M. (Richard M.)

    2001-01-01

    We describe the use of truncated multipolar expansions for producing dynamic images of cortical neural activation from measurements of the magnetoencephalogram. We use a signal-subspace method to find the locations of a set of multipolar sources, each of which represents a region of activity in the cerebral cortex. Our method builds up an estimate of the sources in a recursive manner, i.e. we first search for point current dipoles, then magnetic dipoles, and finally first order multipoles. The dynamic behavior of these sources is then computed using a linear fit to the spatiotemporal data. The final step in the procedure is to map each of the multipolar sources into an equivalent distributed source on the cortical surface. The method is illustrated through an application to epileptic interictal MEG data.

  8. A principal-component and least-squares method for allocating polycyclic aromatic hydrocarbons in sediment to multiple sources

    International Nuclear Information System (INIS)

    Burns, W.A.; Mankiewicz, P.J.; Bence, A.E.; Page, D.S.; Parker, K.R.

    1997-01-01

    A method was developed to allocate polycyclic aromatic hydrocarbons (PAHs) in sediment samples to the PAH sources from which they came. The method uses principal-component analysis to identify possible sources and a least-squares model to find the source mix that gives the best fit of 36 PAH analytes in each sample. The method identified 18 possible PAH sources in a large set of field data collected in Prince William Sound, Alaska, USA, after the 1989 Exxon Valdez oil spill, including diesel oil, diesel soot, spilled crude oil in various weathering states, natural background, creosote, and combustion products from human activities and forest fires. Spill oil was generally found to be a small increment of the natural background in subtidal sediments, whereas combustion products were often the predominant sources for subtidal PAHs near sites of past or present human activity. The method appears to be applicable to other situations, including other spills

  9. Numerical databases in marine biology

    Digital Repository Service at National Institute of Oceanography (India)

    Sarupria, J.S.; Bhargava, R.M.S.

    stream_size 9 stream_content_type text/plain stream_name Natl_Workshop_Database_Networking_Mar_Biol_1991_45.pdf.txt stream_source_info Natl_Workshop_Database_Networking_Mar_Biol_1991_45.pdf.txt Content-Encoding ISO-8859-1 Content-Type... text/plain; charset=ISO-8859-1 ...

  10. RF model of the distribution system as a communication channel, phase 2. Volume 4: Sofware source program and illustrations ASCII database listings

    Science.gov (United States)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Listings of source programs and some illustrative examples of various ASCII data base files are presented. The listings are grouped into the following categories: main programs, subroutine programs, illustrative ASCII data base files. Within each category files are listed alphabetically.

  11. Database of Interacting Proteins (DIP)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The DIP database catalogs experimentally determined interactions between proteins. It combines information from a variety of sources to create a single, consistent...

  12. Real-time particle monitor calibration factors and PM2.5 emission factors for multiple indoor sources.

    Science.gov (United States)

    Dacunto, Philip J; Cheng, Kai-Chung; Acevedo-Bolton, Viviana; Jiang, Ruo-Ting; Klepeis, Neil E; Repace, James L; Ott, Wayne R; Hildemann, Lynn M

    2013-08-01

    Indoor sources can greatly contribute to personal exposure to particulate matter less than 2.5 μm in diameter (PM2.5). To accurately assess PM2.5 mass emission factors and concentrations, real-time particle monitors must be calibrated for individual sources. Sixty-six experiments were conducted with a common, real-time laser photometer (TSI SidePak™ Model AM510 Personal Aerosol Monitor) and a filter-based PM2.5 gravimetric sampler to quantify the monitor calibration factors (CFs), and to estimate emission factors for common indoor sources including cigarettes, incense, cooking, candles, and fireplaces. Calibration factors for these indoor sources were all significantly less than the factory-set CF of 1.0, ranging from 0.32 (cigarette smoke) to 0.70 (hamburger). Stick incense had a CF of 0.35, while fireplace emissions ranged from 0.44-0.47. Cooking source CFs ranged from 0.41 (fried bacon) to 0.65-0.70 (fried pork chops, salmon, and hamburger). The CFs of combined sources (e.g., cooking and cigarette emissions mixed) were linear combinations of the CFs of the component sources. The highest PM2.5 emission factors per time period were from burned foods and fireplaces (15-16 mg min(-1)), and the lowest from cooking foods such as pizza and ground beef (0.1-0.2 mg min(-1)).

  13. Retrospective US database analysis of persistence with glatiramer acetate vs. available disease-modifying therapies for multiple sclerosis: 2001-2010.

    Science.gov (United States)

    Oleen-Burkey, MerriKay; Cyhaniuk, Anissa; Swallow, Eric

    2014-01-14

    Long-term persistence to treatment for chronic disease is difficult for patients to achieve, regardless of the disease or medication being used. The objective of this investigation was to examine treatment persistence with glatiramer acetate (GA) relative to available disease-modifying therapies (DMT) for multiple sclerosis (MS) over 12-, 24- and 36-month periods. Data from Clinformatics™ for DataMart affiliated with OptumInsight was used to identify patients using DMT between 2001 and 2010. Patients with 12, 24, and 36 months of follow-up were included. Persistence was defined as continuous use of the same DMT for the duration of follow-up regardless of treatment gaps. Regimen changes including re-initiation of therapy following gaps of 15 days or more, switching therapy, and DMT discontinuation were investigated. Descriptive statistics were used to summarize the results. Cohorts of GA users with 12 months (n = 12,144), 24 months (n = 7,386) and 36 months (n = 4,693) of follow-up were identified. Persistence rates with GA were 80% for all time periods; discontinuation rates declined over time while switching increased modestly. In contrast, the full DMT-treated cohorts showed persistent rates of 68.3% at 12 months (n = 35,312), 53.9% at 24 months (n = 21,927), and 70.1% at 36 months (n = 14,343). As with these full DMT-treated cohorts, the proportion of GA users remaining on their initial therapy without a gap of 15 days or more decreased with length of follow-up. However, the proportion of GA users with a gap in treatment who re-initiated GA increased over time (64.4% at 12 months; 75.1% at 24 months, and 80.1% at 36 months) while those in the full DMT-treated cohorts re-initiated therapy at rates of only 50-60%. Persistence rates for GA were 80% for the 12-, 24- and 36-month time periods in contrast with the full DMT-treated cohorts whose persistence rates never exceeded 70.0%. Although there were more gaps in therapy of 15 days or more with all DMT over time

  14. Use of a Bayesian isotope mixing model to estimate proportional contributions of multiple nitrate sources in surface water

    International Nuclear Information System (INIS)

    Xue Dongmei; De Baets, Bernard; Van Cleemput, Oswald; Hennessy, Carmel; Berglund, Michael; Boeckx, Pascal

    2012-01-01

    To identify different NO 3 − sources in surface water and to estimate their proportional contribution to the nitrate mixture in surface water, a dual isotope and a Bayesian isotope mixing model have been applied for six different surface waters affected by agriculture, greenhouses in an agricultural area, and households. Annual mean δ 15 N–NO 3 − were between 8.0 and 19.4‰, while annual mean δ 18 O–NO 3 − were given by 4.5–30.7‰. SIAR was used to estimate the proportional contribution of five potential NO 3 − sources (NO 3 − in precipitation, NO 3 − fertilizer, NH 4 + in fertilizer and rain, soil N, and manure and sewage). SIAR showed that “manure and sewage” contributed highest, “soil N”, “NO 3 − fertilizer” and “NH 4 + in fertilizer and rain” contributed middle, and “NO 3 − in precipitation” contributed least. The SIAR output can be considered as a “fingerprint” for the NO 3 − source contributions. However, the wide range of isotope values observed in surface water and of the NO 3 − sources limit its applicability. - Highlights: ► The dual isotope approach (δ 15 N- and δ 18 O–NO 3 − ) identify dominant nitrate sources in 6 surface waters. ► The SIAR model estimate proportional contributions for 5 nitrate sources. ► SIAR is a reliable approach to assess temporal and spatial variations of different NO 3 − sources. ► The wide range of isotope values observed in surface water and of the nitrate sources limit its applicability. - This paper successfully applied a dual isotope approach and Bayesian isotopic mixing model to identify and quantify 5 potential nitrate sources in surface water.

  15. Developments in diffraction databases

    International Nuclear Information System (INIS)

    Jenkins, R.

    1999-01-01

    Full text: There are a number of databases available to the diffraction community. Two of the more important of these are the Powder Diffraction File (PDF) maintained by the International Centre for Diffraction Data (ICDD), and the Inorganic Crystal Structure Database (ICSD) maintained by Fachsinformationzentrum (FIZ, Karlsruhe). In application, the PDF has been used as an indispensable tool in phase identification and identification of unknowns. The ICSD database has extensive and explicit reference to the structures of compounds: atomic coordinates, space group and even thermal vibration parameters. A similar database, but for organic compounds, is maintained by the Cambridge Crystallographic Data Centre. These databases are often used as independent sources of information. However, little thought has been given on how to exploit the combined properties of structural database tools. A recently completed agreement between ICDD and FIZ, plus ICDD and Cambridge, provides a first step in complementary use of the PDF and the ICSD databases. The focus of this paper (as indicated below) is to examine ways of exploiting the combined properties of both databases. In 1996, there were approximately 76,000 entries in the PDF and approximately 43,000 entries in the ICSD database. The ICSD database has now been used to calculate entries in the PDF. Thus, to derive d-spacing and peak intensity data requires the synthesis of full diffraction patterns, i.e., we use the structural data in the ICSD database and then add instrumental resolution information. The combined data from PDF and ICSD can be effectively used in many ways. For example, we can calculate PDF data for an ideally random crystal distribution and also in the absence of preferred orientation. Again, we can use systematic studies of intermediate members in solid solutions series to help produce reliable quantitative phase analyses. In some cases, we can study how solid solution properties vary with composition and

  16. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  17. Database Vs Data Warehouse

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Data warehouse technology includes a set of concepts and methods that offer the users useful information for decision making. The necessity to build a data warehouse arises from the necessity to improve the quality of information in the organization. The date proceeding from different sources, having a variety of forms - both structured and unstructured, are filtered according to business rules and are integrated in a single large data collection. Using informatics solutions, managers have understood that data stored in operational systems - including databases, are an informational gold mine that must be exploited. Data warehouses have been developed to answer the increasing demands for complex analysis, which could not be properly achieved with operational databases. The present paper emphasizes some of the criteria that information application developers can use in order to choose between a database solution or a data warehouse one.

  18. Multiple-source tracking: Investigating sources of pathogens, nutrients, and sediment in the Upper Little River Basin, Kentucky, water years 2013–14

    Science.gov (United States)

    Crain, Angela S.; Cherry, Mac A.; Williamson, Tanja N.; Bunch, Aubrey R.

    2017-09-20

    The South Fork Little River (SFLR) and the North Fork Little River (NFLR) are two major headwater tributaries that flow into the Little River just south of Hopkinsville, Kentucky. Both tributaries are included in those water bodies in Kentucky and across the Nation that have been reported with declining water quality. Each tributary has been listed by the Kentucky Energy and Environment Cabinet—Kentucky Division of Water in the 303(d) List of Waters for Kentucky Report to Congress as impaired by nutrients, pathogens, and sediment for contact recreation from point and nonpoint sources since 2002. In 2009, the Kentucky Energy and Environment Cabinet—Kentucky Division of Water developed a pathogen total maximum daily load (TMDL) for the Little River Basin including the SFLR and NFLR Basins. Future nutrient and suspended-sediment TMDLs are planned once nutrient criteria and suspended-sediment protocols have been developed for Kentucky. In this study, different approaches were used to identify potential sources of fecal-indicator bacteria (FIB), nitrate, and suspended sediment; to inform the TMDL process; and to aid in the implementation of effective watershed-management activities. The main focus of source identification was in the SFLR Basin.To begin understanding the potential sources of fecal contamination, samples were collected at 19 sites for densities of FIB (E. coli) in water and fluvial sediment and at 11 sites for Bacteroidales genetic markers (General AllBac, human HF183, ruminant BoBac, canid BacCan, and waterfowl GFD) during the recreational season (May through October) in 2013 and 2014. Results indicated 34 percent of all E. coli water samples (n=227 samples) did not meet the U.S. Environmental Protection Agency 2012 recommended national criteria for primary recreational waters. No criterion currently exists for E. coli in fluvial sediment. By use of the Spearman’s rank correlation test, densities of FIB in fluvial sediments were observed to have a

  19. Part 2. Development of Enhanced Statistical Methods for Assessing Health Effects Associated with an Unknown Number of Major Sources of Multiple Air Pollutants.

    Science.gov (United States)

    Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford

    2015-06-01

    A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2

  20. Ionizing radiation sources: very diversified means, multiple applications and a changing regulatory environment. Conference proceedings; Les sources de rayonnements ionisants: des moyens tres diversifies, des applications multiples et une reglementation en evolution. Recueil des presentations

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-11-15

    This document brings together the available presentations given at the conference organised by the French society of radiation protection about ionizing radiation source means, applications and regulatory environment. Twenty eight presentations (slides) are compiled in this document and deal with: 1 - Overview of sources - some quantitative data from the national inventory of ionizing radiation sources (Yann Billarand, IRSN); 2 - Overview of sources (Jerome Fradin, ASN); 3 - Regulatory framework (Sylvie Rodde, ASN); 4 - Alternatives to Iridium radiography - the case of pressure devices at the manufacturing stage (Henri Walaszek, Cetim; Bruno Kowalski, Welding Institute); 5 - Dosimetric stakes of medical scanner examinations (Jean-Louis Greffe, Charleroi hospital of Medical University); 6 - The removal of ionic smoke detectors (Bruno Charpentier, ASN); 7 - Joint-activity and reciprocal liabilities - Organisation of labour risk prevention in case of companies joint-activity (Paulo Pinto, DGT); 8 - Consideration of gamma-graphic testing in the organization of a unit outage activities (Jean-Gabriel Leonard, EDF); 9 - Radiological risk control at a closed and independent work field (Stephane Sartelet, Areva); 10 - Incidents and accidents status and typology (Pascale Scanff, IRSN); 11 - Regional overview of radiation protection significant events (Philippe Menechal, ASN); 12 - Incident leading to a tritium contamination in and urban area - consequences and experience feedback (Laurence Fusil, CEA); 13 - Experience feedback - loss of sealing of a calibration source (Philippe Mougnard, Areva); 14 - Blocking incident of a {sup 60}Co source (Bruno Delille, Salvarem); 15 - Triggering of gantry's alarm: status of findings (Philippe Prat, Syctom); 16 - Non-medical electric devices: regulatory changes (Sophie Dagois, IRSN; Jerome Fradin, ASN); 17 - Evaluation of the dose equivalent rate in pulsed fields: method proposed by the IRSN and implementation test (Laurent Donadille

  1. Having a Lot of a Good Thing: Multiple Important Group Memberships as a Source of Self-Esteem

    Science.gov (United States)

    Jetten, Jolanda; Branscombe, Nyla R.; Haslam, S. Alexander; Haslam, Catherine; Cruwys, Tegan; Jones, Janelle M.; Cui, Lijuan; Dingle, Genevieve; Liu, James; Murphy, Sean; Thai, Anh; Walter, Zoe; Zhang, Airong

    2015-01-01

    Membership in important social groups can promote a positive identity. We propose and test an identity resource model in which personal self-esteem is boosted by membership in additional important social groups. Belonging to multiple important group memberships predicts personal self-esteem in children (Study 1a), older adults (Study 1b), and former residents of a homeless shelter (Study 1c). Study 2 shows that the effects of multiple important group memberships on personal self-esteem are not reducible to number of interpersonal ties. Studies 3a and 3b provide longitudinal evidence that multiple important group memberships predict personal self-esteem over time. Studies 4 and 5 show that collective self-esteem mediates this effect, suggesting that membership in multiple important groups boosts personal self-esteem because people take pride in, and derive meaning from, important group memberships. Discussion focuses on when and why important group memberships act as a social resource that fuels personal self-esteem. PMID:26017554

  2. Having a lot of a good thing: multiple important group memberships as a source of self-esteem.

    Directory of Open Access Journals (Sweden)

    Jolanda Jetten

    Full Text Available Membership in important social groups can promote a positive identity. We propose and test an identity resource model in which personal self-esteem is boosted by membership in additional important social groups. Belonging to multiple important group memberships predicts personal self-esteem in children (Study 1a, older adults (Study 1b, and former residents of a homeless shelter (Study 1c. Study 2 shows that the effects of multiple important group memberships on personal self-esteem are not reducible to number of interpersonal ties. Studies 3a and 3b provide longitudinal evidence that multiple important group memberships predict personal self-esteem over time. Studies 4 and 5 show that collective self-esteem mediates this effect, suggesting that membership in multiple important groups boosts personal self-esteem because people take pride in, and derive meaning from, important group memberships. Discussion focuses on when and why important group memberships act as a social resource that fuels personal self-esteem.

  3. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  4. SIMS: addressing the problem of heterogeneity in databases

    Science.gov (United States)

    Arens, Yigal

    1997-02-01

    The heterogeneity of remotely accessible databases -- with respect to contents, query language, semantics, organization, etc. -- presents serious obstacles to convenient querying. The SIMS (single interface to multiple sources) system addresses this global integration problem. It does so by defining a single language for describing the domain about which information is stored in the databases and using this language as the query language. Each database to which SIMS is to provide access is modeled using this language. The model describes a database's contents, organization, and other relevant features. SIMS uses these models, together with a planning system drawing on techniques from artificial intelligence, to decompose a given user's high-level query into a series of queries against the databases and other data manipulation steps. The retrieval plan is constructed so as to minimize data movement over the network and maximize parallelism to increase execution speed. SIMS can recover from network failures during plan execution by obtaining data from alternate sources, when possible. SIMS has been demonstrated in the domains of medical informatics and logistics, using real databases.

  5. Epidemiologic study of neural tube defects in Los Angeles County. I. Prevalence at birth based on multiple sources of case ascertainment

    Energy Technology Data Exchange (ETDEWEB)

    Sever, L.E. (Pacific Northwest Lab., Richland, WA); Sanders, M.; Monsen, R.

    1982-01-01

    Epidemiologic studies of the neural tube defects (NTDs), anencephalus and spina bifida, have for the most part been based on single sources of case ascertainment in past studies. The present investigation attempts total ascertainment of NTD cases in the newborn population of Los Angeles County residents for the period 1966 to 1972. Design of the study, sources of data, and estimates of prevalence rates based on single and multiple sources of case ascertainment are here discussed. Anencephalus cases totaled 448, spina bifida 442, and encephalocele 72, giving prevalence rates of 0.52, 0.51, and 0.08 per 1000 total births, respectively, for these neural tube defects - rates considered to be low. The Los Angeles County prevalence rates are compared with those of other recent North American studies and support is provided for earlier suggestions of low rates on the West Coast.

  6. Fonte investigadora em Educação: registros do banco de teses da CAPES An investigation source in Education: the records in the CAPES theses database

    Directory of Open Access Journals (Sweden)

    Renata de Almeida Vieira

    2007-08-01

    Full Text Available Apresenta-se neste artigo o resultado de uma investigação realizada no Banco de Teses CAPES - Resumos, banco de dados multidisciplinar disponibilizado via internet pela Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES. Ao congregar informações básicas de pesquisas de pós-graduação stricto sensu - mestrado e doutorado - das diversas áreas e subáreas do conhecimento humano, desenvolvidas em Instituições de Ensino Superior - IES - públicas e particulares de todo o território nacional e defendidas a partir de 1987, esse banco, por meio de resumos, constitui-se em uma fonte de pesquisa abrangente, bem como em um instrumento relevante de divulgação do conhecimento científico brasileiro. Considerada tal abrangência e relevância, ao perceber em consulta às suas bases algumas incongruências nos dados recuperados pelo sistema, bem como falta de informações senão importantes, mas efetivamente necessárias àqueles que pesquisam ou buscam informações preliminares de estudos, efetuou-se uma investigação exploratória em uma amostra de seus registros, a qual se pautou em pesquisa documental e foi iniciada por meio da palavra-chave preconceito. Fruto dessa exploração é o presente ensaio, que tem por objetivo apresentar um panorama das constatações feitas acerca do aspecto formal dos registros e das estrutura dos resumos constantes nesse banco. Objetiva-se, também, marcar a importância de se realizar estudos que contribuam para o fomento de um debate crítico-reflexivo que implique ações resolutivas dos problemas constatados.This article presents results of an investigation made with the CAPES Theses Database - Abstracts, a multidisciplinary database offered through the Internet by CAPES - Coordination for the Improvement of Higher Education Personnel. By congregating basic information about stricto sensu graduate researches - Master and Doctorate levels - developed in various areas and sub-areas of

  7. Quantifying methane emission from fugitive sources by combining tracer release and downwind measurements - a sensitivity analysis based on multiple field surveys.

    Science.gov (United States)

    Mønster, Jacob G; Samuelsson, Jerker; Kjeldsen, Peter; Rella, Chris W; Scheutz, Charlotte

    2014-08-01

    Using a dual species methane/acetylene instrument based on cavity ring down spectroscopy (CRDS), the dynamic plume tracer dispersion method for quantifying the emission rate of methane was successfully tested in four measurement campaigns: (1) controlled methane and trace gas release with different trace gas configurations, (2) landfill with unknown emission source locations, (3) landfill with closely located emission sources, and (4) comparing with an Fourier transform infrared spectroscopy (FTIR) instrument using multiple trace gasses for source separation. The new real-time, high precision instrument can measure methane plumes more than 1.2 km away from small sources (about 5 kg h(-1)) in urban areas with a measurement frequency allowing plume crossing at normal driving speed. The method can be used for quantification of total methane emissions from diffuse area sources down to 1 kg per hour and can be used to quantify individual sources with the right choice of wind direction and road distance. The placement of the trace gas is important for obtaining correct quantification and uncertainty of up to 36% can be incurred when the trace gas is not co-located with the methane source. Measurements made at greater distances are less sensitive to errors in trace gas placement and model calculations showed an uncertainty of less than 5% in both urban and open-country for placing the trace gas 100 m from the source, when measurements were done more than 3 km away. Using the ratio of the integrated plume concentrations of tracer gas and methane gives the most reliable results for measurements at various distances to the source, compared to the ratio of the highest concentration in the plume, the direct concentration ratio and using a Gaussian plume model. Under suitable weather and road conditions, the CRDS system can quantify the emission from different sources located close to each other using only one kind of trace gas due to the high time resolution, while the FTIR

  8. Fast in-database cross-matching of high-cadence, high-density source lists with an up-to-date sky model

    NARCIS (Netherlands)

    L.H.A. Scheers (Bart); S. Bloemen; H.F. Mühleisen (Hannes); P. Schellart; A. Van Elteren (Arjen); M.L. Kersten (Martin); P.J. Groot

    2018-01-01

    htmlabstract

    Coming high-cadence wide-field optical telescopes will image hundreds of thousands of sources per minute. Besides inspecting the near real-time data streams for transient and variability events, the accumulated data archive is a wealthy laboratory for making complementary scientific

  9. Testing for multiple invasion routes and source populations for the invasive brown treesnake (Boiga irregularis) on Guam: implications for pest management

    Science.gov (United States)

    Richmond, Jonathan Q.; Wood, Dustin A.; Stanford, James W.; Fisher, Robert N.

    2014-01-01

    The brown treesnake (Boiga irregularis) population on the Pacific island of Guam has reached iconic status as one of the most destructive invasive species of modern times, yet no published works have used genetic data to identify a source population. We used DNA sequence data from multiple genetic markers and coalescent-based phylogenetic methods to place the Guam population within the broader phylogeographic context of B. irregularis across its native range and tested whether patterns of genetic variation on the island are consistent with one or multiple introductions from different source populations. We also modeled a series of demographic scenarios that differed in the effective size and duration of a population bottleneck immediately following the invasion on Guam, and measured the fit of these simulations to the observed data using approximate Bayesian computation. Our results exclude the possibility of serial introductions from different source populations, and instead verify a single origin from the Admiralty Archipelago off the north coast of Papua New Guinea. This finding is consistent with the hypothesis thatB. irregularis was accidentally transported to Guam during military relocation efforts at the end of World War II. Demographic model comparisons suggest that multiple snakes were transported to Guam from the source locality, but that fewer than 10 individuals could be responsible for establishing the population. Our results also provide evidence that low genetic diversity stemming from the founder event has not been a hindrance to the ecological success of B. irregularis on Guam, and at the same time offers a unique ‘genetic opening’ to manage snake density using classical biological approaches.

  10. Pt Electrodes Enable the Formation of μ4-O Centers in MOF-5 from Multiple Oxygen Sources.

    Science.gov (United States)

    Li, Minyuan M; Dincă, Mircea

    2017-10-04

    The μ 4 -O 2- ions in the Zn 4 O(O 2 C-) 6 secondary building units of Zn 4 O(1,4-benzenedicarboxylate) 3 (MOF-5) electrodeposited under cathodic bias can be sourced from nitrate, water, and molecular oxygen when using platinum gauze as working electrodes. The use of Zn(ClO 4 ) 2 ·6H 2 O, anhydrous Zn(NO 3 ) 2 , or anhydrous Zn(CF 3 SO 3 ) 2 as Zn 2+ sources under rigorous control of other sources of oxygen, including water and O 2 , confirm that the source of the μ 4 -O 2- ions can be promiscuous. Although this finding reveals a relatively complicated manifold of electrochemical processes responsible for the crystallization of MOF-5 under cathodic bias, it further highlights the importance of hydroxide intermediates in the formation of the Zn 4 O(O 2 C-R) secondary building units in this iconic material and is illustrative of the complicated crystallization mechanisms of metal-organic frameworks in general.

  11. Design and commissioning of an aberration-corrected ultrafast spin-polarized low energy electron microscope with multiple electron sources.

    Science.gov (United States)

    Wan, Weishi; Yu, Lei; Zhu, Lin; Yang, Xiaodong; Wei, Zheng; Liu, Jefferson Zhe; Feng, Jun; Kunze, Kai; Schaff, Oliver; Tromp, Ruud; Tang, Wen-Xin

    2017-03-01

    We describe the design and commissioning of a novel aberration-corrected low energy electron microscope (AC-LEEM). A third magnetic prism array (MPA) is added to the standard AC-LEEM with two prism arrays, allowing the incorporation of an ultrafast spin-polarized electron source alongside the standard cold field emission electron source, without degrading spatial resolution. The high degree of symmetries of the AC-LEEM are utilized while we design the electron optics of the ultrafast spin-polarized electron source, so as to minimize the deleterious effect of time broadening, while maintaining full control of electron spin. A spatial resolution of 2nm and temporal resolution of 10ps (ps) are expected in the future time resolved aberration-corrected spin-polarized LEEM (TR-AC-SPLEEM). The commissioning of the three-prism AC-LEEM has been successfully finished with the cold field emission source, with a spatial resolution below 2nm. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Stackfile Database

    Science.gov (United States)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  13. Advances in knowledge discovery in databases

    CERN Document Server

    Adhikari, Animesh

    2015-01-01

    This book presents recent advances in Knowledge discovery in databases (KDD) with a focus on the areas of market basket database, time-stamped databases and multiple related databases. Various interesting and intelligent algorithms are reported on data mining tasks. A large number of association measures are presented, which play significant roles in decision support applications. This book presents, discusses and contrasts new developments in mining time-stamped data, time-based data analyses, the identification of temporal patterns, the mining of multiple related databases, as well as local patterns analysis.  

  14. NREL: U.S. Life Cycle Inventory Database - About the LCI Database Project

    Science.gov (United States)

    About the LCI Database Project The U.S. Life Cycle Inventory (LCI) Database is a publicly available database that allows users to objectively review and compare analysis results that are based on similar source of critically reviewed LCI data through its LCI Database Project. NREL's High-Performance

  15. Development of Level-2 PSA Technology: A Development of the Database of the Parametric Source Term for Kori Unit 1 Using the MAAP4 Code

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Chang Soon; Mun, Ju Hyun; Yun, Jeong Ick; Cho, Young Hoo; Kim, Chong Uk [Seoul National University, Seoul (Korea, Republic of)

    1997-07-15

    To quantify the severe accident source term of the parametric model method, the uncertainty of the parameters should be analyzed. Generally, to analyze the uncertainties, the cumulative distribution functions(CDF`S) of the parameters are derived. This report introduces a method of derivation of the CDF`s of the basic parameters, FCOR, FVES and FDCH. The calculation tool of the source term is the MAAP version 4.0. In the MAAP code, there are model parameters to consider an uncertain physical and/or chemical phenomenon. In general, the parameters have not a point value but a range. In this paper, considering this point, the input values of model parameters influencing each parameter are sampled using LHS. Then, the calculation results are shown in the cumulative distribution form. For a case study, the CDF`s of FCOR, FVES and FDCH of KORI unit 1 are derived. The target scenarios for the calculation are the ones whose initial events are large LOCA, small LOCA and transient, respectively. It is found that the distributions of this study are consistent to those of NUREG-1150 and are proven to be adequate in assessing the uncertainties in the severe accident source term of KORI Unit 1. 15 refs., 27 tabs., 4 figs. (author)

  16. The Chandra Source Catalog: Storage and Interfaces

    Science.gov (United States)

    van Stone, David; Harbo, Peter N.; Tibbetts, Michael S.; Zografou, Panagoula; Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is part of the Chandra Data Archive (CDA) at the Chandra X-ray Center. The catalog contains source properties and associated data objects such as images, spectra, and lightcurves. The source properties are stored in relational databases and the data objects are stored in files with their metadata stored in databases. The CDA supports different versions of the catalog: multiple fixed release versions and a live database version. There are several interfaces to the catalog: CSCview, a graphical interface for building and submitting queries and for retrieving data objects; a command-line interface for property and source searches using ADQL; and VO-compliant services discoverable though the VO registry. This poster describes the structure of the catalog and provides an overview of the interfaces.

  17. SIAPEM - Brazilian Software Database for Multiple Sclerosis ...

    African Journals Online (AJOL)

    Resultados: Se utiliza con el programa ACCESS 2000, y es un banco de datos de software que permite obtener resultados en forma inmediata de los datos entrados, no importando el número de pacientes existente, con un diseño simple y práctico. Conclusiones: Es un proceso que ahorra tiempo y permite mantener una ...

  18. Global estimates of CO sources with high resolution by adjoint inversion of multiple satellite datasets (MOPITT, AIRS, SCIAMACHY, TES

    Directory of Open Access Journals (Sweden)

    M. Kopacz

    2010-02-01

    Full Text Available We combine CO column measurements from the MOPITT, AIRS, SCIAMACHY, and TES satellite instruments in a full-year (May 2004–April 2005 global inversion of CO sources at 4°×5° spatial resolution and monthly temporal resolution. The inversion uses the GEOS-Chem chemical transport model (CTM and its adjoint applied to MOPITT, AIRS, and SCIAMACHY. Observations from TES, surface sites (NOAA/GMD, and aircraft (MOZAIC are used for evaluation of the a posteriori solution. Using GEOS-Chem as a common intercomparison platform shows global consistency between the different satellite datasets and with the in situ data. Differences can be largely explained by different averaging kernels and a priori information. The global CO emission from combustion as constrained in the inversion is 1350 Tg a−1. This is much higher than current bottom-up emission inventories. A large fraction of the correction results from a seasonal underestimate of CO sources at northern mid-latitudes in winter and suggests a larger-than-expected CO source from vehicle cold starts and residential heating. Implementing this seasonal variation of emissions solves the long-standing problem of models underestimating CO in the northern extratropics in winter-spring. A posteriori emissions also indicate a general underestimation of biomass burning in the GFED2 inventory. However, the tropical biomass burning constraints are not quantitatively consistent across the different datasets.

  19. Brain source localization: A new method based on MUltiple SIgnal Classification algorithm and spatial sparsity of the field signal for electroencephalogram measurements

    Science.gov (United States)

    Vergallo, P.; Lay-Ekuakille, A.

    2013-08-01

    Brain activity can be recorded by means of EEG (Electroencephalogram) electrodes placed on the scalp of the patient. The EEG reflects the activity of groups of neurons located in the head, and the fundamental problem in neurophysiology is the identification of the sources responsible of brain activity, especially if a seizure occurs and in this case it is important to identify it. The studies conducted in order to formalize the relationship between the electromagnetic activity in the head and the recording of the generated external field allow to know pattern of brain activity. The inverse problem, that is given the sampling field at different electrodes the underlying asset must be determined, is more difficult because the problem may not have a unique solution, or the search for the solution is made difficult by a low spatial resolution which may not allow to distinguish between activities involving sources close to each other. Thus, sources of interest may be obscured or not detected and known method in source localization problem as MUSIC (MUltiple SIgnal Classification) could fail. Many advanced source localization techniques achieve a best resolution by exploiting sparsity: if the number of sources is small as a result, the neural power vs. location is sparse. In this work a solution based on the spatial sparsity of the field signal is presented and analyzed to improve MUSIC method. For this purpose, it is necessary to set a priori information of the sparsity in the signal. The problem is formulated and solved using a regularization method as Tikhonov, which calculates a solution that is the better compromise between two cost functions to minimize, one related to the fitting of the data, and another concerning the maintenance of the sparsity of the signal. At the first, the method is tested on simulated EEG signals obtained by the solution of the forward problem. Relatively to the model considered for the head and brain sources, the result obtained allows to

  20. Robust heart rate estimation from multiple asynchronous noisy sources using signal quality indices and a Kalman filter

    International Nuclear Information System (INIS)

    Li, Q; Mark, R G; Clifford, G D

    2008-01-01

    Physiological signals such as the electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often severely corrupted by noise, artifact and missing data, which lead to large errors in the estimation of the heart rate (HR) and ABP. A robust HR estimation method is described that compensates for these problems. The method is based upon the concept of fusing multiple signal quality indices (SQIs) and HR estimates derived from multiple electrocardiogram (ECG) leads and an invasive ABP waveform recorded from ICU patients. Physiological SQIs were obtained by analyzing the statistical characteristics of each waveform and their relationships to each other. HR estimates from the ECG and ABP are tracked with separate Kalman filters, using a modified update sequence based upon the individual SQIs. Data fusion of each HR estimate was then performed by weighting each estimate by the Kalman filters' SQI-modified innovations. This method was evaluated on over 6000 h of simultaneously acquired ECG and ABP from a 437 patient subset of ICU data by adding real ECG and realistic artificial ABP noise. The method provides an accurate HR estimate even in the presence of high levels of persistent noise and artifact, and during episodes of extreme bradycardia and tachycardia

  1. Development of multiple source data processing for structural analysis at a regional scale. [digital remote sensing in geology

    Science.gov (United States)

    Carrere, Veronique

    1990-01-01

    Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.

  2. Analysis of drug-drug interactions among patients receiving antiretroviral regimens using data from a large open-source prescription database.

    Science.gov (United States)

    Patel, Nimish; Borg, Peter; Haubrich, Richard; McNicholl, Ian

    2018-06-14

    Results of a study of contraindicated concomitant medication use among recipients of preferred antiretroviral therapy (ART) regimens are reported. A retrospective study was conducted to evaluate concomitant medication use in a cohort of previously treatment-naive, human immunodeficiency virus (HIV)-infected U.S. patients prescribed preferred ART regimens during the period April 2014-March 2015. Data were obtained from a proprietary longitudinal prescription database; elements retrieved included age, sex, and prescription data. The outcome of interest was the frequency of drug-drug interactions (DDIs) associated with concomitant use of contraindicated medications. Data on 25,919 unique treatment-naive patients who used a preferred ART regimen were collected. Overall, there were 384 instances in which a contraindicated medication was dispensed for concurrent use with a recommended ART regimen. Rates of contraindicated concomitant medication use differed significantly by ART regimen; the highest rate (3.2%) was for darunavir plus ritonavir plus emtricitabine-tenofovir disoproxil fumarate (DRV plus RTV plus FTC/TDF), followed by elvitegravir-cobicistat-emtricitabine-tenofovir disoproxil fumarate (EVG/c/FTC/TDF)(2.8%). The highest frequencies of DDIs were associated with ART regimens that included a pharmacoenhancing agent: DRV plus RTV plus FTC/TDF (3.2%) and EVG/c/FTC/TDF (2.8%). In a large population of treatment-naive HIV-infected patients, ART regimens that contained a pharmacoenhancing agent were involved most frequently in contraindicated medication-related DDIs. All of the DDIs could have been avoided by using therapeutic alternatives within the same class not associated with a DDI. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  3. Variable sulfur isotope composition of sulfides provide evidence for multiple sources of contamination in the Rustenburg Layered Suite, Bushveld Complex

    Science.gov (United States)

    Magalhães, Nivea; Penniston-Dorland, Sarah; Farquhar, James; Mathez, Edmond A.

    2018-06-01

    The Rustenburg Layered Suite (RLS) of the Bushveld Complex (BC) is famous for its platinum group element (PGE) ore, which is hosted in sulfides. The source of sulfur necessary to generate this type of mineralization is inferred to be the host rock of the intrusion. The RLS has a sulfur isotopic signature that indicates the presence of Archean surface-derived material (Δ33 S ≠ 0) in the magma. This signature, with an average value of Δ33 S = 0.112 ± 0.024 ‰, deviates from the expected Δ33 S value of the mantle of 0 ± 0.008 ‰. Previous work suggested that this signature is uniform throughout the RLS, which contrasts with radiogenic isotopes which vary throughout the igneous stratigraphy of the RLS. In this study, samples from key intervals within the igneous stratigraphy were analyzed, showing that Δ33 S values vary in the same stratigraphic levels as Sr and Nd isotopes. However, the variation is not consistent; in some levels there is a positive correlation and in others a negative correlation. This observation suggests that in some cases distinct magma pulses contained assimilated sulfur from different sources. Textural analysis shows no evidence for late addition of sulfur. These results also suggest that it is unlikely that large-scale assimilation and/or efficient mixing of host rock material in a single magma chamber occurred during emplacement. The data do not uniquely identify the source of sulfur in the different layers of the RLS, but the variation in sulfur isotope composition and its relationship to radiogenic isotope data calls for a reevaluation of the models for the formation and evolution of the RLS, which has the potential to impact the knowledge of how PGE deposits form.

  4. An analysis of the vapor flow and the heat conduction through the liquid-wick and pipe wall in a heat pipe with single or multiple heat sources

    Science.gov (United States)

    Chen, Ming-Ming; Faghri, Amir

    1990-01-01

    A numerical analysis is presented for the overall performance of heat pipes with single or multiple heat sources. The analysis includes the heat conduction in the wall and liquid-wick regions as well as the compressibility effect of the vapor inside the heat pipe. The two-dimensional elliptic governing equations in conjunction with the thermodynamic equilibrium relation and appropriate boundary conditions are solved numerically. The solutions are in agreement with existing experimental data for the vapor and wall temperatures at both low and high operating temperatures.

  5. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  6. Three-dimensional printing of X-ray computed tomography datasets with multiple materials using open-source data processing.

    Science.gov (United States)

    Sander, Ian M; McGoldrick, Matthew T; Helms, My N; Betts, Aislinn; van Avermaete, Anthony; Owers, Elizabeth; Doney, Evan; Liepert, Taimi; Niebur, Glen; Liepert, Douglas; Leevy, W Matthew

    2017-07-01

    Advances in three-dimensional (3D) printing allow for digital files to be turned into a "printed" physical product. For example, complex anatomical models derived from clinical or pre-clinical X-ray computed tomography (CT) data of patients or research specimens can be constructed using various printable materials. Although 3D printing has the potential to advance learning, many academic programs have been slow to adopt its use in the classroom despite increased availability of the equipment and digital databases already established for educational use. Herein, a protocol is reported for the production of enlarged bone core and accurate representation of human sinus passages in a 3D printed format using entirely consumer-grade printers and a combination of free-software platforms. The comparative resolutions of three surface rendering programs were also determined using the sinuses, a human body, and a human wrist data files to compare the abilities of different software available for surface map generation of biomedical data. Data shows that 3D Slicer provided highest compatibility and surface resolution for anatomical 3D printing. Generated surface maps were then 3D printed via fused deposition modeling (FDM printing). In conclusion, a methodological approach that explains the production of anatomical models using entirely consumer-grade, fused deposition modeling machines, and a combination of free software platforms is presented in this report. The methods outlined will facilitate the incorporation of 3D printed anatomical models in the classroom. Anat Sci Educ 10: 383-391. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.

  7. The Danish multiple sclerosis treatment register

    DEFF Research Database (Denmark)

    Magyari, Melinda; Koch-Henriksen, Nils; Sørensen, Per Soelberg

    2016-01-01

    Aim of the database: The Danish Multiple Sclerosis Treatment Register (DMSTR) serves as a clinical quality register, enabling the health authorities to monitor the quality of the diseasemodifying treatment, and it is an important data source for epidemiological research. Study population: The DMSTR...... includes all patients with multiple sclerosis who had been treated with disease-modifying drugs since 1996. At present, more than 8,400 patients have been registered in this database. Data are continuously entered online into a central database from all sites in Denmark at start and at regular visits. Main...... variables: Include age, sex, onset year and year of the diagnosis, basic clinical information, and information about treatment, side effects, and relapses. Descriptive data: Notification is done at treatment start, and thereafter at every scheduled clinical visit 3 months after treatment start...

  8. Identification of multiple detrital sources for Otway Supergroup sedimentary rocks: implications for basin models and chronostratigraphic correlations

    International Nuclear Information System (INIS)

    Mitchell, M.M.

    1997-01-01

    Correlation of apatite chlorine content (wt%) with apatite fission track age (Ma) from Lower Cretaceous Otway Supergroup sediments at present-day low temperatures, allows identification of two characteristic detrital source regions. Apatites from eroded Palaeozoic basement terrains yield low Cl content (generally 0.5 wt%) and syndepositional fission track ages. Where post-depositional thermal annealing ( > 70 degree C) has significantly reduced the fission track age, provenance information is preserved in the apatite Cl composition alone. In the Otway Supergroup, evidence for contemporaneous volcanism was found in both the Eumeralla Formation (Albian-Aptian), and Crayfish Group (Aptian-Berriasian) in samples located towards the central rift, where less sandy facies dominate. Results suggest that Crayfish Group sediments deposited along the northern margin of the basin were predominantly derived from eroding basement material, while the section located towards the central rift contains a greater proportion of volcanogenic detritus. Evidence from this study suggests that volcanogenic detritus was a distal sediment source throughout the entire early rift phase, prior to the main influx of arc-related volcanogenic material during deposition of the Eumeralla Formation. As diagenesis of volcanogenic sediments significantly reduces porosity and permeability of the sandstones, reservoir quality and petroleum potential may be significantly reduced in the Crayfish Group in deeper parts of the basin where a greater proportion of volcanogenic detritus is suggested. The results presented here provide important information regarding Lower Cretaceous Otway Basin stratigraphy and clearly indicate that this methodology may have wider application. (authors)

  9. Towards P2P XML Database Technology

    NARCIS (Netherlands)

    Y. Zhang (Ying)

    2007-01-01

    textabstractTo ease the development of data-intensive P2P applications, we envision a P2P XML Database Management System (P2P XDBMS) that acts as a database middle-ware, providing a uniform database abstraction on top of a dynamic set of distributed data sources. In this PhD work, we research which

  10. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  11. A Reconstruction Method for the Estimation of Temperatures of Multiple Sources Applied for Nanoparticle-Mediated Hyperthermia

    Directory of Open Access Journals (Sweden)

    Idan Steinberg

    2018-03-01

    Full Text Available Solid malignant tumors are one of the leading causes of death worldwide. Many times complete removal is not possible and alternative methods such as focused hyperthermia are used. Precise control of the hyperthermia process is imperative for the successful application of such treatment. To that end, this research presents a fast method that enables the estimation of deep tissue heat distribution by capturing and processing the transient temperature at the boundary based on a bio-heat transfer model. The theoretical model is rigorously developed and thoroughly validated by a series of experiments. A 10-fold improvement is demonstrated in resolution and visibility on tissue mimicking phantoms. The inverse problem is demonstrated as well with a successful application of the model for imaging deep-tissue embedded heat sources. Thereby, allowing the physician then ability to dynamically evaluate the hyperthermia treatment efficiency in real time.

  12. A Reconstruction Method for the Estimation of Temperatures of Multiple Sources Applied for Nanoparticle-Mediated Hyperthermia.

    Science.gov (United States)

    Steinberg, Idan; Tamir, Gil; Gannot, Israel

    2018-03-16

    Solid malignant tumors are one of the leading causes of death worldwide. Many times complete removal is not possible and alternative methods such as focused hyperthermia are used. Precise control of the hyperthermia process is imperative for the successful application of such treatment. To that end, this research presents a fast method that enables the estimation of deep tissue heat distribution by capturing and processing the transient temperature at the boundary based on a bio-heat transfer model. The theoretical model is rigorously developed and thoroughly validated by a series of experiments. A 10-fold improvement is demonstrated in resolution and visibility on tissue mimicking phantoms. The inverse problem is demonstrated as well with a successful application of the model for imaging deep-tissue embedded heat sources. Thereby, allowing the physician then ability to dynamically evaluate the hyperthermia treatment efficiency in real time.

  13. Climate Narratives: Combing multiple sources of information to develop risk management strategies for a municipal water utility

    Science.gov (United States)

    Yates, D. N.; Basdekas, L.; Rajagopalan, B.; Stewart, N.

    2013-12-01

    Municipal water utilities often develop Integrated Water Resource Plans (IWRP), with the goal of providing a reliable, sustainable water supply to customers in a cost-effective manner. Colorado Springs Utilities, a 5-service provider (potable and waste water, solid waste, natural gas and electricity) in Colorado USA, recently undertook an IWRP. where they incorporated water supply, water demand, water quality, infrastructure reliability, environmental protection, and other measures within the context of complex water rights, such as their critically important 'exchange potential'. The IWRP noted that an uncertain climate was one of the greatest sources of uncertainty to achieving a sustainable water supply to a growing community of users. We describe how historic drought, paleo-climate, and climate change projections were blended together into climate narratives that informed a suite of water resource systems models used by the utility to explore the vulnerabilities of their water systems.

  14. Full Data of Yeast Interacting Proteins Database (Original Version) - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Full Data of Yeast Interacting Proteins Database (Origin...al Version) Data detail Data name Full Data of Yeast Interacting Proteins Database (Original Version) DOI 10....18908/lsdba.nbdc00742-004 Description of data contents The entire data in the Yeast Interacting Proteins Database...eir interactions are required. Several sources including YPD (Yeast Proteome Database, Costanzo, M. C., Hoga...ematic name in the SGD (Saccharomyces Genome Database; http://www.yeastgenome.org /). Bait gene name The gen

  15. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M.

    1992-11-09

    The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air- conditioning and refrigeration equipment. The database identifies sources of specific information on R-32, R-123, R-124, R-125, R-134, R-134a, R-141b, R-142b, R-143a, R-152a, R-245ca, R-290 (propane), R- 717 (ammonia), ethers, and others as well as azeotropic and zeotropic and zeotropic blends of these fluids. It addresses lubricants including alkylbenzene, polyalkylene glycol, ester, and other synthetics as well as mineral oils. It also references documents on compatibility of refrigerants and lubricants with metals, plastics, elastomers, motor insulation, and other materials used in refrigerant circuits. A computerized version is available that includes retrieval software.

  16. THE EXTRAGALACTIC DISTANCE DATABASE

    International Nuclear Information System (INIS)

    Tully, R. Brent; Courtois, Helene M.; Jacobs, Bradley A.; Rizzi, Luca; Shaya, Edward J.; Makarov, Dmitry I.

    2009-01-01

    A database can be accessed on the Web at http://edd.ifa.hawaii.edu that was developed to promote access to information related to galaxy distances. The database has three functional components. First, tables from many literature sources have been gathered and enhanced with links through a distinct galaxy naming convention. Second, comparisons of results both at the levels of parameters and of techniques have begun and are continuing, leading to increasing homogeneity and consistency of distance measurements. Third, new material is presented arising from ongoing observational programs at the University of Hawaii 2.2 m telescope, radio telescopes at Green Bank, Arecibo, and Parkes and with the Hubble Space Telescope. This new observational material is made available in tandem with related material drawn from archives and passed through common analysis pipelines.

  17. Multiple metabolic alterations exist in mutant PI3K cancers, but only glucose is essential as a nutrient source.

    Directory of Open Access Journals (Sweden)

    Rebecca Foster

    Full Text Available Targeting tumour metabolism is becoming a major new area of pharmaceutical endeavour. Consequently, a systematic search to define whether there are specific energy source dependencies in tumours, and how these might be dictated by upstream driving genetic mutations, is required. The PI3K-AKT-mTOR signalling pathway has a seminal role in regulating diverse cellular processes including cell proliferation and survival, but has also been associated with metabolic dysregulation. In this study, we sought to define how mutations within PI3KCA may affect the metabolic dependency of a cancer cell, using precisely engineered isogenic cell lines. Studies revealed gene expression signatures in PIK3CA mutant cells indicative of a consistent up-regulation of glycolysis. Interestingly, the genes up- and down-regulated varied between isogenic models suggesting that the primary node of regulation is not the same between models. Additional gene expression changes were also observed, suggesting that metabolic pathways other than glycolysis, such as glutaminolysis, were also affected. Nutrient dependency studies revealed that growth of PIK3CA mutant cells is highly dependent on glucose, whereas glutamine dependency is independent of PIK3CA status. In addition, the glucose dependency exhibited by PIK3CA mutant cells could not be overridden by supplementation with other nutrients. This specific dependence on glucose for growth was further illustrated by studies evaluating the effects of targeted disruption of the glycolytic pathway using siRNA and was also found to be present across a wider panel of cancer cell lines harbouring endogenous PIK3CA mutations. In conclusion, we have found that PIK3CA mutations lead to a shift towards a highly glycolytic phenotype, and that despite suggestions that cancer cells are adept at utilising alternative nutrient sources, PIK3CA mutant cells are not able to compensate for glucose withdrawal. Understanding the metabolic

  18. Sewage pollution in urban stormwater runoff as evident from the widespread presence of multiple microbial and chemical source tracking markers.

    Science.gov (United States)

    Sidhu, J P S; Ahmed, W; Gernjak, W; Aryal, R; McCarthy, D; Palmer, A; Kolotelo, P; Toze, S

    2013-10-01

    The concurrence of human sewage contamination in urban stormwater runoff (n=23) from six urban catchments across Australia was assessed by using both microbial source tracking (MST) and chemical source tracking (CST) markers. Out of 23 stormwater samples human adenovirus (HAv), human polyomavirus (HPv) and the sewage-associated markers; Methanobrevibacter smithii nifH and Bacteroides HF183 were detected in 91%, 56%, 43% and 96% of samples, respectively. Similarly, CST markers paracetamol (87%), salicylic acid (78%) acesulfame (96%) and caffeine (91%) were frequently detected. Twenty one samples (91%) were positive for six to eight sewage related MST and CST markers and remaining two samples were positive for five and four markers, respectively. A very good consensus (>91%) observed between the concurrence of the HF183, HAv, acesulfame and caffeine suggests good predictability of the presence of HAv in samples positive for one of the three markers. High prevalence of HAv (91%) also suggests that other enteric viruses may also be present in the stormwater samples which may pose significant health risks. This study underscores the benefits of employing a set of MST and CST markers which could include monitoring for HF183, adenovirus, caffeine and paracetamol to accurately detect human sewage contamination along with credible information on the presence of human enteric viruses, which could be used for more reliable public health risk assessments. Based on the results obtained in this study, it is recommended that some degree of treatment of captured stormwater would be required if it were to be used for non-potable purposes. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  19. Connecting multiple clouds and mixing real and virtual resources via the open source WNoDeS framework

    CERN Multimedia

    CERN. Geneva; Italiano, Alessandro

    2012-01-01

    In this paper we present the latest developments introduced in the WNoDeS framework (http://web.infn.it/wnodes); we will in particular describe inter-cloud connectivity, support for multiple batch systems, and coexistence of virtual and real environments on a single hardware. Specific effort has been dedicated to the work needed to deploy a "multi-sites" WNoDeS installation. The goal is to give end users the possibility to submit requests for resources using cloud interfaces on several sites in a transparent way. To this extent, we will show how we have exploited already existing and deployed middleware within the framework of the IGI (Italian Grid Initiative) and EGI (European Grid Infrastructure) services. In this context, we will also describe the developments that have taken place in order to have the possibility to dynamically exploit public cloud services like Amazon EC2. The latter gives WNoDeS the capability to serve, for example, part of the user requests through external computing resources when ne...

  20. Uncertainty in biological monitoring: a framework for data collection and analysis to account for multiple sources of sampling bias

    Science.gov (United States)

    Ruiz-Gutierrez, Viviana; Hooten, Melvin B.; Campbell Grant, Evan H.

    2016-01-01

    Biological monitoring programmes are increasingly relying upon large volumes of citizen-science data to improve the scope and spatial coverage of information, challenging the scientific community to develop design and model-based approaches to improve inference.Recent statistical models in ecology have been developed to accommodate false-negative errors, although current work points to false-positive errors as equally important sources of bias. This is of particular concern for the success of any monitoring programme given that rates as small as 3% could lead to the overestimation of the occurrence of rare events by as much as 50%, and even small false-positive rates can severely bias estimates of occurrence dynamics.We present an integrated, computationally efficient Bayesian hierarchical model to correct for false-positive and false-negative errors in detection/non-detection data. Our model combines independent, auxiliary data sources with field observations to improve the estimation of false-positive rates, when a subset of field observations cannot be validated a posteriori or assumed as perfect. We evaluated the performance of the model across a range of occurrence rates, false-positive and false-negative errors, and quantity of auxiliary data.The model performed well under all simulated scenarios, and we were able to identify critical auxiliary data characteristics which resulted in improved inference. We applied our false-positive model to a large-scale, citizen-science monitoring programme for anurans in the north-eastern United States, using auxiliary data from an experiment designed to estimate false-positive error rates. Not correcting for false-positive rates resulted in biased estimates of occupancy in 4 of the 10 anuran species we analysed, leading to an overestimation of the average number of occupied survey routes by as much as 70%.The framework we present for data collection and analysis is able to efficiently provide reliable inference for

  1. Sr isotope tracing of multiple water sources in a complex river system, Noteć River, central Poland

    Energy Technology Data Exchange (ETDEWEB)

    Zieliński, Mateusz, E-mail: mateusz.zielinski@amu.edu.pl [Institute of Geoecology and Geoinformation, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland); Dopieralska, Jolanta, E-mail: dopieralska@amu.edu.pl [Poznań Science and Technology Park, Adam Mickiewicz University Foundation, Rubież 46, 61-612 Poznań (Poland); Belka, Zdzislaw, E-mail: zbelka@amu.edu.pl [Isotope Laboratory, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland); Walczak, Aleksandra, E-mail: awalczak@amu.edu.pl [Isotope Laboratory, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland); Siepak, Marcin, E-mail: siep@amu.edu.pl [Institute of Geology, Adam Mickiewicz University, Maków Polnych 16, 61-606 Poznań (Poland); Jakubowicz, Michal, E-mail: mjakub@amu.edu.pl [Institute of Geoecology and Geoinformation, Adam Mickiewicz University, Dzięgielowa 27, 61-680 Poznań (Poland)

    2016-04-01

    Anthropogenic impact on surface waters and other elements in the environment was investigated in the Noteć River basin in central Poland. The approach was to trace changes in the Sr isotope composition ({sup 87}Sr/{sup 86}Sr) and concentration in space and time. Systematic sampling of the river water shows a very wide range of {sup 87}Sr/{sup 86}Sr ratios, from 0.7089 to 0.7127. This strong variation, however, is restricted to the upper course of the river, whereas the water in the lower course typically shows {sup 87}Sr/{sup 86}Sr values around 0.7104–0.7105. Variations in {sup 87}Sr/{sup 86}Sr are associated with a wide range of Sr concentrations, from 0.14 to 1.32 mg/L. We find that strong variations in {sup 87}Sr/{sup 86}Sr and Sr concentrations can be accounted for by mixing of two end-members: 1) atmospheric waters charged with Sr from the near-surface weathering and wash-out of Quaternary glaciogenic deposits, and 2) waters introduced into the river from an open pit lignite mine. The first reservoir is characterized by a low Sr content and high {sup 87}Sr/{sup 86}Sr ratios, whereas mine waters display opposite characteristics. Anthropogenic pollution is also induced by extensive use of fertilizers which constitute the third source of Sr in the environment. The study has an important implication for future archeological studies in the region. It shows that the present-day Sr isotope signatures of river water, flora and fauna cannot be used unambiguously to determine the “baseline” for bioavailable {sup 87}Sr/{sup 86}Sr in the past. - Highlights: • Sr isotopes fingerprint water sources and their interactions in a complex river system. • Mine waters and fertilizers are critical anthropogenic additions in the river water. • Limited usage of environmental isotopic data in archeological studies. • Sr budget of the river is dynamic and temporary.

  2. Development of a Consumer Product Ingredient Database for ...

    Science.gov (United States)

    Consumer products are a primary source of chemical exposures, yet little structured information is available on the chemical ingredients of these products and the concentrations at which ingredients are present. To address this data gap, we created a database of chemicals in consumer products using product Material Safety Data Sheets (MSDSs) publicly provided by a large retailer. The resulting database represents 1797 unique chemicals mapped to 8921 consumer products and a hierarchy of 353 consumer product “use categories” within a total of 15 top-level categories. We examine the utility of this database and discuss ways in which it will support (i) exposure screening and prioritization, (ii) generic or framework formulations for several indoor/consumer product exposure modeling initiatives, (iii) candidate chemical selection for monitoring near field exposure from proximal sources, and (iv) as activity tracers or ubiquitous exposure sources using “chemical space” map analyses. Chemicals present at high concentrations and across multiple consumer products and use categories that hold high exposure potential are identified. Our database is publicly available to serve regulators, retailers, manufacturers, and the public for predictive screening of chemicals in new and existing consumer products on the basis of exposure and risk. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts resear

  3. CBD: a biomarker database for colorectal cancer.

    Science.gov (United States)

    Zhang, Xueli; Sun, Xiao-Feng; Cao, Yang; Ye, Benchen; Peng, Qiliang; Liu, Xingyun; Shen, Bairong; Zhang, Hong

    2018-01-01

    Colorectal cancer (CRC) biomarker database (CBD) was established based on 870 identified CRC biomarkers and their relevant information from 1115 original articles in PubMed published from 1986 to 2017. In this version of the CBD, CRC biomarker data were collected, sorted, displayed and analysed. The CBD with the credible contents as a powerful and time-saving tool provide more comprehensive and accurate information for further CRC biomarker research. The CBD was constructed under MySQL server. HTML, PHP and JavaScript languages have been used to implement the web interface. The Apache was selected as HTTP server. All of these web operations were implemented under the Windows system. The CBD could provide to users the multiple individual biomarker information and categorized into the biological category, source and application of biomarkers; the experiment methods, results, authors and publication resources; the research region, the average age of cohort, gender, race, the number of tumours, tumour location and stage. We only collect data from the articles with clear and credible results to prove the biomarkers are useful in the diagnosis, treatment or prognosis of CRC. The CBD can also provide a professional platform to researchers who are interested in CRC research to communicate, exchange their research ideas and further design high-quality research in CRC. They can submit their new findings to our database via the submission page and communicate with us in the CBD.Database URL: http://sysbio.suda.edu.cn/CBD/.

  4. The 2008 Wells, Nevada earthquake sequence: Source constraints using calibrated multiple event relocation and InSAR

    Science.gov (United States)

    Nealy, Jennifer; Benz, Harley M.; Hayes, Gavin; Berman, Eric; Barnhart, William

    2017-01-01

    The 2008 Wells, NV earthquake represents the largest domestic event in the conterminous U.S. outside of California since the October 1983 Borah Peak earthquake in southern Idaho. We present an improved catalog, magnitude complete to 1.6, of the foreshock-aftershock sequence, supplementing the current U.S. Geological Survey (USGS) Preliminary Determination of Epicenters (PDE) catalog with 1,928 well-located events. In order to create this catalog, both subspace and kurtosis detectors are used to obtain an initial set of earthquakes and associated locations. The latter are then calibrated through the implementation of the hypocentroidal decomposition method and relocated using the BayesLoc relocation technique. We additionally perform a finite fault slip analysis of the mainshock using InSAR observations. By combining the relocated sequence with the finite fault analysis, we show that the aftershocks occur primarily updip and along the southwestern edge of the zone of maximum slip. The aftershock locations illuminate areas of post-mainshock strain increase; aftershock depths, ranging from 5 to 16 km, are consistent with InSAR imaging, which shows that the Wells earthquake was a buried source with no observable near-surface offset.

  5. Multiple energy computed tomography for neuroradiology with monochromatic x-rays from the National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Dilmanian, F.A.; Garrett, R.F.; Thomlinson, W.C.; Berman, L.E.; Chapman, L.D.; Gmuer, N.F.; Lazarz, N.M.; Moulin, H.R.; Oversluizen, T.; Slatkin, D.N.; Stojanoff, V.; Volkow, N.D.; Zeman, H.D.; Luke, P.N.; Thompson, A.C.

    1990-01-01

    Monochromatic and tunable 33--100 keV x-rays from the X17 superconducting wiggler of the National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory (BNL) will be used for computed tomography (CT) of the human head and neck. The CT configuration will be one of a fixed horizontal fan-shaped beam and a seated rotating subject. The system, which is under development, will employ a two-crystal monochromator with an energy bandwidth of about 0.1%, and high-purity germanium linear array detector with 0.5 mm element width and 200 mm total width. Narrow energy bands not only eliminate beam hardening but are ideal for carrying out the following dial-energy methods: (a) dual-photon absorptiometry CT, that provides separate images of the low-Z and the intermediate-Z elements; and (b) K-edge subtraction CT of iodine and perhaps of heavier contrast elements. As a result, the system should provide ∼10-fold improvement in image contrast resolution and in quantitative precision over conventional CT. A prototype system for a 45 mm subject diameter will be ready in 1991, which will be used for studies with phantoms and small animals. The human imaging system will have a field of view of 200 mm. The in-plane spatial resolution in both systems will be 0.5 mm FWHM. 34 refs., 6 figs

  6. Mapping of wind energy potential over the Gobi Desert in Northwest China based on multiple sources of data

    Science.gov (United States)

    Li, Li; Wang, Xinyuan; Luo, Lei; Zhao, Yanchuang; Zong, Xin; Bachagha, Nabil

    2018-06-01

    In recent years, wind energy has been a fastgrowing alternative source of electrical power due to its sustainability. In this paper, the wind energy potential over the Gobi Desert in Northwest China is assessed at the patch scale using geographic information systems (GIS). Data on land cover, topography, and administrative boundaries and 11 years (2000‒2010) of wind speed measurements were collected and used to map and estimate the region's wind energy potential. Based on the results, it was found that continuous regions of geographical potential (GeoP) are located in the middle of the research area (RA), with scattered areas of similar GeoP found in other regions. The results also show that the technical potential (TecP) levels are about 1.72‒2.67 times (2.20 times on average) higher than the actual levels. It was found that the GeoP patches can be divided into four classes: unsuitable regions, suitable regions, more suitable regions, and the most suitable regions. The GeoP estimation shows that 0.41 billion kW of wind energy are potentially available in the RA. The suitable regions account for 25.49%, the more suitable regions 24.45%, and the most suitable regions for more than half of the RA. It is also shown that Xinjiang and Gansu are more suitable for wind power development than Ningxia.

  7. Gender differences in drunk driving prevalence rates and trends: a 20-year assessment using multiple sources of evidence.

    Science.gov (United States)

    Schwartz, Jennifer

    2008-09-01

    This research tracked women's and men's drunk driving rates and the DUI sex ratio in the United States from 1982-2004 using three diverse sources of evidence. Sex-specific prevalence estimates and the sex ratio are derived from official arrest statistics from the Federal Bureau of Investigation, self-reports from the Centers for Disease Control and Prevention, and traffic fatality data from the National Highway and Transportation Safety Administration. Drunk driving trends were analyzed using Augmented Dickey Fuller time series techniques. Female DUI arrest rates increased whereas male rates declined then stabilized, producing a significantly narrower sex ratio. According to self-report and traffic data, women's and men's drunk driving rates declined and the gender gap was unchanged. Women's overrepresentation in arrests relative to their share of offending began in the 1990s and accelerated in 2000. Women's arrest gains, contrasted with no systematic change in DUI behavior, and the timing of this shift suggest an increased vulnerability to arrest. More stringent laws and enforcement directed at less intoxicated offenders may inadvertently target female offending patterns.

  8. Multiple disparities in adult mortality in relation to social and health care perspective: results from different data sources.

    Science.gov (United States)

    Ranabhat, Chhabi Lal; Kim, Chun-Bae; Park, Myung-Bae; Acharaya, Sambhu

    2017-08-08

    Disparity in adult mortality (AM) with reference to social dynamics and health care has not been sufficiently examined. This study aimed to identify the gap in the understanding of AM in relation to religion, political stability, economic level, and universal health coverage (UHC). A cross-national study was performed with different sources of data, using the administrative record linkage theory. Data was created from the 2013 World Bank data catalogue by region, The Economist (Political instability index 2013), Stuckler David et al. (Universal health coverage, 2010), and religious categories of all UN country members. Descriptive statistics, a t-test, an ANOVA followed by a post hoc test, and a linear regression were used where applicable. The average AM rate for males and females was 0.20 ± 0.10 and 0.14 ± 0.10, respectively. There was high disparity of AM between countries with and without UHC and between groups with low and high income. UHC and political stability would significantly reduce AMR by >0.41 in both sexes and high economic status would reduce male AMR by 0.44, and female AMR by 0.70. It can be concluded that effective health care; UHC and political stability significantly reduce AM.

  9. Differentiation among Multiple Sources of Anthropogenic Nitrate in a Complex Groundwater System using Dual Isotope Systematics: A case study from Mortandad Canyon, New Mexico

    Science.gov (United States)

    Larson, T. E.; Perkins, G.; Longmire, P.; Heikoop, J. M.; Fessenden, J. E.; Rearick, M.; Fabyrka-Martin, J.; Chrystal, A. E.; Dale, M.; Simmons, A. M.

    2009-12-01

    The groundwater system beneath Los Alamos National Laboratory has been affected by multiple sources of anthropogenic nitrate contamination. Average NO3-N concentrations of up to 18.2±1.7 mg/L have been found in wells in the perched intermediate aquifer beneath one of the more affected sites within Mortandad Canyon. Sources of nitrate potentially reaching the alluvial and intermediate aquifers include: (1) sewage effluent, (2) neutralized nitric acid, (3) neutralized 15N-depleted nitric acid (treated waste from an experiment enriching nitric acid in 15N), and (4) natural background nitrate. Each of these sources is unique in δ18O and δ15N space. Using nitrate stable isotope ratios, a mixing model for the three anthropogenic sources of nitrate was established, after applying a linear subtraction of the background component. The spatial and temporal variability in nitrate contaminant sources through Mortandad Canyon is clearly shown in ternary plots. While microbial denitrification has been shown to change groundwater nitrate stable isotope ratios in other settings, the redox potential, relatively high dissolved oxygen content, increasing nitrate concentrations over time, and lack of observed NO2 in these wells suggest minimal changes to the stable isotope ratios have occurred. Temporal trends indicate that the earliest form of anthropogenic nitrate in this watershed was neutralized nitric acid. Alluvial wells preserve a trend of decreasing nitrate concentrations and mixing models show decreasing contributions of 15N-depleted nitric acid. Nearby intermediate wells show increasing nitrate concentrations and mixing models indicate a larger component derived from 15N-depleted nitric acid. These data indicate that the pulse of neutralized 15N-depleted nitric acid that was released into Mortandad Canyon between 1986 and 1989 has infiltrated through the alluvial aquifer and is currently affecting two intermediate wells. This hypothesis is consistent with previous

  10. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  11. Sequencing-Based Analysis of the Bacterial and Fungal Composition of Kefir Grains and Milks from Multiple Sources

    Science.gov (United States)

    Marsh, Alan J.; O’Sullivan, Orla; Hill, Colin; Ross, R. Paul; Cotter, Paul D.

    2013-01-01

    Kefir is a fermented milk-based beverage to which a number of health-promoting properties have been attributed. The microbes responsible for the fermentation of milk to produce kefir consist of a complex association of bacteria and yeasts, bound within a polysaccharide matrix, known as the kefir grain. The consistency of this microbial population, and that present in the resultant beverage, has been the subject of a number of previous, almost exclusively culture-based, studies which have indicated differences depending on geographical location and culture conditions. However, culture-based identification studies are limited by virtue of only detecting species with the ability to grow on the specific medium used and thus culture-independent, molecular-based techniques offer the potential for a more comprehensive analysis of such communities. Here we describe a detailed investigation of the microbial population, both bacterial and fungal, of kefir, using high-throughput sequencing to analyse 25 kefir milks and associated grains sourced from 8 geographically distinct regions. This is the first occasion that this technology has been employed to investigate the fungal component of these populations or to reveal the microbial composition of such an extensive number of kefir grains or milks. As a result several genera and species not previously identified in kefir were revealed. Our analysis shows that the bacterial populations in kefir are dominated by 2 phyla, the Firmicutes and the Proteobacteria. It was also established that the fungal populations of kefir were dominated by the genera Kazachstania, Kluyveromyces and Naumovozyma, but that a variable sub-dominant population also exists. PMID:23894461

  12. Sequencing-based analysis of the bacterial and fungal composition of kefir grains and milks from multiple sources.

    Directory of Open Access Journals (Sweden)

    Alan J Marsh

    Full Text Available Kefir is a fermented milk-based beverage to which a number of health-promoting properties have been attributed. The microbes responsible for the fermentation of milk to produce kefir consist of a complex association of bacteria and yeasts, bound within a polysaccharide matrix, known as the kefir grain. The consistency of this microbial population, and that present in the resultant beverage, has been the subject of a number of previous, almost exclusively culture-based, studies which have indicated differences depending on geographical location and culture conditions. However, culture-based identification studies are limited by virtue of only detecting species with the ability to grow on the specific medium used and thus culture-independent, molecular-based techniques offer the potential for a more comprehensive analysis of such communities. Here we describe a detailed investigation of the microbial population, both bacterial and fungal, of kefir, using high-throughput sequencing to analyse 25 kefir milks and associated grains sourced from 8 geographically distinct regions. This is the first occasion that this technology has been employed to investigate the fungal component of these populations or to reveal the microbial composition of such an extensive number of kefir grains or milks. As a result several genera and species not previously identified in kefir were revealed. Our analysis shows that the bacterial populations in kefir are dominated by 2 phyla, the Firmicutes and the Proteobacteria. It was also established that the fungal populations of kefir were dominated by the genera Kazachstania, Kluyveromyces and Naumovozyma, but that a variable sub-dominant population also exists.

  13. Contrasts between chemical and physical estimates of baseflow help discern multiple sources of water contributing to rivers

    Science.gov (United States)

    Cartwright, I.; Gilfedder, B.; Hofmann, H.

    2013-05-01

    This study compares geochemical and physical methods of estimating baseflow in the upper reaches of the Barwon River, southeast Australia. Estimates of baseflow from physical techniques such as local minima and recursive digital filters are higher than those based on chemical mass balance using continuous electrical conductivity (EC). Between 2001 and 2011 the baseflow flux calculated using chemical mass balance is between 1.8 × 103 and 1.5 × 104 ML yr-1 (15 to 25% of the total discharge in any one year) whereas recursive digital filters yield baseflow fluxes of 3.6 × 103 to 3.8 × 104 ML yr-1 (19 to 52% of discharge) and the local minimum method yields baseflow fluxes of 3.2 × 103 to 2.5 × 104 ML yr-1 (13 to 44% of discharge). These differences most probably reflect how the different techniques characterise baseflow. Physical methods probably aggregate much of the water from delayed sources as baseflow. However, as many delayed transient water stores (such as bank return flow or floodplain storage) are likely to be geochemically similar to surface runoff, chemical mass balance calculations aggregate them with the surface runoff component. The mismatch between geochemical and physical estimates is greatest following periods of high discharge in winter, implying that these transient stores of water feed the river for several weeks to months. Consistent with these interpretations, modelling of bank storage indicates that bank return flows provide water to the river for several weeks after flood events. EC vs. discharge variations during individual flow events also imply that an inflow of low EC water stored within the banks or on the floodplain occurs as discharge falls. The joint use of physical and geochemical techniques allows a better understanding of the different components of water that contribute to river flow, which is important for the management and protection of water resources.

  14. Sequencing-based analysis of the bacterial and fungal composition of kefir grains and milks from multiple sources.

    Science.gov (United States)

    Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D

    2013-01-01

    Kefir is a fermented milk-based beverage to which a number of health-promoting properties have been attributed. The microbes responsible for the fermentation of milk to produce kefir consist of a complex association of bacteria and yeasts, bound within a polysaccharide matrix, known as the kefir grain. The consistency of this microbial population, and that present in the resultant beverage, has been the subject of a number of previous, almost exclusively culture-based, studies which have indicated differences depending on geographical location and culture conditions. However, culture-based identification studies are limited by virtue of only detecting species with the ability to grow on the specific medium used and thus culture-independent, molecular-based techniques offer the potential for a more comprehensive analysis of such communities. Here we describe a detailed investigation of the microbial population, both bacterial and fungal, of kefir, using high-throughput sequencing to analyse 25 kefir milks and associated grains sourced from 8 geographically distinct regions. This is the first occasion that this technology has been employed to investigate the fungal component of these populations or to reveal the microbial composition of such an extensive number of kefir grains or milks. As a result several genera and species not previously identified in kefir were revealed. Our analysis shows that the bacterial populations in kefir are dominated by 2 phyla, the Firmicutes and the Proteobacteria. It was also established that the fungal populations of kefir were dominated by the genera Kazachstania, Kluyveromyces and Naumovozyma, but that a variable sub-dominant population also exists.

  15. Incorporating community and multiple perspectives in the development of acceptable drinking water source protection policy in catchments facing recreation demands.

    Science.gov (United States)

    Syme, Geoffrey J; Nancarrow, Blair E

    2013-11-15

    The protection of catchment areas for drinking water quality has become an increasingly disputed issue in Australia and internationally. This is particularly the case in regard to the growing demand for nature based and rural recreation. Currently the policy for the protection of drinking water in Western Australia is to enforce a 2 km exclusion zone with a much larger surrounding area with limited and prescribed access to recreators. The debate between recreators and water management agencies has been lively, culminating in a recent state government enquiry. This paper describes the second phase of a three phase study to develop a methodology for defensible policy formulation which accounts for the points of view of all stakeholders. We examine general community, active recreators and professionals' views on the current policy of catchment protection and five proposed alternatives using a social judgement theory approach. Key attitudinal determinants of the preferences for policies were identified. Overall the recreators did not support the current policy despite strong support from both the general community and the professional group. Nevertheless, it was evident that there was some support by the community for policies that would enable a slight relaxation of current recreational exclusion. It was also evident that there was a significant proportion of the general community who were dissatisfied with current recreational opportunities and that, in future, it may be less easy to police exclusion zones even if current policy is maintained. The potential for future integration of recreational and water source protection is discussed as well as the benefits of community research in understanding policy preferences in this regard. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Standard guide for formats for collection and compilation of corrosion data for metals for computerized database input

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1995-01-01

    1.1 This guide covers the data categories and specific data elements (fields) considered necessary to accommodate desired search strategies and reliable data comparisons in computerized corrosion databases. The data entries are designed to accommodate data relative to the basic forms of corrosion and to serve as guides for structuring multiple source database compilations capable of assessing compatibility of metals and alloys for a wide range of environments and exposure conditions.

  17. Multiple linear regression model for bromate formation based on the survey data of source waters from geographically different regions across China.

    Science.gov (United States)

    Yu, Jianwei; Liu, Juan; An, Wei; Wang, Yongjing; Zhang, Junzhi; Wei, Wei; Su, Ming; Yang, Min

    2015-01-01

    A total of 86 source water samples from 38 cities across major watersheds of China were collected for a bromide (Br(-)) survey, and the bromate (BrO3 (-)) formation potentials (BFPs) of 41 samples with Br(-) concentration >20 μg L(-1) were evaluated using a batch ozonation reactor. Statistical analyses indicated that higher alkalinity, hardness, and pH of water samples could lead to higher BFPs, with alkalinity as the most important factor. Based on the survey data, a multiple linear regression (MLR) model including three parameters (alkalinity, ozone dose, and total organic carbon (TOC)) was established with a relatively good prediction performance (model selection criterion = 2.01, R (2) = 0.724), using logarithmic transformation of the variables. Furthermore, a contour plot was used to interpret the influence of alkalinity and TOC on BrO3 (-) formation with prediction accuracy as high as 71 %, suggesting that these two parameters, apart from ozone dosage, were the most important ones affecting the BFPs of source waters with Br(-) concentration >20 μg L(-1). The model could be a useful tool for the prediction of the BFPs of source water.

  18. Interaction between different groundwaters in brittany catchments (france): characterizing multiple sources through Sr- and S isotope tracing

    Science.gov (United States)

    Negrel, Ph; Pauwels, H.

    2003-04-01

    Water resources in hard-rocks commonly involve different hydrogeological compartments such as overlying sediments, weathered rock, the weathered-fissured zone, and fractured bedrock. Streams, lakes and wetlands that drain such environments can drain groundwater, recharge groundwater, or do both. Groundwater resources in many countries are increasingly threatened by growing demand, wasteful use, and contamination. Surface water and shallow groundwater are particularly vulnerable to pollution, while deeper resources are more protected from contamination. Sr- and S-isotope data as well as major ions, from shallow and deep groundwater in three granite and Brioverian "schist" areas of the Armorican Massif (NW France) with intensive agriculture covering large parts are presented. The stable-isotope signatures of the waters plot close to the general meteoric-water line, reflecting a meteoric origin and the lack of significant evaporation or water-rock interaction. The water chemistry from the different catchments shows large variation in the major-element contents. Plotting Na, Mg, NO_3, K, SO_4 and Sr vs. Cl contents concentrations reflect agricultural input from hog and livestock farming and fertilizer applications, with local sewage-effluent influence, although some water samples are clearly unpolluted. The δ34S(SO_4) is controlled by several potential sources (atmospheric sulphate, pyrite-derived sulphates, fertilizer sulphates). Some δ18O and δ34S values are expected to increase through sulphate reduction, with higher effect on δ34S for the dissimilatory processes and on δ18O for assimilatory processes. The range in Sr contents in groundwater from different catchments agrees with previous work on groundwater sampled from granites in France. The Sr content is well correlated with Mg and both are related to agricultural practises. As in granite-gneiss watersheds in France, 87Sr/86Sr ratios range from 0.71265 to 0.72009. The relationship between 87Sr/86Sr and Mg

  19. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  20. ARTI refrigerant database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M. [Calm (James M.), Great Falls, VA (United States)

    1996-04-15

    The Refrigerant Database is an information system on alternative refrigerants, associated lubricants, and their use in air conditioning and refrigeration. It consolidates and facilitates access to property, compatibility, environmental, safety, application and other information. It provides corresponding information on older refrigerants, to assist manufacturers and those using alternative refrigerants, to make comparisons and determine differences. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern. The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air-conditioning and refrigeration equipment. The complete documents are not included, though some may be added at a later date. The database identifies sources of specific information on refrigerants. It addresses lubricants including alkylbenzene, polyalkylene glycol, polyolester, and other synthetics as well as mineral oils. It also references documents addressing compatibility of refrigerants and lubricants with metals, plastics, elastomers, motor insulation, and other materials used in refrigerant circuits. Incomplete citations or abstracts are provided for some documents. They are included to accelerate availability of the information and will be completed or replaced in future updates. Citations in this report are divided into the following topics: thermophysical properties; materials compatibility; lubricants and tribology; application data; safety; test and analysis methods; impacts; regulatory actions; substitute refrigerants; identification; absorption and adsorption; research programs; and miscellaneous documents. Information is also presented on ordering instructions for the computerized version.

  1. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Cain, J.M. (Calm (James M.), Great Falls, VA (United States))

    1993-04-30

    The Refrigerant Database consolidates and facilitates access to information to assist industry in developing equipment using alternative refrigerants. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern. The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air-conditioning and refrigeration equipment. The complete documents are not included. The database identifies sources of specific information on R-32, R-123, R-124, R-125, R-134, R-134a, R-141b, R-142b, R-143a, R-152a, R-245ca, R-290 (propane), R-717 (ammonia), ethers, and others as well as azeotropic and zeotropic blends of these fluids. It addresses lubricants including alkylbenzene, polyalkylene glycol, ester, and other synthetics as well as mineral oils. It also references documents addressing compatibility of refrigerants and lubricants with metals, plastics, elastomers, motor insulation, and other materials used in refrigerant circuits. Incomplete citations or abstracts are provided for some documents to accelerate availability of the information and will be completed or replaced in future updates.

  2. ARTI refrigerant database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M. [Calm (James M.), Great Falls, VA (United States)

    1998-08-01

    The Refrigerant Database is an information system on alternative refrigerants, associated lubricants, and their use in air conditioning and refrigeration. It consolidates and facilitates access to property, compatibility, environmental, safety, application and other information. It provides corresponding information on older refrigerants, to assist manufactures and those using alternative refrigerants, to make comparisons and determine differences. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern. The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air-conditioning and refrigeration equipment. The complete documents are not included, though some may be added at a later date. The database identifies sources of specific information on many refrigerants including propane, ammonia, water, carbon dioxide, propylene, ethers, and others as well as azeotropic and zeotropic blends of these fluids. It addresses lubricants including alkylbenzene, polyalkylene glycol, polyolester, and other synthetics as well as mineral oils. It also references documents addressing compatibility of refrigerants and lubricants with metals, plastics, elastomers, motor insulation, and other materials used in refrigerant circuits. Incomplete citations or abstracts are provided for some documents. They are included to accelerate availability of the information and will be completed or replaced in future updates.

  3. ARTI refrigerant database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M.

    1997-02-01

    The Refrigerant Database is an information system on alternative refrigerants, associated lubricants, and their use in air conditioning and refrigeration. It consolidates and facilitates access to property, compatibility, environmental, safety, application and other information. It provides corresponding information on older refrigerants, to assist manufacturers and those using alterative refrigerants, to make comparisons and determine differences. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern. The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air-conditioning and refrigeration equipment. The complete documents are not included, though some may be added at a later date. The database identifies sources of specific information on various refrigerants. It addresses lubricants including alkylbenzene, polyalkylene glycol, polyolester, and other synthetics as well as mineral oils. It also references documents addressing compatibility of refrigerants and lubricants with metals, plastics, elastomers, motor insulation, and other materials used in refrigerant circuits. Incomplete citations or abstracts are provided for some documents. They are included to accelerate availability of the information and will be completed or replaced in future updates.

  4. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece)

    International Nuclear Information System (INIS)

    Matiatos, Ioannis

    2016-01-01

    Nitrate (NO_3) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ"1"5N–NO_3 and δ"1"8O–NO_3) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO_3 sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC). - Highlights: • More enriched N-isotope values were observed in the industrial/urban areas. • A Bayesian isotope mixing model was applied in a multiple land-use area. • A 3-component model explained the factors controlling nitrate content in groundwater. • Industrial/urban nitrogen source was

  5. Filling Terrorism Gaps: VEOs, Evaluating Databases, and Applying Risk Terrain Modeling to Terrorism

    Energy Technology Data Exchange (ETDEWEB)

    Hagan, Ross F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-08-29

    This paper aims to address three issues: the lack of literature differentiating terrorism and violent extremist organizations (VEOs), terrorism incident databases, and the applicability of Risk Terrain Modeling (RTM) to terrorism. Current open source literature and publicly available government sources do not differentiate between terrorism and VEOs; furthermore, they fail to define them. Addressing the lack of a comprehensive comparison of existing terrorism data sources, a matrix comparing a dozen terrorism databases is constructed, providing insight toward the array of data available. RTM, a method for spatial risk analysis at a micro level, has some applicability to terrorism research, particularly for studies looking at risk indicators of terrorism. Leveraging attack data from multiple databases, combined with RTM, offers one avenue for closing existing research gaps in terrorism literature.

  6. Gully Erosion Mapping and Monitoring at Multiple Scales Based on Multi-Source Remote Sensing Data of the Sancha River Catchment, Northeast China

    Directory of Open Access Journals (Sweden)

    Ranghu Wang

    2016-11-01

    Full Text Available This research is focused on gully erosion mapping and monitoring at multiple spatial scales using multi-source remote sensing data of the Sancha River catchment in Northeast China, where gullies extend over a vast area. A high resolution satellite image (Pleiades 1A, 0.7 m was used to obtain the spatial distribution of the gullies of the overall basin. Image visual interpretation with field verification was employed to map the geometric gully features and evaluate gully erosion as well as the topographic differentiation characteristics. Unmanned Aerial Vehicle (UAV remote sensing data and the 3D photo-reconstruction method were employed for detailed gully mapping at a site scale. The results showed that: (1 the sub-meter image showed a strong ability in the recognition of various gully types and obtained satisfactory results, and the topographic factors of elevation, slope and slope aspects exerted significant influence on the gully spatial distribution at the catchment scale; and (2 at a more detailed site scale, UAV imagery combined with 3D photo-reconstruction provided a Digital Surface Model (DSM and ortho-image at the centimeter level as well as a detailed 3D model. The resulting products revealed the area of agricultural utilization and its shaping by human agricultural activities and water erosion in detail, and also provided the gully volume. The present study indicates that using multi-source remote sensing data, including satellite and UAV imagery simultaneously, results in an effective assessment of gully erosion over multiple spatial scales. The combined approach should be continued to regularly monitor gully erosion to understand the erosion process and its relationship with the environment from a comprehensive perspective.

  7. Mathematics for Databases

    NARCIS (Netherlands)

    ir. Sander van Laar

    2007-01-01

    A formal description of a database consists of the description of the relations (tables) of the database together with the constraints that must hold on the database. Furthermore the contents of a database can be retrieved using queries. These constraints and queries for databases can very well be

  8. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  9. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  10. The Saccharomyces Genome Database Variant Viewer.

    Science.gov (United States)

    Sheppard, Travis K; Hitz, Benjamin C; Engel, Stacia R; Song, Giltae; Balakrishnan, Rama; Binkley, Gail; Costanzo, Maria C; Dalusag, Kyla S; Demeter, Janos; Hellerstedt, Sage T; Karra, Kalpana; Nash, Robert S; Paskov, Kelley M; Skrzypek, Marek S; Weng, Shuai; Wong, Edith D; Cherry, J Michael

    2016-01-04

    The Saccharomyces Genome Database (SGD; http://www.yeastgenome.org) is the authoritative community resource for the Saccharomyces cerevisiae reference genome sequence and its annotation. In recent years, we have moved toward increased representation of sequence variation and allelic differences within S. cerevisiae. The publication of numerous additional genomes has motivated the creation of new tools for their annotation and analysis. Here we present the Variant Viewer: a dynamic open-source web application for the visualization of genomic and proteomic differences. Multiple sequence alignments have been constructed across high quality genome sequences from 11 different S. cerevisiae strains and stored in the SGD. The alignments and summaries are encoded in JSON and used to create a two-tiered dynamic view of the budding yeast pan-genome, available at http://www.yeastgenome.org/variant-viewer. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Identifying preferred format and source of exercise information in persons with multiple sclerosis that can be delivered by health-care providers.

    Science.gov (United States)

    Learmonth, Yvonne C; Adamson, Brynn C; Balto, Julia M; Chiu, Chung-Yi; Molina-Guzman, Isabel M; Finlayson, Marcia; Riskin, Barry J; Motl, Robert W

    2017-10-01

    There is increasing recognition of the benefits of exercise in individuals with multiple sclerosis (MS), yet the MS population does not engage in sufficient amounts of exercise to accrue health benefits. There has been little qualitative inquiry to establish the preferred format and source for receiving exercise information from health-care providers among persons with MS. We sought to identify the desired and preferred format and source of exercise information for persons with MS that can be delivered through health-care providers. Participants were adults with MS who had mild or moderate disability and participated in a range of exercise levels. All participants lived in the Midwest of the United States. Fifty semi-structured interviews were conducted and analysed using thematic analysis. Two themes emerged, (i) approach for receiving exercise promotion and (ii) ideal person for promoting exercise. Persons with MS want to receive exercise information through in-person consultations with health-care providers, print media and electronic media. Persons with MS want to receive exercise promotion from health-care providers with expertise in MS (ie neurologists) and with expertise in exercise (eg physical therapists). These data support the importance of understanding how to provide exercise information to persons with MS and identifying that health-care providers including neurologists and physical therapists should be involved in exercise promotion. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  12. The ESID Online Database network.

    Science.gov (United States)

    Guzman, D; Veit, D; Knerr, V; Kindle, G; Gathmann, B; Eades-Perner, A M; Grimbacher, B

    2007-03-01

    Primary immunodeficiencies (PIDs) belong to the group of rare diseases. The European Society for Immunodeficiencies (ESID), is establishing an innovative European patient and research database network for continuous long-term documentation of patients, in order to improve the diagnosis, classification, prognosis and therapy of PIDs. The ESID Online Database is a web-based system aimed at data storage, data entry, reporting and the import of pre-existing data sources in an enterprise business-to-business integration (B2B). The online database is based on Java 2 Enterprise System (J2EE) with high-standard security features, which comply with data protection laws and the demands of a modern research platform. The ESID Online Database is accessible via the official website (http://www.esid.org/). Supplementary data are available at Bioinformatics online.

  13. SAADA: Astronomical Databases Made Easier

    Science.gov (United States)

    Michel, L.; Nguyen, H. N.; Motch, C.

    2005-12-01

    Many astronomers wish to share datasets with their community but have not enough manpower to develop databases having the functionalities required for high-level scientific applications. The SAADA project aims at automatizing the creation and deployment process of such databases. A generic but scientifically relevant data model has been designed which allows one to build databases by providing only a limited number of product mapping rules. Databases created by SAADA rely on a relational database supporting JDBC and covered by a Java layer including a lot of generated code. Such databases can simultaneously host spectra, images, source lists and plots. Data are grouped in user defined collections whose content can be seen as one unique set per data type even if their formats differ. Datasets can be correlated one with each other using qualified links. These links help, for example, to handle the nature of a cross-identification (e.g., a distance or a likelihood) or to describe their scientific content (e.g., by associating a spectrum to a catalog entry). The SAADA query engine is based on a language well suited to the data model which can handle constraints on linked data, in addition to classical astronomical queries. These constraints can be applied on the linked objects (number, class and attributes) and/or on the link qualifier values. Databases created by SAADA are accessed through a rich WEB interface or a Java API. We are currently developing an inter-operability module implanting VO protocols.

  14. Databases of the marine metagenomics

    KAUST Repository

    Mineta, Katsuhiko

    2015-10-28

    The metagenomic data obtained from marine environments is significantly useful for understanding marine microbial communities. In comparison with the conventional amplicon-based approach of metagenomics, the recent shotgun sequencing-based approach has become a powerful tool that provides an efficient way of grasping a diversity of the entire microbial community at a sampling point in the sea. However, this approach accelerates accumulation of the metagenome data as well as increase of data complexity. Moreover, when metagenomic approach is used for monitoring a time change of marine environments at multiple locations of the seawater, accumulation of metagenomics data will become tremendous with an enormous speed. Because this kind of situation has started becoming of reality at many marine research institutions and stations all over the world, it looks obvious that the data management and analysis will be confronted by the so-called Big Data issues such as how the database can be constructed in an efficient way and how useful knowledge should be extracted from a vast amount of the data. In this review, we summarize the outline of all the major databases of marine metagenome that are currently publically available, noting that database exclusively on marine metagenome is none but the number of metagenome databases including marine metagenome data are six, unexpectedly still small. We also extend our explanation to the databases, as reference database we call, that will be useful for constructing a marine metagenome database as well as complementing important information with the database. Then, we would point out a number of challenges to be conquered in constructing the marine metagenome database.

  15. Dietary Supplement Ingredient Database

    Science.gov (United States)

    ... and US Department of Agriculture Dietary Supplement Ingredient Database Toggle navigation Menu Home About DSID Mission Current ... values can be saved to build a small database or add to an existing database for national, ...

  16. Energy Consumption Database

    Science.gov (United States)

    Consumption Database The California Energy Commission has created this on-line database for informal reporting ) classifications. The database also provides easy downloading of energy consumption data into Microsoft Excel (XLSX

  17. Database of Standardized Questionnaires About Walking & Bicycling

    Science.gov (United States)

    This database contains questionnaire items and a list of validation studies for standardized items related to walking and biking. The items come from multiple national and international physical activity questionnaires.

  18. Small-boat Cetacean Surveys Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The database contains multiple spreadsheets that hold data collected during each small-boat survey project conducted by the PIFSC CRP. This includes a summary of the...

  19. Multiple-source current steering in subthalamic nucleus deep brain stimulation for Parkinson's disease (the VANTAGE study): a non-randomised, prospective, multicentre, open-label study.

    Science.gov (United States)

    Timmermann, Lars; Jain, Roshini; Chen, Lilly; Maarouf, Mohamed; Barbe, Michael T; Allert, Niels; Brücke, Thomas; Kaiser, Iris; Beirer, Sebastian; Sejio, Fernando; Suarez, Esther; Lozano, Beatriz; Haegelen, Claire; Vérin, Marc; Porta, Mauro; Servello, Domenico; Gill, Steven; Whone, Alan; Van Dyck, Nic; Alesch, Francois

    2015-07-01

    High-frequency deep brain stimulation (DBS) with a single electrical source is effective for motor symptom relief in patients with Parkinson's disease. We postulated that a multiple-source, constant-current device that permits well defined distribution of current would lead to motor improvement in patients with Parkinson's disease. We did a prospective, multicentre, non-randomised, open-label intervention study of an implantable DBS device (the VANTAGE study) at six specialist DBS centres at universities in six European countries. Patients were judged eligible if they were aged 21-75 years, had been diagnosed with bilateral idiopathic Parkinson's disease with motor symptoms for more than 5 years, had a Hoehn and Yahr score of 2 or greater, and had a Unified Parkinson's disease rating scale part III (UPDRS III) score in the medication-off state of more than 30, which improved by 33% or more after a levodopa challenge. Participants underwent bilateral implantation in the subthalamic nucleus of a multiple-source, constant-current, eight-contact, rechargeable DBS system, and were assessed 12, 26, and 52 weeks after implantation. The primary endpoint was the mean change in UPDRS III scores (assessed by site investigators who were aware of the treatment assignment) from baseline (medication-off state) to 26 weeks after first lead implantation (stimulation-on, medication-off state). This study is registered with ClinicalTrials.gov, number NCT01221948. Of 53 patients enrolled in the study, 40 received a bilateral implant in the subthalamic nucleus and their data contributed to the primary endpoint analysis. Improvement was noted in the UPDRS III motor score 6 months after first lead implantation (mean 13·5 [SD 6·8], 95% CI 11·3-15·7) compared with baseline (37·4 [8·9], 34·5-40·2), with a mean difference of 23·8 (SD 10·6; 95% CI 20·3-27·3; p<0·0001). One patient died of pneumonia 24 weeks after implantation, which was judged to be unrelated to the procedure

  20. The NCBI BioSystems database.

    Science.gov (United States)

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  1. A Taxonomic Search Engine: Federating taxonomic databases using web services

    Directory of Open Access Journals (Sweden)

    Page Roderic DM

    2005-03-01

    Full Text Available Abstract Background The taxonomic name of an organism is a key link between different databases that store information on that organism. However, in the absence of a single, comprehensive database of organism names, individual databases lack an easy means of checking the correctness of a name. Furthermore, the same organism may have more than one name, and the same name may apply to more than one organism. Results The Taxonomic Search Engine (TSE is a web application written in PHP that queries multiple taxonomic databases (ITIS, Index Fungorum, IPNI, NCBI, and uBIO and summarises the results in a consistent format. It supports "drill-down" queries to retrieve a specific record. The TSE can optionally suggest alternative spellings the user can try. It also acts as a Life Science Identifier (LSID authority for the source taxonomic databases, providing globally unique identifiers (and associated metadata for each name. Conclusion The Taxonomic Search Engine is available at http://darwin.zoology.gla.ac.uk/~rpage/portal/ and provides a simple demonstration of the potential of the federated approach to providing access to taxonomic names.

  2. A Taxonomic Search Engine: federating taxonomic databases using web services.

    Science.gov (United States)

    Page, Roderic D M

    2005-03-09

    The taxonomic name of an organism is a key link between different databases that store information on that organism. However, in the absence of a single, comprehensive database of organism names, individual databases lack an easy means of checking the correctness of a name. Furthermore, the same organism may have more than one name, and the same name may apply to more than one organism. The Taxonomic Search Engine (TSE) is a web application written in PHP that queries multiple taxonomic databases (ITIS, Index Fungorum, IPNI, NCBI, and uBIO) and summarises the results in a consistent format. It supports "drill-down" queries to retrieve a specific record. The TSE can optionally suggest alternative spellings the user can try. It also acts as a Life Science Identifier (LSID) authority for the source taxonomic databases, providing globally unique identifiers (and associated metadata) for each name. The Taxonomic Search Engine is available at http://darwin.zoology.gla.ac.uk/~rpage/portal/ and provides a simple demonstration of the potential of the federated approach to providing access to taxonomic names.

  3. Multiple Sclerosis.

    Science.gov (United States)

    Plummer, Nancy; Michael, Nancy, Ed.

    This module on multiple sclerosis is intended for use in inservice or continuing education programs for persons who administer medications in long-term care facilities. Instructor information, including teaching suggestions, and a listing of recommended audiovisual materials and their sources appear first. The module goal and objectives are then…

  4. Database theory and SQL practice using Access

    International Nuclear Information System (INIS)

    Kim, Gyeong Min; Lee, Myeong Jin

    2001-01-01

    This book introduces database theory and SQL practice using Access. It is comprised of seven chapters, which give description of understanding database with basic conception and DMBS, understanding relational database with examples of it, building database table and inputting data using access 2000, structured Query Language with introduction, management and making complex query using SQL, command for advanced SQL with understanding conception of join and virtual table, design on database for online bookstore with six steps and building of application with function, structure, component, understanding of the principle, operation and checking programming source for application menu.

  5. Collecting Taxes Database

    Data.gov (United States)

    US Agency for International Development — The Collecting Taxes Database contains performance and structural indicators about national tax systems. The database contains quantitative revenue performance...

  6. USAID Anticorruption Projects Database

    Data.gov (United States)

    US Agency for International Development — The Anticorruption Projects Database (Database) includes information about USAID projects with anticorruption interventions implemented worldwide between 2007 and...

  7. NoSQL databases

    OpenAIRE

    Mrozek, Jakub

    2012-01-01

    This thesis deals with database systems referred to as NoSQL databases. In the second chapter, I explain basic terms and the theory of database systems. A short explanation is dedicated to database systems based on the relational data model and the SQL standardized query language. Chapter Three explains the concept and history of the NoSQL databases, and also presents database models, major features and the use of NoSQL databases in comparison with traditional database systems. In the fourth ...

  8. SmallSat Database

    Science.gov (United States)

    Petropulos, Dolores; Bittner, David; Murawski, Robert; Golden, Bert

    2015-01-01

    The SmallSat has an unrealized potential in both the private industry and in the federal government. Currently over 70 companies, 50 universities and 17 governmental agencies are involved in SmallSat research and development. In 1994, the U.S. Army Missile and Defense mapped the moon using smallSat imagery. Since then Smart Phones have introduced this imagery to the people of the world as diverse industries watched this trend. The deployment cost of smallSats is also greatly reduced compared to traditional satellites due to the fact that multiple units can be deployed in a single mission. Imaging payloads have become more sophisticated, smaller and lighter. In addition, the growth of small technology obtained from private industries has led to the more widespread use of smallSats. This includes greater revisit rates in imagery, significantly lower costs, the ability to update technology more frequently and the ability to decrease vulnerability of enemy attacks. The popularity of smallSats show a changing mentality in this fast paced world of tomorrow. What impact has this created on the NASA communication networks now and in future years? In this project, we are developing the SmallSat Relational Database which can support a simulation of smallSats within the NASA SCaN Compatability Environment for Networks and Integrated Communications (SCENIC) Modeling and Simulation Lab. The NASA Space Communications and Networks (SCaN) Program can use this modeling to project required network support needs in the next 10 to 15 years. The SmallSat Rational Database could model smallSats just as the other SCaN databases model the more traditional larger satellites, with a few exceptions. One being that the smallSat Database is designed to be built-to-order. The SmallSat database holds various hardware configurations that can be used to model a smallSat. It will require significant effort to develop as the research material can only be populated by hand to obtain the unique data

  9. The eNanoMapper database for nanomaterial safety information

    Directory of Open Access Journals (Sweden)

    Nina Jeliazkova

    2015-07-01

    Full Text Available Background: The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs. Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs.Results: The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API, and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms.Conclusion: We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the “representational state

  10. The eNanoMapper database for nanomaterial safety information.

    Science.gov (United States)

    Jeliazkova, Nina; Chomenidis, Charalampos; Doganis, Philip; Fadeel, Bengt; Grafström, Roland; Hardy, Barry; Hastings, Janna; Hegi, Markus; Jeliazkov, Vedrin; Kochev, Nikolay; Kohonen, Pekka; Munteanu, Cristian R; Sarimveis, Haralambos; Smeets, Bart; Sopasakis, Pantelis; Tsiliki, Georgia; Vorgrimmler, David; Willighagen, Egon

    2015-01-01

    The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the "representational state transfer" (REST) API enables building user friendly

  11. EuroFIR-BASIS - a combined composition and biological activity database for bioactive compounds in plant-based foods

    DEFF Research Database (Denmark)

    Gry, Jørn; Black, Lucinda; Eriksen, Folmer Damsted

    2007-01-01

    Mounting evidence suggests that certain non-nutrient bioactive compounds promote optimal human health and reduce the risk of chronic disease. An Internet-deployed database, EuroFIR-BASIS, which uniquely combines food composition and biological effects data for plant-based bioactive compounds......, is being developed. The database covers multiple compound classes and 330 major food plants and their edible parts with data sourced from quality-assessed, peer-reviewed literature. The database will be a valuable resource for food regulatory and advisory bodies, risk authorities, epidemiologists...... and researchers interested in diet and health relationships, and product developers within the food industry....

  12. Nitrate source identification in groundwater of multiple land-use areas by combining isotopes and multivariate statistical analysis: A case study of Asopos basin (Central Greece)

    Energy Technology Data Exchange (ETDEWEB)

    Matiatos, Ioannis, E-mail: i.matiatos@iaea.org

    2016-01-15

    Nitrate (NO{sub 3}) is one of the most common contaminants in aquatic environments and groundwater. Nitrate concentrations and environmental isotope data (δ{sup 15}N–NO{sub 3} and δ{sup 18}O–NO{sub 3}) from groundwater of Asopos basin, which has different land-use types, i.e., a large number of industries (e.g., textile, metal processing, food, fertilizers, paint), urban and agricultural areas and livestock breeding facilities, were analyzed to identify the nitrate sources of water contamination and N-biogeochemical transformations. A Bayesian isotope mixing model (SIAR) and multivariate statistical analysis of hydrochemical data were used to estimate the proportional contribution of different NO{sub 3} sources and to identify the dominant factors controlling the nitrate content of the groundwater in the region. The comparison of SIAR and Principal Component Analysis showed that wastes originating from urban and industrial zones of the basin are mainly responsible for nitrate contamination of groundwater in these areas. Agricultural fertilizers and manure likely contribute to groundwater contamination away from urban fabric and industrial land-use areas. Soil contribution to nitrate contamination due to organic matter is higher in the south-western part of the area far from the industries and the urban settlements. The present study aims to highlight the use of environmental isotopes combined with multivariate statistical analysis in locating sources of nitrate contamination in groundwater leading to a more effective planning of environmental measures and remediation strategies in river basins and water bodies as defined by the European Water Frame Directive (Directive 2000/60/EC). - Highlights: • More enriched N-isotope values were observed in the industrial/urban areas. • A Bayesian isotope mixing model was applied in a multiple land-use area. • A 3-component model explained the factors controlling nitrate content in groundwater. • Industrial

  13. Household trends in access to improved water sources and sanitation facilities in Vietnam and associated factors: findings from the Multiple Indicator Cluster Surveys, 2000–2011

    Science.gov (United States)

    Tuyet-Hanh, Tran Thi; Lee, Jong-Koo; Oh, Juhwan; Van Minh, Hoang; Ou Lee, Chul; Hoan, Le Thi; Nam, You-Seon; Long, Tran Khanh

    2016-01-01

    Background Despite progress made by the Millennium Development Goal (MDG) number 7.C, Vietnam still faces challenges with regard to the provision of access to safe drinking water and basic sanitation. Objective This paper describes household trends in access to improved water sources and sanitation facilities separately, and analyses factors associated with access to improved water sources and sanitation facilities in combination. Design Secondary data from the Vietnam Multiple Indicator Cluster Survey in 2000, 2006, and 2011 were analyzed. Descriptive statistics and tests of significance describe trends over time in access to water and sanitation by location, demographic and socio-economic factors. Binary logistic regressions (2000, 2006, and 2011) describe associations between access to water and sanitation, and geographic, demographic, and socio-economic factors. Results There have been some outstanding developments in access to improved water sources and sanitation facilities from 2000 to 2011. In 2011, the proportion of households with access to improved water sources and sanitation facilities reached 90% and 77%, respectively, meeting the 2015 MDG targets for safe drinking water and basic sanitation set at 88% and 75%, respectively. However, despite these achievements, in 2011, only 74% of households overall had access to combined improved drinking water and sanitation facilities. There were also stark differences between regions. In 2011, only 47% of households had access to both improved water and sanitation facilities in the Mekong River Delta compared with 94% in the Red River Delta. In 2011, households in urban compared to rural areas were more than twice as likely (odds ratio [OR]: 2.2; 95% confidence interval [CI]: 1.9–2.5) to have access to improved water and sanitation facilities in combination, and households in the highest compared with the lowest wealth quintile were over 40 times more likely (OR: 42.3; 95% CI: 29.8–60.0). Conclusions More

  14. Household trends in access to improved water sources and sanitation facilities in Vietnam and associated factors: findings from the Multiple Indicator Cluster Surveys, 2000–2011

    Directory of Open Access Journals (Sweden)

    Tran Thi Tuyet-Hanh

    2016-02-01

    Full Text Available Background: Despite progress made by the Millennium Development Goal (MDG number 7.C, Vietnam still faces challenges with regard to the provision of access to safe drinking water and basic sanitation. Objective: This paper describes household trends in access to improved water sources and sanitation facilities separately, and analyses factors associated with ac