WorldWideScience

Sample records for methods electronic databases

  1. Method and electronic database search engine for exposing the content of an electronic database

    NARCIS (Netherlands)

    Stappers, P.J.

    2000-01-01

    The invention relates to an electronic database search engine comprising an electronic memory device suitable for storing and releasing elements from the database, a display unit, a user interface for selecting and displaying at least one element from the database on the display unit, and control

  2. Electronic database of arterial aneurysms

    Directory of Open Access Journals (Sweden)

    Fabiano Luiz Erzinger

    2014-12-01

    Full Text Available Background:The creation of an electronic database facilitates the storage of information, as well as streamlines the exchange of data, making easier the exchange of knowledge for future research.Objective:To construct an electronic database containing comprehensive and up-to-date clinical and surgical data on the most common arterial aneurysms, to help advance scientific research.Methods:The most important specialist textbooks and articles found in journals and on internet databases were reviewed in order to define the basic structure of the protocol. Data were computerized using the SINPE© system for integrated electronic protocols and tested in a pilot study.Results:The data entered onto the system was first used to create a Master protocol, organized into a structure of top-level directories covering a large proportion of the content on vascular diseases as follows: patient history; physical examination; supplementary tests and examinations; diagnosis; treatment; and clinical course. By selecting items from the Master protocol, Specific protocols were then created for the 22 arterial sites most often involved by aneurysms. The program provides a method for collection of data on patients including clinical characteristics (patient history and physical examination, supplementary tests and examinations, treatments received and follow-up care after treatment. Any information of interest on these patients that is contained in the protocol can then be used to query the database and select data for studies.Conclusions:It proved possible to construct a database of clinical and surgical data on the arterial aneurysms of greatest interest and, by adapting the data to specific software, the database was integrated into the SINPE© system, thereby providing a standardized method for collection of data on these patients and tools for retrieving this information in an organized manner for use in scientific studies.

  3. Electron Effective-Attenuation-Length Database

    Science.gov (United States)

    SRD 82 NIST Electron Effective-Attenuation-Length Database (PC database, no charge)   This database provides values of electron effective attenuation lengths (EALs) in solid elements and compounds at selected electron energies between 50 eV and 2,000 eV. The database was designed mainly to provide EALs (to account for effects of elastic-eletron scattering) for applications in surface analysis by Auger-electron spectroscopy (AES) and X-ray photoelectron spectroscopy (XPS).

  4. Electron Inelastic-Mean-Free-Path Database

    Science.gov (United States)

    SRD 71 NIST Electron Inelastic-Mean-Free-Path Database (PC database, no charge)   This database provides values of electron inelastic mean free paths (IMFPs) for use in quantitative surface analyses by AES and XPS.

  5. Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA)

    Science.gov (United States)

    SRD 100 Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA) (PC database for purchase)   This database has been designed to facilitate quantitative interpretation of Auger-electron and X-ray photoelectron spectra and to improve the accuracy of quantitation in routine analysis. The database contains all physical data needed to perform quantitative interpretation of an electron spectrum for a thin-film specimen of given composition. A simulation module provides an estimate of peak intensities as well as the energy and angular distributions of the emitted electron flux.

  6. Availability and Utilization of Electronic Information Databases by ...

    African Journals Online (AJOL)

    The study was undertaken to determine the Availability and Utilization of Electronic Information Database by Staff of the Agricultural Complex, Ahmadu Bello University, Zaria. A survey method was used for the study. Stratified sampling method was used to select 209 respondents to accommodate the different strata of the ...

  7. Accessing Electronic Databases for Curriculum Delivery in Schools ...

    African Journals Online (AJOL)

    This paper discussed the role of electronic databases in education with emphasis on the means of accessing the electronic databases. The paper further highlighted the various types and categories of electronic databases which the schools can explore in the process of teaching and learning as well as the techniques of ...

  8. EMEN2: an object oriented database and electronic lab notebook.

    Science.gov (United States)

    Rees, Ian; Langley, Ed; Chiu, Wah; Ludtke, Steven J

    2013-02-01

    Transmission electron microscopy and associated methods, such as single particle analysis, two-dimensional crystallography, helical reconstruction, and tomography, are highly data-intensive experimental sciences, which also have substantial variability in experimental technique. Object-oriented databases present an attractive alternative to traditional relational databases for situations where the experiments themselves are continually evolving. We present EMEN2, an easy to use object-oriented database with a highly flexible infrastructure originally targeted for transmission electron microscopy and tomography, which has been extended to be adaptable for use in virtually any experimental science. It is a pure object-oriented database designed for easy adoption in diverse laboratory environments and does not require professional database administration. It includes a full featured, dynamic web interface in addition to APIs for programmatic access. EMEN2 installations currently support roughly 800 scientists worldwide with over 1/2 million experimental records and over 20 TB of experimental data. The software is freely available with complete source.

  9. An Algorithm for Building an Electronic Database.

    Science.gov (United States)

    Cohen, Wess A; Gayle, Lloyd B; Patel, Nima P

    2016-01-01

    We propose an algorithm on how to create a prospectively maintained database, which can then be used to analyze prospective data in a retrospective fashion. Our algorithm provides future researchers a road map on how to set up, maintain, and use an electronic database to improve evidence-based care and future clinical outcomes. The database was created using Microsoft Access and included demographic information, socioeconomic information, and intraoperative and postoperative details via standardized drop-down menus. A printed out form from the Microsoft Access template was given to each surgeon to be completed after each case and a member of the health care team then entered the case information into the database. By utilizing straightforward, HIPAA-compliant data input fields, we permitted data collection and transcription to be easy and efficient. Collecting a wide variety of data allowed us the freedom to evolve our clinical interests, while the platform also permitted new categories to be added at will. We have proposed a reproducible method for institutions to create a database, which will then allow senior and junior surgeons to analyze their outcomes and compare them with others in an effort to improve patient care and outcomes. This is a cost-efficient way to create and maintain a database without additional software.

  10. Awareness and use of electronic databases by public library users ...

    African Journals Online (AJOL)

    The study investigated awareness, access and use of electronic database by public library users in Ibadan Oyo State in Nigeria. The purpose of this study was to determine awareness of public library users' electronic databases, find out what these users used electronic databases to do and to identify problems associated ...

  11. Academic impact of a public electronic health database: bibliometric analysis of studies using the general practice research database.

    Directory of Open Access Journals (Sweden)

    Yu-Chun Chen

    Full Text Available BACKGROUND: Studies that use electronic health databases as research material are getting popular but the influence of a single electronic health database had not been well investigated yet. The United Kingdom's General Practice Research Database (GPRD is one of the few electronic health databases publicly available to academic researchers. This study analyzed studies that used GPRD to demonstrate the scientific production and academic impact by a single public health database. METHODOLOGY AND FINDINGS: A total of 749 studies published between 1995 and 2009 with 'General Practice Research Database' as their topics, defined as GPRD studies, were extracted from Web of Science. By the end of 2009, the GPRD had attracted 1251 authors from 22 countries and been used extensively in 749 studies published in 193 journals across 58 study fields. Each GPRD study was cited 2.7 times by successive studies. Moreover, the total number of GPRD studies increased rapidly, and it is expected to reach 1500 by 2015, twice the number accumulated till the end of 2009. Since 17 of the most prolific authors (1.4% of all authors contributed nearly half (47.9% of GPRD studies, success in conducting GPRD studies may accumulate. The GPRD was used mainly in, but not limited to, the three study fields of "Pharmacology and Pharmacy", "General and Internal Medicine", and "Public, Environmental and Occupational Health". The UK and United States were the two most active regions of GPRD studies. One-third of GRPD studies were internationally co-authored. CONCLUSIONS: A public electronic health database such as the GPRD will promote scientific production in many ways. Data owners of electronic health databases at a national level should consider how to reduce access barriers and to make data more available for research.

  12. Dissolution Methods Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — For a drug product that does not have a dissolution test method in the United States Pharmacopeia (USP), the FDA Dissolution Methods Database provides information on...

  13. Upgrade of laser and electron beam welding database

    CERN Document Server

    Furman, Magdalena

    2014-01-01

    The main purpose of this project was to fix existing issues and update the existing database holding parameters of laser-beam and electron-beam welding machines. Moreover, the database had to be extended to hold the data for the new machines that arrived recently at the workshop. As a solution - the database had to be migrated to Oracle framework, the new user interface (using APEX) had to be designed and implemented with the integration with the CERN web services (EDMS, Phonebook, JMT, CDD and EDH).

  14. NIST/Sandia/ICDD Electron Diffraction Database: A Database for Phase Identification by Electron Diffraction.

    Science.gov (United States)

    Carr, M J; Chambers, W F; Melgaard, D; Himes, V L; Stalick, J K; Mighell, A D

    1989-01-01

    A new database containing crystallographic and chemical information designed especially for application to electron diffraction search/match and related problems has been developed. The new database was derived from two well-established x-ray diffraction databases, the JCPDS Powder Diffraction File and NBS CRYSTAL DATA, and incorporates 2 years of experience with an earlier version. It contains 71,142 entries, with space group and unit cell data for 59,612 of those. Unit cell and space group information were used, where available, to calculate patterns consisting of all allowed reflections with d -spacings greater than 0.8 A for ~ 59,000 of the entries. Calculated patterns are used in the database in preference to experimental x-ray data when both are available, since experimental x-ray data sometimes omits high d -spacing data which falls at low diffraction angles. Intensity data are not given when calculated spacings are used. A search scheme using chemistry and r -spacing (reciprocal d -spacing) has been developed. Other potentially searchable data in this new database include space group, Pearson symbol, unit cell edge lengths, reduced cell edge length, and reduced cell volume. Compound and/or mineral names, formulas, and journal references are included in the output, as well as pointers to corresponding entries in NBS CRYSTAL DATA and the Powder Diffraction File where more complete information may be obtained. Atom positions are not given. Rudimentary search software has been written to implement a chemistry and r -spacing bit map search. With typical data, a full search through ~ 71,000 compounds takes 10~20 seconds on a PDP 11/23-RL02 system.

  15. Academic Impact of a Public Electronic Health Database: Bibliometric Analysis of Studies Using the General Practice Research Database

    Science.gov (United States)

    Chen, Yu-Chun; Wu, Jau-Ching; Haschler, Ingo; Majeed, Azeem; Chen, Tzeng-Ji; Wetter, Thomas

    2011-01-01

    Background Studies that use electronic health databases as research material are getting popular but the influence of a single electronic health database had not been well investigated yet. The United Kingdom's General Practice Research Database (GPRD) is one of the few electronic health databases publicly available to academic researchers. This study analyzed studies that used GPRD to demonstrate the scientific production and academic impact by a single public health database. Methodology and Findings A total of 749 studies published between 1995 and 2009 with ‘General Practice Research Database’ as their topics, defined as GPRD studies, were extracted from Web of Science. By the end of 2009, the GPRD had attracted 1251 authors from 22 countries and been used extensively in 749 studies published in 193 journals across 58 study fields. Each GPRD study was cited 2.7 times by successive studies. Moreover, the total number of GPRD studies increased rapidly, and it is expected to reach 1500 by 2015, twice the number accumulated till the end of 2009. Since 17 of the most prolific authors (1.4% of all authors) contributed nearly half (47.9%) of GPRD studies, success in conducting GPRD studies may accumulate. The GPRD was used mainly in, but not limited to, the three study fields of “Pharmacology and Pharmacy”, “General and Internal Medicine”, and “Public, Environmental and Occupational Health”. The UK and United States were the two most active regions of GPRD studies. One-third of GRPD studies were internationally co-authored. Conclusions A public electronic health database such as the GPRD will promote scientific production in many ways. Data owners of electronic health databases at a national level should consider how to reduce access barriers and to make data more available for research. PMID:21731733

  16. Users' satisfaction with the use of electronic database in university ...

    African Journals Online (AJOL)

    Users' satisfaction with the use of electronic database in university libraries in north ... file of digitized information (bibliographic records, abstracts, full-text documents, ... managed with the aid of database management system (DBMS) software.

  17. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    OpenAIRE

    Weycker Derek; Sofrygin Oleg; Seefeld Kim; Deeter Robert G; Legg Jason; Edelsberg John

    2013-01-01

    Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classifie...

  18. Standardizing terminology and definitions of medication adherence and persistence in research employing electronic databases.

    Science.gov (United States)

    Raebel, Marsha A; Schmittdiel, Julie; Karter, Andrew J; Konieczny, Jennifer L; Steiner, John F

    2013-08-01

    To propose a unifying set of definitions for prescription adherence research utilizing electronic health record prescribing databases, prescription dispensing databases, and pharmacy claims databases and to provide a conceptual framework to operationalize these definitions consistently across studies. We reviewed recent literature to identify definitions in electronic database studies of prescription-filling patterns for chronic oral medications. We then develop a conceptual model and propose standardized terminology and definitions to describe prescription-filling behavior from electronic databases. The conceptual model we propose defines 2 separate constructs: medication adherence and persistence. We define primary and secondary adherence as distinct subtypes of adherence. Metrics for estimating secondary adherence are discussed and critiqued, including a newer metric (New Prescription Medication Gap measure) that enables estimation of both primary and secondary adherence. Terminology currently used in prescription adherence research employing electronic databases lacks consistency. We propose a clear, consistent, broadly applicable conceptual model and terminology for such studies. The model and definitions facilitate research utilizing electronic medication prescribing, dispensing, and/or claims databases and encompasses the entire continuum of prescription-filling behavior. Employing conceptually clear and consistent terminology to define medication adherence and persistence will facilitate future comparative effectiveness research and meta-analytic studies that utilize electronic prescription and dispensing records.

  19. DANBIO-powerful research database and electronic patient record

    DEFF Research Database (Denmark)

    Hetland, Merete Lund

    2011-01-01

    an overview of the research outcome and presents the cohorts of RA patients. The registry, which is approved as a national quality registry, includes patients with RA, PsA and AS, who are followed longitudinally. Data are captured electronically from the source (patients and health personnel). The IT platform...... as an electronic patient 'chronicle' in routine care, and at the same time provides a powerful research database....

  20. Prediction methods and databases within chemoinformatics

    DEFF Research Database (Denmark)

    Jónsdóttir, Svava Osk; Jørgensen, Flemming Steen; Brunak, Søren

    2005-01-01

    MOTIVATION: To gather information about available databases and chemoinformatics methods for prediction of properties relevant to the drug discovery and optimization process. RESULTS: We present an overview of the most important databases with 2-dimensional and 3-dimensional structural information...... about drugs and drug candidates, and of databases with relevant properties. Access to experimental data and numerical methods for selecting and utilizing these data is crucial for developing accurate predictive in silico models. Many interesting predictive methods for classifying the suitability...

  1. Creating a High-Frequency Electronic Database in the PICU: The Perpetual Patient.

    Science.gov (United States)

    Brossier, David; El Taani, Redha; Sauthier, Michael; Roumeliotis, Nadia; Emeriaud, Guillaume; Jouvet, Philippe

    2018-04-01

    Our objective was to construct a prospective high-quality and high-frequency database combining patient therapeutics and clinical variables in real time, automatically fed by the information system and network architecture available through fully electronic charting in our PICU. The purpose of this article is to describe the data acquisition process from bedside to the research electronic database. Descriptive report and analysis of a prospective database. A 24-bed PICU, medical ICU, surgical ICU, and cardiac ICU in a tertiary care free-standing maternal child health center in Canada. All patients less than 18 years old were included at admission to the PICU. None. Between May 21, 2015, and December 31, 2016, 1,386 consecutive PICU stays from 1,194 patients were recorded in the database. Data were prospectively collected from admission to discharge, every 5 seconds from monitors and every 30 seconds from mechanical ventilators and infusion pumps. These data were linked to the patient's electronic medical record. The database total volume was 241 GB. The patients' median age was 2.0 years (interquartile range, 0.0-9.0). Data were available for all mechanically ventilated patients (n = 511; recorded duration, 77,678 hr), and respiratory failure was the most frequent reason for admission (n = 360). The complete pharmacologic profile was synched to database for all PICU stays. Following this implementation, a validation phase is in process and several research projects are ongoing using this high-fidelity database. Using the existing bedside information system and network architecture of our PICU, we implemented an ongoing high-fidelity prospectively collected electronic database, preventing the continuous loss of scientific information. This offers the opportunity to develop research on clinical decision support systems and computational models of cardiorespiratory physiology for example.

  2. 2008 Availability and Utilization of Electronic Information Databases ...

    African Journals Online (AJOL)

    Gbaje E.S

    electronic information databases include; research work, to update knowledge in their field of interest and Current awareness. ... be read by a computer device. CD ROMs are ... business and government innovation. Its ... technologies, ideas and management practices ..... sources of information and storage devices bring.

  3. New DMSP Database of Precipitating Auroral Electrons and Ions.

    Science.gov (United States)

    Redmon, Robert J; Denig, William F; Kilcommons, Liam M; Knipp, Delores J

    2017-08-01

    Since the mid 1970's, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low earth orbit. As the program evolved, so to have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) through F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information (NCEI) and the Coordinated Data Analysis Web (CDAWeb). We describe how the new database is being applied to high latitude studies of: the co-location of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field aligned currents and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  4. New DMSP database of precipitating auroral electrons and ions

    Science.gov (United States)

    Redmon, Robert J.; Denig, William F.; Kilcommons, Liam M.; Knipp, Delores J.

    2017-08-01

    Since the mid-1970s, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low Earth orbit. As the program evolved, so have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) to F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information and the Coordinated Data Analysis Web. We describe how the new database is being applied to high-latitude studies of the colocation of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field-aligned currents, and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  5. Improvement of medication event interventions through use of an electronic database.

    Science.gov (United States)

    Merandi, Jenna; Morvay, Shelly; Lewe, Dorcas; Stewart, Barb; Catt, Char; Chanthasene, Phillip P; McClead, Richard; Kappeler, Karl; Mirtallo, Jay M

    2013-10-01

    Patient safety enhancements achieved through the use of an electronic Web-based system for responding to adverse drug events (ADEs) are described. A two-phase initiative was carried out at an academic pediatric hospital to improve processes related to "medication event huddles" (interdisciplinary meetings focused on ADE interventions). Phase 1 of the initiative entailed a review of huddles and interventions over a 16-month baseline period during which multiple databases were used to manage the huddle process and staff interventions were assigned via manually generated e-mail reminders. Phase 1 data collection included ADE details (e.g., medications and staff involved, location and date of event) and the types and frequencies of interventions. Based on the phase 1 analysis, an electronic database was created to eliminate the use of multiple systems for huddle scheduling and documentation and to automatically generate e-mail reminders on assigned interventions. In phase 2 of the initiative, the impact of the database during a 5-month period was evaluated; the primary outcome was the percentage of interventions documented as completed after database implementation. During the postimplementation period, 44.7% of assigned interventions were completed, compared with a completion rate of 21% during the preimplementation period, and interventions documented as incomplete decreased from 77% to 43.7% (p Process changes, education, and medication order improvements were the most frequently documented categories of interventions. Implementation of a user-friendly electronic database improved intervention completion and documentation after medication event huddles.

  6. GMDD: a database of GMO detection methods.

    Science.gov (United States)

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-06-04

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.

  7. Electronic Publishing Approaches to Curriculum: Videotex, Teletext and Databases.

    Science.gov (United States)

    Aumente, Jerome

    1986-01-01

    Describes the Journalism Resources Institute (JRI) of Rutgers University in terms of its administrative organization, computer resources, computer facilities use, involvement in electronic publishing, use of the Dow Jones News/Retrieval Database, curricular options, and professional continuing education. (AYC)

  8. PubChemQC Project: A Large-Scale First-Principles Electronic Structure Database for Data-Driven Chemistry.

    Science.gov (United States)

    Nakata, Maho; Shimazaki, Tomomi

    2017-06-26

    Large-scale molecular databases play an essential role in the investigation of various subjects such as the development of organic materials, in silico drug design, and data-driven studies with machine learning. We have developed a large-scale quantum chemistry database based on first-principles methods. Our database currently contains the ground-state electronic structures of 3 million molecules based on density functional theory (DFT) at the B3LYP/6-31G* level, and we successively calculated 10 low-lying excited states of over 2 million molecules via time-dependent DFT with the B3LYP functional and the 6-31+G* basis set. To select the molecules calculated in our project, we referred to the PubChem Project, which was used as the source of the molecular structures in short strings using the InChI and SMILES representations. Accordingly, we have named our quantum chemistry database project "PubChemQC" ( http://pubchemqc.riken.jp/ ) and placed it in the public domain. In this paper, we show the fundamental features of the PubChemQC database and discuss the techniques used to construct the data set for large-scale quantum chemistry calculations. We also present a machine learning approach to predict the electronic structure of molecules as an example to demonstrate the suitability of the large-scale quantum chemistry database.

  9. An elemental concentration open source database for Hogdahl-Convention and Westcott-Formalism based on K0-INAA method in Malaysia

    International Nuclear Information System (INIS)

    Yavar, A.R.; Sukiman Sarmani; Tan, C.Y.; Rafie, N.N.; Lim, S.W.E.; Khoo, K.S.

    2012-01-01

    An electronic database has been developed and implemented for K 0 -INAA method in Malaysia. Databases are often developed according to national requirements. This database contains constant nuclear data for k 0 -INAA method. Hogdahl-convention and Westcott-formalism as 3 separate command user interfaces. It has been created using Microsoft Access 2007 under a Windows operating system. This database saves time and the quality of results can be assured when the calculation of neutron flux parameters and concentration of elements by k 0 -INAA method are utilised. An evaluation of the database was conducted by IAEA Soil7 where the results published which showed a high level of consistency. (Author)

  10. An examination of intrinsic errors in electronic structure methods using the Environmental Molecular Sciences Laboratory computational results database and the Gaussian-2 set

    International Nuclear Information System (INIS)

    Feller, D.; Peterson, K.A.

    1998-01-01

    The Gaussian-2 (G2) collection of atoms and molecules has been studied with Hartree endash Fock and correlated levels of theory, ranging from second-order perturbation theory to coupled cluster theory with noniterative inclusion of triple excitations. By exploiting the systematic convergence properties of the correlation consistent family of basis sets, complete basis set limits were estimated for a large number of the G2 energetic properties. Deviations with respect to experimentally derived energy differences corresponding to rigid molecules were obtained for 15 basis set/method combinations, as well as the estimated complete basis set limit. The latter values are necessary for establishing the intrinsic error for each method. In order to perform this analysis, the information generated in the present study was combined with the results of many previous benchmark studies in an electronic database, where it is available for use by other software tools. Such tools can assist users of electronic structure codes in making appropriate basis set and method choices that will increase the likelihood of achieving their accuracy goals without wasteful expenditures of computer resources. copyright 1998 American Institute of Physics

  11. Infant feeding practices within a large electronic medical record database.

    Science.gov (United States)

    Bartsch, Emily; Park, Alison L; Young, Jacqueline; Ray, Joel G; Tu, Karen

    2018-01-02

    The emerging adoption of the electronic medical record (EMR) in primary care enables clinicians and researchers to efficiently examine epidemiological trends in child health, including infant feeding practices. We completed a population-based retrospective cohort study of 8815 singleton infants born at term in Ontario, Canada, April 2002 to March 2013. Newborn records were linked to the Electronic Medical Record Administrative data Linked Database (EMRALD™), which uses patient-level information from participating family practice EMRs across Ontario. We assessed exclusive breastfeeding patterns using an automated electronic search algorithm, with manual review of EMRs when the latter was not possible. We examined the rate of breastfeeding at visits corresponding to 2, 4 and 6 months of age, as well as sociodemographic factors associated with exclusive breastfeeding. Of the 8815 newborns, 1044 (11.8%) lacked breastfeeding information in their EMR. Rates of exclusive breastfeeding were 39.5% at 2 months, 32.4% at 4 months and 25.1% at 6 months. At age 6 months, exclusive breastfeeding rates were highest among mothers aged ≥40 vs. database.

  12. An XML-Based Networking Method for Connecting Distributed Anthropometric Databases

    Directory of Open Access Journals (Sweden)

    H Cheng

    2007-03-01

    Full Text Available Anthropometric data are used by numerous types of organizations for health evaluation, ergonomics, apparel sizing, fitness training, and many other applications. Data have been collected and stored in electronic databases since at least the 1940s. These databases are owned by many organizations around the world. In addition, the anthropometric studies stored in these databases often employ different standards, terminology, procedures, or measurement sets. To promote the use and sharing of these databases, the World Engineering Anthropometry Resources (WEAR group was formed and tasked with the integration and publishing of member resources. It is easy to see that organizing worldwide anthropometric data into a single database architecture could be a daunting and expensive undertaking. The challenges of WEAR integration reflect mainly in the areas of distributed and disparate data, different standards and formats, independent memberships, and limited development resources. Fortunately, XML schema and web services provide an alternative method for networking databases, referred to as the Loosely Coupled WEAR Integration. A standard XML schema can be defined and used as a type of Rosetta stone to translate the anthropometric data into a universal format, and a web services system can be set up to link the databases to one another. In this way, the originators of the data can keep their data locally along with their own data management system and user interface, but their data can be searched and accessed as part of the larger data network, and even combined with the data of others. This paper will identify requirements for WEAR integration, review XML as the universal format, review different integration approaches, and propose a hybrid web services/data mart solution.

  13. Antibiotics in Dutch general practice: nationwide electronic GP database and national reimbursement rates.

    NARCIS (Netherlands)

    Akkerman, A.E.; Kuyvenhoven, M.M.; Verheij, T.J.M.; Dijk, L. van

    2008-01-01

    PURPOSE: In order to assess whether different databases generate information which can be reliable compared with each other, this study aimed to assess to which degree prescribing rates for systemic antibiotics from a nationwide electronic general practitioner (GP) database correspond with national

  14. Computer Cataloging of Electronic Journals in Unstable Aggregator Databases: The Hong Kong Baptist University Library Experience.

    Science.gov (United States)

    Li, Yiu-On; Leung, Shirley W.

    2001-01-01

    Discussion of aggregator databases focuses on a project at the Hong Kong Baptist University library to integrate full-text electronic journal titles from three unstable aggregator databases into its online public access catalog (OPAC). Explains the development of the electronic journal computer program (EJCOP) to generate MARC records for…

  15. Use of electronic databases by postgraduate students in a university ...

    African Journals Online (AJOL)

    The purpose of this study was to investigate the use of electronic databases by postgraduate students in the Faculty of Science and Agriculture at the. University of KwaZulu-Natal, Pietermaritzburg. The study adopted a quantitative approach and a survey was conducted. The results of the study found that while postgraduate ...

  16. Use of large electronic health record databases for environmental epidemiology studies.

    Science.gov (United States)

    Background: Electronic health records (EHRs) are a ubiquitous component of the United States healthcare system and capture nearly all data collected in a clinic or hospital setting. EHR databases are attractive for secondary data analysis as they may contain detailed clinical rec...

  17. Engineering method to build the composite structure ply database

    Directory of Open Access Journals (Sweden)

    Qinghua Shi

    Full Text Available In this paper, a new method to build a composite ply database with engineering design constraints is proposed. This method has two levels: the core stacking sequence design and the whole stacking sequence design. The core stacking sequences are obtained by the full permutation algorithm considering the ply ratio requirement and the dispersion character which characterizes the dispersion of ply angles. The whole stacking sequences are the combinations of the core stacking sequences. By excluding the ply sequences which do not meet the engineering requirements, the final ply database is obtained. One example with the constraints that the total layer number is 100 and the ply ratio is 30:60:10 is presented to validate the method. This method provides a new way to set up the ply database based on the engineering requirements without adopting intelligent optimization algorithms. Keywords: Composite ply database, VBA program, Structure design, Stacking sequence

  18. Phonetic search methods for large speech databases

    CERN Document Server

    Moyal, Ami; Tetariy, Ella; Gishri, Michal

    2013-01-01

    “Phonetic Search Methods for Large Databases” focuses on Keyword Spotting (KWS) within large speech databases. The brief will begin by outlining the challenges associated with Keyword Spotting within large speech databases using dynamic keyword vocabularies. It will then continue by highlighting the various market segments in need of KWS solutions, as well as, the specific requirements of each market segment. The work also includes a detailed description of the complexity of the task and the different methods that are used, including the advantages and disadvantages of each method and an in-depth comparison. The main focus will be on the Phonetic Search method and its efficient implementation. This will include a literature review of the various methods used for the efficient implementation of Phonetic Search Keyword Spotting, with an emphasis on the authors’ own research which entails a comparative analysis of the Phonetic Search method which includes algorithmic details. This brief is useful for resea...

  19. Antibiotics in Dutch General Practice: electronic GP databases and national reimbursement data.

    NARCIS (Netherlands)

    Kuyvenhoven, M.; Akkerman, A.E.; Dijk, L. van; Verheij, T.J.M.

    2007-01-01

    Background. A variety of databases such as data from registration forms, electronic patient records and claims data of health insurance companies, are used in evaluation studies on antimicrobial management in general practice. Aim. To assess to which degree prescribing figures for systemic

  20. ZZ ELAST2, Database of Cross Sections for the Elastic Scattering of Electrons and Positrons by Atoms

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Historical background and information: This database is an extension of the earlier database, 'Elastic Scattering of Electrons and Positrons by Atoms: Database ELAST', Report NISTIR 5188, 1993. Cross sections for the elastic scattering of electrons and positrons by atoms were calculated at energies from 1 KeV to 100 MeV. Up to 10 MeV the RELEL code of Riley was used. Above 10 MeV the ELSCAT code was used, which calculated the factored cross sections and evaluates the screening factor Kscr in WKB approximation. 2 - Application of the data: This database was developed to provide input for the transport codes, such as ETRAN, and includes differential cross sections, the total cross section, and the transport cross sections. In addition, a code TRANSX is provided that generates transport cross section of arbitrary order needed as input for the calculation of Goudsmit-Saunderson multiple-scattering angular distribution 3 - Source and scope of data: The database includes cross sections at 61 energies for electrons and 41 energies from positrons, covering the energy region from 1 KeV to 100 MeV. The number of deflection angles included in the database is 314 angles. Total and transport cross sections are also included in this package. The data files have an extension (jjj) that represents the atomic number of the target atom. The database includes auxiliary data files that enable the ELASTIC code to include the following optional modifications: (i) the inclusion of the exchange correction for electrons scattering; (ii) the conversion of the cross sections for scattering by free atoms to cross sections for scattering by atoms in solids; (iii) ti reduction of the cross sections at large angles and at high energies when the nucleus is treated as an extended rather than a point charge

  1. Construction of crystal structure prototype database: methods and applications.

    Science.gov (United States)

    Su, Chuanxun; Lv, Jian; Li, Quan; Wang, Hui; Zhang, Lijun; Wang, Yanchao; Ma, Yanming

    2017-04-26

    Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery.

  2. Construction of crystal structure prototype database: methods and applications

    International Nuclear Information System (INIS)

    Su, Chuanxun; Lv, Jian; Wang, Hui; Wang, Yanchao; Ma, Yanming; Li, Quan; Zhang, Lijun

    2017-01-01

    Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery. (paper)

  3. Classification of monofloral honeys by voltammetric electronic tongue with chemometrics method

    Energy Technology Data Exchange (ETDEWEB)

    Wei Zhenbo [Department of Bio-systems Engineering, Zhejiang University, 268 Kaixuan Road, Hangzhou 310029, Zhejiang (China); Wang Jun, E-mail: jwang@zju.edu.cn [Department of Bio-systems Engineering, Zhejiang University, 268 Kaixuan Road, Hangzhou 310029, Zhejiang (China)

    2011-05-01

    Highlights: > We self-developed a voltammetric electronic tongue based on new sensors array. > We advanced a new method to extract eigenvalues from signals obtained by VE-tongue. > We first detected the monofloral honeys of different floral origins using VE-tongue. - Abstract: A voltammetric electronic tongue (VE-tongue) based on multifrequency large amplitude pulse voltammetry (MLAPV) was developed to classify monofloral honeys of seven kinds of floral origins. The VE-tongue was composed of six working electrodes (gold, silver, platinum, palladium, tungsten, and titanium) in a standard three-electrode configuration. The applied waveform of MLAPV was composed of four individual frequencies: 1 Hz, 10 Hz, 100 Hz, and 1000 Hz. Two eigenvalues (the maximum value and the minimum value) of each cycle were extracted for building the first database (FDB); four eigenvalues (the maximum value, the minimum value, and two inflexion values) were exacted for building the second database (SDB). The two databases were analyzed by three-pattern recognition techniques: principal component analysis (PCA), discriminant function analysis (DFA) and cluster analysis (CA), respectively. It was possible to discriminate the seven kinds of honeys of different floral origins completely based on FDB and SDB by PCA, DFA and CA, and FDB was certificated as an efficient database by contrasting with the SDB. Moreover, the effective working electrodes and frequencies were picked out as the best experimental project for the further study.

  4. An effective suggestion method for keyword search of databases

    KAUST Repository

    Huang, Hai; Chen, Zonghai; Liu, Chengfei; Huang, He; Zhang, Xiangliang

    2016-01-01

    This paper solves the problem of providing high-quality suggestions for user keyword queries over databases. With the assumption that the returned suggestions are independent, existing query suggestion methods over databases score candidate

  5. An electronic health record-enabled obesity database

    Directory of Open Access Journals (Sweden)

    Wood G

    2012-05-01

    Full Text Available Abstract Background The effectiveness of weight loss therapies is commonly measured using body mass index and other obesity-related variables. Although these data are often stored in electronic health records (EHRs and potentially very accessible, few studies on obesity and weight loss have used data derived from EHRs. We developed processes for obtaining data from the EHR in order to construct a database on patients undergoing Roux-en-Y gastric bypass (RYGB surgery. Methods Clinical data obtained as part of standard of care in a bariatric surgery program at an integrated health delivery system were extracted from the EHR and deposited into a data warehouse. Data files were extracted, cleaned, and stored in research datasets. To illustrate the utility of the data, Kaplan-Meier analysis was used to estimate length of post-operative follow-up. Results Demographic, laboratory, medication, co-morbidity, and survey data were obtained from 2028 patients who had undergone RYGB at the same institution since 2004. Pre-and post-operative diagnostic and prescribing information were available on all patients, while survey laboratory data were available on a majority of patients. The number of patients with post-operative laboratory test results varied by test. Based on Kaplan-Meier estimates, over 74% of patients had post-operative weight data available at 4 years. Conclusion A variety of EHR-derived data related to obesity can be efficiently obtained and used to study important outcomes following RYGB.

  6. New e-learning method using databases

    Directory of Open Access Journals (Sweden)

    Andreea IONESCU

    2012-10-01

    Full Text Available The objective of this paper is to present a new e-learning method that use databases. The solution could pe implemented for any typeof e-learning system in any domain. The article will purpose a solution to improve the learning process for virtual classes.

  7. Evaluation of Electronic Healthcare Databases for Post-Marketing Drug Safety Surveillance and Pharmacoepidemiology in China.

    Science.gov (United States)

    Yang, Yu; Zhou, Xiaofeng; Gao, Shuangqing; Lin, Hongbo; Xie, Yanming; Feng, Yuji; Huang, Kui; Zhan, Siyan

    2018-01-01

    Electronic healthcare databases (EHDs) are used increasingly for post-marketing drug safety surveillance and pharmacoepidemiology in Europe and North America. However, few studies have examined the potential of these data sources in China. Three major types of EHDs in China (i.e., a regional community-based database, a national claims database, and an electronic medical records [EMR] database) were selected for evaluation. Forty core variables were derived based on the US Mini-Sentinel (MS) Common Data Model (CDM) as well as the data features in China that would be desirable to support drug safety surveillance. An email survey of these core variables and eight general questions as well as follow-up inquiries on additional variables was conducted. These 40 core variables across the three EHDs and all variables in each EHD along with those in the US MS CDM and Observational Medical Outcomes Partnership (OMOP) CDM were compared for availability and labeled based on specific standards. All of the EHDs' custodians confirmed their willingness to share their databases with academic institutions after appropriate approval was obtained. The regional community-based database contained 1.19 million people in 2015 with 85% of core variables. Resampled annually nationwide, the national claims database included 5.4 million people in 2014 with 55% of core variables, and the EMR database included 3 million inpatients from 60 hospitals in 2015 with 80% of core variables. Compared with MS CDM or OMOP CDM, the proportion of variables across the three EHDs available or able to be transformed/derived from the original sources are 24-83% or 45-73%, respectively. These EHDs provide potential value to post-marketing drug safety surveillance and pharmacoepidemiology in China. Future research is warranted to assess the quality and completeness of these EHDs or additional data sources in China.

  8. Assessment of COPD-related outcomes via a national electronic medical record database.

    Science.gov (United States)

    Asche, Carl; Said, Quayyim; Joish, Vijay; Hall, Charles Oaxaca; Brixner, Diana

    2008-01-01

    The technology and sophistication of healthcare utilization databases have expanded over the last decade to include results of lab tests, vital signs, and other clinical information. This review provides an assessment of the methodological and analytical challenges of conducting chronic obstructive pulmonary disease (COPD) outcomes research in a national electronic medical records (EMR) dataset and its potential application towards the assessment of national health policy issues, as well as a description of the challenges or limitations. An EMR database and its application to measuring outcomes for COPD are described. The ability to measure adherence to the COPD evidence-based practice guidelines, generated by the NIH and HEDIS quality indicators, in this database was examined. Case studies, before and after their publication, were used to assess the adherence to guidelines and gauge the conformity to quality indicators. EMR was the only source of information for pulmonary function tests, but low frequency in ordering by primary care was an issue. The EMR data can be used to explore impact of variation in healthcare provision on clinical outcomes. The EMR database permits access to specific lab data and biometric information. The richness and depth of information on "real world" use of health services for large population-based analytical studies at relatively low cost render such databases an attractive resource for outcomes research. Various sources of information exist to perform outcomes research. It is important to understand the desired endpoints of such research and choose the appropriate database source.

  9. The Establishment of the Chinese Full-text Electronic Periodical Database and Service System

    Directory of Open Access Journals (Sweden)

    Huei-Chu Chang

    2003-12-01

    Full Text Available A database covers important journals to critical mass, with powerful search interface, and easy for remote access is the most reasonable electronic resource for users. This article try to start from the project of digitizing bio-medical journals in Taiwan area to the CEPS, discuss the related issues about the selection of journals, the digitized of back issues, the copyright transfer from authors to database producers, the feedback to authors for payment from revenue. It also talks about the flow of journal publishing, marketing, function and the proposed cost-effectiveness in CEPS.[Article content in Chinese

  10. Methods for measurement of electron emission yield under low energy electron-irradiation by collector method and Kelvin probe method

    Energy Technology Data Exchange (ETDEWEB)

    Tondu, Thomas; Belhaj, Mohamed; Inguimbert, Virginie [Onera, DESP, 2 Avenue Edouard Belin, 31400 Toulouse (France); Onera, DESP, 2 Avenue Edouard Belin, 31400 Toulouse, France and Fondation STAE, 4 allee Emile Monso, BP 84234-31432, Toulouse Cedex 4 (France); Onera, DESP, 2 Avenue Edouard Belin, 31400 Toulouse (France)

    2010-09-15

    Secondary electron emission yield of gold under electron impact at normal incidence below 50 eV was investigated by the classical collector method and by the Kelvin probe method. The authors show that biasing a collector to ensure secondary electron collection while keeping the target grounded can lead to primary electron beam perturbations. Thus reliable secondary electron emission yield at low primary electron energy cannot be obtained with a biased collector. The authors present two collector-free methods based on current measurement and on electron pulse surface potential buildup (Kelvin probe method). These methods are consistent, but at very low energy, measurements become sensitive to the earth magnetic field (below 10 eV). For gold, the authors can extrapolate total emission yield at 0 eV to 0.5, while a total electron emission yield of 1 is obtained at 40{+-}1 eV.

  11. Methods for measurement of electron emission yield under low energy electron-irradiation by collector method and Kelvin probe method

    International Nuclear Information System (INIS)

    Tondu, Thomas; Belhaj, Mohamed; Inguimbert, Virginie

    2010-01-01

    Secondary electron emission yield of gold under electron impact at normal incidence below 50 eV was investigated by the classical collector method and by the Kelvin probe method. The authors show that biasing a collector to ensure secondary electron collection while keeping the target grounded can lead to primary electron beam perturbations. Thus reliable secondary electron emission yield at low primary electron energy cannot be obtained with a biased collector. The authors present two collector-free methods based on current measurement and on electron pulse surface potential buildup (Kelvin probe method). These methods are consistent, but at very low energy, measurements become sensitive to the earth magnetic field (below 10 eV). For gold, the authors can extrapolate total emission yield at 0 eV to 0.5, while a total electron emission yield of 1 is obtained at 40±1 eV.

  12. A database for AVLIS-U method

    International Nuclear Information System (INIS)

    Vasaru, Gheorghe

    2000-01-01

    Uranium enrichment is a critical step in transforming natural uranium in nuclear fuel to produce energy. Enrichment accounts for approximately one third of the cost of nuclear fuel and about 10% of the total cost of the electricity generated. Atomic vapor processes work on principle of photo-ionization whereby a powerful laser is used to ionize particular atoms present in a vapor of uranium metal. The positively-charged 235 U ions are then attracted to a negatively-charged plate and collected. The main molecular processes which have been studied work on the principle of photo-dissociation of UF 6 to solid UF 5 , using tuned laser radiation as above. Any process using UF 6 fits that atomic process more readily within the conventional fuel cycle. These two new methods have been the focus of interest for some time. They promise lower energy inputs, lower capital costs and lower tails assays, hence significant economic advantages. The program of work included: - theoretical studies of photon-atom interaction, including the effects of hyperfine structure, magnetic field and cross section; - experimental work to find theoretically favorable transition between the levels in the atom and to measure relevant transition parameters using, initially, low density uranium vapor; - development of techniques for the precision tuning and stabilization of suitable lasers, obtaining the required band width, and amplifying the light to required power; - materials and technology related to high density vapor production; - theoretical and experimental work on the efficient separation of selectively generated ions from a vapor stream. A number of technique (sputtering, electron beam heating, etc.) have been used to produced suitable streams of uranium vapor. For AVLIS-U development, the following five areas of activity were focused on: - vapor production electron guns; - production of laser beams; - selective ionization of 235 U; - separation and collection of tails and product

  13. Intelligent methods for data retrieval in fusion databases

    International Nuclear Information System (INIS)

    Vega, J.

    2008-01-01

    The plasma behaviour is identified through the recognition of patterns inside signals. The search for patterns is usually a manual and tedious procedure in which signals need to be examined individually. A breakthrough in data retrieval for fusion databases is the development of intelligent methods to search for patterns. A pattern (in the broadest sense) could be a single segment of a waveform, a set of pixels within an image or even a heterogeneous set of features made up of waveforms, images and any kind of experimental data. Intelligent methods will allow searching for data according to technical, scientific and structural criteria instead of an identifiable time interval or pulse number. Such search algorithms should be intelligent enough to avoid passing over the entire database. Benefits of such access methods are discussed and several available techniques are reviewed. In addition, the applicability of the methods from general purpose searching systems to ad hoc developments is covered

  14. A Web-based Alternative Non-animal Method Database for Safety Cosmetic Evaluations.

    Science.gov (United States)

    Kim, Seung Won; Kim, Bae-Hwan

    2016-07-01

    Animal testing was used traditionally in the cosmetics industry to confirm product safety, but has begun to be banned; alternative methods to replace animal experiments are either in development, or are being validated, worldwide. Research data related to test substances are critical for developing novel alternative tests. Moreover, safety information on cosmetic materials has neither been collected in a database nor shared among researchers. Therefore, it is imperative to build and share a database of safety information on toxicological mechanisms and pathways collected through in vivo, in vitro, and in silico methods. We developed the CAMSEC database (named after the research team; the Consortium of Alternative Methods for Safety Evaluation of Cosmetics) to fulfill this purpose. On the same website, our aim is to provide updates on current alternative research methods in Korea. The database will not be used directly to conduct safety evaluations, but researchers or regulatory individuals can use it to facilitate their work in formulating safety evaluations for cosmetic materials. We hope this database will help establish new alternative research methods to conduct efficient safety evaluations of cosmetic materials.

  15. Virtual materials design using databases of calculated materials properties

    International Nuclear Information System (INIS)

    Munter, T R; Landis, D D; Abild-Pedersen, F; Jones, G; Wang, S; Bligaard, T

    2009-01-01

    Materials design is most commonly carried out by experimental trial and error techniques. Current trends indicate that the increased complexity of newly developed materials, the exponential growth of the available computational power, and the constantly improving algorithms for solving the electronic structure problem, will continue to increase the relative importance of computational methods in the design of new materials. One possibility for utilizing electronic structure theory in the design of new materials is to create large databases of materials properties, and subsequently screen these for new potential candidates satisfying given design criteria. We utilize a database of more than 81 000 electronic structure calculations. This alloy database is combined with other published materials properties to form the foundation of a virtual materials design framework (VMDF). The VMDF offers a flexible collection of materials databases, filters, analysis tools and visualization methods, which are particularly useful in the design of new functional materials and surface structures. The applicability of the VMDF is illustrated by two examples. One is the determination of the Pareto-optimal set of binary alloy methanation catalysts with respect to catalytic activity and alloy stability; the other is the search for new alloy mercury absorbers.

  16. Rosetta Mission: Electron Scattering Cross Sections—Data Needs and Coverage in BEAMDB Database

    Directory of Open Access Journals (Sweden)

    Bratislav P. Marinković

    2017-11-01

    Full Text Available The emission of [O I] lines in the coma of Comet 67P/Churyumov-Gerasimenko during the Rosetta mission have been explained by electron impact dissociation of water rather than the process of photodissociation. This is the direct evidence for the role of electron induced processing has been seen on such a body. Analysis of other emission features is handicapped by a lack of detailed knowledge of electron impact cross sections which highlights the need for a broad range of electron scattering data from the molecular systems detected on the comet. In this paper, we present an overview of the needs for electron scattering data relevant for the understanding of observations in coma, the tenuous atmosphere and on the surface of 67P/Churyumov-Gerasimenko during the Rosetta mission. The relevant observations for elucidating the role of electrons come from optical spectra, particle analysis using the ion and electron sensors and mass spectrometry measurements. To model these processes electron impact data should be collated and reviewed in an electron scattering database and an example is given in the BEAMD, which is a part of a larger consortium of Virtual Atomic and Molecular Data Centre—VAMDC.

  17. The Danish Fetal Medicine Database

    DEFF Research Database (Denmark)

    Ekelund, Charlotte K; Petersen, Olav B; Jørgensen, Finn S

    2015-01-01

    OBJECTIVE: To describe the establishment and organization of the Danish Fetal Medicine Database and to report national results of first-trimester combined screening for trisomy 21 in the 5-year period 2008-2012. DESIGN: National register study using prospectively collected first-trimester screening...... data from the Danish Fetal Medicine Database. POPULATION: Pregnant women in Denmark undergoing first-trimester screening for trisomy 21. METHODS: Data on maternal characteristics, biochemical and ultrasonic markers are continuously sent electronically from local fetal medicine databases (Astraia Gmbh...... software) to a central national database. Data are linked to outcome data from the National Birth Register, the National Patient Register and the National Cytogenetic Register via the mother's unique personal registration number. First-trimester screening data from 2008 to 2012 were retrieved. MAIN OUTCOME...

  18. Healthcare databases in Europe for studying medicine use and safety during pregnancy

    OpenAIRE

    Charlton, Rachel A.; Neville, Amanda J.; Jordan, Sue; Pierini, Anna; Damase-Michel, Christine; Klungsøyr, Kari; Andersen, Anne-Marie Nybo; Hansen, Anne Vinkel; Gini, Rosa; Bos, Jens H.J.; Puccini, Aurora; Hurault-Delarue, Caroline; Brooks, Caroline J.; De Jong-van den Berg, Lolkje T.V.; de Vries, Corinne S.

    2014-01-01

    Purpose The aim of this study was to describe a number of electronic healthcare databases in Europe in terms of the population covered, the source of the data captured and the availability of data on key variables required for evaluating medicine use and medicine safety during pregnancy. Methods A sample of electronic healthcare databases that captured pregnancies and prescription data was selected on the basis of contacts within the EUROCAT network. For each participating database, a data...

  19. FY1995 transduction method and CAD database systems for integrated design; 1995 nendo transduction ho to CAD database togo sekkei shien system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    Transduction method developed by the research coordinator and Prof. Muroga is one of the most popular methods to design large-scale integrated circuits, and thus used by major design tool companies in USA and Japan. The major objectives of the research is to improve capability and utilize its reusable property by combining with CAD databases. Major results of the project is as follows, (1) Improvement of Transduction method : Efficiency, capability and the maximum circuit size are improved. Error compensation method is also improved. (2) Applications to new logic elements : Transduction method is modified to cope with wired logic and FPGAs. (3) CAD databases : One of the major advantages of Transduction methods is 'reusability' of already designed circuits. It is suitable to combine with CAD databases. We design CAD databases suitable for cooperative design using Transduction method. (4) Program development : Programs for Windows95 and developed for distribution. (NEDO)

  20. FY1995 transduction method and CAD database systems for integrated design; 1995 nendo transduction ho to CAD database togo sekkei shien system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    Transduction method developed by the research coordinator and Prof. Muroga is one of the most popular methods to design large-scale integrated circuits, and thus used by major design tool companies in USA and Japan. The major objectives of the research is to improve capability and utilize its reusable property by combining with CAD databases. Major results of the project is as follows, (1) Improvement of Transduction method : Efficiency, capability and the maximum circuit size are improved. Error compensation method is also improved. (2) Applications to new logic elements : Transduction method is modified to cope with wired logic and FPGAs. (3) CAD databases : One of the major advantages of Transduction methods is 'reusability' of already designed circuits. It is suitable to combine with CAD databases. We design CAD databases suitable for cooperative design using Transduction method. (4) Program development : Programs for Windows95 and developed for distribution. (NEDO)

  1. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  2. Database for inelastic collisions of lithium atoms with electrons, protons, and multiply charged ions

    NARCIS (Netherlands)

    Schweinzer, J; Brandenburg, R; Bray, [No Value; Hoekstra, R; Aumayr, F; Janev, RK; Winter, HP

    New experimental and theoretical cross-section data for inelastic collision processes of Li atoms in the ground state and excited states (up to n = 4) with electrons, protons, and multiply charged ions have been reported since the database assembled by Wutte et al. [ATOMIC DATA AND NUCLEAR DATA

  3. DRES Database of Methods for the Analysis of Chemical Warfare Agents

    National Research Council Canada - National Science Library

    D'Agostino, Paul

    1997-01-01

    .... Update of the database continues as an ongoing effort and the DRES Database of Methods for the Analysis of Chemical Warfare Agents is available panel in hardcopy form or as a softcopy Procite or Wordperfect file...

  4. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    Directory of Open Access Journals (Sweden)

    Weycker Derek

    2013-02-01

    Full Text Available Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive “gold standard” (ANC 9/L, and body temperature ≥38.3°C or receipt of antibiotics and claims-based definition (diagnosis codes for neutropenia, fever, and/or infection. Accuracy was evaluated principally based on positive predictive value (PPV and sensitivity. Results Among 357 study subjects, 82 (23% met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28, PPV was 100% and sensitivity was 34% (95% CI: 24–45. For the definition including neutropenia in the primary position (n=54, PPV was 87% (78–95 and sensitivity was 57% (46–68. For the definition including neutropenia in any position (n=71, PPV was 77% (68–87 and sensitivity was 67% (56–77. Conclusions Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  5. A generic method for improving the spatial interoperability of medical and ecological databases.

    Science.gov (United States)

    Ghenassia, A; Beuscart, J B; Ficheur, G; Occelli, F; Babykina, E; Chazard, E; Genin, M

    2017-10-03

    The availability of big data in healthcare and the intensive development of data reuse and georeferencing have opened up perspectives for health spatial analysis. However, fine-scale spatial studies of ecological and medical databases are limited by the change of support problem and thus a lack of spatial unit interoperability. The use of spatial disaggregation methods to solve this problem introduces errors into the spatial estimations. Here, we present a generic, two-step method for merging medical and ecological databases that avoids the use of spatial disaggregation methods, while maximizing the spatial resolution. Firstly, a mapping table is created after one or more transition matrices have been defined. The latter link the spatial units of the original databases to the spatial units of the final database. Secondly, the mapping table is validated by (1) comparing the covariates contained in the two original databases, and (2) checking the spatial validity with a spatial continuity criterion and a spatial resolution index. We used our novel method to merge a medical database (the French national diagnosis-related group database, containing 5644 spatial units) with an ecological database (produced by the French National Institute of Statistics and Economic Studies, and containing with 36,594 spatial units). The mapping table yielded 5632 final spatial units. The mapping table's validity was evaluated by comparing the number of births in the medical database and the ecological databases in each final spatial unit. The median [interquartile range] relative difference was 2.3% [0; 5.7]. The spatial continuity criterion was low (2.4%), and the spatial resolution index was greater than for most French administrative areas. Our innovative approach improves interoperability between medical and ecological databases and facilitates fine-scale spatial analyses. We have shown that disaggregation models and large aggregation techniques are not necessarily the best ways to

  6. Exposure to benzodiazepines (anxiolytics, hypnotics and related drugs) in seven European electronic healthcare databases: a cross-national descriptive study from the PROTECT-EU Project.

    Science.gov (United States)

    Huerta, Consuelo; Abbing-Karahagopian, Victoria; Requena, Gema; Oliva, Belén; Alvarez, Yolanda; Gardarsdottir, Helga; Miret, Montserrat; Schneider, Cornelia; Gil, Miguel; Souverein, Patrick C; De Bruin, Marie L; Slattery, Jim; De Groot, Mark C H; Hesse, Ulrik; Rottenkolber, Marietta; Schmiedl, Sven; Montero, Dolores; Bate, Andrew; Ruigomez, Ana; García-Rodríguez, Luis Alberto; Johansson, Saga; de Vries, Frank; Schlienger, Raymond G; Reynolds, Robert F; Klungel, Olaf H; de Abajo, Francisco José

    2016-03-01

    Studies on drug utilization usually do not allow direct cross-national comparisons because of differences in the respective applied methods. This study aimed to compare time trends in BZDs prescribing by applying a common protocol and analyses plan in seven European electronic healthcare databases. Crude and standardized prevalence rates of drug prescribing from 2001-2009 were calculated in databases from Spain, United Kingdon (UK), The Netherlands, Germany and Denmark. Prevalence was stratified by age, sex, BZD type [(using ATC codes), i.e. BZD-anxiolytics BZD-hypnotics, BZD-related drugs and clomethiazole], indication and number of prescription. Crude prevalence rates of BZDs prescribing ranged from 570 to 1700 per 10,000 person-years over the study period. Standardization by age and sex did not substantially change the differences. Standardized prevalence rates increased in the Spanish (+13%) and UK databases (+2% and +8%) over the study period, while they decreased in the Dutch databases (-4% and -22%), the German (-12%) and Danish (-26%) database. Prevalence of anxiolytics outweighed that of hypnotics in the Spanish, Dutch and Bavarian databases, but the reverse was shown in the UK and Danish databases. Prevalence rates consistently increased with age and were two-fold higher in women than in men in all databases. A median of 18% of users received 10 or more prescriptions in 2008. Although similar methods were applied, the prevalence of BZD prescribing varied considerably across different populations. Clinical factors related to BZDs and characteristics of the databases may explain these differences. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Prediction methods and databases within chemoinformatics: emphasis on drugs and drug candidates

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Jorgensen, FS; Brunak, Søren

    2005-01-01

    about drugs and drug candidates, and of databases with relevant properties. Access to experimental data and numerical methods for selecting and utilizing these data is crucial for developing accurate predictive in silico models. Many interesting predictive methods for classifying the suitability......MOTIVATION: To gather information about available databases and chemoinformatics methods for prediction of properties relevant to the drug discovery and optimization process. RESULTS: We present an overview of the most important databases with 2-dimensional and 3-dimensional structural information...... of chemical compounds as potential drugs, as well as for predicting their physico-chemical and ADMET properties have been proposed in recent years. These methods are discussed, and some possible future directions in this rapidly developing field are described....

  8. Examples how to use atomic and molecular databases

    International Nuclear Information System (INIS)

    Murakami, Izumi

    2012-01-01

    As examples how to use atomic and molecular databases, atomic spectra database (ASD) and molecular chemical kinetics database of National Institute of Standards and Technology (NIST), collision cross sections of National Institute of Fusion Science (NIFS), Open-Atomic Data and Analysis Structure (ADAS) and chemical reaction rate coefficients of GRI-Mech were presented. Sorting method differed in each database and several options were prepared. Atomic wavelengths/transition probabilities and electron collision ionization, excitation and recombination cross sections/rate coefficients were simply searched with just specifying atom or ion using a general internet search engine (GENIE) of IAEA. (T. Tanaka)

  9. Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus

    OpenAIRE

    RIBEIRO, CHRISTIE GRAF; MOREIRA, ANA TEREZA RAMOS; PINTO, JOSÉ SIMÃO DE PAULA; MALAFAIA, OSVALDO

    2016-01-01

    ABSTRACT Objective: to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c)). Methods: this is a descriptive study, with the methodology divided into three phases: (1) development of a theoretical ophthalmologic database with emphasis on strabismus; (2) compute...

  10. Development of prostate cancer research database with the clinical data warehouse technology for direct linkage with electronic medical record system.

    Science.gov (United States)

    Choi, In Young; Park, Seungho; Park, Bumjoon; Chung, Byung Ha; Kim, Choung-Soo; Lee, Hyun Moo; Byun, Seok-Soo; Lee, Ji Youl

    2013-01-01

    In spite of increased prostate cancer patients, little is known about the impact of treatments for prostate cancer patients and outcome of different treatments based on nationwide data. In order to obtain more comprehensive information for Korean prostate cancer patients, many professionals urged to have national system to monitor the quality of prostate cancer care. To gain its objective, the prostate cancer database system was planned and cautiously accommodated different views from various professions. This prostate cancer research database system incorporates information about a prostate cancer research including demographics, medical history, operation information, laboratory, and quality of life surveys. And, this system includes three different ways of clinical data collection to produce a comprehensive data base; direct data extraction from electronic medical record (EMR) system, manual data entry after linking EMR documents like magnetic resonance imaging findings and paper-based data collection for survey from patients. We implemented clinical data warehouse technology to test direct EMR link method with St. Mary's Hospital system. Using this method, total number of eligible patients were 2,300 from 1997 until 2012. Among them, 538 patients conducted surgery and others have different treatments. Our database system could provide the infrastructure for collecting error free data to support various retrospective and prospective studies.

  11. Guideline on radiation protection in medicine requires documentation of radioiodine therapy and follow-up. What are the benefits of an electronic database?

    International Nuclear Information System (INIS)

    Koch, W.; Rosa, F.; Knesewitsch, P.; Hahn, K.

    2005-01-01

    The lately updated German guideline on radiation protection in medicine (Richtlinie Strahlenschutz in der Medizin) requires the physician who administers radioactive substances for therapy, to perform and document follows-up. In order to decrease the administrative burden, an electronic database was developed that interfaces with a word processing software to generate written reports and statistic analysis. Methods: Based on Microsoft registered Access and Microsoft registered Visual Basic a database was created to monitor patients with benign and malignant thyroid disorders after radioiodine therapy. It permits automatic creation of therapy documents and necessary patient reports in Microsoft registered Word. Intuitive handling, third level of normalization in database architecture and automatic plausibility checks guarantee integrity of the data and the efficacy of the database. Results, conclusion: The new software has been a success in over 1500 patients and over 3800 in- and outpatient therapies and visits. The effort of data entry is easily offset by the automatic generation of the necessary patient reports. The required supervision of the follow-up appointments is now also user-friendly and efficient. (orig.)

  12. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases.

    Science.gov (United States)

    Weycker, Derek; Sofrygin, Oleg; Seefeld, Kim; Deeter, Robert G; Legg, Jason; Edelsberg, John

    2013-02-13

    Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive "gold standard" (ANC based definition (diagnosis codes for neutropenia, fever, and/or infection). Accuracy was evaluated principally based on positive predictive value (PPV) and sensitivity. Among 357 study subjects, 82 (23%) met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28), PPV was 100% and sensitivity was 34% (95% CI: 24-45). For the definition including neutropenia in the primary position (n=54), PPV was 87% (78-95) and sensitivity was 57% (46-68). For the definition including neutropenia in any position (n=71), PPV was 77% (68-87) and sensitivity was 67% (56-77). Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  13. Children's Culture Database (CCD)

    DEFF Research Database (Denmark)

    Wanting, Birgit

    a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996......a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996...

  14. E-learning platform for automated testing of electronic circuits using signature analysis method

    Science.gov (United States)

    Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel

    2016-12-01

    Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.

  15. GMOMETHODS: the European Union database of reference methods for GMO analysis.

    Science.gov (United States)

    Bonfini, Laura; Van den Bulcke, Marc H; Mazzara, Marco; Ben, Enrico; Patak, Alexandre

    2012-01-01

    In order to provide reliable and harmonized information on methods for GMO (genetically modified organism) analysis we have published a database called "GMOMETHODS" that supplies information on PCR assays validated according to the principles and requirements of ISO 5725 and/or the International Union of Pure and Applied Chemistry protocol. In addition, the database contains methods that have been verified by the European Union Reference Laboratory for Genetically Modified Food and Feed in the context of compliance with an European Union legislative act. The web application provides search capabilities to retrieve primers and probes sequence information on the available methods. It further supplies core data required by analytical labs to carry out GM tests and comprises information on the applied reference material and plasmid standards. The GMOMETHODS database currently contains 118 different PCR methods allowing identification of 51 single GM events and 18 taxon-specific genes in a sample. It also provides screening assays for detection of eight different genetic elements commonly used for the development of GMOs. The application is referred to by the Biosafety Clearing House, a global mechanism set up by the Cartagena Protocol on Biosafety to facilitate the exchange of information on Living Modified Organisms. The publication of the GMOMETHODS database can be considered an important step toward worldwide standardization and harmonization in GMO analysis.

  16. Construction of Electronics Database for East Asian Countries and Empirical Analysis of International Competitiveness of Japanese Companies (Japanese)

    OpenAIRE

    MOTOHASHI Kazuyuki

    2010-01-01

    The international competitiveness of Japanese electronics firms is fading as firms in East Asian countries such as China, Korea, and Taiwan catch up. In this paper, we have constructed an electronics industry database from 1996 to 2005 for China, Korea, Japan, Taiwan, and the United States. It covers industrial statistics in these countries including trade and overseas production statistics, which makes it possible to control for global production activities of electronics firms. We have also...

  17. Examining database persistence of ISO/EN 13606 standardized electronic health record extracts: relational vs. NoSQL approaches.

    Science.gov (United States)

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Lozano-Rubí, Raimundo; Serrano-Balazote, Pablo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2017-08-18

    The objective of this research is to compare the relational and non-relational (NoSQL) database systems approaches in order to store, recover, query and persist standardized medical information in the form of ISO/EN 13606 normalized Electronic Health Record XML extracts, both in isolation and concurrently. NoSQL database systems have recently attracted much attention, but few studies in the literature address their direct comparison with relational databases when applied to build the persistence layer of a standardized medical information system. One relational and two NoSQL databases (one document-based and one native XML database) of three different sizes have been created in order to evaluate and compare the response times (algorithmic complexity) of six different complexity growing queries, which have been performed on them. Similar appropriate results available in the literature have also been considered. Relational and non-relational NoSQL database systems show almost linear algorithmic complexity query execution. However, they show very different linear slopes, the former being much steeper than the two latter. Document-based NoSQL databases perform better in concurrency than in isolation, and also better than relational databases in concurrency. Non-relational NoSQL databases seem to be more appropriate than standard relational SQL databases when database size is extremely high (secondary use, research applications). Document-based NoSQL databases perform in general better than native XML NoSQL databases. EHR extracts visualization and edition are also document-based tasks more appropriate to NoSQL database systems. However, the appropriate database solution much depends on each particular situation and specific problem.

  18. Truth Space Method for Caching Database Queries

    Directory of Open Access Journals (Sweden)

    S. V. Mosin

    2015-01-01

    Full Text Available We propose a new method of client-side data caching for relational databases with a central server and distant clients. Data are loaded into the client cache based on queries executed on the server. Every query has the corresponding DB table – the result of the query execution. These queries have a special form called "universal relational query" based on three fundamental Relational Algebra operations: selection, projection and natural join. We have to mention that such a form is the closest one to the natural language and the majority of database search queries can be expressed in this way. Besides, this form allows us to analyze query correctness by checking lossless join property. A subsequent query may be executed in a client’s local cache if we can determine that the query result is entirely contained in the cache. For this we compare truth spaces of the logical restrictions in a new user’s query and the results of the queries execution in the cache. Such a comparison can be performed analytically , without need in additional Database queries. This method may be used to define lacking data in the cache and execute the query on the server only for these data. To do this the analytical approach is also used, what distinguishes our paper from the existing technologies. We propose four theorems for testing the required conditions. The first and the third theorems conditions allow us to define the existence of required data in cache. The second and the fourth theorems state conditions to execute queries with cache only. The problem of cache data actualizations is not discussed in this paper. However, it can be solved by cataloging queries on the server and their serving by triggers in background mode. The article is published in the author’s wording.

  19. 31 CFR 203.10 - Electronic payment methods.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Electronic payment methods. 203.10... TAX AND LOAN PROGRAM Electronic Federal Tax Payments § 203.10 Electronic payment methods. (a) General. Electronic payment methods for Federal tax payments available under this subpart include ACH debit entries...

  20. A DATABASE OF >20 keV ELECTRON GREEN'S FUNCTIONS OF INTERPLANETARY TRANSPORT AT 1 AU

    Energy Technology Data Exchange (ETDEWEB)

    Agueda, N.; Sanahuja, B. [Departament d' Astronomia i Meteorologia, Institut de Ciencies del Cosmos, Universitat de Barcelona, Barcelona (Spain); Vainio, R. [Department of Physics, University of Helsinki, Helsinki (Finland)

    2012-10-15

    We use interplanetary transport simulations to compute a database of electron Green's functions, i.e., differential intensities resulting at the spacecraft position from an impulsive injection of energetic (>20 keV) electrons close to the Sun, for a large number of values of two standard interplanetary transport parameters: the scattering mean free path and the solar wind speed. The nominal energy channels of the ACE, STEREO, and Wind spacecraft have been used in the interplanetary transport simulations to conceive a unique tool for the study of near-relativistic electron events observed at 1 AU. In this paper, we quantify the characteristic times of the Green's functions (onset and peak time, rise and decay phase duration) as a function of the interplanetary transport conditions. We use the database to calculate the FWHM of the pitch-angle distributions at different times of the event and under different scattering conditions. This allows us to provide a first quantitative result that can be compared with observations, and to assess the validity of the frequently used term beam-like pitch-angle distribution.

  1. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    Science.gov (United States)

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework

  2. The Camden & Islington Research Database: Using electronic mental health records for research.

    Science.gov (United States)

    Werbeloff, Nomi; Osborn, David P J; Patel, Rashmi; Taylor, Matthew; Stewart, Robert; Broadbent, Matthew; Hayes, Joseph F

    2018-01-01

    Electronic health records (EHRs) are widely used in mental health services. Case registers using EHRs from secondary mental healthcare have the potential to deliver large-scale projects evaluating mental health outcomes in real-world clinical populations. We describe the Camden and Islington NHS Foundation Trust (C&I) Research Database which uses the Clinical Record Interactive Search (CRIS) tool to extract and de-identify routinely collected clinical information from a large UK provider of secondary mental healthcare, and demonstrate its capabilities to answer a clinical research question regarding time to diagnosis and treatment of bipolar disorder. The C&I Research Database contains records from 108,168 mental health patients, of which 23,538 were receiving active care. The characteristics of the patient population are compared to those of the catchment area, of London, and of England as a whole. The median time to diagnosis of bipolar disorder was 76 days (interquartile range: 17-391) and median time to treatment was 37 days (interquartile range: 5-194). Compulsory admission under the UK Mental Health Act was associated with shorter intervals to diagnosis and treatment. Prior diagnoses of other psychiatric disorders were associated with longer intervals to diagnosis, though prior diagnoses of schizophrenia and related disorders were associated with decreased time to treatment. The CRIS tool, developed by the South London and Maudsley NHS Foundation Trust (SLaM) Biomedical Research Centre (BRC), functioned very well at C&I. It is reassuring that data from different organizations deliver similar results, and that applications developed in one Trust can then be successfully deployed in another. The information can be retrieved in a quicker and more efficient fashion than more traditional methods of health research. The findings support the secondary use of EHRs for large-scale mental health research in naturalistic samples and settings investigated across large

  3. Application of new type of distributed multimedia databases to networked electronic museum

    Science.gov (United States)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed

  4. The use of databases and registries to enhance colonoscopy quality.

    Science.gov (United States)

    Logan, Judith R; Lieberman, David A

    2010-10-01

    Administrative databases, registries, and clinical databases are designed for different purposes and therefore have different advantages and disadvantages in providing data for enhancing quality. Administrative databases provide the advantages of size, availability, and generalizability, but are subject to constraints inherent in the coding systems used and from data collection methods optimized for billing. Registries are designed for research and quality reporting but require significant investment from participants for secondary data collection and quality control. Electronic health records contain all of the data needed for quality research and measurement, but that data is too often locked in narrative text and unavailable for analysis. National mandates for electronic health record implementation and functionality will likely change this landscape in the near future. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Healthcare databases in Europe for studying medicine use and safety during pregnancy

    DEFF Research Database (Denmark)

    Charlton, Rachel A; Neville, Amanda J; Jordan, Sue

    2014-01-01

    data recorded by primary-care practitioners. All databases captured maternal co-prescribing and a measure of socioeconomic status. CONCLUSION: This study suggests that within Europe, electronic healthcare databases may be valuable sources of data for evaluating medicine use and safety during pregnancy......PURPOSE: The aim of this study was to describe a number of electronic healthcare databases in Europe in terms of the population covered, the source of the data captured and the availability of data on key variables required for evaluating medicine use and medicine safety during pregnancy. METHODS....... The suitability of a particular database, however, will depend on the research question, the type of medicine to be evaluated, the prevalence of its use and any adverse outcomes of interest. © 2014 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd....

  6. Cell Centred Database (CCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Cell Centered Database (CCDB) is a web accessible database for high resolution 2D, 3D and 4D data from light and electron microscopy, including correlated imaging.

  7. The Method of Analysis Derived Coefficients of Database as a New Method of Historical Research (for Example, a Database of Ballistic Parameters of Naval Artillery

    Directory of Open Access Journals (Sweden)

    Nicholas W. Mitiukov

    2015-12-01

    Full Text Available In paper there is proposed a new method of historical research, based on analysis of derivatives coefficients of database (for example, the form factor in the database of ballistic data. This method has a much greater protection from subjectivism and direct falsification, compared with the analysis obtained directly from the source of the numerical series, as any intentional or unintentional distortion of the raw data provides a significant contrast ratio derived from the average sample values. Application of this method to the analysis of ballistic data base of naval artillery allowed to find the facts, forcing a new look at some of the events in the history data on the German naval artillery before World War I, probably overpriced for disinformation opponents of the Entente; during the First World War, Spain, apparently held secret talks with the firm Bofors ended purchase of Swedish shells; the first Russian naval rifled guns were created obvious based on the project Blackly, not Krupp as traditionally considered.

  8. Structural pattern recognition methods based on string comparison for fusion databases

    International Nuclear Information System (INIS)

    Dormido-Canto, S.; Farias, G.; Dormido, R.; Vega, J.; Sanchez, J.; Duro, N.; Vargas, H.; Ratta, G.; Pereira, A.; Portas, A.

    2008-01-01

    Databases for fusion experiments are designed to store several million waveforms. Temporal evolution signals show the same patterns under the same plasma conditions and, therefore, pattern recognition techniques allow the identification of similar plasma behaviours. This article is focused on the comparison of structural pattern recognition methods. A pattern can be composed of simpler sub-patterns, where the most elementary sub-patterns are known as primitives. Selection of primitives is an essential issue in structural pattern recognition methods, because they determine what types of structural components can be constructed. However, it should be noted that there is not a general solution to extract structural features (primitives) from data. So, four different ways to compute the primitives of plasma waveforms are compared: (1) constant length primitives, (2) adaptive length primitives, (3) concavity method and (4) concavity method for noisy signals. Each method defines a code alphabet and, in this way, the pattern recognition problem is carried out via string comparisons. Results of the four methods with the TJ-II stellarator databases will be discussed

  9. Structural pattern recognition methods based on string comparison for fusion databases

    Energy Technology Data Exchange (ETDEWEB)

    Dormido-Canto, S. [Dpto. Informatica y Automatica - UNED 28040, Madrid (Spain)], E-mail: sebas@dia.uned.es; Farias, G.; Dormido, R. [Dpto. Informatica y Automatica - UNED 28040, Madrid (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, 28040, Madrid (Spain); Sanchez, J.; Duro, N.; Vargas, H. [Dpto. Informatica y Automatica - UNED 28040, Madrid (Spain); Ratta, G.; Pereira, A.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, 28040, Madrid (Spain)

    2008-04-15

    Databases for fusion experiments are designed to store several million waveforms. Temporal evolution signals show the same patterns under the same plasma conditions and, therefore, pattern recognition techniques allow the identification of similar plasma behaviours. This article is focused on the comparison of structural pattern recognition methods. A pattern can be composed of simpler sub-patterns, where the most elementary sub-patterns are known as primitives. Selection of primitives is an essential issue in structural pattern recognition methods, because they determine what types of structural components can be constructed. However, it should be noted that there is not a general solution to extract structural features (primitives) from data. So, four different ways to compute the primitives of plasma waveforms are compared: (1) constant length primitives, (2) adaptive length primitives, (3) concavity method and (4) concavity method for noisy signals. Each method defines a code alphabet and, in this way, the pattern recognition problem is carried out via string comparisons. Results of the four methods with the TJ-II stellarator databases will be discussed.

  10. Database searches for qualitative research

    OpenAIRE

    Evans, David

    2002-01-01

    Interest in the role of qualitative research in evidence-based health care is growing. However, the methods currently used to identify quantitative research do not translate easily to qualitative research. This paper highlights some of the difficulties during searches of electronic databases for qualitative research. These difficulties relate to the descriptive nature of the titles used in some qualitative studies, the variable information provided in abstracts, and the differences in the ind...

  11. X-ray Photoelectron Spectroscopy Database (Version 4.1)

    Science.gov (United States)

    SRD 20 X-ray Photoelectron Spectroscopy Database (Version 4.1) (Web, free access)   The NIST XPS Database gives access to energies of many photoelectron and Auger-electron spectral lines. The database contains over 22,000 line positions, chemical shifts, doublet splittings, and energy separations of photoelectron and Auger-electron lines.

  12. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May 1,...

  13. An effective suggestion method for keyword search of databases

    KAUST Repository

    Huang, Hai

    2016-09-09

    This paper solves the problem of providing high-quality suggestions for user keyword queries over databases. With the assumption that the returned suggestions are independent, existing query suggestion methods over databases score candidate suggestions individually and return the top-k best of them. However, the top-k suggestions have high redundancy with respect to the topics. To provide informative suggestions, the returned k suggestions are expected to be diverse, i.e., maximizing the relevance to the user query and the diversity with respect to topics that the user might be interested in simultaneously. In this paper, an objective function considering both factors is defined for evaluating a suggestion set. We show that maximizing the objective function is a submodular function maximization problem subject to n matroid constraints, which is an NP-hard problem. An greedy approximate algorithm with an approximation ratio O((Formula presented.)) is also proposed. Experimental results show that our suggestion outperforms other methods on providing relevant and diverse suggestions. © 2016 Springer Science+Business Media New York

  14. Survey of Machine Learning Methods for Database Security

    Science.gov (United States)

    Kamra, Ashish; Ber, Elisa

    Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.

  15. An ontology-based method for secondary use of electronic dental record data

    Science.gov (United States)

    Schleyer, Titus KL; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P.; Liu, Kaihong; Hernandez, Pedro

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance. PMID:24303273

  16. An ontology-based method for secondary use of electronic dental record data.

    Science.gov (United States)

    Schleyer, Titus Kl; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P; Liu, Kaihong; Hernandez, Pedro

    2013-01-01

    A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance.

  17. The Electronic Library of the Thermal Physical Databases

    International Nuclear Information System (INIS)

    Zhuravleva, Y.; Mingaleeva, G.; Mokrousov, K.; Yashnikov, D.

    2008-01-01

    base library collect experimental data-base not only for verification of one-dimensional thermal hydraulic system codes with average flow parameters but also include data useful for the subchannel thermal hydraulic analyses in rod bundles We update the library regularly. Now the library contains the following bases: critical heat flux databank (10000 experimental points, 136 test sections, 57 sourses); post-crisis heat transfer; void fraction in vertical pipes and rod bundles; critical flow from pipes, valves and flow limiters; heat transfer to supercritical water; local thermal hydraulic parameters in fuel rod assemblies; transient processes on test sites and nuclear power plants. MS Access was used for the creation of the library. The experiment description and results are saved in MS Word and Excel formats correspondently. The basic principles of the library are: the regular updating of the library; the searching and choosing data by user's specify parameters; the simple data processing; the information in each databank must be efficient both for traditional analyses methods and new approaches.(author)

  18. DOMe: A deduplication optimization method for the NewSQL database backups.

    Directory of Open Access Journals (Sweden)

    Longxiang Wang

    Full Text Available Reducing duplicated data of database backups is an important application scenario for data deduplication technology. NewSQL is an emerging database system and is now being used more and more widely. NewSQL systems need to improve data reliability by periodically backing up in-memory data, resulting in a lot of duplicated data. The traditional deduplication method is not optimized for the NewSQL server system and cannot take full advantage of hardware resources to optimize deduplication performance. A recent research pointed out that the future NewSQL server will have thousands of CPU cores, large DRAM and huge NVRAM. Therefore, how to utilize these hardware resources to optimize the performance of data deduplication is an important issue. To solve this problem, we propose a deduplication optimization method (DOMe for NewSQL system backup. To take advantage of the large number of CPU cores in the NewSQL server to optimize deduplication performance, DOMe parallelizes the deduplication method based on the fork-join framework. The fingerprint index, which is the key data structure in the deduplication process, is implemented as pure in-memory hash table, which makes full use of the large DRAM in NewSQL system, eliminating the performance bottleneck problem of fingerprint index existing in traditional deduplication method. The H-store is used as a typical NewSQL database system to implement DOMe method. DOMe is experimentally analyzed by two representative backup data. The experimental results show that: 1 DOMe can reduce the duplicated NewSQL backup data. 2 DOMe significantly improves deduplication performance by parallelizing CDC algorithms. In the case of the theoretical speedup ratio of the server is 20.8, the speedup ratio of DOMe can achieve up to 18; 3 DOMe improved the deduplication throughput by 1.5 times through the pure in-memory index optimization method.

  19. Methods to Secure Databases Against Vulnerabilities

    Science.gov (United States)

    2015-12-01

    for several languages such as C, C++, PHP, Java and Python [16]. MySQL will work well with very large databases. The documentation references...using Eclipse and connected to each database management system using Python and Java drivers provided by MySQL , MongoDB, and Datastax (for Cassandra...tiers in Python and Java . Problem MySQL MongoDB Cassandra 1. Injection a. Tautologies Vulnerable Vulnerable Not Vulnerable b. Illegal query

  20. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    Science.gov (United States)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  1. Combining electronic structure and many-body theory with large databases: A method for predicting the nature of 4 f states in Ce compounds

    Science.gov (United States)

    Herper, H. C.; Ahmed, T.; Wills, J. M.; Di Marco, I.; Björkman, T.; Iuşan, D.; Balatsky, A. V.; Eriksson, O.

    2017-08-01

    Recent progress in materials informatics has opened up the possibility of a new approach to accessing properties of materials in which one assays the aggregate properties of a large set of materials within the same class in addition to a detailed investigation of each compound in that class. Here we present a large scale investigation of electronic properties and correlated magnetism in Ce-based compounds accompanied by a systematic study of the electronic structure and 4 f -hybridization function of a large body of Ce compounds. We systematically study the electronic structure and 4 f -hybridization function of a large body of Ce compounds with the goal of elucidating the nature of the 4 f states and their interrelation with the measured Kondo energy in these compounds. The hybridization function has been analyzed for more than 350 data sets (being part of the IMS database) of cubic Ce compounds using electronic structure theory that relies on a full-potential approach. We demonstrate that the strength of the hybridization function, evaluated in this way, allows us to draw precise conclusions about the degree of localization of the 4 f states in these compounds. The theoretical results are entirely consistent with all experimental information, relevant to the degree of 4 f localization for all investigated materials. Furthermore, a more detailed analysis of the electronic structure and the hybridization function allows us to make precise statements about Kondo correlations in these systems. The calculated hybridization functions, together with the corresponding density of states, reproduce the expected exponential behavior of the observed Kondo temperatures and prove a consistent trend in real materials. This trend allows us to predict which systems may be correctly identified as Kondo systems. A strong anticorrelation between the size of the hybridization function and the volume of the systems has been observed. The information entropy for this set of systems is

  2. The UMIST database for astrochemistry 2006

    Science.gov (United States)

    Woodall, J.; Agúndez, M.; Markwick-Kemper, A. J.; Millar, T. J.

    2007-05-01

    Aims:We present a new version of the UMIST Database for Astrochemistry, the fourth such version to be released to the public. The current version contains some 4573 binary gas-phase reactions, an increase of 10% from the previous (1999) version, among 420 species, of which 23 are new to the database. Methods: Major updates have been made to ion-neutral reactions, neutral-neutral reactions, particularly at low temperature, and dissociative recombination reactions. We have included for the first time the interstellar chemistry of fluorine. In addition to the usual database, we have also released a reaction set in which the effects of dipole-enhanced ion-neutral rate coefficients are included. Results: These two reactions sets have been used in a dark cloud model and the results of these models are presented and discussed briefly. The database and associated software are available on the World Wide Web at www.udfa.net. Tables 1, 2, 4 and 9 are only available in electronic form at http://www.aanda.org

  3. A Bayesian method for identifying missing enzymes in predicted metabolic pathway databases

    Directory of Open Access Journals (Sweden)

    Karp Peter D

    2004-06-01

    Full Text Available Abstract Background The PathoLogic program constructs Pathway/Genome databases by using a genome's annotation to predict the set of metabolic pathways present in an organism. PathoLogic determines the set of reactions composing those pathways from the enzymes annotated in the organism's genome. Most annotation efforts fail to assign function to 40–60% of sequences. In addition, large numbers of sequences may have non-specific annotations (e.g., thiolase family protein. Pathway holes occur when a genome appears to lack the enzymes needed to catalyze reactions in a pathway. If a protein has not been assigned a specific function during the annotation process, any reaction catalyzed by that protein will appear as a missing enzyme or pathway hole in a Pathway/Genome database. Results We have developed a method that efficiently combines homology and pathway-based evidence to identify candidates for filling pathway holes in Pathway/Genome databases. Our program not only identifies potential candidate sequences for pathway holes, but combines data from multiple, heterogeneous sources to assess the likelihood that a candidate has the required function. Our algorithm emulates the manual sequence annotation process, considering not only evidence from homology searches, but also considering evidence from genomic context (i.e., is the gene part of an operon? and functional context (e.g., are there functionally-related genes nearby in the genome? to determine the posterior belief that a candidate has the required function. The method can be applied across an entire metabolic pathway network and is generally applicable to any pathway database. The program uses a set of sequences encoding the required activity in other genomes to identify candidate proteins in the genome of interest, and then evaluates each candidate by using a simple Bayes classifier to determine the probability that the candidate has the desired function. We achieved 71% precision at a

  4. Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus

    Directory of Open Access Journals (Sweden)

    CHRISTIE GRAF RIBEIRO

    Full Text Available ABSTRACT Objective: to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c. Methods: this is a descriptive study, with the methodology divided into three phases: (1 development of a theoretical ophthalmologic database with emphasis on strabismus; (2 computerization of this theoretical ophthalmologic database using SINPE(c and (3 interpretation of the information with demonstration of results to validate the protocol. We inputed data from the charts of fifty patients with known strabismus through the Electronic Protocol for testing and validation. Results: the new electronic protocol was able to store information regarding patient history, physical examination, laboratory exams, imaging results, diagnosis and treatment of patients with ophthalmologic diseases, with emphasis on strabismus. We included 2,141 items in this master protocol and created 20 new specific electronic protocols for strabismus, each with its own specifics. Validation was achieved through correlation and corroboration of the symptoms and confirmed diagnoses of the fifty included patients with the diagnostic criteria for the twenty new strabismus protocols. Conclusion: a new, validated electronic database focusing on ophthalmology, with emphasis on strabismus, was successfully created through the standardized collection of information, and computerization of the database using proprietary software. This protocol is ready for deployment to facilitate data collection, sorting and application for practitioners and researchers in numerous specialties.

  5. Databases and statistical methods of cohort studies (1979-1995) in Yangjiang, China

    International Nuclear Information System (INIS)

    Sun Quanfu; Zou Jianming; Liu Yusheng

    1997-01-01

    There are epidemiological databases of some 40 MB available to risk analysis, mainly including databases of cohort follow-up and deaths of 12000 subject for the periods 1979-1986 and 1987-1995, and dosimetric database for 6783 households in 526 hamlets. Because of no strict projection relationship between database of the two periods of 1979-1986 and 1987-1995, the authors developed methods to combine the data of the two periods into one for risk analysis. The first one is to set up a theoretical cohort of 1979-1995 based on record linkage between the two periods. The other method is simply to sum up stratified person-year tables of different periods. It is suggested through extensive analysis of dosimetric data that indoor exposures should be divided further into two parts (exposure received on bed and those received during other indoor activities), outdoor exposure is homogeneous within a hamlet, and occupancy factors are sex-and-age-dependent. Cumulative dose estimates based upon hamlet-specific average of dose rates in bedroom, living room, and outdoor and sex-age-specific occupancy factors are derived for each cohort member. Person-years and number of deaths are tabulated with stratification by sex, attained age, calendar years, and dose. Cancer risks are analyzed for the period of 1979-1990. Conclusion: The epidemiological studies in high background radiation areas of Yangjiang, have been greatly improved by extensively using database management system and advanced statistical analysis with more attention paid to standardization and systematization of survey data management

  6. Database requirements for the Advanced Test Accelerator project

    International Nuclear Information System (INIS)

    Chambers, F.W.

    1984-01-01

    The database requirements for the Advanced Test Accelerator (ATA) project are outlined. ATA is a state-of-the-art electron accelerator capable of producing energetic (50 million electron volt), high current (10,000 ampere), short pulse (70 billionths of a second) beams of electrons for a wide variety of applications. Databasing is required for two applications. First, the description of the configuration of facility itself requires an extended database. Second, experimental data gathered from the facility must be organized and managed to insure its full utilization. The two applications are intimately related since the acquisition and analysis of experimental data requires knowledge of the system configuration. This report reviews the needs of the ATA program and current implementation, intentions, and desires. These database applications have several unique aspects which are of interest and will be highlighted. The features desired in an ultimate database system are outlined. 3 references, 5 figures

  7. Photon and electron interaction databases and their use in medical applications

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1994-05-01

    This paper discusses the All Particle-Method photon and electron interaction, and atomic relaxation data bases, that were initially developed for use in medical applications. Currently these data bases are being used in both medical and industrial applications. The All Particle Method data bases are designed to allow modelling of individual collisions in as much detail as possible. Elastic scattering can be modelled as single, as opposed to multiple, scattering events. Ionization can be modelled at the atomic subshell level, to define which subshell was ionized, spectrum of the initially emitted electron, as well as the spectra of electron and photons emitted as the atom relaxes back to neutrality. These data bases are currently being used in applications involving rather small spatial regions, where detailed calculations of individual events are required. While initially designed for use in medical applications, these data bases are now being used in a variety of industrial applications, e.g., transport in microelectronics

  8. Research on Big Data Attribute Selection Method in Submarine Optical Fiber Network Fault Diagnosis Database

    Directory of Open Access Journals (Sweden)

    Chen Ganlang

    2017-11-01

    Full Text Available At present, in the fault diagnosis database of submarine optical fiber network, the attribute selection of large data is completed by detecting the attributes of the data, the accuracy of large data attribute selection cannot be guaranteed. In this paper, a large data attribute selection method based on support vector machines (SVM for fault diagnosis database of submarine optical fiber network is proposed. Mining large data in the database of optical fiber network fault diagnosis, and calculate its attribute weight, attribute classification is completed according to attribute weight, so as to complete attribute selection of large data. Experimental results prove that ,the proposed method can improve the accuracy of large data attribute selection in fault diagnosis database of submarine optical fiber network, and has high use value.

  9. A computer-controlled conformal radiotherapy system. IV: Electronic chart

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; McShan, Daniel L.; Matrone, Gwynne M.; Weaver, Tamar A.; Lewis, James D.; Kessler, Marc L.

    1995-01-01

    Purpose: The design and implementation of a system for electronically tracking relevant plan, prescription, and treatment data for computer-controlled conformal radiation therapy is described. Methods and Materials: The electronic charting system is implemented on a computer cluster coupled by high-speed networks to computer-controlled therapy machines. A methodical approach to the specification and design of an integrated solution has been used in developing the system. The electronic chart system is designed to allow identification and access of patient-specific data including treatment-planning data, treatment prescription information, and charting of doses. An in-house developed database system is used to provide an integrated approach to the database requirements of the design. A hierarchy of databases is used for both centralization and distribution of the treatment data for specific treatment machines. Results: The basic electronic database system has been implemented and has been in use since July 1993. The system has been used to download and manage treatment data on all patients treated on our first fully computer-controlled treatment machine. To date, electronic dose charting functions have not been fully implemented clinically, requiring the continued use of paper charting for dose tracking. Conclusions: The routine clinical application of complex computer-controlled conformal treatment procedures requires the management of large quantities of information for describing and tracking treatments. An integrated and comprehensive approach to this problem has led to a full electronic chart for conformal radiation therapy treatments

  10. Drafting method of electricity and electron design

    International Nuclear Information System (INIS)

    Gungbon, Junchun

    1989-11-01

    This book concentrates on drafting of electricity and electron design. It deals with The meaning of electricity and electron drafting JIS standard regulation the types of drafting and line and letter, basics drafting with projection drafting method, plan projection and development elevation, Drafting method of shop drawing, practical method of design and drafting, Design and drafting of technic and illustration, Connection diagram, Drafting of wiring diagram for light and illumination, Drafting of development connection diagram for sequence control, Drafting of logic circuit sign of flow chart and manual, drafting for a electron circuit diagram and Drawing of PC board.

  11. A simultaneous electron energy and dosimeter calibration method for an electron beam irradiator

    International Nuclear Information System (INIS)

    Tanaka, R.; Sunaga, H.; Kojima, T.

    1991-01-01

    In radiation processing using electron accelerators, the reproducibility of absorbed dose in the product depends not only on the variation of beam current and conveyor speed, but also on variations of other accelerator parameters. This requires routine monitoring of the beam current and the scan width, and also requires periodical calibration of routine dosimeters usually in the shape of film, electron energy, and other radiation field parameters. The electron energy calibration is important especially for food processing. The dose calibration method using partial absorption calorimeters provides only information about absorbed dose. Measurement of average electron current density provides basic information about the radiation field formed by the beam scanning and scattering at the beam window, though it does not allow direct dose calibration. The total absorption calorimeter with a thick absorber allows dose and dosimeter calibration, if the depth profile of relative dose in a reference absorber is given experimentally. It also allows accurate calibration of the average electron energy at the surface of the calorimeter core, if electron fluence received by the calorimeter is measured at the same time. This means that both electron energy and dosimeters can be simultaneously calibrated by irradiation of a combined system including the calorimeter, the detector of the electron current density meter, and a thick reference absorber for depth profile measurement of relative dose. We have developed a simple and multifunctional system using the combined calibration method for 5 MeV electron beams. The paper describes a simultaneous calibration method for electron energy and film dosimeters, and describes the electron current density meter, the total absorption calorimeter, and the characteristics of this method. (author). 13 refs, 7 figs, 3 tabs

  12. Electronic cigarettes: human health effects

    OpenAIRE

    Callahan-Lyon, Priscilla

    2014-01-01

    Objective With the rapid increase in use of electronic nicotine delivery systems (ENDS), such as electronic cigarettes (e-cigarettes), users and non-users are exposed to the aerosol and product constituents. This is a review of published data on the human health effects of exposure to e-cigarettes and their components. Methods Literature searches were conducted through September 2013 using multiple electronic databases. Results Forty-four articles are included in this analysis. E-cigarette ae...

  13. Ionic Liquids for Absorption and Separation of Gases: An Extensive Database and a Systematic Screening Method

    DEFF Research Database (Denmark)

    Zhao, Yongsheng; Gani, Rafiqul; Afzal, Raja Muhammad

    2017-01-01

    requirements remains a challenging task. In this study, an extensive database of estimated Henry's law constants of twelve gases in more than ten thousand ILs at 313.15 K is established using the COSMO-RS method. Based on the database, a new systematic and efficient screening method for IL selection...

  14. ECMDB: The E. coli Metabolome Database

    OpenAIRE

    Guo, An Chi; Jewison, Timothy; Wilson, Michael; Liu, Yifeng; Knox, Craig; Djoumbou, Yannick; Lo, Patrick; Mandal, Rupasri; Krishnamurthy, Ram; Wishart, David S.

    2012-01-01

    The Escherichia coli Metabolome Database (ECMDB, http://www.ecmdb.ca) is a comprehensively annotated metabolomic database containing detailed information about the metabolome of E. coli (K-12). Modelled closely on the Human and Yeast Metabolome Databases, the ECMDB contains >2600 metabolites with links to ?1500 different genes and proteins, including enzymes and transporters. The information in the ECMDB has been collected from dozens of textbooks, journal articles and electronic databases. E...

  15. Requirements and specifications for a particle database

    International Nuclear Information System (INIS)

    2015-01-01

    One of the tasks of WPEC Subgroup 38 (SG38) is to design a database structure for storing the particle information needed for nuclear reaction databases and transport codes. Since the same particle may appear many times in a reaction database (produced by many different reactions on different targets), one of the long-term goals for SG38 is to move towards a central database of particle information to reduce redundancy and ensure consistency among evaluations. The database structure must be general enough to describe all relevant particles and their properties, including mass, charge, spin and parity, half-life, decay properties, and so on. Furthermore, it must be broad enough to handle not only excited nuclear states but also excited atomic states that can de-excite through atomic relaxation. Databases built with this hierarchy will serve as central repositories for particle information that can be linked to from codes and other databases. It is hoped that the final product is general enough for use in other projects besides SG38. While this is called a 'particle database', the definition of a particle (as described in Section 2) is very broad. The database must describe nucleons, nuclei, excited nuclear states (and possibly atomic states) in addition to fundamental particles like photons, electrons, muons, etc. Under this definition the list of possible particles becomes quite large. To help organize them the database will need a way of grouping related particles (e.g., all the isotopes of an element, or all the excited levels of an isotope) together into particle 'groups'. The database will also need a way to classify particles that belong to the same 'family' (such as 'leptons', 'baryons', etc.). Each family of particles may have special requirements as to what properties are required. One important function of the particle database will be to provide an easy way for codes and external databases to look up any particle stored inside. In order to make access as

  16. Integration of first-principles methods and crystallographic database searches for new ferroelectrics: Strategies and explorations

    International Nuclear Information System (INIS)

    Bennett, Joseph W.; Rabe, Karin M.

    2012-01-01

    In this concept paper, the development of strategies for the integration of first-principles methods with crystallographic database mining for the discovery and design of novel ferroelectric materials is discussed, drawing on the results and experience derived from exploratory investigations on three different systems: (1) the double perovskite Sr(Sb 1/2 Mn 1/2 )O 3 as a candidate semiconducting ferroelectric; (2) polar derivatives of schafarzikite MSb 2 O 4 ; and (3) ferroelectric semiconductors with formula M 2 P 2 (S,Se) 6 . A variety of avenues for further research and investigation are suggested, including automated structure type classification, low-symmetry improper ferroelectrics, and high-throughput first-principles searches for additional representatives of structural families with desirable functional properties. - Graphical abstract: Integration of first-principles methods with crystallographic database mining, for the discovery and design of novel ferroelectric materials, could potentially lead to new classes of multifunctional materials. Highlights: ► Integration of first-principles methods and database mining. ► Minor structural families with desirable functional properties. ► Survey of polar entries in the Inorganic Crystal Structural Database.

  17. Data-Based Decision-Making: Developing a Method for Capturing Teachers' Understanding of CBM Graphs

    Science.gov (United States)

    Espin, Christine A.; Wayman, Miya Miura; Deno, Stanley L.; McMaster, Kristen L.; de Rooij, Mark

    2017-01-01

    In this special issue, we explore the decision-making aspect of "data-based decision-making". The articles in the issue address a wide range of research questions, designs, methods, and analyses, but all focus on data-based decision-making for students with learning difficulties. In this first article, we introduce the topic of…

  18. Distance correlation methods for discovering associations in large astrophysical databases

    International Nuclear Information System (INIS)

    Martínez-Gómez, Elizabeth; Richards, Mercedes T.; Richards, Donald St. P.

    2014-01-01

    High-dimensional, large-sample astrophysical databases of galaxy clusters, such as the Chandra Deep Field South COMBO-17 database, provide measurements on many variables for thousands of galaxies and a range of redshifts. Current understanding of galaxy formation and evolution rests sensitively on relationships between different astrophysical variables; hence an ability to detect and verify associations or correlations between variables is important in astrophysical research. In this paper, we apply a recently defined statistical measure called the distance correlation coefficient, which can be used to identify new associations and correlations between astrophysical variables. The distance correlation coefficient applies to variables of any dimension, can be used to determine smaller sets of variables that provide equivalent astrophysical information, is zero only when variables are independent, and is capable of detecting nonlinear associations that are undetectable by the classical Pearson correlation coefficient. Hence, the distance correlation coefficient provides more information than the Pearson coefficient. We analyze numerous pairs of variables in the COMBO-17 database with the distance correlation method and with the maximal information coefficient. We show that the Pearson coefficient can be estimated with higher accuracy from the corresponding distance correlation coefficient than from the maximal information coefficient. For given values of the Pearson coefficient, the distance correlation method has a greater ability than the maximal information coefficient to resolve astrophysical data into highly concentrated horseshoe- or V-shapes, which enhances classification and pattern identification. These results are observed over a range of redshifts beyond the local universe and for galaxies from elliptical to spiral.

  19. Comparison of Electronic Data Capture (EDC) with the Standard Data Capture Method for Clinical Trial Data

    Science.gov (United States)

    Walther, Brigitte; Hossin, Safayet; Townend, John; Abernethy, Neil; Parker, David; Jeffries, David

    2011-01-01

    Background Traditionally, clinical research studies rely on collecting data with case report forms, which are subsequently entered into a database to create electronic records. Although well established, this method is time-consuming and error-prone. This study compares four electronic data capture (EDC) methods with the conventional approach with respect to duration of data capture and accuracy. It was performed in a West African setting, where clinical trials involve data collection from urban, rural and often remote locations. Methodology/Principal Findings Three types of commonly available EDC tools were assessed in face-to-face interviews; netbook, PDA, and tablet PC. EDC performance during telephone interviews via mobile phone was evaluated as a fourth method. The Graeco Latin square study design allowed comparison of all four methods to standard paper-based recording followed by data double entry while controlling simultaneously for possible confounding factors such as interview order, interviewer and interviewee. Over a study period of three weeks the error rates decreased considerably for all EDC methods. In the last week of the study the data accuracy for the netbook (5.1%, CI95%: 3.5–7.2%) and the tablet PC (5.2%, CI95%: 3.7–7.4%) was not significantly different from the accuracy of the conventional paper-based method (3.6%, CI95%: 2.2–5.5%), but error rates for the PDA (7.9%, CI95%: 6.0–10.5%) and telephone (6.3%, CI95% 4.6–8.6%) remained significantly higher. While EDC-interviews take slightly longer, data become readily available after download, making EDC more time effective. Free text and date fields were associated with higher error rates than numerical, single select and skip fields. Conclusions EDC solutions have the potential to produce similar data accuracy compared to paper-based methods. Given the considerable reduction in the time from data collection to database lock, EDC holds the promise to reduce research-associated costs

  20. Comparison of electronic data capture (EDC with the standard data capture method for clinical trial data.

    Directory of Open Access Journals (Sweden)

    Brigitte Walther

    Full Text Available BACKGROUND: Traditionally, clinical research studies rely on collecting data with case report forms, which are subsequently entered into a database to create electronic records. Although well established, this method is time-consuming and error-prone. This study compares four electronic data capture (EDC methods with the conventional approach with respect to duration of data capture and accuracy. It was performed in a West African setting, where clinical trials involve data collection from urban, rural and often remote locations. METHODOLOGY/PRINCIPAL FINDINGS: Three types of commonly available EDC tools were assessed in face-to-face interviews; netbook, PDA, and tablet PC. EDC performance during telephone interviews via mobile phone was evaluated as a fourth method. The Graeco Latin square study design allowed comparison of all four methods to standard paper-based recording followed by data double entry while controlling simultaneously for possible confounding factors such as interview order, interviewer and interviewee. Over a study period of three weeks the error rates decreased considerably for all EDC methods. In the last week of the study the data accuracy for the netbook (5.1%, CI95%: 3.5-7.2% and the tablet PC (5.2%, CI95%: 3.7-7.4% was not significantly different from the accuracy of the conventional paper-based method (3.6%, CI95%: 2.2-5.5%, but error rates for the PDA (7.9%, CI95%: 6.0-10.5% and telephone (6.3%, CI95% 4.6-8.6% remained significantly higher. While EDC-interviews take slightly longer, data become readily available after download, making EDC more time effective. Free text and date fields were associated with higher error rates than numerical, single select and skip fields. CONCLUSIONS: EDC solutions have the potential to produce similar data accuracy compared to paper-based methods. Given the considerable reduction in the time from data collection to database lock, EDC holds the promise to reduce research

  1. Non-animal methods to predict skin sensitization (I): the Cosmetics Europe database.

    Science.gov (United States)

    Hoffmann, Sebastian; Kleinstreuer, Nicole; Alépée, Nathalie; Allen, David; Api, Anne Marie; Ashikaga, Takao; Clouet, Elodie; Cluzel, Magalie; Desprez, Bertrand; Gellatly, Nichola; Goebel, Carsten; Kern, Petra S; Klaric, Martina; Kühnl, Jochen; Lalko, Jon F; Martinozzi-Teissier, Silvia; Mewes, Karsten; Miyazawa, Masaaki; Parakhia, Rahul; van Vliet, Erwin; Zang, Qingda; Petersohn, Dirk

    2018-05-01

    Cosmetics Europe, the European Trade Association for the cosmetics and personal care industry, is conducting a multi-phase program to develop regulatory accepted, animal-free testing strategies enabling the cosmetics industry to conduct safety assessments. Based on a systematic evaluation of test methods for skin sensitization, five non-animal test methods (DPRA (Direct Peptide Reactivity Assay), KeratinoSens TM , h-CLAT (human cell line activation test), U-SENS TM , SENS-IS) were selected for inclusion in a comprehensive database of 128 substances. Existing data were compiled and completed with newly generated data, the latter amounting to one-third of all data. The database was complemented with human and local lymph node assay (LLNA) reference data, physicochemical properties and use categories, and thoroughly curated. Focused on the availability of human data, the substance selection resulted nevertheless resulted in a high diversity of chemistries in terms of physico-chemical property ranges and use categories. Predictivities of skin sensitization potential and potency, where applicable, were calculated for the LLNA as compared to human data and for the individual test methods compared to both human and LLNA reference data. In addition, various aspects of applicability of the test methods were analyzed. Due to its high level of curation, comprehensiveness, and completeness, we propose our database as a point of reference for the evaluation and development of testing strategies, as done for example in the associated work of Kleinstreuer et al. We encourage the community to use it to meet the challenge of conducting skin sensitization safety assessment without generating new animal data.

  2. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  3. A development and integration of the concentration database for relative method, k0 method and absolute method in instrumental neutron activation analysis using Microsoft Access

    International Nuclear Information System (INIS)

    Hoh Siew Sin

    2012-01-01

    Instrumental Neutron Activation Analysis (INAA) is offen used to determine and calculate the concentration of an element in the sample by the National University of Malaysia, especially students of Nuclear Science Program. The lack of a database service leads consumers to take longer time to calculate the concentration of an element in the sample. This is because we are more dependent on software that is developed by foreign researchers which are costly. To overcome this problem, a study has been carried out to build an INAA database software. The objective of this study is to build a database software that help the users of INAA in Relative Method and Absolute Method for calculating the element concentration in the sample using Microsoft Excel 2010 and Microsoft Access 2010. The study also integrates k 0 data, k 0 Concent and k 0 -Westcott to execute and complete the system. After the integration, a study was conducted to test the effectiveness of the database software by comparing the concentrations between the experiments and in the database. Triple Bare Monitor Zr-Au and Cr-Mo-Au were used in Abs-INAA as monitor to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration are the net peak area (N p ), the measurement time (t m ), the irradiation time (t irr ), k-factor (k), thermal to epithermal neutron flux ratio (f), the parameters of the neutron flux distribution epithermal (α) and detection efficiency (ε p ). For Com-INAA databases, reference material IAEA-375 Soil was used to calculate the concentration of elements in the sample. CRM, SRM are also used in this database. After the INAA database integration, a verification process was to examine the effectiveness of the Abs-INAA was carried out by comparing the sample concentration between the in database and the experiment. The result of the experimental concentration value of INAA database software performed with high accuracy and precision. ICC

  4. The synthesis method for design of electron flow sources

    Science.gov (United States)

    Alexahin, Yu I.; Molodozhenzev, A. Yu

    1997-01-01

    The synthesis method to design a relativistic magnetically - focused beam source is described in this paper. It allows to find a shape of electrodes necessary to produce laminar space charge flows. Electron guns with shielded cathodes designed with this method were analyzed using the EGUN code. The obtained results have shown the coincidence of the synthesis and analysis calculations [1]. This method of electron gun calculation may be applied for immersed electron flows - of interest for the EBIS electron gun design.

  5. A method to add richness to the National Landslide Database of Great Britain

    Science.gov (United States)

    Taylor, Faith; Freeborough, Katy; Malamud, Bruce; Demeritt, David

    2014-05-01

    Landslides in Great Britain (GB) pose a risk to infrastructure, property and livelihoods. Our understanding of where landslide hazard and impact will be greatest is based on our knowledge of past events. Here, we present a method to supplement existing records of landslides in GB by searching electronic archives of local and regional newspapers. In Great Britain, the British Geological Survey (BGS) are responsible for updating and maintaining records of GB landslide events and their impacts in the National Landslide Database (NLD). The NLD contains records of approximately 16,500 landslide events in Great Britain. Data sources for the NLD include field surveys, academic articles, grey literature, news, public reports and, since 2012, social media. Here we aim to supplement the richness of the NLD by (i) identifying additional landslide events and (ii) adding more detail to existing database entries. This is done by systematically searching the LexisNexis digital archive of 568 local and regional newspapers published in the UK. The first step in the methodology was to construct Boolean search criteria that optimised the balance between minimising the number of irrelevant articles (e.g. "a landslide victory") and maximising those referring to landslide events. This keyword search was then applied to the LexisNexis archive of newspapers for all articles published between 1 January and 31 December 2012, resulting in 1,668 articles. These articles were assessed to determine whether they related to a landslide event. Of the 1,668 articles, approximately 30% (~700) referred to landslide events, with others referring to landslides more generally or themes unrelated to landslides. Examples of information obtained from newspaper articles included: date/time of landslide occurrence, spatial location, size, impact, landslide type and triggering mechanism, although the amount of detail and precision attainable from individual articles was variable. Of the 700 articles found for

  6. Development status of component reliability database for Korean NPPs and a case study

    International Nuclear Information System (INIS)

    Choi, S. Y.; Yang, S. H.; Lee, S. C.; Kim, S. H.; Han, S. H.

    1999-01-01

    We have applied a generic database to the PSA (Probabilistic Safety Assessment) for the Korean Standard NPPs (Nuclear Power Plant) since there is no specific component reliability database. However generic data is not enough to reflect the specific characteristics of domestic plants since it is collected by foreign plants. Therefore we are developing the plant-specific component reliability database for domestic NPPs. In this paper, we describe the development status of the component reliability database and the approach method of data collection and component failure analysis. We also summarize a case study of component failure analysis. We first collect the failure and repair data from the TR (Trouble Report) electronic database and the daily operation report sheet. Now we add a data collection method that checks the original TR sheet to improve the data quality. We input the component failure and repair data of principal components of about 30 systems into the component reliability database. Now, we are analyzing the component failure data of 11 safety systems among the systems to calculate component failure rate and unavailability etc

  7. Efficient electronic structure methods applied to metal nanoparticles

    DEFF Research Database (Denmark)

    Larsen, Ask Hjorth

    of efficient approaches to density functional theory and the application of these methods to metal nanoparticles. We describe the formalism and implementation of localized atom-centered basis sets within the projector augmented wave method. Basis sets allow for a dramatic increase in performance compared....... The basis set method is used to study the electronic effects for the contiguous range of clusters up to several hundred atoms. The s-electrons hybridize to form electronic shells consistent with the jellium model, leading to electronic magic numbers for clusters with full shells. Large electronic gaps...... and jumps in Fermi level near magic numbers can lead to alkali-like or halogen-like behaviour when main-group atoms adsorb onto gold clusters. A non-self-consistent NewnsAnderson model is used to more closely study the chemisorption of main-group atoms on magic-number Au clusters. The behaviour at magic...

  8. 14 CFR 1260.69 - Electronic funds transfer payment methods.

    Science.gov (United States)

    2010-01-01

    ... Government by electronic funds transfer through the Treasury Fedline Payment System (FEDLINE) or the... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Electronic funds transfer payment methods... COOPERATIVE AGREEMENTS General Special Conditions § 1260.69 Electronic funds transfer payment methods...

  9. Database of Small Molecule Thermochemistry for Combustion

    KAUST Repository

    Goldsmith, C. Franklin; Magoon, Gregory R.; Green, William H.

    2012-01-01

    High-accuracy ab initio thermochemistry is presented for 219 small molecules relevant in combustion chemistry, including many radical, biradical, and triplet species. These values are critical for accurate kinetic modeling. The RQCISD(T)/cc-PV∞QZ//B3LYP/6-311++G(d,p) method was used to compute the electronic energies. A bond additivity correction for this method has been developed to remove systematic errors in the enthalpy calculations, using the Active Thermochemical Tables as reference values. On the basis of comparison with the benchmark data, the 3σ uncertainty in the standard-state heat of formation is 0.9 kcal/mol, or within chemical accuracy. An uncertainty analysis is presented for the entropy and heat capacity. In many cases, the present values are the most accurate and comprehensive numbers available. The present work is compared to several published databases. In some cases, there are large discrepancies and errors in published databases; the present work helps to resolve these problems. © 2012 American Chemical Society.

  10. Database of Small Molecule Thermochemistry for Combustion

    KAUST Repository

    Goldsmith, C. Franklin

    2012-09-13

    High-accuracy ab initio thermochemistry is presented for 219 small molecules relevant in combustion chemistry, including many radical, biradical, and triplet species. These values are critical for accurate kinetic modeling. The RQCISD(T)/cc-PV∞QZ//B3LYP/6-311++G(d,p) method was used to compute the electronic energies. A bond additivity correction for this method has been developed to remove systematic errors in the enthalpy calculations, using the Active Thermochemical Tables as reference values. On the basis of comparison with the benchmark data, the 3σ uncertainty in the standard-state heat of formation is 0.9 kcal/mol, or within chemical accuracy. An uncertainty analysis is presented for the entropy and heat capacity. In many cases, the present values are the most accurate and comprehensive numbers available. The present work is compared to several published databases. In some cases, there are large discrepancies and errors in published databases; the present work helps to resolve these problems. © 2012 American Chemical Society.

  11. Database Dump - fRNAdb | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us fRNAdb Database Dump Data detail Data name Database Dump DOI 10.18908/lsdba.nbdc00452-002 De... data (tab separeted text) Data file File name: Database_Dump File URL: ftp://ftp....biosciencedbc.jp/archive/frnadb/LATEST/Database_Dump File size: 673 MB Simple search URL - Data acquisition...s. Data analysis method - Number of data entries 4 files - About This Database Database Description Download... License Update History of This Database Site Policy | Contact Us Database Dump - fRNAdb | LSDB Archive ...

  12. Variational methods in electron-atom scattering theory

    CERN Document Server

    Nesbet, Robert K

    1980-01-01

    The investigation of scattering phenomena is a major theme of modern physics. A scattered particle provides a dynamical probe of the target system. The practical problem of interest here is the scattering of a low­ energy electron by an N-electron atom. It has been difficult in this area of study to achieve theoretical results that are even qualitatively correct, yet quantitative accuracy is often needed as an adjunct to experiment. The present book describes a quantitative theoretical method, or class of methods, that has been applied effectively to this problem. Quantum mechanical theory relevant to the scattering of an electron by an N-electron atom, which may gain or lose energy in the process, is summarized in Chapter 1. The variational theory itself is presented in Chapter 2, both as currently used and in forms that may facilitate future applications. The theory of multichannel resonance and threshold effects, which provide a rich structure to observed electron-atom scattering data, is presented in Cha...

  13. Governance and oversight of researcher access to electronic health data: the role of the Independent Scientific Advisory Committee for MHRA database research, 2006-2015.

    Science.gov (United States)

    Waller, P; Cassell, J A; Saunders, M H; Stevens, R

    2017-03-01

    In order to promote understanding of UK governance and assurance relating to electronic health records research, we present and discuss the role of the Independent Scientific Advisory Committee (ISAC) for MHRA database research in evaluating protocols proposing the use of the Clinical Practice Research Datalink. We describe the development of the Committee's activities between 2006 and 2015, alongside growth in data linkage and wider national electronic health records programmes, including the application and assessment processes, and our approach to undertaking this work. Our model can provide independence, challenge and support to data providers such as the Clinical Practice Research Datalink database which has been used for well over 1,000 medical research projects. ISAC's role in scientific oversight ensures feasible and scientifically acceptable plans are in place, while having both lay and professional membership addresses governance issues in order to protect the integrity of the database and ensure that public confidence is maintained.

  14. System for cooling hybrid vehicle electronics, method for cooling hybrid vehicle electronics

    Science.gov (United States)

    France, David M.; Yu, Wenhua; Singh, Dileep; Zhao, Weihuan

    2017-11-21

    The invention provides a single radiator cooling system for use in hybrid electric vehicles, the system comprising a surface in thermal communication with electronics, and subcooled boiling fluid contacting the surface. The invention also provides a single radiator method for simultaneously cooling electronics and an internal combustion engine in a hybrid electric vehicle, the method comprising separating a coolant fluid into a first portion and a second portion; directing the first portion to the electronics and the second portion to the internal combustion engine for a time sufficient to maintain the temperature of the electronics at or below 175.degree. C.; combining the first and second portion to reestablish the coolant fluid; and treating the reestablished coolant fluid to the single radiator for a time sufficient to decrease the temperature of the reestablished coolant fluid to the temperature it had before separation.

  15. The European Fusion Material properties database

    Energy Technology Data Exchange (ETDEWEB)

    Karditsas, P.J. [UKAEA Fusion, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)]. E-mail: panos.karditsas@ukaea.org.uk; Lloyd, G. [Tessella Support Services plc, 3 Vineyard Chambers, Abingdon OX14 3PX (United Kingdom); Walters, M. [Tessella Support Services plc, 3 Vineyard Chambers, Abingdon OX14 3PX (United Kingdom); Peacock, A. [EFDA Close Support Unit, Garching D-85748 (Germany)

    2006-02-15

    Materials research represents a significant part of the European and world effort on fusion research. A European Fusion Materials web-based relational database is being developed to collect, expand and preserve for the future the data produced in support of the NET, DEMO and ITER. The database allows understanding of material properties and their critical parameters for fusion environments. The system uses J2EE technologies and the PostgreSQL relational database, and flexibility ensures that new methods to automate material design for specific applications can be easily implemented. It runs on a web server and allows users access via the Internet using their preferred web browser. The database allows users to store, browse and search raw tests, material properties and qualified data, and electronic reports. For data security, users are issued with individual accounts, and the origin of all requests is checked against a list of trusted sites. Different user accounts have access to different datasets to ensure the data is not shared unintentionally. The system allows several levels of data checking/cleaning and validation. Data insertion is either online or through downloaded templates, and validation is through different expert groups, which can apply different criteria to the data.

  16. Online drug databases: a new method to assess and compare inclusion of clinically relevant information.

    Science.gov (United States)

    Silva, Cristina; Fresco, Paula; Monteiro, Joaquim; Rama, Ana Cristina Ribeiro

    2013-08-01

    Evidence-Based Practice requires health care decisions to be based on the best available evidence. The model "Information Mastery" proposes that clinicians should use sources of information that have previously evaluated relevance and validity, provided at the point of care. Drug databases (DB) allow easy and fast access to information and have the benefit of more frequent content updates. Relevant information, in the context of drug therapy, is that which supports safe and effective use of medicines. Accordingly, the European Guideline on the Summary of Product Characteristics (EG-SmPC) was used as a standard to evaluate the inclusion of relevant information contents in DB. To develop and test a method to evaluate relevancy of DB contents, by assessing the inclusion of information items deemed relevant for effective and safe drug use. Hierarchical organisation and selection of the principles defined in the EGSmPC; definition of criteria to assess inclusion of selected information items; creation of a categorisation and quantification system that allows score calculation; calculation of relative differences (RD) of scores for comparison with an "ideal" database, defined as the one that achieves the best quantification possible for each of the information items; pilot test on a sample of 9 drug databases, using 10 drugs frequently associated in literature with morbidity-mortality and also being widely consumed in Portugal. Main outcome measure Calculate individual and global scores for clinically relevant information items of drug monographs in databases, using the categorisation and quantification system created. A--Method development: selection of sections, subsections, relevant information items and corresponding requisites; system to categorise and quantify their inclusion; score and RD calculation procedure. B--Pilot test: calculated scores for the 9 databases; globally, all databases evaluated significantly differed from the "ideal" database; some DB performed

  17. Electron microscopy methods in studies of cultural heritage sites

    Science.gov (United States)

    Vasiliev, A. L.; Kovalchuk, M. V.; Yatsishina, E. B.

    2016-11-01

    The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient "nanotechnologies"; hence, their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.

  18. Clinical Databases Originating in Electronic Patient Records

    Czech Academy of Sciences Publication Activity Database

    Zvárová, Jana

    2002-01-01

    Roč. 22, č. 1 (2002), s. 43-60 ISSN 0208-5216 R&D Projects: GA MŠk LN00B107 Keywords : medical informatics * tekemedicine * electronic health record * electronic medical guidelines * decision-support systems * cardiology Subject RIV: BD - Theory of Information

  19. Electron microscopy methods in studies of cultural heritage sites

    Energy Technology Data Exchange (ETDEWEB)

    Vasiliev, A. L., E-mail: a.vasiliev56@gmail.com; Kovalchuk, M. V.; Yatsishina, E. B. [National Research Centre “Kurchatov Institute” (Russian Federation)

    2016-11-15

    The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient “nanotechnologies”; hence, their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.

  20. Electron microscopy methods in studies of cultural heritage sites

    International Nuclear Information System (INIS)

    Vasiliev, A. L.; Kovalchuk, M. V.; Yatsishina, E. B.

    2016-01-01

    The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient “nanotechnologies”; hence, their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.

  1. Computational methods of electron/photon transport

    International Nuclear Information System (INIS)

    Mack, J.M.

    1983-01-01

    A review of computational methods simulating the non-plasma transport of electrons and their attendant cascades is presented. Remarks are mainly restricted to linearized formalisms at electron energies above 1 keV. The effectiveness of various metods is discussed including moments, point-kernel, invariant imbedding, discrete-ordinates, and Monte Carlo. Future research directions and the potential impact on various aspects of science and engineering are indicated

  2. Attenuation relation for strong motion in Eastern Java based on appropriate database and method

    Science.gov (United States)

    Mahendra, Rian; Rohadi, Supriyanto; Rudyanto, Ariska

    2017-07-01

    The selection and determination of attenuation relation has become important for seismic hazard assessment in active seismic region. This research initially constructs the appropriate strong motion database, including site condition and type of the earthquake. The data set consisted of large number earthquakes of 5 ≤ Mw ≤ 9 and distance less than 500 km that occurred around Java from 2009 until 2016. The location and depth of earthquake are being relocated using double difference method to improve the quality of database. Strong motion data from twelve BMKG's accelerographs which are located in east Java is used. The site condition is known by using dominant period and Vs30. The type of earthquake is classified into crustal earthquake, interface, and intraslab based on slab geometry analysis. A total of 10 Ground Motion Prediction Equations (GMPEs) are tested using Likelihood (Scherbaum et al., 2004) and Euclidean Distance Ranking method (Kale and Akkar, 2012) with the associated database. The evaluation of these methods lead to a set of GMPEs that can be applied for seismic hazard in East Java where the strong motion data is collected. The result of these methods found that there is still high deviation of GMPEs, so the writer modified some GMPEs using inversion method. Validation was performed by analysing the attenuation curve of the selected GMPE and observation data in period 2015 up to 2016. The results show that the selected GMPE is suitable for estimated PGA value in East Java.

  3. SINBAD: Shielding integral benchmark archive and database

    International Nuclear Information System (INIS)

    Hunter, H.T.; Ingersoll, D.T.; Roussin, R.W.

    1996-01-01

    SINBAD is a new electronic database developed to store a variety of radiation shielding benchmark data so that users can easily retrieve and incorporate the data into their calculations. SINBAD is an excellent data source for users who require the quality assurance necessary in developing cross-section libraries or radiation transport codes. The future needs of the scientific community are best served by the electronic database format of SINBAD and its user-friendly interface, combined with its data accuracy and integrity

  4. Method of fabricating a cooled electronic system

    Science.gov (United States)

    Chainer, Timothy J; Gaynes, Michael A; Graybill, David P; Iyengar, Madhusudan K; Kamath, Vinod; Kochuparambil, Bejoy J; Schmidt, Roger R; Schultz, Mark D; Simco, Daniel P; Steinke, Mark E

    2014-02-11

    A method of fabricating a liquid-cooled electronic system is provided which includes an electronic assembly having an electronics card and a socket with a latch at one end. The latch facilitates securing of the card within the socket. The method includes providing a liquid-cooled cold rail at the one end of the socket, and a thermal spreader to couple the electronics card to the cold rail. The thermal spreader includes first and second thermal transfer plates coupled to first and second surfaces on opposite sides of the card, and thermally conductive extensions extending from end edges of the plates, which couple the respective transfer plates to the liquid-cooled cold rail. The extensions are disposed to the sides of the latch, and the card is securable within or removable from the socket using the latch without removing the cold rail or the thermal spreader.

  5. An electron moiré method for a common SEM

    Institute of Scientific and Technical Information of China (English)

    Y.M.Xing; S.Kishimoto; Y.R.Zhao

    2006-01-01

    In the electron moiré method,a high-frequency grating is used to measure microscopic deformation,which promises significant potential applications for the method in the microscopic analysis of materials.However,a special beam scanning control device is required to produce a grating and generate a moiré fringe pattern for the scanning electron microscope (SEM).Because only a few SEMs used in the material science studies are equipped with this device,the use of the electron moiré method is limited.In this study,an electron moiré method for a common SEM without the beam control device is presented.A grating based on a multi-scanning concept is fabricated in any observing mode.A real-time moiré pattern can also be generated in the SEM or an optical filtering system.Without the beam control device being a prerequisite,the electron moiré method can be more widely used.The experimental results from three different types of SEMS show that high quality gratings with uniform lines and less pitch error can be fabricated by this method,and moiré patterns can also be correctly generated.

  6. Identifying potentially eligible subjects for research: paper-based logs versus the hospital administrative database.

    Science.gov (United States)

    Magee, L A; Massey, K; von Dadelszen, P; Fazio, M; Payne, B; Liston, R

    2011-12-01

    The Canadian Perinatal Network (CPN) is a national database focused on threatened very pre-term birth. Women with one or more conditions most commonly associated with very pre-term birth are included if admitted to a participating tertiary perinatal unit at 22 weeks and 0 days to 28 weeks and 6 days. At BC Women's Hospital and Health Centre, we compared traditional paper-based ward logs and a search of the Canadian Institute for Health Information (CIHI) electronic database of inpatient discharges to identify patients. The study identified 244 women potentially eligible for inclusion in the CPN admitted between April and December 2007. Of the 155 eligible women entered into the CPN database, each method identified a similar number of unique records (142 and 147) not ascertained by the other: 10 (6.4%) by CIHI search and 5 (3.2%) by ward log review. However, CIHI search achieved these results after reviewing fewer records (206 vs. 223) in less time (0.67 vs. 13.6 hours for ward logs). Either method is appropriate for identification of potential research subjects using gestational age criteria. Although electronic methods are less time-consuming, they cannot be performed until after the patient is discharged and records and charts are reviewed. Each method's advantages and disadvantages will dictate use for a specific project.

  7. Methods of Analysis of Electronic Money in Banks

    Directory of Open Access Journals (Sweden)

    Melnychenko Oleksandr V.

    2014-03-01

    Full Text Available The article identifies methods of analysis of electronic money, formalises its instruments and offers an integral indicator, which should be calculated by issuing banks and those banks, which carry out operations with electronic money, issued by other banks. Calculation of the integral indicator would allow complex assessment of activity of the studied bank with electronic money and would allow comparison of parameters of different banks by the aggregate of indicators for the study of the electronic money market, its level of development, etc. The article presents methods which envisage economic analysis of electronic money in banks by the following directions: solvency and liquidity, efficiency of electronic money issue, business activity of the bank and social responsibility. Moreover, the proposed indicators by each of the directions are offered to be taken into account when building integral indicators, with the help of which banks are studied: business activity, profitability, solvency, liquidity and so on.

  8. Survey of electronic payment methods and systems

    NARCIS (Netherlands)

    Havinga, Paul J.M.; Smit, Gerardus Johannes Maria; Helme, A.; Verbraeck, A.

    1996-01-01

    In this paper an overview of electronic payment methods and systems is given. This survey is done as part of the Moby Dick project. Electronic payment systems can be grouped into three broad classes: traditional money transactions, digital currency and creditdebit payments. Such payment systems have

  9. A theoretical-electron-density databank using a model of real and virtual spherical atoms.

    Science.gov (United States)

    Nassour, Ayoub; Domagala, Slawomir; Guillot, Benoit; Leduc, Theo; Lecomte, Claude; Jelsch, Christian

    2017-08-01

    A database describing the electron density of common chemical groups using combinations of real and virtual spherical atoms is proposed, as an alternative to the multipolar atom modelling of the molecular charge density. Theoretical structure factors were computed from periodic density functional theory calculations on 38 crystal structures of small molecules and the charge density was subsequently refined using a density model based on real spherical atoms and additional dummy charges on the covalent bonds and on electron lone-pair sites. The electron-density parameters of real and dummy atoms present in a similar chemical environment were averaged on all the molecules studied to build a database of transferable spherical atoms. Compared with the now-popular databases of transferable multipolar parameters, the spherical charge modelling needs fewer parameters to describe the molecular electron density and can be more easily incorporated in molecular modelling software for the computation of electrostatic properties. The construction method of the database is described. In order to analyse to what extent this modelling method can be used to derive meaningful molecular properties, it has been applied to the urea molecule and to biotin/streptavidin, a protein/ligand complex.

  10. Electronic cigarettes and nicotine clinical pharmacology

    OpenAIRE

    Schroeder, Megan J; Hoffman, Allison C

    2014-01-01

    Objective To review the available literature evaluating electronic cigarette (e-cigarette) nicotine clinical pharmacology in order to understand the potential impact of e-cigarettes on individual users, nicotine dependence and public health. Methods Literature searches were conducted between 1 October 2012 and 30 September 2013 using key terms in five electronic databases. Studies were included in the review if they were in English and publicly available; non-clinical studies, conference abst...

  11. New tools and methods for direct programmatic access to the dbSNP relational database.

    Science.gov (United States)

    Saccone, Scott F; Quan, Jiaxi; Mehta, Gaurang; Bolze, Raphael; Thomas, Prasanth; Deelman, Ewa; Tischfield, Jay A; Rice, John P

    2011-01-01

    Genome-wide association studies often incorporate information from public biological databases in order to provide a biological reference for interpreting the results. The dbSNP database is an extensive source of information on single nucleotide polymorphisms (SNPs) for many different organisms, including humans. We have developed free software that will download and install a local MySQL implementation of the dbSNP relational database for a specified organism. We have also designed a system for classifying dbSNP tables in terms of common tasks we wish to accomplish using the database. For each task we have designed a small set of custom tables that facilitate task-related queries and provide entity-relationship diagrams for each task composed from the relevant dbSNP tables. In order to expose these concepts and methods to a wider audience we have developed web tools for querying the database and browsing documentation on the tables and columns to clarify the relevant relational structure. All web tools and software are freely available to the public at http://cgsmd.isi.edu/dbsnpq. Resources such as these for programmatically querying biological databases are essential for viably integrating biological information into genetic association experiments on a genome-wide scale.

  12. First-principles method for electron-phonon coupling and electron mobility

    DEFF Research Database (Denmark)

    Gunst, Tue; Markussen, Troels; Stokbro, Kurt

    2016-01-01

    We present density functional theory calculations of the phonon-limited mobility in n-type monolayer graphene, silicene, and MoS2. The material properties, including the electron-phonon interaction, are calculated from first principles. We provide a detailed description of the normalized full......-band relaxation time approximation for the linearized Boltzmann transport equation (BTE) that includes inelastic scattering processes. The bulk electron-phonon coupling is evaluated by a supercell method. The method employed is fully numerical and does therefore not require a semianalytic treatment of part...... of the problem and, importantly, it keeps the anisotropy information stored in the coupling as well as the band structure. In addition, we perform calculations of the low-field mobility and its dependence on carrier density and temperature to obtain a better understanding of transport in graphene, silicene...

  13. EDM 1.0: electron direct methods.

    Science.gov (United States)

    Kilaas, R; Marks, L D; Own, C S

    2005-02-01

    A computer program designed to provide a number of quantitative analysis tools for high-resolution imaging and electron diffraction data is described. The program includes basic image manipulation, both real space and reciprocal space image processing, Wiener-filtering, symmetry averaging, methods for quantification of electron diffraction patterns and two-dimensional direct methods. The program consists of a number of sub-programs written in a combination of C++, C and Fortran. It can be downloaded either as GNU source code or as binaries and has been compiled and verified on a wide range of platforms, both Unix based and PC's. Elements of the design philosophy as well as future possible extensions are described.

  14. Electronic Publishing in Library and Information Science.

    Science.gov (United States)

    Lee, Joel M.; And Others

    1988-01-01

    Discusses electronic publishing as it refers to machine-readable databases. Types of electronic products and services are described and related topics considered: (1) usage of library and information science databases; (2) production and distribution of databases; (3) trends and projections in the electronic information industry; and (4)…

  15. AN EFFICIENT DATA MINING METHOD TO FIND FREQUENT ITEM SETS IN LARGE DATABASE USING TR- FCTM

    Directory of Open Access Journals (Sweden)

    Saravanan Suba

    2016-01-01

    Full Text Available Mining association rules in large database is one of most popular data mining techniques for business decision makers. Discovering frequent item set is the core process in association rule mining. Numerous algorithms are available in the literature to find frequent patterns. Apriori and FP-tree are the most common methods for finding frequent items. Apriori finds significant frequent items using candidate generation with more number of data base scans. FP-tree uses two database scans to find significant frequent items without using candidate generation. This proposed TR-FCTM (Transaction Reduction- Frequency Count Table Method discovers significant frequent items by generating full candidates once to form frequency count table with one database scan. Experimental results of TR-FCTM shows that this algorithm outperforms than Apriori and FP-tree.

  16. New method of ionization energy calculation for two-electron ions

    International Nuclear Information System (INIS)

    Ershov, D.K.

    1997-01-01

    A new method for calculation of the ionization energy of two-electron ions is proposed. The method is based on the calculation of the energy of second electron interaction with the field of an one-electron ion the potential of which is well known

  17. Clinical Databases for Chest Physicians.

    Science.gov (United States)

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  18. Development of Database Assisted Structure Identification (DASI Methods for Nontargeted Metabolomics

    Directory of Open Access Journals (Sweden)

    Lochana C. Menikarachchi

    2016-05-01

    Full Text Available Metabolite structure identification remains a significant challenge in nontargeted metabolomics research. One commonly used strategy relies on searching biochemical databases using exact mass. However, this approach fails when the database does not contain the unknown metabolite (i.e., for unknown-unknowns. For these cases, constrained structure generation with combinatorial structure generators provides a potential option. Here we evaluated structure generation constraints based on the specification of: (1 substructures required (i.e., seed structures; (2 substructures not allowed; and (3 filters to remove incorrect structures. Our approach (database assisted structure identification, DASI used predictive models in MolFind to find candidate structures with chemical and physical properties similar to the unknown. These candidates were then used for seed structure generation using eight different structure generation algorithms. One algorithm was able to generate correct seed structures for 21/39 test compounds. Eleven of these seed structures were large enough to constrain the combinatorial structure generator to fewer than 100,000 structures. In 35/39 cases, at least one algorithm was able to generate a correct seed structure. The DASI method has several limitations and will require further experimental validation and optimization. At present, it seems most useful for identifying the structure of unknown-unknowns with molecular weights <200 Da.

  19. Automated tools for cross-referencing large databases. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Clapp, N E; Green, P L; Bell, D [and others

    1997-05-01

    A Cooperative Research and Development Agreement (CRADA) was funded with TRESP Associates, Inc., to develop a limited prototype software package operating on one platform (e.g., a personal computer, small workstation, or other selected device) to demonstrate the concepts of using an automated database application to improve the process of detecting fraud and abuse of the welfare system. An analysis was performed on Tennessee`s welfare administration system. This analysis was undertaken to determine if the incidence of welfare waste, fraud, and abuse could be reduced and if the administrative process could be improved to reduce benefits overpayment errors. The analysis revealed a general inability to obtain timely data to support the verification of a welfare recipient`s economic status and eligibility for benefits. It has been concluded that the provision of more modern computer-based tools and the establishment of electronic links to other state and federal data sources could increase staff efficiency, reduce the incidence of out-of-date information provided to welfare assistance staff, and make much of the new data required available in real time. Electronic data links have been proposed to allow near-real-time access to data residing in databases located in other states and at federal agency data repositories. The ability to provide these improvements to the local office staff would require the provision of additional computers, software, and electronic data links within each of the offices and the establishment of approved methods of accessing remote databases and transferring potentially sensitive data. In addition, investigations will be required to ascertain if existing laws would allow such data transfers, and if not, what changed or new laws would be required. The benefits, in both cost and efficiency, to the state of Tennessee of having electronically-enhanced welfare system administration and control are expected to result in a rapid return of investment.

  20. Estimating the annotation error rate of curated GO database sequence annotations

    Directory of Open Access Journals (Sweden)

    Brown Alfred L

    2007-05-01

    Full Text Available Abstract Background Annotations that describe the function of sequences are enormously important to researchers during laboratory investigations and when making computational inferences. However, there has been little investigation into the data quality of sequence function annotations. Here we have developed a new method of estimating the error rate of curated sequence annotations, and applied this to the Gene Ontology (GO sequence database (GOSeqLite. This method involved artificially adding errors to sequence annotations at known rates, and used regression to model the impact on the precision of annotations based on BLAST matched sequences. Results We estimated the error rate of curated GO sequence annotations in the GOSeqLite database (March 2006 at between 28% and 30%. Annotations made without use of sequence similarity based methods (non-ISS had an estimated error rate of between 13% and 18%. Annotations made with the use of sequence similarity methodology (ISS had an estimated error rate of 49%. Conclusion While the overall error rate is reasonably low, it would be prudent to treat all ISS annotations with caution. Electronic annotators that use ISS annotations as the basis of predictions are likely to have higher false prediction rates, and for this reason designers of these systems should consider avoiding ISS annotations where possible. Electronic annotators that use ISS annotations to make predictions should be viewed sceptically. We recommend that curators thoroughly review ISS annotations before accepting them as valid. Overall, users of curated sequence annotations from the GO database should feel assured that they are using a comparatively high quality source of information.

  1. SOME ASPECTS REGARDING THE INTERNATIONAL DATABASES NOWADAYS

    Directory of Open Access Journals (Sweden)

    Emilian M. DOBRESCU

    2015-01-01

    Full Text Available A national database (NDB or an international one (abbreviated IDB, also named often as “data bank”, represents a method of storing some information and data on an external device (a storage device, with the possibility of an easy extension or an easy way to quickly find these information. Therefore, through IDB we don`t only understand a bibliometric or bibliographic index, which is a collection of references, that normally represents the “soft”, but also the respective IDB “hard”, which is the support and the storage technology. Usually, a database – a very comprehensive notion in the computer’s science – is a bibliographic index, compiled with specific purpose, objectives and means. In reality, the national and international databases are operated through management systems, usually electronic and informational, based on advanced manipulation technologies in the virtual space. On line encyclopedias can also be considered and are important international database (IDB. WorldCat, for example, is a world catalogue, that included the identification data for the books within circa 71.000 libraries in 112 countries, data classified through Online Computer Library Center (OCLC, with the participation of the libraries in the respective countries, especially of those that are national library.

  2. Undergraduate Use of Library Databases Decreases as Level of Study Progresses

    Directory of Open Access Journals (Sweden)

    Kimberly Miller

    2014-09-01

    Full Text Available A Review of: Mbabu, L.G., Bertram, A. B., & Varnum, K. (2013. Patterns of undergraduates’ use of scholarly databases in a large research university. Journal of Academic Librarianship, 39(2, 189-193. http://dx.doi.org/10.10.1016/j.acalib.2012.10.004 Abstract Objective – To investigate undergraduate students’ patterns of electronic database use to discover whether database use increases as undergraduate students progress into later stages of study with increasingly sophisticated information needs and demands. Design – User database authentication log analysis. Setting – A large research university in the Midwestern United States of America. Subjects – A total of 26,208 undergraduate students enrolled during the Fall 2009 academic semester. Methods – The researchers obtained logs of user-authenticated activity from the university’s databases. Logged data for each user included: the user’s action and details of that action (including database searches, the time of action, the user’s relationship to the university, the individual school in which the user was enrolled, and the user’s class standing. The data were analyzed to determine which proportion of undergraduate students accessed the library’s electronic databases. The study reports that the logged data accounted for 61% of all database activity, and the authors suggest the other 39% of use is likely from “non-undergraduate members of the research community within the [university’s] campus IP range” (192. Main Results – The study found that 10,897 (42% of the subject population of undergraduate students accessed the library’s electronic databases. The study also compared database access by class standing, and found that freshman undergraduates had the highest proportion of database use, with 56% of enrolled freshman accessing the library’s databases. Sophomores had the second highest proportion of students accessing the databases at 40%; juniors and seniors

  3. Electronic Publishing.

    Science.gov (United States)

    Lancaster, F. W.

    1989-01-01

    Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…

  4. Time-Dependent Close-Coupling Methods for Electron-Atom/Molecule Scattering

    International Nuclear Information System (INIS)

    Colgan, James

    2014-01-01

    The time-dependent close-coupling (TDCC) method centers on an accurate representation of the interaction between two outgoing electrons moving in the presence of a Coulomb field. It has been extensively applied to many problems of electrons, photons, and ions scattering from light atomic targets. Theoretical Description: The TDCC method centers on a solution of the time-dependent Schrödinger equation for two interacting electrons. The advantages of a time-dependent approach are two-fold; one treats the electron-electron interaction essentially in an exact manner (within numerical accuracy) and a time-dependent approach avoids the difficult boundary condition encountered when two free electrons move in a Coulomb field (the classic three-body Coulomb problem). The TDCC method has been applied to many fundamental atomic collision processes, including photon-, electron- and ion-impact ionization of light atoms. For application to electron-impact ionization of atomic systems, one decomposes the two-electron wavefunction in a partial wave expansion and represents the subsequent two-electron radial wavefunctions on a numerical lattice. The number of partial waves required to converge the ionization process depends on the energy of the incoming electron wavepacket and on the ionization threshold of the target atom or ion.

  5. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  6. Method for controlling low-energy high current density electron beams

    International Nuclear Information System (INIS)

    Lee, J.N.; Oswald, R.B. Jr.

    1977-01-01

    A method and an apparatus for controlling the angle of incidence of low-energy, high current density electron beams are disclosed. The apparatus includes a current generating diode arrangement with a mesh anode for producing a drifting electron beam. An auxiliary grounded screen electrode is placed between the anode and a target for controlling the average angle of incidence of electrons in the drifting electron beam. According to the method of the present invention, movement of the auxiliary screen electrode relative to the target and the anode permits reliable and reproducible adjustment of the average angle of incidence of the electrons in low energy, high current density relativistic electron beams

  7. The Global Index of Vegetation-Plot Databases 1 (GIVD): a new resource for vegetation science

    NARCIS (Netherlands)

    Dengler, J.; Jansen, F.; Glockler, F.; Schaminee, J.H.J.

    2011-01-01

    Question: How many vegetation plot observations (relevés) are available in electronic databases, how are they geographically distributed, what are their properties and how might they be discovered and located for research and application? Location: Global. Methods: We compiled the Global Index of

  8. Geometric methods for estimating representative sidewalk widths applied to Vienna's streetscape surfaces database

    Science.gov (United States)

    Brezina, Tadej; Graser, Anita; Leth, Ulrich

    2017-04-01

    Space, and in particular public space for movement and leisure, is a valuable and scarce resource, especially in today's growing urban centres. The distribution and absolute amount of urban space—especially the provision of sufficient pedestrian areas, such as sidewalks—is considered crucial for shaping living and mobility options as well as transport choices. Ubiquitous urban data collection and today's IT capabilities offer new possibilities for providing a relation-preserving overview and for keeping track of infrastructure changes. This paper presents three novel methods for estimating representative sidewalk widths and applies them to the official Viennese streetscape surface database. The first two methods use individual pedestrian area polygons and their geometrical representations of minimum circumscribing and maximum inscribing circles to derive a representative width of these individual surfaces. The third method utilizes aggregated pedestrian areas within the buffered street axis and results in a representative width for the corresponding road axis segment. Results are displayed as city-wide means in a 500 by 500 m grid and spatial autocorrelation based on Moran's I is studied. We also compare the results between methods as well as to previous research, existing databases and guideline requirements on sidewalk widths. Finally, we discuss possible applications of these methods for monitoring and regression analysis and suggest future methodological improvements for increased accuracy.

  9. A method to implement fine-grained access control for personal health records through standard relational database queries.

    Science.gov (United States)

    Sujansky, Walter V; Faus, Sam A; Stone, Ethan; Brennan, Patricia Flatley

    2010-10-01

    Online personal health records (PHRs) enable patients to access, manage, and share certain of their own health information electronically. This capability creates the need for precise access-controls mechanisms that restrict the sharing of data to that intended by the patient. The authors describe the design and implementation of an access-control mechanism for PHR repositories that is modeled on the eXtensible Access Control Markup Language (XACML) standard, but intended to reduce the cognitive and computational complexity of XACML. The authors implemented the mechanism entirely in a relational database system using ANSI-standard SQL statements. Based on a set of access-control rules encoded as relational table rows, the mechanism determines via a single SQL query whether a user who accesses patient data from a specific application is authorized to perform a requested operation on a specified data object. Testing of this query on a moderately large database has demonstrated execution times consistently below 100ms. The authors include the details of the implementation, including algorithms, examples, and a test database as Supplementary materials. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Database Security: What Students Need to Know

    Science.gov (United States)

    Murray, Meg Coffin

    2010-01-01

    Database security is a growing concern evidenced by an increase in the number of reported incidents of loss of or unauthorized exposure to sensitive data. As the amount of data collected, retained and shared electronically expands, so does the need to understand database security. The Defense Information Systems Agency of the US Department of…

  11. A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases

    Science.gov (United States)

    Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie

    2018-01-01

    Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade

  12. Using AMDD method for Database Design in Mobile Cloud Computing Systems

    OpenAIRE

    Silviu Claudiu POPA; Mihai-Constantin AVORNICULUI; Vasile Paul BRESFELEAN

    2013-01-01

    The development of the technologies of wireless telecommunications gave birth of new kinds of e-commerce, the so called Mobile e-Commerce or m-Commerce. Mobile Cloud Computing (MCC) represents a new IT research area that combines mobile computing and cloud compu-ting techniques. Behind a cloud mobile commerce system there is a database containing all necessary information for transactions. By means of Agile Model Driven Development (AMDD) method, we are able to achieve many benefits that smoo...

  13. Method of determining the position of an irradiated electron beam

    International Nuclear Information System (INIS)

    Fukuda, Wataru.

    1967-01-01

    The present invention relates to the method of determining the position of a radiated electron beam, in particular, the method of detecting the position of a p-n junction by a novel method when irradiating the electron beam on to the semi-conductor wafer, controlling the position of the electron beam from said junction. When the electron beam is irradiated on to the semi-conductor wafer which possesses the p-n junction, the position of the p-n junction may be ascertained to determine the position of the irradiated electron beam by detecting the electromotive force resulting from said p-n junction with a metal disposed in the proximity of but without mechanical contact with said semi-conductor wafer. Furthermore, as far as a semi-conductor wafer having at least one p-n junction is concerned, the present invention allows said p-n junction to be used to determine the position of an irradiated electron beam. Thus, according to the present invention, the electromotive force of the electron beam resulting from the p-n junction may easily be detected by electrostatic coupling, enabling the position of the irradiated electron beam to be accurately determined. (Masui, R.)

  14. Multilayer electronic component systems and methods of manufacture

    Science.gov (United States)

    Thompson, Dane (Inventor); Wang, Guoan (Inventor); Kingsley, Nickolas D. (Inventor); Papapolymerou, Ioannis (Inventor); Tentzeris, Emmanouil M. (Inventor); Bairavasubramanian, Ramanan (Inventor); DeJean, Gerald (Inventor); Li, RongLin (Inventor)

    2010-01-01

    Multilayer electronic component systems and methods of manufacture are provided. In this regard, an exemplary system comprises a first layer of liquid crystal polymer (LCP), first electronic components supported by the first layer, and a second layer of LCP. The first layer is attached to the second layer by thermal bonds. Additionally, at least a portion of the first electronic components are located between the first layer and the second layer.

  15. 14 CFR 1274.931 - Electronic funds transfer payment methods.

    Science.gov (United States)

    2010-01-01

    ... cooperative agreement will be made by the Government by electronic funds transfer through the Treasury Fedline... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Electronic funds transfer payment methods... COOPERATIVE AGREEMENTS WITH COMMERCIAL FIRMS Other Provisions and Special Conditions § 1274.931 Electronic...

  16. Implementation of a database on drugs into a university hospital Intranet.

    Science.gov (United States)

    François, M; Joubert, M; Fieschi, D; Fieschi, M

    1998-01-01

    Several databases on drugs have been developed worldwide for drug information functions whose sources are now electronically available. Our objective was to implement one of them in our University hospitals information system. Thériaque is a database which contains information on all the drugs available in France. Before its implementation we modeled its content (chemical classes, active components, excipients, indications, contra-indications, side effects, and so on) following an object-oriented method. From this model we designed dynamic HTML pages according to the Microsoft's Internet Database Connector (IDC) technics. This allowed a fast implementation and does not imply to port a client application on the thousands of workstations over the network of the University hospitals. This interface provides end-users with an easy-to-use and natural way to access information related to drugs in an Intranet environment.

  17. Software listing: CHEMTOX database

    International Nuclear Information System (INIS)

    Moskowitz, P.D.

    1993-01-01

    Initially launched in 1983, the CHEMTOX Database was among the first microcomputer databases containing hazardous chemical information. The database is used in many industries and government agencies in more than 17 countries. Updated quarterly, the CHEMTOX Database provides detailed environmental and safety information on 7500-plus hazardous substances covered by dozens of regulatory and advisory sources. This brief listing describes the method of accessing data and provides ordering information for those wishing to obtain the CHEMTOX Database

  18. CD-ROM-aided Databases

    Science.gov (United States)

    Masuyama, Keiichi

    CD-ROM has rapidly evolved as a new information medium with large capacity, In the U.S. it is predicted that it will become two hundred billion yen market in three years, and thus CD-ROM is strategic target of database industry. Here in Japan the movement toward its commercialization has been active since this year. Shall CD-ROM bussiness ever conquer information market as an on-disk database or electronic publication? Referring to some cases of the applications in the U.S. the author views marketability and the future trend of this new optical disk medium.

  19. A data driven method to measure electron charge mis-identification rate

    CERN Document Server

    Bakhshiansohi, Hamed

    2009-01-01

    Electron charge mis-measurement is an important challenge in analyses which depend on the charge of electron. To estimate the probability of {\\it electron charge mis-measurement} a data driven method is introduced and a good agreement with MC based methods is achieved.\\\\ The third moment of $\\phi$ distribution of hits in electron SuperCluster is studied. The correlation between this variable and the electron charge is also investigated. Using this `new' variable and some other variables the electron charge measurement is improved by two different approaches.

  20. Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases Gibbs Method and Statistical Physics of Electron Gases

    CERN Document Server

    Askerov, Bahram M

    2010-01-01

    This book deals with theoretical thermodynamics and the statistical physics of electron and particle gases. While treating the laws of thermodynamics from both classical and quantum theoretical viewpoints, it posits that the basis of the statistical theory of macroscopic properties of a system is the microcanonical distribution of isolated systems, from which all canonical distributions stem. To calculate the free energy, the Gibbs method is applied to ideal and non-ideal gases, and also to a crystalline solid. Considerable attention is paid to the Fermi-Dirac and Bose-Einstein quantum statistics and its application to different quantum gases, and electron gas in both metals and semiconductors is considered in a nonequilibrium state. A separate chapter treats the statistical theory of thermodynamic properties of an electron gas in a quantizing magnetic field.

  1. Numerical methods in electron magnetic resonance

    International Nuclear Information System (INIS)

    Soernes, A.R.

    1998-01-01

    The focal point of the thesis is the development and use of numerical methods in the analysis, simulation and interpretation of Electron Magnetic Resonance experiments on free radicals in solids to uncover the structure, the dynamics and the environment of the system

  2. Numerical methods in electron magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Soernes, A.R

    1998-07-01

    The focal point of the thesis is the development and use of numerical methods in the analysis, simulation and interpretation of Electron Magnetic Resonance experiments on free radicals in solids to uncover the structure, the dynamics and the environment of the system.

  3. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  4. Database Optimizing Services

    Directory of Open Access Journals (Sweden)

    Adrian GHENCEA

    2010-12-01

    Full Text Available Almost every organization has at its centre a database. The database provides support for conducting different activities, whether it is production, sales and marketing or internal operations. Every day, a database is accessed for help in strategic decisions. The satisfaction therefore of such needs is entailed with a high quality security and availability. Those needs can be realised using a DBMS (Database Management System which is, in fact, software for a database. Technically speaking, it is software which uses a standard method of cataloguing, recovery, and running different data queries. DBMS manages the input data, organizes it, and provides ways of modifying or extracting the data by its users or other programs. Managing the database is an operation that requires periodical updates, optimizing and monitoring.

  5. A curated gluten protein sequence database to support development of proteomics methods for determination of gluten in gluten-free foods.

    Science.gov (United States)

    Bromilow, Sophie; Gethings, Lee A; Buckley, Mike; Bromley, Mike; Shewry, Peter R; Langridge, James I; Clare Mills, E N

    2017-06-23

    The unique physiochemical properties of wheat gluten enable a diverse range of food products to be manufactured. However, gluten triggers coeliac disease, a condition which is treated using a gluten-free diet. Analytical methods are required to confirm if foods are gluten-free, but current immunoassay-based methods can unreliable and proteomic methods offer an alternative but require comprehensive and well annotated sequence databases which are lacking for gluten. A manually a curated database (GluPro V1.0) of gluten proteins, comprising 630 discrete unique full length protein sequences has been compiled. It is representative of the different types of gliadin and glutenin components found in gluten. An in silico comparison of their coeliac toxicity was undertaken by analysing the distribution of coeliac toxic motifs. This demonstrated that whilst the α-gliadin proteins contained more toxic motifs, these were distributed across all gluten protein sub-types. Comparison of annotations observed using a discovery proteomics dataset acquired using ion mobility MS/MS showed that more reliable identifications were obtained using the GluPro V1.0 database compared to the complete reviewed Viridiplantae database. This highlights the value of a curated sequence database specifically designed to support the proteomic workflows and the development of methods to detect and quantify gluten. We have constructed the first manually curated open-source wheat gluten protein sequence database (GluPro V1.0) in a FASTA format to support the application of proteomic methods for gluten protein detection and quantification. We have also analysed the manually verified sequences to give the first comprehensive overview of the distribution of sequences able to elicit a reaction in coeliac disease, the prevalent form of gluten intolerance. Provision of this database will improve the reliability of gluten protein identification by proteomic analysis, and aid the development of targeted mass

  6. EDDIX--a database of ionisation double differential cross sections.

    Science.gov (United States)

    MacGibbon, J H; Emerson, S; Liamsuwan, T; Nikjoo, H

    2011-02-01

    The use of Monte Carlo track structure is a choice method in biophysical modelling and calculations. To precisely model 3D and 4D tracks, the cross section for the ionisation by an incoming ion, double differential in the outgoing electron energy and angle, is required. However, the double differential cross section cannot be theoretically modelled over the full range of parameters. To address this issue, a database of all available experimental data has been constructed. Currently, the database of Experimental Double Differential Ionisation Cross sections (EDDIX) contains over 1200 digitalised experimentally measured datasets from the 1960s to present date, covering all available ion species (hydrogen to uranium) and all available target species. Double differential cross sections are also presented with the aid of an eight parameter functions fitted to the cross sections. The parameters include projectile species and charge, target nuclear charge and atomic mass, projectile atomic mass and energy, electron energy and deflection angle. It is planned to freely distribute EDDIX and make it available to the radiation research community for use in the analytical and numerical modelling of track structure.

  7. Validation of asthma recording in electronic health records: a systematic review

    Directory of Open Access Journals (Sweden)

    Nissen F

    2017-12-01

    Full Text Available Francis Nissen,1 Jennifer K Quint,2 Samantha Wilkinson,1 Hana Mullerova,3 Liam Smeeth,1 Ian J Douglas1 1Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK; 2National Heart and Lung Institute, Imperial College, London, UK; 3RWD & Epidemiology, GSK R&D, Uxbridge, UK Objective: To describe the methods used to validate asthma diagnoses in electronic health records and summarize the results of the validation studies. Background: Electronic health records are increasingly being used for research on asthma to inform health services and health policy. Validation of the recording of asthma diagnoses in electronic health records is essential to use these databases for credible epidemiological asthma research.Methods: We searched EMBASE and MEDLINE databases for studies that validated asthma diagnoses detected in electronic health records up to October 2016. Two reviewers independently assessed the full text against the predetermined inclusion criteria. Key data including author, year, data source, case definitions, reference standard, and validation statistics (including sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV] were summarized in two tables.Results: Thirteen studies met the inclusion criteria. Most studies demonstrated a high validity using at least one case definition (PPV >80%. Ten studies used a manual validation as the reference standard; each had at least one case definition with a PPV of at least 63%, up to 100%. We also found two studies using a second independent database to validate asthma diagnoses. The PPVs of the best performing case definitions ranged from 46% to 58%. We found one study which used a questionnaire as the reference standard to validate a database case definition; the PPV of the case definition algorithm in this study was 89%. Conclusion: Attaining high PPVs (>80% is possible using each of the discussed validation

  8. Constructing Effective Search Strategies for Electronic Searching.

    Science.gov (United States)

    Flanagan, Lynn; Parente, Sharon Campbell

    Electronic databases have grown tremendously in both number and popularity since their development during the 1960s. Access to electronic databases in academic libraries was originally offered primarily through mediated search services by trained librarians; however, the advent of CD-ROM and end-user interfaces for online databases has shifted the…

  9. [Electronic poison information management system].

    Science.gov (United States)

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier.

  10. Optimization and Accessibility of the Qweak Database

    Science.gov (United States)

    Urban, Erik; Spayde, Damon

    2010-11-01

    The Qweak experiment is a multi-institutional collaborative effort at Thomas Jefferson National Accelerator Facility designed to accurately determine the weak nuclear charge of a proton through measurements of the parity violating asymmetries of electron-proton elastic scattering that result from pulses of electrons with opposite helicities. Through the study of these scattering asymmetries, the Qweak experiment hopes to constrain extensions of the Standard Model or find indications of new physics. Since precision is critical to the success of the Qweak experiment, the collaboration will be taking data for thousands of hours. The Qweak database is responsible for storing the non-binary, processed data of this experiment in a meaningful and organized manner for use at a later date. The goal of this undertaking to not only create a database which can input and output data quickly, but create one which can easily be accessed by those who have minimal knowledge of the database language. Through tests on the system, the speed of retrieval and insert times has been optimized and, in addition, the implementation of summary tables and additional programs should make the majority of commonly sought results readily available to database novices.

  11. Numerical simulation methods for electron and ion optics

    International Nuclear Information System (INIS)

    Munro, Eric

    2011-01-01

    This paper summarizes currently used techniques for simulation and computer-aided design in electron and ion beam optics. Topics covered include: field computation, methods for computing optical properties (including Paraxial Rays and Aberration Integrals, Differential Algebra and Direct Ray Tracing), simulation of Coulomb interactions, space charge effects in electron and ion sources, tolerancing, wave optical simulations and optimization. Simulation examples are presented for multipole aberration correctors, Wien filter monochromators, imaging energy filters, magnetic prisms, general curved axis systems and electron mirrors.

  12. A web-based database for EPR centers in semiconductors

    International Nuclear Information System (INIS)

    Umeda, T.; Hagiwara, S.; Katagiri, M.; Mizuochi, N.; Isoya, J.

    2006-01-01

    We develop a web-based database system for electron paramagnetic resonance (EPR) centers in semiconductors. This database is available to anyone at http://www.kc.tsukuba.ac.jp/div-media/epr/. It currently has more than 300 records of the spin-Hamiltonian parameters for major known EPR centers. One can upload own new records to the database or can use simulation tools powered by EPR-NMR(C). Here, we describe the features and objectives of this database, and mention some future plans

  13. Medical databases in studies of drug teratogenicity: methodological issues

    Directory of Open Access Journals (Sweden)

    Vera Ehrenstein

    2010-03-01

    Full Text Available Vera Ehrenstein1, Henrik T Sørensen1, Leiv S Bakketeig1,2, Lars Pedersen11Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, Denmark; 2Norwegian Institute of Public Health, Oslo, NorwayAbstract: More than half of all pregnant women take prescription medications, raising concerns about fetal safety. Medical databases routinely collecting data from large populations are potentially valuable resources for cohort studies addressing teratogenicity of drugs. These include electronic medical records, administrative databases, population health registries, and teratogenicity information services. Medical databases allow estimation of prevalences of birth defects with enhanced precision, but systematic error remains a potentially serious problem. In this review, we first provide a brief description of types of North American and European medical databases suitable for studying teratogenicity of drugs and then discuss manifestation of systematic errors in teratogenicity studies based on such databases. Selection bias stems primarily from the inability to ascertain all reproductive outcomes. Information bias (misclassification may be caused by paucity of recorded clinical details or incomplete documentation of medication use. Confounding, particularly confounding by indication, can rarely be ruled out. Bias that either masks teratogenicity or creates false appearance thereof, may have adverse consequences for the health of the child and the mother. Biases should be quantified and their potential impact on the study results should be assessed. Both theory and software are available for such estimation. Provided that methodological problems are understood and effectively handled, computerized medical databases are a valuable source of data for studies of teratogenicity of drugs.Keywords: databases, birth defects, epidemiologic methods, pharmacoepidemiology

  14. TIJAH: Embracing IR Methods in XML Databases

    NARCIS (Netherlands)

    List, Johan; Mihajlovic, V.; Ramirez, Georgina; de Vries, A.P.; Hiemstra, Djoerd; Blok, H.E.

    2005-01-01

    This paper discusses our participation in INEX (the Initiative for the Evaluation of XML Retrieval) using the TIJAH XML-IR system. TIJAH's system design follows a `standard' layered database architecture, carefully separating the conceptual, logical and physical levels. At the conceptual level, we

  15. Update History of This Database - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...B link & Genome analysis methods English archive site is opened. 2012/08/08 PGDBj... Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods is opened. About This...ate History of This Database - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  16. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    Science.gov (United States)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed

  17. Fast electronic structure methods for strongly correlated molecular systems

    International Nuclear Information System (INIS)

    Head-Gordon, Martin; Beran, Gregory J O; Sodt, Alex; Jung, Yousung

    2005-01-01

    A short review is given of newly developed fast electronic structure methods that are designed to treat molecular systems with strong electron correlations, such as diradicaloid molecules, for which standard electronic structure methods such as density functional theory are inadequate. These new local correlation methods are based on coupled cluster theory within a perfect pairing active space, containing either a linear or quadratic number of pair correlation amplitudes, to yield the perfect pairing (PP) and imperfect pairing (IP) models. This reduces the scaling of the coupled cluster iterations to no worse than cubic, relative to the sixth power dependence of the usual (untruncated) coupled cluster doubles model. A second order perturbation correction, PP(2), to treat the neglected (weaker) correlations is formulated for the PP model. To ensure minimal prefactors, in addition to favorable size-scaling, highly efficient implementations of PP, IP and PP(2) have been completed, using auxiliary basis expansions. This yields speedups of almost an order of magnitude over the best alternatives using 4-center 2-electron integrals. A short discussion of the scope of accessible chemical applications is given

  18. A Web-based Alternative Non-animal Method Database for Safety Cosmetic Evaluations

    OpenAIRE

    Kim, Seung Won; Kim, Bae-Hwan

    2016-01-01

    Animal testing was used traditionally in the cosmetics industry to confirm product safety, but has begun to be banned; alternative methods to replace animal experiments are either in development, or are being validated, worldwide. Research data related to test substances are critical for developing novel alternative tests. Moreover, safety information on cosmetic materials has neither been collected in a database nor shared among researchers. Therefore, it is imperative to build and share a d...

  19. The Human Communication Research Centre dialogue database.

    Science.gov (United States)

    Anderson, A H; Garrod, S C; Clark, A; Boyle, E; Mullin, J

    1992-10-01

    The HCRC dialogue database consists of over 700 transcribed and coded dialogues from pairs of speakers aged from seven to fourteen. The speakers are recorded while tackling co-operative problem-solving tasks and the same pairs of speakers are recorded over two years tackling 10 different versions of our two tasks. In addition there are over 200 dialogues recorded between pairs of undergraduate speakers engaged on versions of the same tasks. Access to the database, and to its accompanying custom-built search software, is available electronically over the JANET system by contacting liz@psy.glasgow.ac.uk, from whom further information about the database and a user's guide to the database can be obtained.

  20. O-GLYCBASE: a revised database of O-glycosylated proteins

    DEFF Research Database (Denmark)

    Hansen, Jan; Lund, Ole; Nielsen, Jens O.

    1996-01-01

    O-GLYCBASE is a comprehensive database of information on glycoproteins and their O-linked glycosylation sites. Entries are compiled and revised from the SWISS-PROT and PIR databases as well as directly from recently published reports. Nineteen percent of the entries extracted from the databases n...... of mucin type O-glycosylation sites in mammalian glycoproteins exclusively from the primary sequence is made available by E-mail or WWW. The O-GLYCBASE database is also available electronically through our WWW server or by anonymous FTP....

  1. Modified Monte Carlo method for study of electron transport in degenerate electron gas in the presence of electron-electron interactions, application to graphene

    Science.gov (United States)

    Borowik, Piotr; Thobel, Jean-Luc; Adamowicz, Leszek

    2017-07-01

    Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron-electron (e-e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e-e interactions. This required adapting the treatment of e-e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.

  2. Identifying complications of interventional procedures from UK routine healthcare databases: a systematic search for methods using clinical codes.

    Science.gov (United States)

    Keltie, Kim; Cole, Helen; Arber, Mick; Patrick, Hannah; Powell, John; Campbell, Bruce; Sims, Andrew

    2014-11-28

    Several authors have developed and applied methods to routine data sets to identify the nature and rate of complications following interventional procedures. But, to date, there has been no systematic search for such methods. The objective of this article was to find, classify and appraise published methods, based on analysis of clinical codes, which used routine healthcare databases in a United Kingdom setting to identify complications resulting from interventional procedures. A literature search strategy was developed to identify published studies that referred, in the title or abstract, to the name or acronym of a known routine healthcare database and to complications from procedures or devices. The following data sources were searched in February and March 2013: Cochrane Methods Register, Conference Proceedings Citation Index - Science, Econlit, EMBASE, Health Management Information Consortium, Health Technology Assessment database, MathSciNet, MEDLINE, MEDLINE in-process, OAIster, OpenGrey, Science Citation Index Expanded and ScienceDirect. Of the eligible papers, those which reported methods using clinical coding were classified and summarised in tabular form using the following headings: routine healthcare database; medical speciality; method for identifying complications; length of follow-up; method of recording comorbidity. The benefits and limitations of each approach were assessed. From 3688 papers identified from the literature search, 44 reported the use of clinical codes to identify complications, from which four distinct methods were identified: 1) searching the index admission for specified clinical codes, 2) searching a sequence of admissions for specified clinical codes, 3) searching for specified clinical codes for complications from procedures and devices within the International Classification of Diseases 10th revision (ICD-10) coding scheme which is the methodology recommended by NHS Classification Service, and 4) conducting manual clinical

  3. Analysing and Rationalising Molecular and Materials Databases Using Machine-Learning

    Science.gov (United States)

    de, Sandip; Ceriotti, Michele

    Computational materials design promises to greatly accelerate the process of discovering new or more performant materials. Several collaborative efforts are contributing to this goal by building databases of structures, containing between thousands and millions of distinct hypothetical compounds, whose properties are computed by high-throughput electronic-structure calculations. The complexity and sheer amount of information has made manual exploration, interpretation and maintenance of these databases a formidable challenge, making it necessary to resort to automatic analysis tools. Here we will demonstrate how, starting from a measure of (dis)similarity between database items built from a combination of local environment descriptors, it is possible to apply hierarchical clustering algorithms, as well as dimensionality reduction methods such as sketchmap, to analyse, classify and interpret trends in molecular and materials databases, as well as to detect inconsistencies and errors. Thanks to the agnostic and flexible nature of the underlying metric, we will show how our framework can be applied transparently to different kinds of systems ranging from organic molecules and oligopeptides to inorganic crystal structures as well as molecular crystals. Funded by National Center for Computational Design and Discovery of Novel Materials (MARVEL) and Swiss National Science Foundation.

  4. The Russian effort in establishing large atomic and molecular databases

    Science.gov (United States)

    Presnyakov, Leonid P.

    1998-07-01

    The database activities in Russia have been developed in connection with UV and soft X-ray spectroscopic studies of extraterrestrial and laboratory (magnetically confined and laser-produced) plasmas. Two forms of database production are used: i) a set of computer programs to calculate radiative and collisional data for the general atom or ion, and ii) development of numeric database systems with the data stored in the computer. The first form is preferable for collisional data. At the Lebedev Physical Institute, an appropriate set of the codes has been developed. It includes all electronic processes at collision energies from the threshold up to the relativistic limit. The ion -atom (and -ion) collisional data are calculated with the methods developed recently. The program for the calculations of the level populations and line intensities is used for spectrical diagnostics of transparent plasmas. The second form of database production is widely used at the Institute of Physico-Technical Measurements (VNIIFTRI), and the Troitsk Center: the Institute of Spectroscopy and TRINITI. The main results obtained at the centers above are reviewed. Plans for future developments jointly with international collaborations are discussed.

  5. Constraints on Biological Mechanism from Disease Comorbidity Using Electronic Medical Records and Database of Genetic Variants.

    Directory of Open Access Journals (Sweden)

    Steven C Bagley

    2016-04-01

    Full Text Available Patterns of disease co-occurrence that deviate from statistical independence may represent important constraints on biological mechanism, which sometimes can be explained by shared genetics. In this work we study the relationship between disease co-occurrence and commonly shared genetic architecture of disease. Records of pairs of diseases were combined from two different electronic medical systems (Columbia, Stanford, and compared to a large database of published disease-associated genetic variants (VARIMED; data on 35 disorders were available across all three sources, which include medical records for over 1.2 million patients and variants from over 17,000 publications. Based on the sources in which they appeared, disease pairs were categorized as having predominant clinical, genetic, or both kinds of manifestations. Confounding effects of age on disease incidence were controlled for by only comparing diseases when they fall in the same cluster of similarly shaped incidence patterns. We find that disease pairs that are overrepresented in both electronic medical record systems and in VARIMED come from two main disease classes, autoimmune and neuropsychiatric. We furthermore identify specific genes that are shared within these disease groups.

  6. International Nuclear Safety Center (INSC) database

    International Nuclear Information System (INIS)

    Sofu, T.; Ley, H.; Turski, R.B.

    1997-01-01

    As an integral part of DOE's International Nuclear Safety Center (INSC) at Argonne National Laboratory, the INSC Database has been established to provide an interactively accessible information resource for the world's nuclear facilities and to promote free and open exchange of nuclear safety information among nations. The INSC Database is a comprehensive resource database aimed at a scope and level of detail suitable for safety analysis and risk evaluation for the world's nuclear power plants and facilities. It also provides an electronic forum for international collaborative safety research for the Department of Energy and its international partners. The database is intended to provide plant design information, material properties, computational tools, and results of safety analysis. Initial emphasis in data gathering is given to Soviet-designed reactors in Russia, the former Soviet Union, and Eastern Europe. The implementation is performed under the Oracle database management system, and the World Wide Web is used to serve as the access path for remote users. An interface between the Oracle database and the Web server is established through a custom designed Web-Oracle gateway which is used mainly to perform queries on the stored data in the database tables

  7. Converting a paper proforma template to a user friendly electronic database to collect traumatic brain injury data

    Directory of Open Access Journals (Sweden)

    Prasad M. Veera

    2014-12-01

    Full Text Available A structured reporting system which is based on a uniform template will permit uniform data collection and future statistics and will facilitate and validate independent or comparative audit of performance and quality of care. The successful establishment of a multi-center registry depends on the development of a concise data entry form, data entry system and data analysis to continuously maintain the registry. In the first phase we introduced the paper data collection form, in second phase this data form was converted to an electronic interface. In this second phase of the study the paper proforma which was developed in the first phase was converted into an electronic database by using the FileMaker Pro 13 Advanced®. The FileMaker Pro 13 Advanced® is capable to store the data, provides user friendly interface to enter data and can be converted the standalone runtime program to install in any other computer system. The next step is to explore the possibility whether it would be feasible to use this as a multicenter traumatic brain injury registry.

  8. CEBS: a comprehensive annotated database of toxicological data

    Science.gov (United States)

    Lea, Isabel A.; Gong, Hui; Paleja, Anand; Rashid, Asif; Fostel, Jennifer

    2017-01-01

    The Chemical Effects in Biological Systems database (CEBS) is a comprehensive and unique toxicology resource that compiles individual and summary animal data from the National Toxicology Program (NTP) testing program and other depositors into a single electronic repository. CEBS has undergone significant updates in recent years and currently contains over 11 000 test articles (exposure agents) and over 8000 studies including all available NTP carcinogenicity, short-term toxicity and genetic toxicity studies. Study data provided to CEBS are manually curated, accessioned and subject to quality assurance review prior to release to ensure high quality. The CEBS database has two main components: data collection and data delivery. To accommodate the breadth of data produced by NTP, the CEBS data collection component is an integrated relational design that allows the flexibility to capture any type of electronic data (to date). The data delivery component of the database comprises a series of dedicated user interface tables containing pre-processed data that support each component of the user interface. The user interface has been updated to include a series of nine Guided Search tools that allow access to NTP summary and conclusion data and larger non-NTP datasets. The CEBS database can be accessed online at http://www.niehs.nih.gov/research/resources/databases/cebs/. PMID:27899660

  9. A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases

    Science.gov (United States)

    Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357

  10. Fire test database

    International Nuclear Information System (INIS)

    Lee, J.A.

    1989-01-01

    This paper describes a project recently completed for EPRI by Impell. The purpose of the project was to develop a reference database of fire tests performed on non-typical fire rated assemblies. The database is designed for use by utility fire protection engineers to locate test reports for power plant fire rated assemblies. As utilities prepare to respond to Information Notice 88-04, the database will identify utilities, vendors or manufacturers who have specific fire test data. The database contains fire test report summaries for 729 tested configurations. For each summary, a contact is identified from whom a copy of the complete fire test report can be obtained. Five types of configurations are included: doors, dampers, seals, wraps and walls. The database is computerized. One version for IBM; one for Mac. Each database is accessed through user-friendly software which allows adding, deleting, browsing, etc. through the database. There are five major database files. One each for the five types of tested configurations. The contents of each provides significant information regarding the test method and the physical attributes of the tested configuration. 3 figs

  11. Electron beam directed energy device and methods of using same

    Science.gov (United States)

    Retsky, Michael W.

    2007-10-16

    A method and apparatus is disclosed for an electron beam directed energy device. The device consists of an electron gun with one or more electron beams. The device includes one or more accelerating plates with holes aligned for beam passage. The plates may be flat or preferably shaped to direct each electron beam to exit the electron gun at a predetermined orientation. In one preferred application, the device is located in outer space with individual beams that are directed to focus at a distant target to be used to impact and destroy missiles. The aimings of the separate beams are designed to overcome Coulomb repulsion. A method is also presented for directing the beams to a target considering the variable terrestrial magnetic field. In another preferred application, the electron beam is directed into the ground to produce a subsurface x-ray source to locate and/or destroy buried or otherwise hidden objects including explosive devices.

  12. Methods for fabrication of flexible hybrid electronics

    Science.gov (United States)

    Street, Robert A.; Mei, Ping; Krusor, Brent; Ready, Steve E.; Zhang, Yong; Schwartz, David E.; Pierre, Adrien; Doris, Sean E.; Russo, Beverly; Kor, Siv; Veres, Janos

    2017-08-01

    Printed and flexible hybrid electronics is an emerging technology with potential applications in smart labels, wearable electronics, soft robotics, and prosthetics. Printed solution-based materials are compatible with plastic film substrates that are flexible, soft, and stretchable, thus enabling conformal integration with non-planar objects. In addition, manufacturing by printing is scalable to large areas and is amenable to low-cost sheet-fed and roll-to-roll processes. FHE includes display and sensory components to interface with users and environments. On the system level, devices also require electronic circuits for power, memory, signal conditioning, and communications. Those electronic components can be integrated onto a flexible substrate by either assembly or printing. PARC has developed systems and processes for realizing both approaches. This talk presents fabrication methods with an emphasis on techniques recently developed for the assembly of off-the-shelf chips. A few examples of systems fabricated with this approach are also described.

  13. An Efficient Method for Electron-Atom Scattering Using Ab-initio Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yuan; Yang, Yonggang; Xiao, Liantuan; Jia, Suotang [Shanxi University, Taiyuan (China)

    2017-02-15

    We present an efficient method based on ab-initio calculations to investigate electron-atom scatterings. Those calculations profit from methods implemented in standard quantum chemistry programs. The new approach is applied to electron-helium scattering. The results are compared with experimental and other theoretical references to demonstrate the efficiency of our method.

  14. ''In situ'' electronic testing method of a neutron detector performance

    International Nuclear Information System (INIS)

    Gonzalez, J.M.; Levai, F.

    1987-01-01

    The method allows detection of any important change in the electrical characteristics of a neutron sensor channel. It checks the response signal produced by an electronic detector circuit when a pulse generator is connected as input signal in the high voltage supply. The electronic circuit compares the detector capacitance value, previously measured, against a reference value, which is adjusted in a window type comparator electronic circuit to detect any important degrading condition of the capacitance value in a detector-cable system. The ''in-situ'' electronic testing method of neutron detector performance has been verified in a laboratory atmosphere to be a potential method to detect any significant change in the capacitance value of a nuclear sensor and its connecting cable, also checking: detector disconnections, cable disconnections, length changes of the connecting cable, electric short-opened circuits in the sensor channel, and any electrical trouble in the detector-connector-cable system. The experimental practices were carried out by simulation of several electric changes in a nuclear sensor-cable system from a linear D.C. channel which measures reactor power during nuclear reactor operation. It was made at the Training Reactor Electronic Laboratory. The results and conclusions obtained at the Laboratory were proved, satisfactorily, in the Electronic Instrumentation of Budapest Technical University Training Reactor, Hungary

  15. [1012.5676] The Exoplanet Orbit Database

    Science.gov (United States)

    : The Exoplanet Orbit Database Authors: Jason T Wright, Onsi Fakhouri, Geoffrey W. Marcy, Eunkyu Han present a database of well determined orbital parameters of exoplanets. This database comprises parameters, and the method used for the planets discovery. This Exoplanet Orbit Database includes all planets

  16. The LHCb configuration database

    CERN Document Server

    Abadie, Lana; Gaspar, Clara; Jacobsson, Richard; Jost, Beat; Neufeld, Niko

    2005-01-01

    The Experiment Control System (ECS) will handle the monitoring, configuration and operation of all the LHCb experimental equipment. All parameters required to configure electronics equipment under the control of the ECS will reside in a configuration database. The database will contain two kinds of information: 1.\tConfiguration properties about devices such as hardware addresses, geographical location, and operational parameters associated with particular running modes (dynamic properties). 2.\tConnectivity between devices : this consists of describing the output and input connections of a device (static properties). The representation of these data using tables must be complete so that it can provide all the required information to the ECS and must cater for all the subsystems. The design should also guarantee a fast response time, even if a query results in a large volume of data being loaded from the database into the ECS. To fulfil these constraints, we apply the following methodology: Determine from the d...

  17. A drainage data-based calculation method for coalbed permeability

    International Nuclear Information System (INIS)

    Lai, Feng-peng; Li, Zhi-ping; Fu, Ying-kun; Yang, Zhi-hao

    2013-01-01

    This paper establishes a drainage data-based calculation method for coalbed permeability. The method combines material balance and production equations. We use a material balance equation to derive the average pressure of the coalbed in the production process. The dimensionless water production index is introduced into the production equation for the water production stage. In the subsequent stage, which uses both gas and water, the gas and water production ratio is introduced to eliminate the effect of flush-flow radius, skin factor, and other uncertain factors in the calculation of coalbed methane permeability. The relationship between permeability and surface cumulative liquid production can be described as a single-variable cubic equation by derivation. The trend shows that the permeability initially declines and then increases after ten wells in the southern Qinshui coalbed methane field. The results show an exponential relationship between permeability and cumulative water production. The relationship between permeability and cumulative gas production is represented by a linear curve and that between permeability and surface cumulative liquid production is represented by a cubic polynomial curve. The regression result of the permeability and surface cumulative liquid production agrees with the theoretical mathematical relationship. (paper)

  18. Methods and apparatus for cooling electronics

    Science.gov (United States)

    Hall, Shawn Anthony; Kopcsay, Gerard Vincent

    2014-12-02

    Methods and apparatus are provided for choosing an energy-efficient coolant temperature for electronics by considering the temperature dependence of the electronics' power dissipation. This dependence is explicitly considered in selecting the coolant temperature T.sub.0 that is sent to the equipment. To minimize power consumption P.sub.Total for the entire system, where P.sub.Total=P.sub.0+P.sub.Cool is the sum of the electronic equipment's power consumption P.sub.0 plus the cooling equipment's power consumption P.sub.Cool, P.sub.Total is obtained experimentally, by measuring P.sub.0 and P.sub.Cool, as a function of three parameters: coolant temperature T.sub.0; weather-related temperature T.sub.3 that affects the performance of free-cooling equipment; and computational state C of the electronic equipment, which affects the temperature dependence of its power consumption. This experiment provides, for each possible combination of T.sub.3 and C, the value T.sub.0* of T.sub.0 that minimizes P.sub.Total. During operation, for any combination of T.sub.3 and C that occurs, the corresponding optimal coolant temperature T.sub.0* is selected, and the cooling equipment is commanded to produce it.

  19. Modelling of phase diagrams and thermodynamic properties using Calphad method – Development of thermodynamic databases

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Aleš

    2013-01-01

    Roč. 66, JAN (2013), s. 3-13 ISSN 0927-0256 R&D Projects: GA MŠk(CZ) OC08053 Institutional support: RVO:68081723 Keywords : Calphad method * phase diagram modelling * thermodynamic database development Subject RIV: BJ - Thermodynamics Impact factor: 1.879, year: 2013

  20. Mixed ionic-electronic conductor-based radiation detectors and methods of fabrication

    Science.gov (United States)

    Conway, Adam; Beck, Patrick R; Graff, Robert T; Nelson, Art; Nikolic, Rebecca J; Payne, Stephen A; Voss, Lars; Kim, Hadong

    2015-04-07

    A method of fabricating a mixed ionic-electronic conductor (e.g. TlBr)-based radiation detector having halide-treated surfaces and associated methods of fabrication, which controls polarization of the mixed ionic-electronic MIEC material to improve stability and operational lifetime.

  1. Methods for recovering metals from electronic waste, and related systems

    Science.gov (United States)

    Lister, Tedd E; Parkman, Jacob A; Diaz Aldana, Luis A; Clark, Gemma; Dufek, Eric J; Keller, Philip

    2017-10-03

    A method of recovering metals from electronic waste comprises providing a powder comprising electronic waste in at least a first reactor and a second reactor and providing an electrolyte comprising at least ferric ions in an electrochemical cell in fluid communication with the first reactor and the second reactor. The method further includes contacting the powders within the first reactor and the second reactor with the electrolyte to dissolve at least one base metal from each reactor into the electrolyte and reduce at least some of the ferric ions to ferrous ions. The ferrous ions are oxidized at an anode of the electrochemical cell to regenerate the ferric ions. The powder within the second reactor comprises a higher weight percent of the at least one base metal than the powder in the first reactor. Additional methods of recovering metals from electronic waste are also described, as well as an apparatus of recovering metals from electronic waste.

  2. REPLIKASI UNIDIRECTIONAL PADA HETEROGEN DATABASE

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2013-01-01

    The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the...

  3. Chemical evaluation of electronic cigarettes

    OpenAIRE

    Cheng, Tianrong

    2014-01-01

    Objective To review the available evidence evaluating the chemicals in refill solutions, cartridges, aerosols and environmental emissions of electronic cigarettes (e-cigarettes). Methods Systematic literature searches were conducted to identify research related to e-cigarettes and chemistry using 5 reference databases and 11 search terms. The search date range was January 2007 to September 2013. The search yielded 36 articles, of which 29 were deemed relevant for analysis. Results The levels ...

  4. Ab initio methods for electron-molecule collisions

    International Nuclear Information System (INIS)

    Collins, L.A.; Schneider, B.I.

    1987-01-01

    This review concentrates on the recent advances in treating the electronic aspect of the electron-molecule interaction and leaves to other articles the description of the rotational and vibrational motions. Those methods which give the most complete treatment of the direct, exchange, and correlation effects are focused on. Such full treatments are generally necessary at energies below a few Rydbergs (≅ 60 eV). This choice unfortunately necessitates omission of those active and vital areas devoted to the development of model potentials and approximate scattering formulations. The ab initio and model approaches complement each other and are both extremely important to the full explication of the electron-scattering process. Due to the rapid developments of recent years, the approaches that provide the fullest treatment are concentrated on. 81 refs

  5. The impact of electronic cigarettes on the paediatric population

    OpenAIRE

    Durmowicz, Elizabeth L

    2014-01-01

    Objective To review the impact of electronic cigarettes (e-cigarettes) on children. Methods Five electronic databases were searched through 31 December 2013. Studies in English that included data for children younger than 18 years of age were included. In addition, relevant data from articles identified during searches of the e-cigarette literature, relevant state survey data and paediatric voluntary adverse event reports submitted to the US Food and Drug Administration (FDA) were reviewed an...

  6. Improving Care And Research Electronic Data Trust Antwerp (iCAREdata): a research database of linked data on out-of-hours primary care.

    Science.gov (United States)

    Colliers, Annelies; Bartholomeeusen, Stefaan; Remmen, Roy; Coenen, Samuel; Michiels, Barbara; Bastiaens, Hilde; Van Royen, Paul; Verhoeven, Veronique; Holmgren, Philip; De Ruyck, Bernard; Philips, Hilde

    2016-05-04

    Primary out-of-hours care is developing throughout Europe. High-quality databases with linked data from primary health services can help to improve research and future health services. In 2014, a central clinical research database infrastructure was established (iCAREdata: Improving Care And Research Electronic Data Trust Antwerp, www.icaredata.eu ) for primary and interdisciplinary health care at the University of Antwerp, linking data from General Practice Cooperatives, Emergency Departments and Pharmacies during out-of-hours care. Medical data are pseudonymised using the services of a Trusted Third Party, which encodes private information about patients and physicians before data is sent to iCAREdata. iCAREdata provides many new research opportunities in the fields of clinical epidemiology, health care management and quality of care. A key aspect will be to ensure the quality of data registration by all health care providers. This article describes the establishment of a research database and the possibilities of linking data from different primary out-of-hours care providers, with the potential to help to improve research and the quality of health care services.

  7. DEPOT: Database for electronics parts and other things

    International Nuclear Information System (INIS)

    Logg, C.A.; Clancey, P.W.; Crane, G.

    1990-01-01

    DEPOT has been developed to provide tracking for the Stanford Linear Collider (SLC) control system equipment. For each piece of equipment entered in the database, a complete service, maintenance, modification, certification, location history, and, optionally, a radiation exposure history, can be maintained. To facilitate data entry accuracy, efficiency, and consistency, barcoding technology has been used extensively. DEPOT has been an important tool in improving the reliability of the microsystems controlling SLC. It is now being adopted by other systems at SLAC. 6 refs., 6 figs

  8. Determination of the Electronics Charge--Electrolysis of Water Method.

    Science.gov (United States)

    Venkatachar, Arun C.

    1985-01-01

    Presents an alternative method for measuring the electronic charge using data from the electrolysis of acidified distilled water. The process (carried out in a commercially available electrolytic cell) has the advantage of short completion time so that students can determine electron charge and mass in one laboratory period. (DH)

  9. Final Report on Atomic Database Project

    International Nuclear Information System (INIS)

    Yuan, J.; Gui, Z.; Moses, G.A.

    2006-01-01

    LTE model, the calculation is simple since the Boltzmann distribution can be used. As long as we have the energy levels and the ionization energy, we can calculate the plasma population very easily. However, for the non-LTE model, the calculation is very complex since various atomic data are required to build the transition balance matrix. Currently, empirical formulas are used to calculate these data such as electron collision ionization and autoionization. Furnished with these tested atomic data computing codes, we have developed a friendly user interface and a flexible atomic database [5]. The UTA model is considered the most practical method for medium and high Z elements since it is very time-consuming and difficult to calculate the enormous number of the transitions. However, the UTA model may overestimate the opacity, therefore, the DTA model is desirable even for medium and high Z elements. With the constant decrease in the cost of the disk storage and increase of CPU speed, it is possible to apply the DTA model to the medium and high Z elements. In this project, we calculate opacities for high Z elements in fully detailed term accounting model for significant populated states. For the various rate coefficients, we calculate the data using the detailed configuration accounting approximation. In order to handle the large volume of data generated for medium to high-Z atoms, we use the HDF data format as our database format, which is becoming a standard for storing scientific data. We have built a sophisticated graphical user interface using Java technology to distinguish our atomic database from other existing databases. Unlike other atomic databases, in which the users can obtain the opacity data in a pair of photon energy and opacity, in our database the user can browser more detailed atomic data information other than the opacity data set by combining our atomic database and Java technology. For example, the user can find out the abundant ion stage and

  10. Pollution Prevention Successes Database (P2SDb) user guide

    International Nuclear Information System (INIS)

    1995-07-01

    When Pollution Prevention Opportunity Assessments (P2OAs) were launched at the Hanford Site during the summer of 1994, the first comment received from those using them expressed the desire for a method to report assessments electronically. As a temporary measure, macros were developed for use on word processing systems, but a more formal database was obviously needed. Additionally, increased DOE and Washington state reporting requirements for pollution prevention suggested that a database system would streamline the reporting process. The Pollution Prevention Group of Westinghouse Hanford Company (WHC) contracted with the Data Automation Engineering Department from ICF Kaiser Hanford Company (ICFKH) to develop the system. The scope was to develop a database that will track P2OAs conducted by the facilities and contractors at the Hanford Site. It will also track pollution prevention accomplishments that are not the result of P2OAs and document a portion of the Process Waste Assessments conducted in the past. To accommodate the above criteria, yet complete the system in a timely manner, the Pollution Prevention Successes Database (P2SDb) is being implemented in three phases. The first phase will automate the worksheets to provide both input and output of the data associated with the worksheets. The second phase will automate standard summary reports and ad hoc reports. The third phase will provide automated searching of the database to facilitate the sharing of pollution prevention experiences among various users. This User's Guide addresses only the Phase 1 system

  11. Statistics of electron multiplication in multiplier phototube: iterative method

    International Nuclear Information System (INIS)

    Grau Malonda, A.; Ortiz Sanchez, J.F.

    1985-01-01

    An iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situations are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average anti-r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (author)

  12. Segmentation of anatomical structures in chest radiographs using supervised methods: a comparative study on a public database

    DEFF Research Database (Denmark)

    van Ginneken, Bram; Stegmann, Mikkel Bille; Loog, Marco

    2006-01-01

    classification method that employs a multi-scale filter bank of Gaussian derivatives and a k-nearest-neighbors classifier. The methods have been tested on a publicly available database of 247 chest radiographs, in which all objects have been manually segmented by two human observers. A parameter optimization...

  13. Discovering new information in bibliographic databases

    Directory of Open Access Journals (Sweden)

    Emil Hudomalj

    2005-01-01

    Full Text Available Databases contain information that can usually not be revealed by standard query systems. For that purpose, the methods for knowledge discovery from databases can be applied, which enable the user to browse aggregated data, discover trends, produce online reports, explore possible new associations within the data etc. Such methods are successfully employed in various fields, such as banking, insurance and telecommunications, while they are seldom used in libraries. The article reviews the development of query systems for bibliographic databases, including some early attempts to apply modern knowledge discovery methods. Analytical databases are described in more detail, since they usually serve as the basis for knowledge discovery. Data mining approaches are presented, since they are a central step in the knowledge discovery process. The key role of librarians who can play a key part in developing systems for finding new information in existing bibliographic databases is stressed.

  14. Scanning probe methods applied to molecular electronics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlicek, Niko

    2013-08-01

    Scanning probe methods on insulating films offer a rich toolbox to study electronic, structural and spin properties of individual molecules. This work discusses three issues in the field of molecular and organic electronics. An STM head to be operated in high magnetic fields has been designed and built up. The STM head is very compact and rigid relying on a robust coarse approach mechanism. This will facilitate investigations of the spin properties of individual molecules in the future. Combined STM/AFM studies revealed a reversible molecular switch based on two stable configurations of DBTH molecules on ultrathin NaCl films. AFM experiments visualize the molecular structure in both states. Our experiments allowed to unambiguously determine the pathway of the switch. Finally, tunneling into and out of the frontier molecular orbitals of pentacene molecules has been investigated on different insulating films. These experiments show that the local symmetry of initial and final electron wave function are decisive for the ratio between elastic and vibration-assisted tunneling. The results can be generalized to electron transport in organic materials.

  15. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  16. Chemical databases evaluated by order theoretical tools.

    Science.gov (United States)

    Voigt, Kristina; Brüggemann, Rainer; Pudenz, Stefan

    2004-10-01

    Data on environmental chemicals are urgently needed to comply with the future chemicals policy in the European Union. The availability of data on parameters and chemicals can be evaluated by chemometrical and environmetrical methods. Different mathematical and statistical methods are taken into account in this paper. The emphasis is set on a new, discrete mathematical method called METEOR (method of evaluation by order theory). Application of the Hasse diagram technique (HDT) of the complete data-matrix comprising 12 objects (databases) x 27 attributes (parameters + chemicals) reveals that ECOTOX (ECO), environmental fate database (EFD) and extoxnet (EXT)--also called multi-database databases--are best. Most single databases which are specialised are found in a minimal position in the Hasse diagram; these are biocatalysis/biodegradation database (BID), pesticide database (PES) and UmweltInfo (UMW). The aggregation of environmental parameters and chemicals (equal weight) leads to a slimmer data-matrix on the attribute side. However, no significant differences are found in the "best" and "worst" objects. The whole approach indicates a rather bad situation in terms of the availability of data on existing chemicals and hence an alarming signal concerning the new and existing chemicals policies of the EEC.

  17. The Usage Analysis of Databases at Ankara University Digital Library

    Directory of Open Access Journals (Sweden)

    Sacit Arslantekin

    2006-12-01

    Full Text Available The development in information and communication technologies has changed and improved resources and services diversity in libraries. These changes continue to develop rapidly throughout the world. As for our country, remarkable developments, especially in university and special libraries, in this field are worth consideration. In order to take benefit of the existing and forthcoming developments in the field of electronic libraries the databases used by clients should be well-demonstrated and followed closely. The providing wide use of electronic databases leads to increasing the productivity of scientific and social information that that is the ultimate goal. The article points out electronic resources management and the effect of consortia developments in the field first, and then evaluates the results of the survey on the use of electronic libraries assessment questionnaires by faculty members at Ankara University.

  18. Constructing a Geology Ontology Using a Relational Database

    Science.gov (United States)

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances

  19. Combination of a geolocation database access with infrastructure sensing in TV bands

    OpenAIRE

    Dionísio, Rogério; Ribeiro, Jorge; Marques, Paulo; Rodriguez, Jonathan

    2014-01-01

    This paper describes the implementation and the technical specifications of a geolocation database assisted by a spectrum-monitoring outdoor network. The geolocation database is populated according to Electronic Communications Committee (ECC) report 186 methodology. The application programming interface (API) between the sensor network and the geolocation database implements an effective and secure connection to successfully gather sensing data and sends it to the geolocation database for ...

  20. Toxicity of ionic liquids: Database and prediction via quantitative structure–activity relationship method

    International Nuclear Information System (INIS)

    Zhao, Yongsheng; Zhao, Jihong; Huang, Ying; Zhou, Qing; Zhang, Xiangping; Zhang, Suojiang

    2014-01-01

    Highlights: • A comprehensive database on toxicity of ionic liquids (ILs) was established. • Relationship between structure and toxicity of IL has been analyzed qualitatively. • Two new QSAR models were developed for predicting toxicity of ILs to IPC-81. • Accuracy of proposed nonlinear SVM model is much higher than the linear MLR model. • The established models can be explored in designing novel green agents. - Abstract: A comprehensive database on toxicity of ionic liquids (ILs) is established. The database includes over 4000 pieces of data. Based on the database, the relationship between IL's structure and its toxicity has been analyzed qualitatively. Furthermore, Quantitative Structure–Activity relationships (QSAR) model is conducted to predict the toxicities (EC 50 values) of various ILs toward the Leukemia rat cell line IPC-81. Four parameters selected by the heuristic method (HM) are used to perform the studies of multiple linear regression (MLR) and support vector machine (SVM). The squared correlation coefficient (R 2 ) and the root mean square error (RMSE) of training sets by two QSAR models are 0.918 and 0.959, 0.258 and 0.179, respectively. The prediction R 2 and RMSE of QSAR test sets by MLR model are 0.892 and 0.329, by SVM model are 0.958 and 0.234, respectively. The nonlinear model developed by SVM algorithm is much outperformed MLR, which indicates that SVM model is more reliable in the prediction of toxicity of ILs. This study shows that increasing the relative number of O atoms of molecules leads to decrease in the toxicity of ILs

  1. The effect of electron range on electron beam induced current collection and a simple method to extract an electron range for any generation function

    International Nuclear Information System (INIS)

    Lahreche, A.; Beggah, Y.; Corkish, R.

    2011-01-01

    The effect of electron range on electron beam induced current (EBIC) is demonstrated and the problem of the choice of the optimal electron ranges to use with simple uniform and point generation function models is resolved by proposing a method to extract an electron range-energy relationship (ERER). The results show that the use of these extracted electron ranges remove the previous disagreement between the EBIC curves computed with simple forms of generation model and those based on a more realistic generation model. The impact of these extracted electron ranges on the extraction of diffusion length, surface recombination velocity and EBIC contrast of defects is discussed. It is also demonstrated that, for the case of uniform generation, the computed EBIC current is independent of the assumed shape of the generation volume. -- Highlights: → Effect of electron ranges on modeling electron beam induced current is shown. → A method to extract an electron range for simple form of generation is proposed. → For uniform generation the EBIC current is independent of the choice of it shape. → Uses of the extracted electron ranges remove some existing literature ambiguity.

  2. Methods for eliciting, annotating, and analyzing databases for child speech development.

    Science.gov (United States)

    Beckman, Mary E; Plummer, Andrew R; Munson, Benjamin; Reidy, Patrick F

    2017-09-01

    Methods from automatic speech recognition (ASR), such as segmentation and forced alignment, have facilitated the rapid annotation and analysis of very large adult speech databases and databases of caregiver-infant interaction, enabling advances in speech science that were unimaginable just a few decades ago. This paper centers on two main problems that must be addressed in order to have analogous resources for developing and exploiting databases of young children's speech. The first problem is to understand and appreciate the differences between adult and child speech that cause ASR models developed for adult speech to fail when applied to child speech. These differences include the fact that children's vocal tracts are smaller than those of adult males and also changing rapidly in size and shape over the course of development, leading to between-talker variability across age groups that dwarfs the between-talker differences between adult men and women. Moreover, children do not achieve fully adult-like speech motor control until they are young adults, and their vocabularies and phonological proficiency are developing as well, leading to considerably more within-talker variability as well as more between-talker variability. The second problem then is to determine what annotation schemas and analysis techniques can most usefully capture relevant aspects of this variability. Indeed, standard acoustic characterizations applied to child speech reveal that adult-centered annotation schemas fail to capture phenomena such as the emergence of covert contrasts in children's developing phonological systems, while also revealing children's nonuniform progression toward community speech norms as they acquire the phonological systems of their native languages. Both problems point to the need for more basic research into the growth and development of the articulatory system (as well as of the lexicon and phonological system) that is oriented explicitly toward the construction of

  3. Nondestructive testing method for a new generation of electronics

    Directory of Open Access Journals (Sweden)

    Azin Anton

    2018-01-01

    Full Text Available The implementation of the Smart City system needs reliable and smoothly operating electronic equipment. The study is aimed at developing a nondestructive testing method for electronic equipment and its components. This method can be used to identify critical design defects of printed circuit boards (PCB and to predict their service life, taking into account the nature of probable operating loads. The study uses an acoustic emission method to identify and localize critical design defects of printed circuit boards. Geometric dimensions of detected critical defects can be determined by the X-ray tomography method. Based on the results of the study, a method combining acoustic emission and X-ray tomography was developed for nondestructive testing of printed circuit boards. The stress-strain state of solder joints containing detected defects was analyzed. This paper gives an example of using the developed method for estimating the degree of damage to joints between PCB components and predicting the service life of the entire PCB.

  4. Spatial access method for urban geospatial database management: An efficient approach of 3D vector data clustering technique

    DEFF Research Database (Denmark)

    Azri, Suhaibah; Ujang, Uznir; Rahman, Alias Abdul

    2014-01-01

    In the last few years, 3D urban data and its information are rapidly increased due to the growth of urban area and urbanization phenomenon. These datasets are then maintain and manage in 3D spatial database system. However, performance deterioration is likely to happen due to the massiveness of 3D...... datasets. As a solution, 3D spatial index structure is used as a booster to increase the performance of data retrieval. In commercial database, commonly and widely used index structure for 3D spatial database is 3D R-Tree. This is due to its simplicity and promising method in handling spatial data. However......D geospatial data clustering to be used in the construction of 3D R-Tree and respectively could reduce the overlapping among nodes. The proposed method is tested on 3D urban dataset for the application of urban infill development. By using several cases of data updating operations such as building...

  5. Designing and Testing a Database for the Qweak Measurement

    Science.gov (United States)

    Holcomb, Edward; Spayde, Damon; Pote, Tim

    2009-05-01

    The aim of the Qweak experiment is to make the most precise determination to date, aside from measurements at the Z-pole, of the Weinberg angle via a measurement of the proton's weak charge. The weak charge determines a particle's interaction with Z-type bosons. According to the Standard Model the value of the angle depends on the momentum of the exchanged Z boson and is well-determined. Deviations from the Standard Model would indicate new physics. During Qweak, bundles of longitudinally polarized electrons will be scattered from a proton target. Elastically scattered electrons will be detected in one of eight quartz bars via the emitted Cerenkov radiation. Periodically the helicity of these electrons will be reversed. The difference in the scattering rates of these two helicity states creates an asymmetry; the Weinberg angle can be calculated from this. Our role in the collaboration was the design, creation, and implementation of a database for the Qweak experiment. The purpose of this database is to store pertinent information, such as detector asymmetries and monitor calibrations, for later access. In my talk I plan to discuss the database design and the results of various tests.

  6. A Flexible Electronic Commerce Recommendation System

    Science.gov (United States)

    Gong, Songjie

    Recommendation systems have become very popular in E-commerce websites. Many of the largest commerce websites are already using recommender technologies to help their customers find products to purchase. An electronic commerce recommendation system learns from a customer and recommends products that the customer will find most valuable from among the available products. But most recommendation methods are hard-wired into the system and they support only fixed recommendations. This paper presented a framework of flexible electronic commerce recommendation system. The framework is composed by user model interface, recommendation engine, recommendation strategy model, recommendation technology group, user interest model and database interface. In the recommender strategy model, the method can be collaborative filtering, content-based filtering, mining associate rules method, knowledge-based filtering method or the mixed method. The system mapped the implementation and demand through strategy model, and the whole system would be design as standard parts to adapt to the change of the recommendation strategy.

  7. Development of database on the distribution coefficient. 2. Preparation of database

    Energy Technology Data Exchange (ETDEWEB)

    Takebe, Shinichi; Abe, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The distribution coefficient is very important parameter for environmental impact assessment on the disposal of radioactive waste arising from research institutes. 'Database on the Distribution Coefficient' was built up from the informations which were obtained by the literature survey in the country for these various items such as value , measuring method and measurement condition of distribution coefficient, in order to select the reasonable distribution coefficient value on the utilization of this value in the safety evaluation. This report was explained about the outline on preparation of this database and was summarized as a use guide book of database. (author)

  8. Development of database on the distribution coefficient. 2. Preparation of database

    Energy Technology Data Exchange (ETDEWEB)

    Takebe, Shinichi; Abe, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The distribution coefficient is very important parameter for environmental impact assessment on the disposal of radioactive waste arising from research institutes. 'Database on the Distribution Coefficient' was built up from the informations which were obtained by the literature survey in the country for these various items such as value , measuring method and measurement condition of distribution coefficient, in order to select the reasonable distribution coefficient value on the utilization of this value in the safety evaluation. This report was explained about the outline on preparation of this database and was summarized as a use guide book of database. (author)

  9. Database Description - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Database Description General information of database Database name Trypanosomes Database...stitute of Genetics Research Organization of Information and Systems Yata 1111, Mishima, Shizuoka 411-8540, JAPAN E mail: Database...y Name: Trypanosoma Taxonomy ID: 5690 Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description The... Article title: Author name(s): Journal: External Links: Original website information Database maintenance s...DB (Protein Data Bank) KEGG PATHWAY Database DrugPort Entry list Available Query search Available Web servic

  10. Adaptive multiresolution method for MAP reconstruction in electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Acar, Erman, E-mail: erman.acar@tut.fi [Department of Signal Processing, Tampere University of Technology, P.O. Box 553, FI-33101 Tampere (Finland); BioMediTech, Tampere University of Technology, Biokatu 10, 33520 Tampere (Finland); Peltonen, Sari; Ruotsalainen, Ulla [Department of Signal Processing, Tampere University of Technology, P.O. Box 553, FI-33101 Tampere (Finland); BioMediTech, Tampere University of Technology, Biokatu 10, 33520 Tampere (Finland)

    2016-11-15

    3D image reconstruction with electron tomography holds problems due to the severely limited range of projection angles and low signal to noise ratio of the acquired projection images. The maximum a posteriori (MAP) reconstruction methods have been successful in compensating for the missing information and suppressing noise with their intrinsic regularization techniques. There are two major problems in MAP reconstruction methods: (1) selection of the regularization parameter that controls the balance between the data fidelity and the prior information, and (2) long computation time. One aim of this study is to provide an adaptive solution to the regularization parameter selection problem without having additional knowledge about the imaging environment and the sample. The other aim is to realize the reconstruction using sequences of resolution levels to shorten the computation time. The reconstructions were analyzed in terms of accuracy and computational efficiency using a simulated biological phantom and publically available experimental datasets of electron tomography. The numerical and visual evaluations of the experiments show that the adaptive multiresolution method can provide more accurate results than the weighted back projection (WBP), simultaneous iterative reconstruction technique (SIRT), and sequential MAP expectation maximization (sMAPEM) method. The method is superior to sMAPEM also in terms of computation time and usability since it can reconstruct 3D images significantly faster without requiring any parameter to be set by the user. - Highlights: • An adaptive multiresolution reconstruction method is introduced for electron tomography. • The method provides more accurate results than the conventional reconstruction methods. • The missing wedge and noise problems can be compensated by the method efficiently.

  11. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    Science.gov (United States)

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  12. Database in Artificial Intelligence.

    Science.gov (United States)

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  13. RA radiological characterization database application

    International Nuclear Information System (INIS)

    Steljic, M.M; Ljubenov, V.Lj. . E-mail address of corresponding author: milijanas@vin.bg.ac.yu; Steljic, M.M.)

    2005-01-01

    Radiological characterization of the RA research reactor is one of the main activities in the first two years of the reactor decommissioning project. The raw characterization data from direct measurements or laboratory analyses (defined within the existing sampling and measurement programme) have to be interpreted, organized and summarized in order to prepare the final characterization survey report. This report should be made so that the radiological condition of the entire site is completely and accurately shown with the radiological condition of the components clearly depicted. This paper presents an electronic database application, designed as a serviceable and efficient tool for characterization data storage, review and analysis, as well as for the reports generation. Relational database model was designed and the application is made by using Microsoft Access 2002 (SP1), a 32-bit RDBMS for the desktop and client/server database applications that run under Windows XP. (author)

  14. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  15. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  16. ISSUES IN MOBILE DISTRIBUTED REAL TIME DATABASES: PERFORMANCE AND REVIEW

    OpenAIRE

    VISHNU SWAROOP,; Gyanendra Kumar Gupta,; UDAI SHANKER

    2011-01-01

    Increase in handy and small electronic devices in computing fields; it makes the computing more popularand useful in business. Tremendous advances in wireless networks and portable computing devices have led to development of mobile computing. Support of real time database system depending upon thetiming constraints, due to availability of data distributed database, and ubiquitous computing pull the mobile database concept, which emerges them in a new form of technology as mobile distributed ...

  17. A simple method for serving Web hypermaps with dynamic database drill-down

    Directory of Open Access Journals (Sweden)

    Carson Ewart R

    2002-08-01

    Full Text Available Abstract Background HealthCyberMap http://healthcybermap.semanticweb.org aims at mapping parts of health information cyberspace in novel ways to deliver a semantically superior user experience. This is achieved through "intelligent" categorisation and interactive hypermedia visualisation of health resources using metadata, clinical codes and GIS. HealthCyberMap is an ArcView 3.1 project. WebView, the Internet extension to ArcView, publishes HealthCyberMap ArcView Views as Web client-side imagemaps. The basic WebView set-up does not support any GIS database connection, and published Web maps become disconnected from the original project. A dedicated Internet map server would be the best way to serve HealthCyberMap database-driven interactive Web maps, but is an expensive and complex solution to acquire, run and maintain. This paper describes HealthCyberMap simple, low-cost method for "patching" WebView to serve hypermaps with dynamic database drill-down functionality on the Web. Results The proposed solution is currently used for publishing HealthCyberMap GIS-generated navigational information maps on the Web while maintaining their links with the underlying resource metadata base. Conclusion The authors believe their map serving approach as adopted in HealthCyberMap has been very successful, especially in cases when only map attribute data change without a corresponding effect on map appearance. It should be also possible to use the same solution to publish other interactive GIS-driven maps on the Web, e.g., maps of real world health problems.

  18. Improving quality of breast cancer surgery through development of a national breast cancer surgical outcomes (BRCASO research database

    Directory of Open Access Journals (Sweden)

    Aiello Bowles Erin J

    2012-04-01

    Full Text Available Abstract Background Common measures of surgical quality are 30-day morbidity and mortality, which poorly describe breast cancer surgical quality with extremely low morbidity and mortality rates. Several national quality programs have collected additional surgical quality measures; however, program participation is voluntary and results may not be generalizable to all surgeons. We developed the Breast Cancer Surgical Outcomes (BRCASO database to capture meaningful breast cancer surgical quality measures among a non-voluntary sample, and study variation in these measures across providers, facilities, and health plans. This paper describes our study protocol, data collection methods, and summarizes the strengths and limitations of these data. Methods We included 4524 women ≥18 years diagnosed with breast cancer between 2003-2008. All women with initial breast cancer surgery performed by a surgeon employed at the University of Vermont or three Cancer Research Network (CRN health plans were eligible for inclusion. From the CRN institutions, we collected electronic administrative data including tumor registry information, Current Procedure Terminology codes for breast cancer surgeries, surgeons, surgical facilities, and patient demographics. We supplemented electronic data with medical record abstraction to collect additional pathology and surgery detail. All data were manually abstracted at the University of Vermont. Results The CRN institutions pre-filled 30% (22 out of 72 of elements using electronic data. The remaining elements, including detailed pathology margin status and breast and lymph node surgeries, required chart abstraction. The mean age was 61 years (range 20-98 years; 70% of women were diagnosed with invasive ductal carcinoma, 20% with ductal carcinoma in situ, and 10% with invasive lobular carcinoma. Conclusions The BRCASO database is one of the largest, multi-site research resources of meaningful breast cancer surgical quality data

  19. A comparative study of different methods for calculating electronic transition rates

    Science.gov (United States)

    Kananenka, Alexei A.; Sun, Xiang; Schubert, Alexander; Dunietz, Barry D.; Geva, Eitan

    2018-03-01

    We present a comprehensive comparison of the following mixed quantum-classical methods for calculating electronic transition rates: (1) nonequilibrium Fermi's golden rule, (2) mixed quantum-classical Liouville method, (3) mean-field (Ehrenfest) mixed quantum-classical method, and (4) fewest switches surface-hopping method (in diabatic and adiabatic representations). The comparison is performed on the Garg-Onuchic-Ambegaokar benchmark charge-transfer model, over a broad range of temperatures and electronic coupling strengths, with different nonequilibrium initial states, in the normal and inverted regimes. Under weak to moderate electronic coupling, the nonequilibrium Fermi's golden rule rates are found to be in good agreement with the rates obtained via the mixed quantum-classical Liouville method that coincides with the fully quantum-mechanically exact results for the model system under study. Our results suggest that the nonequilibrium Fermi's golden rule can serve as an inexpensive yet accurate alternative to Ehrenfest and the fewest switches surface-hopping methods.

  20. Technical efficiency and economic viability of different cattle identification methods allowed by the Brazilian traceability system

    Directory of Open Access Journals (Sweden)

    Marcos Aurelio Lopes

    2017-03-01

    Full Text Available We aimed to evaluate the technical efficiency and economic viability of the implementation and use of four cattle identification methods allowed by the Brazilian traceability system. The study was conducted in a beef cattle production system located in the State of Mato Grosso, from January to June 2012. Four identification methods (treatments were compared: T1: ear tag in one ear and ear button in the other ear (eabu; T2: ear tag and iron brand on the right leg (eaib; T3: ear tag in one ear and tattoo on the other ear (eata; and T4: ear tag in one ear and electronic ear tag (eael on the other. Each treatment was applied to 60 Nelore animals, totaling 240 animals, divided equally into three life stages (calves, young cattle, adult cattle. The study had two phases: implementation (phase 1 and reading and transfer of identification numbers to an electronic database (phase 2. All operating expenses related to the two phases of the study were determined. The database was constructed, and the statistical analyses were performed using SPSS® 17.0 software. Regarding the time spent on implementation (phase 1, conventional ear tags and electronic ear tags produced similar results, which were lower than those of hot iron and tattoo methods, which differed from each other. Regarding the time required for reading the numbers on animals and their transcription into a database (phase 2, electronic ear-tagging was the fastest method, followed by conventional ear tag, hot iron and tattoo. Among the methods analyzed, the electronic ear tag had the highest technical efficiency because it required less time to implement identifiers and to complete the process of reading and transcription to an electronic database and because it did not exhibit any errors. However, the cost of using the electronic ear-tagging method was higher primarily due to the cost of the device.

  1. 77 FR 47690 - 30-Day Notice of Proposed Information Collection: Civilian Response Corps Database In-Processing...

    Science.gov (United States)

    2012-08-09

    .... Title of Information Collection: Civilian Response Corps Database In-Processing Electronic Form. OMB... DEPARTMENT OF STATE [Public Notice 7976] 30-Day Notice of Proposed Information Collection: Civilian Response Corps Database In-Processing Electronic Form, OMB Control Number 1405-0168, Form DS-4096...

  2. Using linked electronic data to validate algorithms for health outcomes in administrative databases.

    Science.gov (United States)

    Lee, Wan-Ju; Lee, Todd A; Pickard, Alan Simon; Shoaibi, Azadeh; Schumock, Glen T

    2015-08-01

    The validity of algorithms used to identify health outcomes in claims-based and administrative data is critical to the reliability of findings from observational studies. The traditional approach to algorithm validation, using medical charts, is expensive and time-consuming. An alternative method is to link the claims data to an external, electronic data source that contains information allowing confirmation of the event of interest. In this paper, we describe this external linkage validation method and delineate important considerations to assess the feasibility and appropriateness of validating health outcomes using this approach. This framework can help investigators decide whether to pursue an external linkage validation method for identifying health outcomes in administrative/claims data.

  3. Database Description - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Database Description General information of database Database name SKIP Stemcell Database...rsity Journal Search: Contact address http://www.skip.med.keio.ac.jp/en/contact/ Database classification Human Genes and Diseases Dat...abase classification Stemcell Article Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database...ks: Original website information Database maintenance site Center for Medical Genetics, School of medicine, ...lable Web services Not available URL of Web services - Need for user registration Not available About This Database Database

  4. Database Description - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Database Description General information of database Database n... BioResource Center Hiroshi Masuya Database classification Plant databases - Arabidopsis thaliana Organism T...axonomy Name: Arabidopsis thaliana Taxonomy ID: 3702 Database description The Arabidopsis thaliana phenome i...heir effective application. We developed the new Arabidopsis Phenome Database integrating two novel database...seful materials for their experimental research. The other, the “Database of Curated Plant Phenome” focusing

  5. Replikasi Unidirectional pada Heterogen Database

    Directory of Open Access Journals (Sweden)

    Hendro Nindito

    2013-12-01

    Full Text Available The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the interaction process is done through repeated. From this research it is obtained that the database replication technolgy using Oracle Golden Gate can be applied in heterogeneous environments in real time as well.

  6. Innovative electron transport methods in EGS5

    International Nuclear Information System (INIS)

    Bielajew, A.F.; Wilderman, S.J.

    2000-01-01

    The initial formulation of a Monte Carlo scheme for the transport of high-energy (>≅ 100 keV) electrons was established by Berger in 1963. Calling his method the 'condensed history theory', Berger combined the theoretical results of the previous generation of research into developing approximate solutions of the Boltzmann transport equation with numerical algorithms for exploiting the power of computers to permit iterative, piece-wise solution of the transport equation in a computationally intensive but much less approximate fashion. The methods devised by Berger, with comparatively little modification, provide the foundation of all present day Monte Carlo electron transport simulation algorithms. Only in the last 15 years, beginning with the development and publication of the PRESTA algorithm, has there been a significant revisitation of the problem of simulating electron transport within the condensed history framework. Research in this area is ongoing, highly active, and far from complete. It presents an enormous challenge, demanding derivation of new analytical transport solutions based on underlying fundamental interaction mechanisms, intuitive insight in the development of computer algorithms, and state of the art computer science skills in order to permit deployment of these techniques in an efficient manner. The EGS5 project, a modern ground-up rewrite of the EGS4 code, is now in the design phase. EGS5 will take modern photon and electron transport algorithms and deploy them in an easy-to-maintain, modern computer language-ANSI-standard C ++. Moreover, the well-known difficulties of applying EGS4 to practical geometries (geometry code development, tally routine design) should be made easier and more intuitive through the use of a visual user interface being designed by Quantum Research, Inc., work that is presented elsewhere in this conference. This report commences with a historical review of electron transport models culminating with the proposal of a

  7. FCDD: A Database for Fruit Crops Diseases.

    Science.gov (United States)

    Chauhan, Rupal; Jasrai, Yogesh; Pandya, Himanshu; Chaudhari, Suman; Samota, Chand Mal

    2014-01-01

    Fruit Crops Diseases Database (FCDD) requires a number of biotechnology and bioinformatics tools. The FCDD is a unique bioinformatics resource that compiles information about 162 details on fruit crops diseases, diseases type, its causal organism, images, symptoms and their control. The FCDD contains 171 phytochemicals from 25 fruits, their 2D images and their 20 possible sequences. This information has been manually extracted and manually verified from numerous sources, including other electronic databases, textbooks and scientific journals. FCDD is fully searchable and supports extensive text search. The main focus of the FCDD is on providing possible information of fruit crops diseases, which will help in discovery of potential drugs from one of the common bioresource-fruits. The database was developed using MySQL. The database interface is developed in PHP, HTML and JAVA. FCDD is freely available. http://www.fruitcropsdd.com/

  8. New pbysical methods used in the study of composition, electronic properties and surface phenomena of solid substances. I. Electronic spectroscopies

    International Nuclear Information System (INIS)

    Toderean, A; Ilonca, Gh.

    1981-01-01

    The discovery of different kinds of interactions between solids and fotonic, respectively electronic and ionic beams, leads to the development of many new, very sensitive, physical methods for the study of solids. This monograph tries to present some of these methods, useful in compositional analysis, in the study of electronic properties and of the surface processes of solid substances. This is done from the point of view both of physical phenomena underlying them and of the information obtainable with such methods. But the whole monograph is limited only to the methods based on the electronic properties of the elements existing in the solid probes studied and this paper presents only those of them in which the detected beam is an electronic one, like: ELS, DAPS, ILS, AES, AEAPS, INS, TSS, XPS and UPS. (authors)

  9. PROTEST AND CONSUMER: A CONTENT ANALYSIS OF EBSCO ELECTRONIC DATABASE ACROSS 50 YEARS

    Directory of Open Access Journals (Sweden)

    Kresno A. Hendarto

    2014-02-01

    Full Text Available AbstractThe objective of this research is to review the articles of protest and consumer in EBSCO electronic journal. The objective will be achieved by answering the following questions: (i what is the method of reviewed articles with the topic of protest; and (ii what agenda of future research? The results showed that most reviewed articles  used qualitative approach. Most research are empirical in nature. For articles of empirical research, most of the data were obtained through observation and were analyzed descriptively. Literature on protest and consumer can be classified into 3 main categories: (1 discussing about the frequency, cause, and objective of protest; (2 discussing about the consequence of protest; and (3 discussing about the  motivation of individuals in taking part in the protest. This review also showed that there are major gaps from the previous research including the fact that attitude is not the focal point in explaining why consumers take part in protest and no article discussed consumers’ attitude to the object taking part in the protest.  AbstrakTujuan penelitian ini adalah melakukan ulas balik artikel-artikel bertopik protes dan konsumen pada database elektronik EBSCO. Tujuan tersebut akan dicapai dengan menjawab pertanyaan penelitian sebagai berikut: (i apa metode dan sifat dasar dari artikel yang bertopik protes; dan (ii agenda penelitian apa yang masih terbuka ke depan? Hasil analisis menunjukkan bahwa kebanyakan artikel yang direview menggunakan pendekatan kualitatif. Kebanyakan artikel merupakan penelitian empiris. Untuk artikel penelitian empiris, mayoritas data diperoleh dengan cara observasi, dan dianalisis secara deskriptif. Literatur tentang protes dan konsumen dapat digolongkan dalam 3 bagian besar, yaitu (1 yang membahas tentang frekuensi, sebab, dan tujuan protes; (2 yang membahas tentang konsekuensi dari protes; dan (3 yang membahas tentang motivasi individu yang mendasari untuk berpartisipasi dalam

  10. A method for the direct measurement of electronic site populations in a molecular aggregate using two-dimensional electronic-vibrational spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Nicholas H. C.; Dong, Hui; Oliver, Thomas A. A.; Fleming, Graham R., E-mail: grfleming@lbl.gov [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Physical Biosciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Kavli Energy Nanosciences Institute at Berkeley, Berkeley, California 94720 (United States)

    2015-09-28

    Two dimensional electronic spectroscopy has proved to be a valuable experimental technique to reveal electronic excitation dynamics in photosynthetic pigment-protein complexes, nanoscale semiconductors, organic photovoltaic materials, and many other types of systems. It does not, however, provide direct information concerning the spatial structure and dynamics of excitons. 2D infrared spectroscopy has become a widely used tool for studying structural dynamics but is incapable of directly providing information concerning electronic excited states. 2D electronic-vibrational (2DEV) spectroscopy provides a link between these domains, directly connecting the electronic excitation with the vibrational structure of the system under study. In this work, we derive response functions for the 2DEV spectrum of a molecular dimer and propose a method by which 2DEV spectra could be used to directly measure the electronic site populations as a function of time following the initial electronic excitation. We present results from the response function simulations which show that our proposed approach is substantially valid. This method provides, to our knowledge, the first direct experimental method for measuring the electronic excited state dynamics in the spatial domain, on the molecular scale.

  11. The Barcelona Hospital Clínic therapeutic apheresis database.

    Science.gov (United States)

    Cid, Joan; Carbassé, Gloria; Cid-Caballero, Marc; López-Púa, Yolanda; Alba, Cristina; Perea, Dolores; Lozano, Miguel

    2017-09-22

    A therapeutic apheresis (TA) database helps to increase knowledge about indications and type of apheresis procedures that are performed in clinical practice. The objective of the present report was to describe the type and number of TA procedures that were performed at our institution in a 10-year period, from 2007 to 2016. The TA electronic database was created by transferring patient data from electronic medical records and consultation forms into a Microsoft Access database developed exclusively for this purpose. Since 2007, prospective data from every TA procedure were entered in the database. A total of 5940 TA procedures were performed: 3762 (63.3%) plasma exchange (PE) procedures, 1096 (18.5%) hematopoietic progenitor cell (HPC) collections, and 1082 (18.2%) TA procedures other than PEs and HPC collections. The overall trend for the time-period was progressive increase in total number of TA procedures performed each year (from 483 TA procedures in 2007 to 822 in 2016). The tracking trend of each procedure during the 10-year period was different: the number of PE and other type of TA procedures increased 22% and 2818%, respectively, and the number of HPC collections decreased 28%. The TA database helped us to increase our knowledge about various indications and type of TA procedures that were performed in our current practice. We also believe that this database could serve as a model that other institutions can use to track service metrics. © 2017 Wiley Periodicals, Inc.

  12. Accurate Ionization Potentials and Electron Affinities of Acceptor Molecules I. Reference Data at the CCSD(T) Complete Basis Set Limit

    KAUST Repository

    Richard, Ryan M.

    2016-01-05

    © 2016 American Chemical Society. In designing organic materials for electronics applications, particularly for organic photovoltaics (OPV), the ionization potential (IP) of the donor and the electron affinity (EA) of the acceptor play key roles. This makes OPV design an appealing application for computational chemistry since IPs and EAs are readily calculable from most electronic structure methods. Unfortunately reliable, high-accuracy wave function methods, such as coupled cluster theory with single, double, and perturbative triples [CCSD(T)] in the complete basis set (CBS) limit are too expensive for routine applications to this problem for any but the smallest of systems. One solution is to calibrate approximate, less computationally expensive methods against a database of high-accuracy IP/EA values; however, to our knowledge, no such database exists for systems related to OPV design. The present work is the first of a multipart study whose overarching goal is to determine which computational methods can be used to reliably compute IPs and EAs of electron acceptors. This part introduces a database of 24 known organic electron acceptors and provides high-accuracy vertical IP and EA values expected to be within ±0.03 eV of the true non-relativistic, vertical CCSD(T)/CBS limit. Convergence of IP and EA values toward the CBS limit is studied systematically for the Hartree-Fock, MP2 correlation, and beyond-MP2 coupled cluster contributions to the focal point estimates.

  13. Several cases of electronics and the measuring methods

    International Nuclear Information System (INIS)

    Supardiyono, Bb.; Kamadi, J.; Suparmono, M.; Indarto.

    1980-01-01

    Several cases of electronics and the measuring methods, covering electric conductivity and electric potential of analog systems, electric current, electric conductivity and electric potential of semiconductor diodes, and characteristics of transistors are described. (SMN)

  14. Comparing deep neural network and other machine learning algorithms for stroke prediction in a large-scale population-based electronic medical claims database.

    Science.gov (United States)

    Chen-Ying Hung; Wei-Chen Chen; Po-Tsun Lai; Ching-Heng Lin; Chi-Chun Lee

    2017-07-01

    Electronic medical claims (EMCs) can be used to accurately predict the occurrence of a variety of diseases, which can contribute to precise medical interventions. While there is a growing interest in the application of machine learning (ML) techniques to address clinical problems, the use of deep-learning in healthcare have just gained attention recently. Deep learning, such as deep neural network (DNN), has achieved impressive results in the areas of speech recognition, computer vision, and natural language processing in recent years. However, deep learning is often difficult to comprehend due to the complexities in its framework. Furthermore, this method has not yet been demonstrated to achieve a better performance comparing to other conventional ML algorithms in disease prediction tasks using EMCs. In this study, we utilize a large population-based EMC database of around 800,000 patients to compare DNN with three other ML approaches for predicting 5-year stroke occurrence. The result shows that DNN and gradient boosting decision tree (GBDT) can result in similarly high prediction accuracies that are better compared to logistic regression (LR) and support vector machine (SVM) approaches. Meanwhile, DNN achieves optimal results by using lesser amounts of patient data when comparing to GBDT method.

  15. Carbon footprinting of electronic products

    International Nuclear Information System (INIS)

    Vasan, Arvind; Sood, Bhanu; Pecht, Michael

    2014-01-01

    Highlights: • Challenges in adopting existing CF standards for electronic products are discussed. • Carbon footprint of electronic products is underestimated using existing standards. • Multipronged approach is presented to overcome the identified challenges. • Multipronged approach demonstrated on commercial and military grade DC–DC converter system. - Abstract: In order to mitigate the effects of global warming, companies are being compelled by governments, investors, and customers to control their greenhouse gas (GHG) emissions. Similar to the European Union’s legislation on the airline industry, legislation is expected to require the electronics industry to assess their product’s carbon footprint before sale or use, as the electronics industry’s contribution to global GHG emissions is comparable to the airline industry’s contribution. Thus, it is necessary for members of the electronics industry to assess their current GHG emission rates and identify methods to reduce environmental impacts. Organizations use Carbon Footprint (CF) analysis methods to identify and quantify the GHG emissions associated with the life cycle stages of their product or services. This paper discusses the prevailing methods used by organizations to estimate the CF of their electronics products and identifies the challenges faced by the electronics industry when adopting these methods in an environment of decreasing product development cycles with complex and diffuse supply chains. We find that, as a result of the inconsistencies arising from the system boundary selection methods and databases, the use of outdated LCA approaches, and the lack of supplier’s emissions-related data, the CFs of electronic products are typically underestimated. To address these challenges, we present a comprehensive approach to the carbon footprinting of electronic products that involves the use of product-group-oriented standards, hybrid life cycle assessment techniques, and the

  16. Database thinking development in Context of School Education

    OpenAIRE

    Panský, Mikoláš

    2011-01-01

    The term database thinking is understood as group of Competencies that enables working with Database System. Database thinking development is targeted educational incidence to student with the expected outcome ability working with Database system. Thesis is focused on problematic of purposes, content and methods of database thinking development. Experimental part proposes quantitative metrics for database thinking development. KEYWORDS: Education, Database, Database thinking, Structured Query...

  17. Assessment of imputation methods using varying ecological information to fill the gaps in a tree functional trait database

    Science.gov (United States)

    Poyatos, Rafael; Sus, Oliver; Vilà-Cabrera, Albert; Vayreda, Jordi; Badiella, Llorenç; Mencuccini, Maurizio; Martínez-Vilalta, Jordi

    2016-04-01

    Plant functional traits are increasingly being used in ecosystem ecology thanks to the growing availability of large ecological databases. However, these databases usually contain a large fraction of missing data because measuring plant functional traits systematically is labour-intensive and because most databases are compilations of datasets with different sampling designs. As a result, within a given database, there is an inevitable variability in the number of traits available for each data entry and/or the species coverage in a given geographical area. The presence of missing data may severely bias trait-based analyses, such as the quantification of trait covariation or trait-environment relationships and may hamper efforts towards trait-based modelling of ecosystem biogeochemical cycles. Several data imputation (i.e. gap-filling) methods have been recently tested on compiled functional trait databases, but the performance of imputation methods applied to a functional trait database with a regular spatial sampling has not been thoroughly studied. Here, we assess the effects of data imputation on five tree functional traits (leaf biomass to sapwood area ratio, foliar nitrogen, maximum height, specific leaf area and wood density) in the Ecological and Forest Inventory of Catalonia, an extensive spatial database (covering 31900 km2). We tested the performance of species mean imputation, single imputation by the k-nearest neighbors algorithm (kNN) and a multiple imputation method, Multivariate Imputation with Chained Equations (MICE) at different levels of missing data (10%, 30%, 50%, and 80%). We also assessed the changes in imputation performance when additional predictors (species identity, climate, forest structure, spatial structure) were added in kNN and MICE imputations. We evaluated the imputed datasets using a battery of indexes describing departure from the complete dataset in trait distribution, in the mean prediction error, in the correlation matrix

  18. Reliability prediction system based on the failure rate model for electronic components

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Lee, Hwa Ki

    2008-01-01

    Although many methodologies for predicting the reliability of electronic components have been developed, their reliability might be subjective according to a particular set of circumstances, and therefore it is not easy to quantify their reliability. Among the reliability prediction methods are the statistical analysis based method, the similarity analysis method based on an external failure rate database, and the method based on the physics-of-failure model. In this study, we developed a system by which the reliability of electronic components can be predicted by creating a system for the statistical analysis method of predicting reliability most easily. The failure rate models that were applied are MILHDBK- 217F N2, PRISM, and Telcordia (Bellcore), and these were compared with the general purpose system in order to validate the effectiveness of the developed system. Being able to predict the reliability of electronic components from the stage of design, the system that we have developed is expected to contribute to enhancing the reliability of electronic components

  19. On-line database of voltammetric data of immobilized particles for identifying pigments and minerals in archaeometry, conservation and restoration (ELCHER database).

    Science.gov (United States)

    Doménech-Carbó, Antonio; Doménech-Carbó, María Teresa; Valle-Algarra, Francisco Manuel; Gimeno-Adelantado, José Vicente; Osete-Cortina, Laura; Bosch-Reig, Francisco

    2016-07-13

    A web-based database of voltammograms is presented for characterizing artists' pigments and corrosion products of ceramic, stone and metal objects by means of the voltammetry of immobilized particles methodology. Description of the website and the database is provided. Voltammograms are, in most cases, accompanied by scanning electron microphotographs, X-ray spectra, infrared spectra acquired in attenuated total reflectance Fourier transform infrared spectroscopy mode (ATR-FTIR) and diffuse reflectance spectra in the UV-Vis-region. For illustrating the usefulness of the database two case studies involving identification of pigments and a case study describing deterioration of an archaeological metallic object are presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Statistics of electron multiplication in a multiplier phototube; Iterative method

    International Nuclear Information System (INIS)

    Ortiz, J. F.; Grau, A.

    1985-01-01

    In the present paper an iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situation are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (Author) 11 refs

  1. Bibliographical database of radiation biological dosimetry and risk assessment: Part 2

    International Nuclear Information System (INIS)

    Straume, T.; Ricker, Y.; Thut, M.

    1990-09-01

    This is part 11 of a database constructed to support research in radiation biological dosimetry and risk assessment. Relevant publications were identified through detailed searches of national and international electronic databases and through our personal knowledge of the subject. Publications were numbered and key worded, and referenced in an electronic data-retrieval system that permits quick access through computerized searches on authors, key words, title, year, journal name, or publication number. Photocopies of the publications contained in the database are maintained in a file that is numerically arranged by our publication acquisition numbers. This volume contains 1048 additional entries, which are listed in alphabetical order by author. The computer software used for the database is a simple but sophisticated relational database program that permits quick information access, high flexibility, and the creation of customized reports. This program is inexpensive and is commercially available for the Macintosh and the IBM PC. Although the database entries were made using a Macintosh computer, we have the capability to convert the files into the IBM PC version. As of this date, the database cites 2260 publications. Citations in the database are from 200 different scientific journals. There are also references to 80 books and published symposia, and 158 reports. Information relevant to radiation biological dosimetry and risk assessment is widely distributed within the scientific literature, although a few journals clearly predominate. The journals publishing the largest number of relevant papers are Health Physics, with a total of 242 citations in the database, and Mutation Research, with 185 citations. Other journals with over 100 citations in the database, are Radiation Research, with 136, and International Journal of Radiation Biology, with 132

  2. Electron paramagnetic resonance: A new method of quaternary dating

    International Nuclear Information System (INIS)

    Poupeau, G.; Rossi, A.; Teles, M.M.; Danon, J.

    1984-01-01

    Significant progress has occurred in the last years in quaternary geochronology. One of this is the emergence of a new dating approach, the Electron Spin Resonance Method. The aim of this paper is to briefly review the method and discuss some aspects of the work at CBPF. (Author) [pt

  3. Electron paramagnetic resonance: a new method of quaternary dating

    International Nuclear Information System (INIS)

    Poupeau, G.; Rossi, A.; Universidade Federal Rural do Rio de Janeiro; Telles, M.; Danon, J.

    1984-01-01

    Significant progress has occurred in the last years in quaternary geochronology. One of this is the emergence of a new dating approach, the Electron Spin Resonance Method. The aim of this paper is to briefly review the method and discuss some aspects of the work at CBPF. (Author) [pt

  4. Improving decoy databases for protein folding algorithms

    KAUST Repository

    Lindsey, Aaron

    2014-01-01

    Copyright © 2014 ACM. Predicting protein structures and simulating protein folding are two of the most important problems in computational biology today. Simulation methods rely on a scoring function to distinguish the native structure (the most energetically stable) from non-native structures. Decoy databases are collections of non-native structures used to test and verify these functions. We present a method to evaluate and improve the quality of decoy databases by adding novel structures and removing redundant structures. We test our approach on 17 different decoy databases of varying size and type and show significant improvement across a variety of metrics. We also test our improved databases on a popular modern scoring function and show that they contain a greater number of native-like structures than the original databases, thereby producing a more rigorous database for testing scoring functions.

  5. Domain Regeneration for Cross-Database Micro-Expression Recognition

    Science.gov (United States)

    Zong, Yuan; Zheng, Wenming; Huang, Xiaohua; Shi, Jingang; Cui, Zhen; Zhao, Guoying

    2018-05-01

    In this paper, we investigate the cross-database micro-expression recognition problem, where the training and testing samples are from two different micro-expression databases. Under this setting, the training and testing samples would have different feature distributions and hence the performance of most existing micro-expression recognition methods may decrease greatly. To solve this problem, we propose a simple yet effective method called Target Sample Re-Generator (TSRG) in this paper. By using TSRG, we are able to re-generate the samples from target micro-expression database and the re-generated target samples would share same or similar feature distributions with the original source samples. For this reason, we can then use the classifier learned based on the labeled source samples to accurately predict the micro-expression categories of the unlabeled target samples. To evaluate the performance of the proposed TSRG method, extensive cross-database micro-expression recognition experiments designed based on SMIC and CASME II databases are conducted. Compared with recent state-of-the-art cross-database emotion recognition methods, the proposed TSRG achieves more promising results.

  6. Electronic structure prediction via data-mining the empirical pseudopotential method

    Energy Technology Data Exchange (ETDEWEB)

    Zenasni, H; Aourag, H [LEPM, URMER, Departement of Physics, University Abou Bakr Belkaid, Tlemcen 13000 (Algeria); Broderick, S R; Rajan, K [Department of Materials Science and Engineering, Iowa State University, Ames, Iowa 50011-2230 (United States)

    2010-01-15

    We introduce a new approach for accelerating the calculation of the electronic structure of new materials by utilizing the empirical pseudopotential method combined with data mining tools. Combining data mining with the empirical pseudopotential method allows us to convert an empirical approach to a predictive approach. Here we consider tetrahedrally bounded III-V Bi semiconductors, and through the prediction of form factors based on basic elemental properties we can model the band structure and charge density for these semi-conductors, for which limited results exist. This work represents a unique approach to modeling the electronic structure of a material which may be used to identify new promising semi-conductors and is one of the few efforts utilizing data mining at an electronic level. (Abstract Copyright [2010], Wiley Periodicals, Inc.)

  7. The Danish Hysterectomy and Hysteroscopy Database

    DEFF Research Database (Denmark)

    Topsøe, Märta Fink; Ibfelt, Else Helene; Settnes, Annette

    2016-01-01

    AIM OF THE DATABASE: The steering committee of the Danish Hysterectomy and Hysteroscopy Database (DHHD) has defined the objective of the database: the aim is firstly to reduce complications, readmissions, reoperations; secondly to specify the need for hospitalization after hysterectomy; thirdly...... DATA: Annually approximately 4,300 hysterectomies and 3,200 operative hysteroscopies are performed in Denmark. Since the establishment of the database in 2003, 50,000 hysterectomies have been registered. DHHD's nationwide cooperation and research have led to national guidelines and regimes. Annual...... national meetings and nationwide workshops have been organized. CONCLUSION: The use of vaginal and laparoscopic hysterectomy methods has increased during the past decade and the overall complication rate and hospital stay have declined. The regional variation in operation methods has also decreased....

  8. Improved methods for high resolution electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.R.

    1987-04-01

    Existing methods of making support films for high resolution transmission electron microscopy are investigated and novel methods are developed. Existing methods of fabricating fenestrated, metal reinforced specimen supports (microgrids) are evaluated for their potential to reduce beam induced movement of monolamellar crystals of C/sub 44/H/sub 90/ paraffin supported on thin carbon films. Improved methods of producing hydrophobic carbon films by vacuum evaporation, and improved methods of depositing well ordered monolamellar paraffin crystals on carbon films are developed. A novel technique for vacuum evaporation of metals is described which is used to reinforce microgrids. A technique is also developed to bond thin carbon films to microgrids with a polymer bonding agent. Unique biochemical methods are described to accomplish site specific covalent modification of membrane proteins. Protocols are given which covalently convert the carboxy terminus of papain cleaved bacteriorhodopsin to a free thiol. 53 refs., 19 figs., 1 tab.

  9. Thick-Restart Lanczos Method for Electronic Structure Calculations

    International Nuclear Information System (INIS)

    Simon, Horst D.; Wang, L.-W.; Wu, Kesheng

    1999-01-01

    This paper describes two recent innovations related to the classic Lanczos method for eigenvalue problems, namely the thick-restart technique and dynamic restarting schemes. Combining these two new techniques we are able to implement an efficient eigenvalue problem solver. This paper will demonstrate its effectiveness on one particular class of problems for which this method is well suited: linear eigenvalue problems generated from non-self-consistent electronic structure calculations

  10. YMDB: the Yeast Metabolome Database

    Science.gov (United States)

    Jewison, Timothy; Knox, Craig; Neveu, Vanessa; Djoumbou, Yannick; Guo, An Chi; Lee, Jacqueline; Liu, Philip; Mandal, Rupasri; Krishnamurthy, Ram; Sinelnikov, Igor; Wilson, Michael; Wishart, David S.

    2012-01-01

    The Yeast Metabolome Database (YMDB, http://www.ymdb.ca) is a richly annotated ‘metabolomic’ database containing detailed information about the metabolome of Saccharomyces cerevisiae. Modeled closely after the Human Metabolome Database, the YMDB contains >2000 metabolites with links to 995 different genes/proteins, including enzymes and transporters. The information in YMDB has been gathered from hundreds of books, journal articles and electronic databases. In addition to its comprehensive literature-derived data, the YMDB also contains an extensive collection of experimental intracellular and extracellular metabolite concentration data compiled from detailed Mass Spectrometry (MS) and Nuclear Magnetic Resonance (NMR) metabolomic analyses performed in our lab. This is further supplemented with thousands of NMR and MS spectra collected on pure, reference yeast metabolites. Each metabolite entry in the YMDB contains an average of 80 separate data fields including comprehensive compound description, names and synonyms, structural information, physico-chemical data, reference NMR and MS spectra, intracellular/extracellular concentrations, growth conditions and substrates, pathway information, enzyme data, gene/protein sequence data, as well as numerous hyperlinks to images, references and other public databases. Extensive searching, relational querying and data browsing tools are also provided that support text, chemical structure, spectral, molecular weight and gene/protein sequence queries. Because of S. cervesiae's importance as a model organism for biologists and as a biofactory for industry, we believe this kind of database could have considerable appeal not only to metabolomics researchers, but also to yeast biologists, systems biologists, the industrial fermentation industry, as well as the beer, wine and spirit industry. PMID:22064855

  11. Electronic properties of antiferromagnetic UBi2 metal by exact exchange for correlated electrons method

    Directory of Open Access Journals (Sweden)

    E Ghasemikhah

    2012-03-01

    Full Text Available This study investigated the electronic properties of antiferromagnetic UBi2 metal by using ab initio calculations based on the density functional theory (DFT, employing the augmented plane waves plus local orbital method. We used the exact exchange for correlated electrons (EECE method to calculate the exchange-correlation energy under a variety of hybrid functionals. Electric field gradients (EFGs at the uranium site in UBi2 compound were calculated and compared with the experiment. The EFGs were predicted experimentally at the U site to be very small in this compound. The EFG calculated by the EECE functional are in agreement with the experiment. The densities of states (DOSs show that 5f U orbital is hybrided with the other orbitals. The plotted Fermi surfaces show that there are two kinds of charges on Fermi surface of this compound.

  12. A Distributed Database System for Developing Ontological and Lexical Resources in Harmony

    NARCIS (Netherlands)

    Horák, A.; Vossen, P.T.J.M.; Rambousek, A.; Gelbukh, A.

    2010-01-01

    In this article, we present the basic ideas of creating a new information-rich lexical database of Dutch, called Cornetto, that is interconnected with corresponding English synsets and a formal ontology. The Cornetto database is based on two existing electronic dictionaries - the Referentie Bestand

  13. Locating relevant patient information in electronic health record data using representations of clinical concepts and database structures.

    Science.gov (United States)

    Pan, Xuequn; Cimino, James J

    2014-01-01

    Clinicians and clinical researchers often seek information in electronic health records (EHRs) that are relevant to some concept of interest, such as a disease or finding. The heterogeneous nature of EHRs can complicate retrieval, risking incomplete results. We frame this problem as the presence of two gaps: 1) a gap between clinical concepts and their representations in EHR data and 2) a gap between data representations and their locations within EHR data structures. We bridge these gaps with a knowledge structure that comprises relationships among clinical concepts (including concepts of interest and concepts that may be instantiated in EHR data) and relationships between clinical concepts and the database structures. We make use of available knowledge resources to develop a reproducible, scalable process for creating a knowledge base that can support automated query expansion from a clinical concept to all relevant EHR data.

  14. Implementation of K-Means Clustering Method for Electronic Learning Model

    Science.gov (United States)

    Latipa Sari, Herlina; Suranti Mrs., Dewi; Natalia Zulita, Leni

    2017-12-01

    Teaching and Learning process at SMK Negeri 2 Bengkulu Tengah has applied e-learning system for teachers and students. The e-learning was based on the classification of normative, productive, and adaptive subjects. SMK Negeri 2 Bengkulu Tengah consisted of 394 students and 60 teachers with 16 subjects. The record of e-learning database was used in this research to observe students’ activity pattern in attending class. K-Means algorithm in this research was used to classify students’ learning activities using e-learning, so that it was obtained cluster of students’ activity and improvement of student’s ability. Implementation of K-Means Clustering method for electronic learning model at SMK Negeri 2 Bengkulu Tengah was conducted by observing 10 students’ activities, namely participation of students in the classroom, submit assignment, view assignment, add discussion, view discussion, add comment, download course materials, view article, view test, and submit test. In the e-learning model, the testing was conducted toward 10 students that yielded 2 clusters of membership data (C1 and C2). Cluster 1: with membership percentage of 70% and it consisted of 6 members, namely 1112438 Anggi Julian, 1112439 Anis Maulita, 1112441 Ardi Febriansyah, 1112452 Berlian Sinurat, 1112460 Dewi Anugrah Anwar and 1112467 Eka Tri Oktavia Sari. Cluster 2:with membership percentage of 30% and it consisted of 4 members, namely 1112463 Dosita Afriyani, 1112471 Erda Novita, 1112474 Eskardi and 1112477 Fachrur Rozi.

  15. Electronic cigarettes: product characterisation and design considerations

    OpenAIRE

    Brown, Christopher J; Cheng, James M

    2014-01-01

    Objective To review the available evidence regarding electronic cigarette (e-cigarette) product characterisation and design features in order to understand their potential impact on individual users and on public health. Methods Systematic literature searches in 10 reference databases were conducted through October 2013. A total of 14 articles and documents and 16 patents were included in this analysis. Results Numerous disposable and reusable e-cigarette product options exist, representing w...

  16. The validity of the density scaling method in primary electron transport for photon and electron beams

    International Nuclear Information System (INIS)

    Woo, M.K.; Cunningham, J.R.

    1990-01-01

    In the convolution/superposition method of photon beam dose calculations, inhomogeneities are usually handled by using some form of scaling involving the relative electron densities of the inhomogeneities. In this paper the accuracy of density scaling as applied to primary electrons generated in photon interactions is examined. Monte Carlo calculations are compared with density scaling calculations for air and cork slab inhomogeneities. For individual primary photon kernels as well as for photon interactions restricted to a thin layer, the results can differ significantly, by up to 50%, between the two calculations. However, for realistic photon beams where interactions occur throughout the whole irradiated volume, the discrepancies are much less severe. The discrepancies for the kernel calculation are attributed to the scattering characteristics of the electrons and the consequent oversimplified modeling used in the density scaling method. A technique called the kernel integration technique is developed to analyze the general effects of air and cork inhomogeneities. It is shown that the discrepancies become significant only under rather extreme conditions, such as immediately beyond the surface after a large air gap. In electron beams all the primary electrons originate from the surface of the phantom and the errors caused by simple density scaling can be much more significant. Various aspects relating to the accuracy of density scaling for air and cork slab inhomogeneities are discussed

  17. Electron and Positron Stopping Powers of Materials

    Science.gov (United States)

    SRD 7 NIST Electron and Positron Stopping Powers of Materials (PC database for purchase)   The EPSTAR database provides rapid calculations of stopping powers (collisional, radiative, and total), CSDA ranges, radiation yields and density effect corrections for incident electrons or positrons with kinetic energies from 1 keV to 10 GeV, and for any chemically defined target material.

  18. Design of multi-tiered database application based on CORBA component in SDUV-FEL system

    International Nuclear Information System (INIS)

    Sun Xiaoying; Shen Liren; Dai Zhimin

    2004-01-01

    The drawback of usual two-tiered database architecture was analyzed and the Shanghai Deep Ultraviolet-Free Electron Laser database system under development was discussed. A project for realizing the multi-tiered database architecture based on common object request broker architecture (CORBA) component and middleware model constructed by C++ was presented. A magnet database was given to exhibit the design of the CORBA component. (authors)

  19. Relativistic convergent close-coupling method applied to electron scattering from mercury

    International Nuclear Information System (INIS)

    Bostock, Christopher J.; Fursa, Dmitry V.; Bray, Igor

    2010-01-01

    We report on the extension of the recently formulated relativistic convergent close-coupling (RCCC) method to accommodate two-electron and quasi-two-electron targets. We apply the theory to electron scattering from mercury and obtain differential and integrated cross sections for elastic and inelastic scattering. We compared with previous nonrelativistic convergent close-coupling (CCC) calculations and for a number of transitions obtained significantly better agreement with the experiment. The RCCC method is able to resolve structure in the integrated cross sections for the energy regime in the vicinity of the excitation thresholds for the (6s6p) 3 P 0,1,2 states. These cross sections are associated with the formation of negative ion (Hg - ) resonances that could not be resolved with the nonrelativistic CCC method. The RCCC results are compared with the experiment and other relativistic theories.

  20. Advanced approaches to intelligent information and database systems

    CERN Document Server

    Boonjing, Veera; Chittayasothorn, Suphamit

    2014-01-01

    This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis, and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications, and multimedia systems methods and applications. The book will be interesting for both practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced ...

  1. Developing a stone database for clinical practice.

    Science.gov (United States)

    Turney, Benjamin W; Noble, Jeremy G; Reynard, John M

    2011-09-01

    Our objective was to design an intranet-based database to streamline stone patient management and data collection. The system developers used a rapid development approach that removed the need for laborious and unnecessary documentation, instead focusing on producing a rapid prototype that could then be altered iteratively. By using open source development software and website best practice, the development cost was kept very low in comparison with traditional clinical applications. Information about each patient episode can be entered via a user-friendly interface. The bespoke electronic stone database removes the need for handwritten notes, dictation, and typing. From the database, files may be automatically generated for clinic letters, operation notes. and letters to family doctors. These may be printed or e-mailed from the database. Data may be easily exported for audits, coding, and research. Data collection remains central to medical practice, to improve patient safety, to analyze medical and surgical outcomes, and to evaluate emerging treatments. Establishing prospective data collection is crucial to this process. In the current era, we have the opportunity to embrace available technology to facilitate this process. The database template could be modified for use in other clinics. The database that we have designed helps to provide a modern and efficient clinical stone service.

  2. Database Description - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Database Description General information of database Database... name Yeast Interacting Proteins Database Alternative name - DOI 10.18908/lsdba.nbdc00742-000 Creator C...-ken 277-8561 Tel: +81-4-7136-3989 FAX: +81-4-7136-3979 E-mail : Database classif...s cerevisiae Taxonomy ID: 4932 Database description Information on interactions and related information obta...l Acad Sci U S A. 2001 Apr 10;98(8):4569-74. Epub 2001 Mar 13. External Links: Original website information Database

  3. Update History of This Database - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Update History of This Database Date Update contents 2014/05/07 The co...ntact information is corrected. The features and manner of utilization of the database are corrected. 2014/02/04 Trypanosomes Databas...e English archive site is opened. 2011/04/04 Trypanosomes Database ( http://www.tan...paku.org/tdb/ ) is opened. About This Database Database Description Download Lice...nse Update History of This Database Site Policy | Contact Us Update History of This Database - Trypanosomes Database | LSDB Archive ...

  4. On the absorbed dose determination method in high energy electrons beams

    International Nuclear Information System (INIS)

    Scarlat, F.; Scarisoreanu, A.; Oane, M.; Mitru, E.; Avadanei, C.

    2008-01-01

    The absorbed dose determination method in water for electron beams with energies in the range from 1 MeV to 50 MeV is presented herein. The dosimetry equipment for measurements is composed of an UNIDOS.PTW electrometer and different ionization chambers calibrated in air kerma in a Co 60 beam. Starting from the code of practice for high energy electron beams, this paper describes the method adopted by the secondary standard dosimetry laboratory (SSDL) in NILPRP - Bucharest

  5. Investigation on structuring the human body function database; Shintai kino database no kochiku ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Based on the concept of human life engineering database, a study was made to know how to technically make such a database fittable to the old people in the age-advancing society. It was then proposed that the old people`s human life engineering database should be prepared to serve for the development and design of life technology to be applied into the age-advancing society. An executive method of structuring the database was established through the `bathing` and `going out` selected as an action to be casestudied in the daily life of old people. As a result of the study, the proposal was made that the old people`s human body function database should be prepared as a R and D base for the life technology in the aged society. Based on the above proposal, a master plan was mapped out to structure this database with the concrete method studied for putting it into action. At the first investigation stage of the above study, documentation was made through utilizing the existing documentary database. Enterprises were also interviewed for the investigation. Pertaining to the function of old people, about 500 documents were extracted with many vague points not clarified yet. The investigation will restart in the next fiscal year. 4 refs., 38 figs., 30 tabs.

  6. Method for pulse to pulse dose reproducibility applied to electron linear accelerators

    International Nuclear Information System (INIS)

    Ighigeanu, D.; Martin, D.; Oproiu, C.; Cirstea, E.; Craciun, G.

    2002-01-01

    An original method for obtaining programmed beam single shots and pulse trains with programmed pulse number, pulse repetition frequency, pulse duration and pulse dose is presented. It is particularly useful for automatic control of absorbed dose rate level, irradiation process control as well as in pulse radiolysis studies, single pulse dose measurement or for research experiments where pulse-to-pulse dose reproducibility is required. This method is applied to the electron linear accelerators, ALIN-10 of 6.23 MeV and 82 W and ALID-7, of 5.5 MeV and 670 W, built in NILPRP. In order to implement this method, the accelerator triggering system (ATS) consists of two branches: the gun branch and the magnetron branch. ATS, which synchronizes all the system units, delivers trigger pulses at a programmed repetition rate (up to 250 pulses/s) to the gun (80 kV, 10 A and 4 ms) and magnetron (45 kV, 100 A, and 4 ms).The accelerated electron beam existence is determined by the electron gun and magnetron pulses overlapping. The method consists in controlling the overlapping of pulses in order to deliver the beam in the desired sequence. This control is implemented by a discrete pulse position modulation of gun and/or magnetron pulses. The instabilities of the gun and magnetron transient regimes are avoided by operating the accelerator with no accelerated beam for a certain time. At the operator 'beam start' command, the ATS controls electron gun and magnetron pulses overlapping and the linac beam is generated. The pulse-to-pulse absorbed dose variation is thus considerably reduced. Programmed absorbed dose, irradiation time, beam pulse number or other external events may interrupt the coincidence between the gun and magnetron pulses. Slow absorbed dose variation is compensated by the control of the pulse duration and repetition frequency. Two methods are reported in the electron linear accelerators' development for obtaining the pulse to pulse dose reproducibility: the method

  7. Database Description - TMBETA-GENOME | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ENOME is a database for transmembrane β-barrel proteins in complete genomes. For each genome, calculations with machine learning algo...rithms and statistical methods have been perfumed and th

  8. Digitizing Olin Eggen's Card Database

    Science.gov (United States)

    Crast, J.; Silvis, G.

    2017-06-01

    The goal of the Eggen Card Database Project is to recover as many of the photometric observations from Olin Eggen's Card Database as possible and preserve these observations, in digital forms that are accessible by anyone. Any observations of interest to the AAVSO will be added to the AAVSO International Database (AID). Given to the AAVSO on long-term loan by the Cerro Tololo Inter-American Observatory, the database is a collection of over 78,000 index cards holding all Eggen's observations made between 1960 and 1990. The cards were electronically scanned and the resulting 108,000 card images have been published as a series of 2,216 PDF files, which are available from the AAVSO web site. The same images are also stored in an AAVSO online database where they are indexed by star name and card content. These images can be viewed using the eggen card portal online tool. Eggen made observations using filter bands from five different photometric systems. He documented these observations using 15 different data recording formats. Each format represents a combination of filter magnitudes and color indexes. These observations are being transcribed onto spreadsheets, from which observations of value to the AAVSO are added to the AID. A total of 506 U, B, V, R, and I observations were added to the AID for the variable stars S Car and l Car. We would like the reader to search through the card database using the eggen card portal for stars of particular interest. If such stars are found and retrieval of the observations is desired, e-mail the authors, and we will be happy to help retrieve those data for the reader.

  9. Interactive searching of facial image databases

    Science.gov (United States)

    Nicholls, Robert A.; Shepherd, John W.; Shepherd, Jean

    1995-09-01

    A set of psychological facial descriptors has been devised to enable computerized searching of criminal photograph albums. The descriptors have been used to encode image databased of up to twelve thousand images. Using a system called FACES, the databases are searched by translating a witness' verbal description into corresponding facial descriptors. Trials of FACES have shown that this coding scheme is more productive and efficient than searching traditional photograph albums. An alternative method of searching the encoded database using a genetic algorithm is currenly being tested. The genetic search method does not require the witness to verbalize a description of the target but merely to indicate a degree of similarity between the target and a limited selection of images from the database. The major drawback of FACES is that is requires a manual encoding of images. Research is being undertaken to automate the process, however, it will require an algorithm which can predict human descriptive values. Alternatives to human derived coding schemes exist using statistical classifications of images. Since databases encoded using statistical classifiers do not have an obvious direct mapping to human derived descriptors, a search method which does not require the entry of human descriptors is required. A genetic search algorithm is being tested for such a purpose.

  10. Application of Macro Response Monte Carlo method for electron spectrum simulation

    International Nuclear Information System (INIS)

    Perles, L.A.; Almeida, A. de

    2007-01-01

    During the past years several variance reduction techniques for Monte Carlo electron transport have been developed in order to reduce the electron computation time transport for absorbed dose distribution. We have implemented the Macro Response Monte Carlo (MRMC) method to evaluate the electron spectrum which can be used as a phase space input for others simulation programs. Such technique uses probability distributions for electron histories previously simulated in spheres (called kugels). These probabilities are used to sample the primary electron final state, as well as the creation secondary electrons and photons. We have compared the MRMC electron spectra simulated in homogeneous phantom against the Geant4 spectra. The results showed an agreement better than 6% in the spectra peak energies and that MRMC code is up to 12 time faster than Geant4 simulations

  11. Performance assessment of EMR systems based on post-relational database.

    Science.gov (United States)

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  12. New Combined Electron-Beam Methods of Wastewater Purification

    International Nuclear Information System (INIS)

    Pikaev, A.K.; Makarov, I.E.; Ponomarev, A.V.; Kartasheva, L.I.; Podzorova, E.A.; Chulkov, V.N.; Han, B.; Kim, D.K.

    1999-01-01

    The paper is a brief review of the results obtained with the participation of the authors from the study on combined electron-beam methods for purification of some wastewaters. The data on purification of wastewaters containing dyes or hydrogen peroxide and municipal wastewater in the aerosol flow are considered

  13. The Belle II VXD production database

    Energy Technology Data Exchange (ETDEWEB)

    Valentan, Manfred; Ritter, Martin [Max-Planck-Institut fuer Physik, Muenchen (Germany); Wuerkner, Benedikt; Leitl, Bernhard [Institut fuer Hochenergiephysik, Wien (Austria); Pilo, Federico [Istituto Nazionale di Fisica Nucleare, Pisa (Italy); Collaboration: Belle II-Collaboration

    2015-07-01

    The construction and commissioning of the Belle II Vertex Detector (VXD) is a huge endeavor involving a large number of valuable components. Both subsystems PXD (Pixel Detector) and SVD (Silicon Vertex Detector) deploy a large number of sensors, readout electronic parts and mechanical elements. These items are scattered around the world at many institutes, where they are built, measured and assembled. One has to keep track of measurement configurations and results, know at any time the location of the sensors, their processing state, quality, where they end up in an assembly, and who is responsible. These requirements call for a flexible and extensive database which is able to reflect the processes in the laboratories and the logistics between the institutes. This talk introduces the database requirements of a physics experiment using the PXD construction workflow as a showcase, and presents an overview of the database ''HephyDb'', which is used by the groups constructing the Belle II VXD.

  14. Database and Related Activities in Japan

    International Nuclear Information System (INIS)

    Murakami, Izumi; Kato, Daiji; Kato, Masatoshi; Sakaue, Hiroyuki A.; Kato, Takako; Ding, Xiaobin; Morita, Shigeru; Kitajima, Masashi; Koike, Fumihiro; Nakamura, Nobuyuki; Sakamoto, Naoki; Sasaki, Akira; Skobelev, Igor; Tsuchida, Hidetsugu; Ulantsev, Artemiy; Watanabe, Tetsuya; Yamamoto, Norimasa

    2011-01-01

    We have constructed and made available atomic and molecular (AM) numerical databases on collision processes such as electron-impact excitation and ionization, recombination and charge transfer of atoms and molecules relevant for plasma physics, fusion research, astrophysics, applied-science plasma, and other related areas. The retrievable data is freely accessible via the internet. We also work on atomic data evaluation and constructing collisional-radiative models for spectroscopic plasma diagnostics. Recently we have worked on Fe ions and W ions theoretically and experimentally. The atomic data and collisional-radiative models for these ions are examined and applied to laboratory plasmas. A visible M1 transition of W 26+ ion is identified at 389.41 nm by EBIT experiments and theoretical calculations. We have small non-retrievable databases in addition to our main database. Recently we evaluated photo-absorption cross sections for 9 atoms and 23 molecules and we present them as a new database. We established a new association ''Forum of Atomic and Molecular Data and Their Applications'' to exchange information among AM data producers, data providers and data users in Japan and we hope this will help to encourage AM data activities in Japan.

  15. Implementing and evaluating a fictitious electron dynamics method for the calculation of electronic structure: Application to the Si(100) surface

    International Nuclear Information System (INIS)

    Hoffman, M J H; Claassens, C H

    2006-01-01

    A density matrix based fictitious electron dynamics method for calculating electronic structure has been implemented within a semi-empirical quantum chemistry environment. This method uses an equation of motion that implicitly ensures the idempotency constraint on the density matrix. Test calculations showed that this method has potential of being combined with simultaneous atomic dynamics, in analogy to the popular Car-Parrinello method. In addition, the sparsity of the density matrix and the sophisticated though flexible way of ensuring idempotency conservation while integrating the equation of motion creates the potential of developing a fast linear scaling method

  16. Electronic cigarettes in the USA: a summary of available toxicology data and suggestions for the future

    OpenAIRE

    Orr, Michael S

    2014-01-01

    Objective To review the available evidence evaluating the toxicological profiles of electronic cigarettes (e-cigarettes) in order to understand the potential impact of e-cigarettes on individual users and the public health. Methods Systematic literature searches were conducted between October 2012 and October 2013 using five electronic databases. Search terms such as ‘e-cigarettes’ and ‘electronic delivery devices’ were used to identify the toxicology information for e-cigarettes. Results As ...

  17. Evaluation of diagnostic tests when there is no gold standard. A review of methods

    NARCIS (Netherlands)

    Rutjes, A. W. S.; Reitsma, J. B.; Coomarasamy, A.; Khan, K. S.; Bossuyt, P. M. M.

    2007-01-01

    OBJECTIVE: To generate a classification of methods to evaluate medical tests when there is no gold standard. METHODS: Multiple search strategies were employed to obtain an overview of the different methods described in the literature, including searches of electronic databases, contacting experts

  18. The method of abstraction in the design of databases and the interoperability

    Science.gov (United States)

    Yakovlev, Nikolay

    2018-03-01

    When designing the database structure oriented to the contents of indicators presented in the documents and communications subject area. First, the method of abstraction is applied by expansion of the indices of new, artificially constructed abstract concepts. The use of abstract concepts allows to avoid registration of relations many-to-many. For this reason, when built using abstract concepts, demonstrate greater stability in the processes. The example abstract concepts to address structure - a unique house number. Second, the method of abstraction can be used in the transformation of concepts by omitting some attributes that are unnecessary for solving certain classes of problems. Data processing associated with the amended concepts is more simple without losing the possibility of solving the considered classes of problems. For example, the concept "street" loses the binding to the land. The content of the modified concept of "street" are only the relations of the houses to the declared name. For most accounting tasks and ensure communication is enough.

  19. Name Authority Challenges for Indexing and Abstracting Databases

    OpenAIRE

    Denise Beaubien Bennett; Priscilla Williams

    2006-01-01

    Objective - This analysis explores alternative methods for managing author name changes in Indexing and Abstarcting (I&A) databases. A searcher may retrieve incomplete or inaccurate results when the database provides no or faulty assistance in linking author name variations. Methods - The article includes an analysis of current name authority practices in I&A databases and of selected research into name disambiguation models applied to authorship of articles. Results - Several potential...

  20. Modified Monte Carlo method for study of electron transport in degenerate electron gas in the presence of electron–electron interactions, application to graphene

    International Nuclear Information System (INIS)

    Borowik, Piotr; Thobel, Jean-Luc; Adamowicz, Leszek

    2017-01-01

    Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron–electron (e–e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e–e interactions. This required adapting the treatment of e–e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.

  1. Modified Monte Carlo method for study of electron transport in degenerate electron gas in the presence of electron–electron interactions, application to graphene

    Energy Technology Data Exchange (ETDEWEB)

    Borowik, Piotr, E-mail: pborow@poczta.onet.pl [Warsaw University of Technology, Faculty of Physics, ul. Koszykowa 75, 00-662 Warszawa (Poland); Thobel, Jean-Luc, E-mail: jean-luc.thobel@iemn.univ-lille1.fr [Institut d' Electronique, de Microélectronique et de Nanotechnologies, UMR CNRS 8520, Université Lille 1, Avenue Poincaré, CS 60069, 59652 Villeneuve d' Ascq Cédex (France); Adamowicz, Leszek, E-mail: adamo@if.pw.edu.pl [Warsaw University of Technology, Faculty of Physics, ul. Koszykowa 75, 00-662 Warszawa (Poland)

    2017-07-15

    Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron–electron (e–e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e–e interactions. This required adapting the treatment of e–e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.

  2. Selected topics on data and databases

    International Nuclear Information System (INIS)

    Ralchenko, Y.

    2001-01-01

    Dr. Ralchenko reviewed atomic data activities at the WIS Plasma Laboratory. Physical sputtering yields over the broad energy range for fusion is presented. Collisional database including electron and heavy particle projectiles was completed. Quantum-mechanical calculations for the process of Stark broadening of atomic spectral lines are continued. Systems considered include data for Be-like ions and lithium-like ions

  3. Selected topics on data and databases

    Energy Technology Data Exchange (ETDEWEB)

    Ralchenko, Y [Weizmann Institute of Science, Rehovot (Israel)

    2001-12-01

    Dr. Ralchenko reviewed atomic data activities at the WIS Plasma Laboratory. Physical sputtering yields over the broad energy range for fusion is presented. Collisional database including electron and heavy particle projectiles was completed. Quantum-mechanical calculations for the process of Stark broadening of atomic spectral lines are continued. Systems considered include data for Be-like ions and lithium-like ions.

  4. On-line database of voltammetric data of immobilized particles for identifying pigments and minerals in archaeometry, conservation and restoration (ELCHER database)

    Energy Technology Data Exchange (ETDEWEB)

    Doménech-Carbó, Antonio, E-mail: antonio.domenech@uv.es [Departament de Química Analítica, Universitat de València, Dr. Moliner, 50, 46100, Burjassot, València (Spain); Doménech-Carbó, María Teresa, E-mail: tdomenec@crbc.upv.es [Institut de Restauració del Patrimoni, Universitat Politècnica de València, Camí de Vera 14, 46022, València (Spain); Valle-Algarra, Francisco Manuel; Gimeno-Adelantado, José Vicente [Departament de Química Analítica, Universitat de València, Dr. Moliner, 50, 46100, Burjassot, València (Spain); Osete-Cortina, Laura [Institut de Restauració del Patrimoni, Universitat Politècnica de València, Camí de Vera 14, 46022, València (Spain); Bosch-Reig, Francisco [Departament de Química Analítica, Universitat de València, Dr. Moliner, 50, 46100, Burjassot, València (Spain)

    2016-07-13

    A web-based database of voltammograms is presented for characterizing artists' pigments and corrosion products of ceramic, stone and metal objects by means of the voltammetry of immobilized particles methodology. Description of the website and the database is provided. Voltammograms are, in most cases, accompanied by scanning electron microphotographs, X-ray spectra, infrared spectra acquired in attenuated total reflectance Fourier transform infrared spectroscopy mode (ATR-FTIR) and diffuse reflectance spectra in the UV–Vis-region. For illustrating the usefulness of the database two case studies involving identification of pigments and a case study describing deterioration of an archaeological metallic object are presented. - Highlights: • A web-based database of voltammograms is presented. • The voltammetry of immobilized particles is used. • Artist's pigments and corrosion products of ceramic, stone and metal objects are included. • Examples of application on works of art are discussed.

  5. On-line database of voltammetric data of immobilized particles for identifying pigments and minerals in archaeometry, conservation and restoration (ELCHER database)

    International Nuclear Information System (INIS)

    Doménech-Carbó, Antonio; Doménech-Carbó, María Teresa; Valle-Algarra, Francisco Manuel; Gimeno-Adelantado, José Vicente; Osete-Cortina, Laura; Bosch-Reig, Francisco

    2016-01-01

    A web-based database of voltammograms is presented for characterizing artists' pigments and corrosion products of ceramic, stone and metal objects by means of the voltammetry of immobilized particles methodology. Description of the website and the database is provided. Voltammograms are, in most cases, accompanied by scanning electron microphotographs, X-ray spectra, infrared spectra acquired in attenuated total reflectance Fourier transform infrared spectroscopy mode (ATR-FTIR) and diffuse reflectance spectra in the UV–Vis-region. For illustrating the usefulness of the database two case studies involving identification of pigments and a case study describing deterioration of an archaeological metallic object are presented. - Highlights: • A web-based database of voltammograms is presented. • The voltammetry of immobilized particles is used. • Artist's pigments and corrosion products of ceramic, stone and metal objects are included. • Examples of application on works of art are discussed.

  6. Integration of curated databases to identify genotype-phenotype associations

    Directory of Open Access Journals (Sweden)

    Li Jianrong

    2006-10-01

    Full Text Available Abstract Background The ability to rapidly characterize an unknown microorganism is critical in both responding to infectious disease and biodefense. To do this, we need some way of anticipating an organism's phenotype based on the molecules encoded by its genome. However, the link between molecular composition (i.e. genotype and phenotype for microbes is not obvious. While there have been several studies that address this challenge, none have yet proposed a large-scale method integrating curated biological information. Here we utilize a systematic approach to discover genotype-phenotype associations that combines phenotypic information from a biomedical informatics database, GIDEON, with the molecular information contained in National Center for Biotechnology Information's Clusters of Orthologous Groups database (NCBI COGs. Results Integrating the information in the two databases, we are able to correlate the presence or absence of a given protein in a microbe with its phenotype as measured by certain morphological characteristics or survival in a particular growth media. With a 0.8 correlation score threshold, 66% of the associations found were confirmed by the literature and at a 0.9 correlation threshold, 86% were positively verified. Conclusion Our results suggest possible phenotypic manifestations for proteins biochemically associated with sugar metabolism and electron transport. Moreover, we believe our approach can be extended to linking pathogenic phenotypes with functionally related proteins.

  7. Methods of organization of SCORM-compliant teaching materials in electronic format

    Directory of Open Access Journals (Sweden)

    Jacek Marciniak

    2012-06-01

    Full Text Available This paper presents a method of organizing electronic teaching materials based on their role in the teaching process rather than their technical structure. Our method allows SCORM materials stored as e-learning courses („electronic books” to be subdivided and structured so that content can be used in multiple contexts. As a standard, SCORM defines rules for organizing content, but not how to divide and structure it. Our method uses UCTS nomenclature to divide content, define relationships between content entities, and aggregate those entities into courses. This allows content to be shared in different implementations of SCORM while guaranteeing that usability and consistency are maintained.

  8. Patterns of Undergraduates' Use of Scholarly Databases in a Large Research University

    Science.gov (United States)

    Mbabu, Loyd Gitari; Bertram, Albert; Varnum, Ken

    2013-01-01

    Authentication data was utilized to explore undergraduate usage of subscription electronic databases. These usage patterns were linked to the information literacy curriculum of the library. The data showed that out of the 26,208 enrolled undergraduate students, 42% of them accessed a scholarly database at least once in the course of the entire…

  9. Reusable data in public health data-bases-problems encountered in Danish Children's Database.

    Science.gov (United States)

    Høstgaard, Anna Marie; Pape-Haugaard, Louise

    2012-01-01

    Denmark have unique health informatics databases e.g. "The Children's Database", which since 2009 holds data on all Danish children from birth until 17 years of age. In the current set-up a number of potential sources of errors exist - both technical and human-which means that the data is flawed. This gives rise to erroneous statistics and makes the data unsuitable for research purposes. In order to make the data usable, it is necessary to develop new methods for validating the data generation process at the municipal/regional/national level. In the present ongoing research project, two research areas are combined: Public Health Informatics and Computer Science, and both ethnographic as well as system engineering research methods are used. The project is expected to generate new generic methods and knowledge about electronic data collection and transmission in different social contexts and by different social groups and thus to be of international importance, since this is sparsely documented in the Public Health Informatics perspective. This paper presents the preliminary results, which indicate that health information technology used ought to be subject for redesign, where a thorough insight into the work practices should be point of departure.

  10. On-line database of the spectral properties of polycyclic aromatic hydrocarbons

    International Nuclear Information System (INIS)

    Malloci, Giuliano; Joblin, Christine; Mulas, Giacomo

    2007-01-01

    We present an on-line database of computed molecular properties for a large sample of polycyclic aromatic hydrocarbons in four charge states: -1, 0, +1, and +2. At present our database includes 40 molecules ranging in size from naphthalene and azulene (C 10 H 8 ) up to circumovalene (C 66 H 20 ). We performed our calculations in the framework of the density functional theory (DFT) and the time-dependent DFT to obtain the most relevant molecular parameters needed for astrophysical applications. For each molecule in the sample, our database presents in a uniform way the energetic, rotational, vibrational, and electronic properties. It is freely accessible on the web at (http://astrochemistry.ca.astro.it/database/) and (http://www.cesr.fr/~joblin/database/)

  11. Online Databases for Health Professionals

    OpenAIRE

    Marshall, Joanne Gard

    1987-01-01

    Recent trends in the marketing of electronic information technology have increased interest among health professionals in obtaining direct access to online biomedical databases such as Medline. During 1985, the Canadian Medical Association (CMA) and Telecom Canada conducted an eight-month trial of the use made of online information retrieval systems by 23 practising physicians and one pharmacist. The results of this project demonstrated both the value and the limitations of these systems in p...

  12. Simple method for generating adjustable trains of picosecond electron bunches

    Directory of Open Access Journals (Sweden)

    P. Muggli

    2010-05-01

    Full Text Available A simple, passive method for producing an adjustable train of picosecond electron bunches is demonstrated. The key component of this method is an electron beam mask consisting of an array of parallel wires that selectively spoils the beam emittance. This mask is positioned in a high magnetic dispersion, low beta-function region of the beam line. The incoming electron beam striking the mask has a time/energy correlation that corresponds to a time/position correlation at the mask location. The mask pattern is transformed into a time pattern or train of bunches when the dispersion is brought back to zero downstream of the mask. Results are presented of a proof-of-principle experiment demonstrating this novel technique that was performed at the Brookhaven National Laboratory Accelerator Test Facility. This technique allows for easy tailoring of the bunch train for a particular application, including varying the bunch width and spacing, and enabling the generation of a trailing witness bunch.

  13. Update History of This Database - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Update History of This Database Date Update contents 2017/02/27 Arabidopsis Phenome Data...base English archive site is opened. - Arabidopsis Phenome Database (http://jphenom...e.info/?page_id=95) is opened. About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Update History of This Database - Arabidopsis Phenome Database | LSDB Archive ...

  14. BUSINESS MODELLING AND DATABASE DESIGN IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Mihai-Constantin AVORNICULUI

    2015-04-01

    Full Text Available Electronic commerce is growing constantly from one year to another in the last decade, few are the areas that also register such a growth. It covers the exchanges of computerized data, but also electronic messaging, linear data banks and electronic transfer payment. Cloud computing, a relatively new concept and term, is a model of access services via the internet to distributed systems of configurable calculus resources at request which can be made available quickly with minimum management effort and intervention from the client and the provider. Behind an electronic commerce system in cloud there is a data base which contains the necessary information for the transactions in the system. Using business modelling, we get many benefits, which makes the design of the database used by electronic commerce systems in cloud considerably easier.

  15. Elektronische Informationsdienste im Bildungswesen (Electronic Information Services in Education) Gesellschaft Information Bildung Conference (GIB) (2nd, Berlin, Germany, November 17-18, 1994).

    Science.gov (United States)

    Diepold, Peter, Ed.; Rusch-Feja, Diann, Ed.

    These papers on educational technology were presented in three workshops at the second annual conference of the Society of Information Education (GIB). Discussion includes electronic networks, CD-ROMs, and online databases in education, the quality of educational software, database services and instructional methods, and the use of the Internet in…

  16. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  17. Kernel polynomial method for a nonorthogonal electronic-structure calculation of amorphous diamond

    International Nuclear Information System (INIS)

    Roeder, H.; Silver, R.N.; Drabold, D.A.; Dong, J.J.

    1997-01-01

    The Kernel polynomial method (KPM) has been successfully applied to tight-binding electronic-structure calculations as an O(N) method. Here we extend this method to nonorthogonal basis sets with a sparse overlap matrix S and a sparse Hamiltonian H. Since the KPM method utilizes matrix vector multiplications it is necessary to apply S -1 H onto a vector. The multiplication of S -1 is performed using a preconditioned conjugate-gradient method and does not involve the explicit inversion of S. Hence the method scales the same way as the original KPM method, i.e., O(N), although there is an overhead due to the additional conjugate-gradient part. We apply this method to a large scale electronic-structure calculation of amorphous diamond. copyright 1997 The American Physical Society

  18. PAMDB: a comprehensive Pseudomonas aeruginosa metabolome database.

    Science.gov (United States)

    Huang, Weiliang; Brewer, Luke K; Jones, Jace W; Nguyen, Angela T; Marcu, Ana; Wishart, David S; Oglesby-Sherrouse, Amanda G; Kane, Maureen A; Wilks, Angela

    2018-01-04

    The Pseudomonas aeruginosaMetabolome Database (PAMDB, http://pseudomonas.umaryland.edu) is a searchable, richly annotated metabolite database specific to P. aeruginosa. P. aeruginosa is a soil organism and significant opportunistic pathogen that adapts to its environment through a versatile energy metabolism network. Furthermore, P. aeruginosa is a model organism for the study of biofilm formation, quorum sensing, and bioremediation processes, each of which are dependent on unique pathways and metabolites. The PAMDB is modelled on the Escherichia coli (ECMDB), yeast (YMDB) and human (HMDB) metabolome databases and contains >4370 metabolites and 938 pathways with links to over 1260 genes and proteins. The database information was compiled from electronic databases, journal articles and mass spectrometry (MS) metabolomic data obtained in our laboratories. For each metabolite entered, we provide detailed compound descriptions, names and synonyms, structural and physiochemical information, nuclear magnetic resonance (NMR) and MS spectra, enzymes and pathway information, as well as gene and protein sequences. The database allows extensive searching via chemical names, structure and molecular weight, together with gene, protein and pathway relationships. The PAMBD and its future iterations will provide a valuable resource to biologists, natural product chemists and clinicians in identifying active compounds, potential biomarkers and clinical diagnostics. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. A MiniReview of the Use of Hospital-based Databases in Observational Inpatient Studies of Drugs

    DEFF Research Database (Denmark)

    Larsen, Michael Due; Cars, Thomas; Hallas, Jesper

    2013-01-01

    inpatient databases in Asia, the United States and Europe were found. Most databases were automatically collected from claims data or generated from electronic medical records. The contents of the databases varied as well as the potential for linkage with other data sources such as laboratory and outpatient...

  20. Update History of This Database - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Update History of This Database Date Update contents 2017/03/13 SKIP Stemcell Database... English archive site is opened. 2013/03/29 SKIP Stemcell Database ( https://www.skip.med.k...eio.ac.jp/SKIPSearch/top?lang=en ) is opened. About This Database Database Description Download License Update History of This Databa...se Site Policy | Contact Us Update History of This Database - SKIP Stemcell Database | LSDB Archive ...

  1. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  2. On some methods to produce high-energy polarized electron beams by means of proton synchrotrons

    International Nuclear Information System (INIS)

    Bessonov, E.G.; Vazdik, Ya.A.

    1980-01-01

    Some methods of production of high-energy polarized electron beams by means of proton synchrotrons are considered. These methods are based on transfer by protons of a part of their energy to the polarized electrons of a thin target placed inside the working volume of the synchrotron. It is suggested to use as a polarized electron target a magnetized crystalline iron in which proton channeling is realized, polarized atomic beams and the polarized plasma. It is shown that by this method one can produce polarized electron beams with energy approximately 100 GeV, energy spread +- 5 % and intensity approximately 10 7 electron/c, polarization approximately 30% and with intensity approximately 10 4 -10 5 electron/c, polarization approximately 100% [ru

  3. Electronic-projecting Moire method applying CBR-technology

    Science.gov (United States)

    Kuzyakov, O. N.; Lapteva, U. V.; Andreeva, M. A.

    2018-01-01

    Electronic-projecting method based on Moire effect for examining surface topology is suggested. Conditions of forming Moire fringes and their parameters’ dependence on reference parameters of object and virtual grids are analyzed. Control system structure and decision-making subsystem are elaborated. Subsystem execution includes CBR-technology, based on applying case base. The approach related to analysing and forming decision for each separate local area with consequent formation of common topology map is applied.

  4. The Danish Urogynaecological Database

    DEFF Research Database (Denmark)

    Guldberg, Rikke; Brostrøm, Søren; Hansen, Jesper Kjær

    2013-01-01

    in the DugaBase from 1 January 2009 to 31 October 2010, using medical records as a reference. RESULTS: A total of 16,509 urogynaecological procedures were registered in the DugaBase by 31 December 2010. The database completeness has increased by calendar time, from 38.2 % in 2007 to 93.2 % in 2010 for public......INTRODUCTION AND HYPOTHESIS: The Danish Urogynaecological Database (DugaBase) is a nationwide clinical database established in 2006 to monitor, ensure and improve the quality of urogynaecological surgery. We aimed to describe its establishment and completeness and to validate selected variables....... This is the first study based on data from the DugaBase. METHODS: The database completeness was calculated as a comparison between urogynaecological procedures reported to the Danish National Patient Registry and to the DugaBase. Validity was assessed for selected variables from a random sample of 200 women...

  5. A method for ultrashort electron pulse-shape measurement using coherent synchrotron radiation

    International Nuclear Information System (INIS)

    Geloni, G.; Yurkov, M.V.

    2003-03-01

    In this paper we discuss a method for nondestructive measurements of the longitudinal profile of sub-picosecond electron bunches for X-ray free electron lasers (XFELs). The method is based on the detection of the coherent synchrotron radiation (CSR) spectrum produced by a bunch passing a dipole magnet system. This work also contains a systematic treatment of synchrotron radiation theory which lies at the basis of CSR. Standard theory of synchrotron radiation uses several approximations whose applicability limits are often forgotten: here we present a systematic discussion about these assumptions. Properties of coherent synchrotron radiation from an electron moving along an arc of a circle are then derived and discussed. We describe also an effective and practical diagnostic technique based on the utilization of an electromagnetic undulator to record the energy of the coherent radiation pulse into the central cone. This measurement must be repeated many times with different undulator resonant frequencies in order to reconstruct the modulus of the bunch form-factor. The retrieval of the bunch profile function from these data is performed by means of deconvolution techniques: for the present work we take advantage of a constrained deconvolution method. We illustrate with numerical examples the potential of the proposed method for electron beam diagnostics at the TESLA test facility (TTF) accelerator. Here we choose, for emphasis, experiments aimed at the measure of the strongly non-Gaussian electron bunch profile in the TTF femtosecond-mode operation. We demonstrate that a tandem combination of a picosecond streak camera and a CSR spectrometer can be used to extract shape information from electron bunches with a narrow leading peak and a long tail. (orig.)

  6. Development and application of advanced methods for electronic structure calculations

    DEFF Research Database (Denmark)

    Schmidt, Per Simmendefeldt

    . For this reason, part of this thesis relates to developing and applying a new method for constructing so-called norm-conserving PAW setups, that are applicable to GW calculations by using a genetic algorithm. The effect of applying the new setups significantly affects the absolute band positions, both for bulk......This thesis relates to improvements and applications of beyond-DFT methods for electronic structure calculations that are applied in computational material science. The improvements are of both technical and principal character. The well-known GW approximation is optimized for accurate calculations...... of electronic excitations in two-dimensional materials by exploiting exact limits of the screened Coulomb potential. This approach reduces the computational time by an order of magnitude, enabling large scale applications. The GW method is further improved by including so-called vertex corrections. This turns...

  7. Pertukaran Data Antar Database Dengan Menggunakan Teknologi API

    Directory of Open Access Journals (Sweden)

    Ahmad Hanafi

    2017-03-01

    Full Text Available Electronically data interchange between institutions or companies must be supported with appropriate data storage media capacity. MySQL is a database engine that is used to perform data storage in the form of information, where the data can be utilized as needed. MYSQL has the advantage of which is to provide convenience in terms of usage, and able to work on different platforms. System requirements that must be reliable and multitasking capable of making the database not only as a data storage medium, but can also be utilized as a means of data exchange. Dropbox API is the best solution that can be utilized as a technology that supports the database to be able to Exchange data. The combination of the Dropbox API and database can be used as a very cheap solution for small companies to implement data exchange, because it only requires a relatively small Internet connection.

  8. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI): A Completed Reference Database of Lung Nodules on CT Scans

    International Nuclear Information System (INIS)

    2011-01-01

    Purpose: The development of computer-aided diagnostic (CAD) methods for lung nodule detection, classification, and quantitative assessment can be facilitated through a well-characterized repository of computed tomography (CT) scans. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) completed such a database, establishing a publicly available reference for the medical imaging research community. Initiated by the National Cancer Institute (NCI), further advanced by the Foundation for the National Institutes of Health (FNIH), and accompanied by the Food and Drug Administration (FDA) through active participation, this public-private partnership demonstrates the success of a consortium founded on a consensus-based process. Methods: Seven academic centers and eight medical imaging companies collaborated to identify, address, and resolve challenging organizational, technical, and clinical issues to provide a solid foundation for a robust database. The LIDC/IDRI Database contains 1018 cases, each of which includes images from a clinical thoracic CT scan and an associated XML file that records the results of a two-phase image annotation process performed by four experienced thoracic radiologists. In the initial blinded-read phase, each radiologist independently reviewed each CT scan and marked lesions belonging to one of three categories (''nodule≥3 mm,''''nodule<3 mm,'' and ''non-nodule≥3 mm''). In the subsequent unblinded-read phase, each radiologist independently reviewed their own marks along with the anonymized marks of the three other radiologists to render a final opinion. The goal of this process was to identify as completely as possible all lung nodules in each CT scan without requiring forced consensus. Results: The Database contains 7371 lesions marked ''nodule'' by at least one radiologist. 2669 of these lesions were marked ''nodule≥3 mm'' by at least one radiologist, of which 928 (34.7%) received such marks from all

  9. Projection-reduction method applied to deriving non-linear optical conductivity for an electron-impurity system

    Directory of Open Access Journals (Sweden)

    Nam Lyong Kang

    2013-07-01

    Full Text Available The projection-reduction method introduced by the present authors is known to give a validated theory for optical transitions in the systems of electrons interacting with phonons. In this work, using this method, we derive the linear and first order nonlinear optical conductivites for an electron-impurity system and examine whether the expressions faithfully satisfy the quantum mechanical philosophy, in the same way as for the electron-phonon systems. The result shows that the Fermi distribution function for electrons, energy denominators, and electron-impurity coupling factors are contained properly in organized manners along with absorption of photons for each electron transition process in the final expressions. Furthermore, the result is shown to be represented properly by schematic diagrams, as in the formulation of electron-phonon interaction. Therefore, in conclusion, we claim that this method can be applied in modeling optical transitions of electrons interacting with both impurities and phonons.

  10. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  11. Data mining in time series databases

    CERN Document Server

    Kandel, Abraham; Bunke, Horst

    2004-01-01

    Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.

  12. [A web-based integrated clinical database for laryngeal cancer].

    Science.gov (United States)

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  13. Combining information from a clinical data warehouse and a pharmaceutical database to generate a framework to detect comorbidities in electronic health records.

    Science.gov (United States)

    Sylvestre, Emmanuelle; Bouzillé, Guillaume; Chazard, Emmanuel; His-Mahier, Cécil; Riou, Christine; Cuggia, Marc

    2018-01-24

    Medical coding is used for a variety of activities, from observational studies to hospital billing. However, comorbidities tend to be under-reported by medical coders. The aim of this study was to develop an algorithm to detect comorbidities in electronic health records (EHR) by using a clinical data warehouse (CDW) and a knowledge database. We enriched the Theriaque pharmaceutical database with the French national Comorbidities List to identify drugs associated with at least one major comorbid condition and diagnoses associated with a drug indication. Then, we compared the drug indications in the Theriaque database with the ICD-10 billing codes in EHR to detect potentially missing comorbidities based on drug prescriptions. Finally, we improved comorbidity detection by matching drug prescriptions and laboratory test results. We tested the obtained algorithm by using two retrospective datasets extracted from the Rennes University Hospital (RUH) CDW. The first dataset included all adult patients hospitalized in the ear, nose, throat (ENT) surgical ward between October and December 2014 (ENT dataset). The second included all adult patients hospitalized at RUH between January and February 2015 (general dataset). We reviewed medical records to find written evidence of the suggested comorbidities in current or past stays. Among the 22,132 Common Units of Dispensation (CUD) codes present in the Theriaque database, 19,970 drugs (90.2%) were associated with one or several ICD-10 diagnoses, based on their indication, and 11,162 (50.4%) with at least one of the 4878 comorbidities from the comorbidity list. Among the 122 patients of the ENT dataset, 75.4% had at least one drug prescription without corresponding ICD-10 code. The comorbidity diagnoses suggested by the algorithm were confirmed in 44.6% of the cases. Among the 4312 patients of the general dataset, 68.4% had at least one drug prescription without corresponding ICD-10 code. The comorbidity diagnoses suggested by the

  14. Method of electron emission control in RF guns

    International Nuclear Information System (INIS)

    Khodak, I.V.; Kushnir, V.A.

    2001-01-01

    The electron emission control method for a RF gun is considered.According to the main idea of the method,the additional resonance system is created in a cathode region where the RF field strength could be varied using the external pulse equipment. The additional resonance system is composed of a coaxial cavity coupled with a RF gun cylindrical cavity via an axial hole. Computed results of radiofrequency and electrodynamic performances of such a two-cavity system and results of the RF gun model pilot study are presented in. Results of particle dynamics simulation are described

  15. Method of electron emission control in RF guns

    CERN Document Server

    Khodak, I V

    2001-01-01

    The electron emission control method for a RF gun is considered.According to the main idea of the method,the additional resonance system is created in a cathode region where the RF field strength could be varied using the external pulse equipment. The additional resonance system is composed of a coaxial cavity coupled with a RF gun cylindrical cavity via an axial hole. Computed results of radiofrequency and electrodynamic performances of such a two-cavity system and results of the RF gun model pilot study are presented in. Results of particle dynamics simulation are described.

  16. Database Description - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Open TG-GATEs Pathological Image Database Database Description General information of database Database... name Open TG-GATEs Pathological Image Database Alternative name - DOI 10.18908/lsdba.nbdc00954-0...iomedical Innovation 7-6-8, Saito-asagi, Ibaraki-city, Osaka 567-0085, Japan TEL:81-72-641-9826 Email: Database... classification Toxicogenomics Database Organism Taxonomy Name: Rattus norvegi... Article title: Author name(s): Journal: External Links: Original website information Database

  17. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    Science.gov (United States)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  18. On a method for high-energy electron beam production in proton synchrotrons

    International Nuclear Information System (INIS)

    Bessonov, E.G.; Vazdik, Ya.A.

    1979-01-01

    It is suggested to produce high-energy electron beams in such a way that the ultrarelativistic protons give an amount of their kinetic energy to the electrons of a thin target, placed inside the working volume of the proton synchrotron. The kinematics of the elastic scattering of relativistic protons on electrons at rest is treated. Evaluation of a number of elastically-scattered electrons by 1000 GeV and 3000 GeV proton beams is presented. The method under consideration is of certain practical interest and may appear to be preferable in a definite energy range of protons and electrons

  19. Gas Hydrate Research Database and Web Dissemination Channel

    Energy Technology Data Exchange (ETDEWEB)

    Micheal Frenkel; Kenneth Kroenlein; V Diky; R.D. Chirico; A. Kazakow; C.D. Muzny; M. Frenkel

    2009-09-30

    To facilitate advances in application of technologies pertaining to gas hydrates, a United States database containing experimentally-derived information about those materials was developed. The Clathrate Hydrate Physical Property Database (NIST Standard Reference Database {number_sign} 156) was developed by the TRC Group at NIST in Boulder, Colorado paralleling a highly-successful database of thermodynamic properties of molecular pure compounds and their mixtures and in association with an international effort on the part of CODATA to aid in international data sharing. Development and population of this database relied on the development of three components of information-processing infrastructure: (1) guided data capture (GDC) software designed to convert data and metadata into a well-organized, electronic format, (2) a relational data storage facility to accommodate all types of numerical and metadata within the scope of the project, and (3) a gas hydrate markup language (GHML) developed to standardize data communications between 'data producers' and 'data users'. Having developed the appropriate data storage and communication technologies, a web-based interface for both the new Clathrate Hydrate Physical Property Database, as well as Scientific Results from the Mallik 2002 Gas Hydrate Production Research Well Program was developed and deployed at http://gashydrates.nist.gov.

  20. Annual seminar on electronic sources of information

    International Nuclear Information System (INIS)

    Ravichandra Rao, I.K.

    2000-03-01

    With the rapid development in IT and the emergence of Internet, a multitude of information sources are now available on electronic media. They include e-journals and other electronic publications - online databases, reference documents, newspapers, magazines, etc. In addition to these online sources, there are thousands of CD-ROM databases. The CD-ROM databases and the online sources are collectively referred to as electronic sources of information. Libraries in no part of the world can afford to ignore these sources. Emergence of these new sources has resulted into a change in the traditional library functions including collection development, acquisitions, cataloguing, user instructions, etc. It is inevitable that in the next five to ten years, special libraries may have to allocate considerable amount towards subscription of e-journals and other e-publications. The papers in this seminar volume discuss several aspects related the theme of the seminar and cover e-journals, different sources available in the Net, classification of electronic sources, online public access catalogues, and different aspects of Internet. Papers relevant to INIS are indexed separately

  1. System and method for authentication

    Science.gov (United States)

    Duerksen, Gary L.; Miller, Seth A.

    2015-12-29

    Described are methods and systems for determining authenticity. For example, the method may include providing an object of authentication, capturing characteristic data from the object of authentication, deriving authentication data from the characteristic data of the object of authentication, and comparing the authentication data with an electronic database comprising reference authentication data to provide an authenticity score for the object of authentication. The reference authentication data may correspond to one or more reference objects of authentication other than the object of authentication.

  2. Methods of measurements on incidental X-radiation from electron tubes

    International Nuclear Information System (INIS)

    1977-01-01

    The standard describes the method for detection of x-radiation and the method for the direct and indirect measurement of field pattern and exposure rate of random incidental radiation emanating from high voltage electron tubes. Required apparatus and calibration procedure for the exposure rate meter or film mount are described. (M.G.B.)

  3. SU-F-T-71: A Practical Method for Evaluation of Electron Virtual Source Position

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Z; Jiang, W; Stuart, B; Leu, S; Feng, Y [East Carolina University, Greenville, North Carolina (United States); Liu, T [Houston Methodist Hospital, Sugar Land, TX (United States)

    2016-06-15

    Purpose: Since electrons are easily scattered, the virtual source position for electrons is expected to locate below the x-ray target of Medical Linacs. However, the effective SSD method yields the electron virtual position above the x-ray target for some applicators for some energy in Siemens Linacs. In this study, we propose to use IC Profiler (Sun Nuclear) for evaluating the electron virtual source position for the standard electron applicators for various electron energies. Methods: The profile measurements for various nominal source-to-detector distances (SDDs) of 100–115 cm were carried out for electron beam energies of 6–18 MeV. Two methods were used: one was to use a 0.125 cc ion chamber (PTW, Type 31010) with buildup mounted in a PTW water tank without water filled; and the other was to use IC Profiler with a buildup to achieve charge particle equilibrium. The full width at half-maximum (FWHM) method was used to determine the field sizes for the measured profiles. Backprojecting (by a straight line) the distance between the 50% points on the beam profiles for the various SDDs, yielded the virtual source position for each applicator. Results: The profiles were obtained and the field sizes were determined by FWHM. The virtual source positions were determined through backprojection of profiles for applicators (5, 10, 15, 20, 25). For instance, they were 96.415 cm (IC Profiler) vs 95.844 cm (scanning ion chamber) for 9 MeV electrons with 10×10 cm applicator and 97.160 cm vs 97.161 cm for 12 MeV electrons with 10×10 cm applicator. The differences in the virtual source positions between IC profiler and scanning ion chamber were within 1.5%. Conclusion: IC Profiler provides a practical method for determining the electron virtual source position and its results are consistent with those obtained by profiles of scanning ion chamber with buildup.

  4. SU-F-T-71: A Practical Method for Evaluation of Electron Virtual Source Position

    International Nuclear Information System (INIS)

    Huang, Z; Jiang, W; Stuart, B; Leu, S; Feng, Y; Liu, T

    2016-01-01

    Purpose: Since electrons are easily scattered, the virtual source position for electrons is expected to locate below the x-ray target of Medical Linacs. However, the effective SSD method yields the electron virtual position above the x-ray target for some applicators for some energy in Siemens Linacs. In this study, we propose to use IC Profiler (Sun Nuclear) for evaluating the electron virtual source position for the standard electron applicators for various electron energies. Methods: The profile measurements for various nominal source-to-detector distances (SDDs) of 100–115 cm were carried out for electron beam energies of 6–18 MeV. Two methods were used: one was to use a 0.125 cc ion chamber (PTW, Type 31010) with buildup mounted in a PTW water tank without water filled; and the other was to use IC Profiler with a buildup to achieve charge particle equilibrium. The full width at half-maximum (FWHM) method was used to determine the field sizes for the measured profiles. Backprojecting (by a straight line) the distance between the 50% points on the beam profiles for the various SDDs, yielded the virtual source position for each applicator. Results: The profiles were obtained and the field sizes were determined by FWHM. The virtual source positions were determined through backprojection of profiles for applicators (5, 10, 15, 20, 25). For instance, they were 96.415 cm (IC Profiler) vs 95.844 cm (scanning ion chamber) for 9 MeV electrons with 10×10 cm applicator and 97.160 cm vs 97.161 cm for 12 MeV electrons with 10×10 cm applicator. The differences in the virtual source positions between IC profiler and scanning ion chamber were within 1.5%. Conclusion: IC Profiler provides a practical method for determining the electron virtual source position and its results are consistent with those obtained by profiles of scanning ion chamber with buildup.

  5. Estimation of daily reference evapotranspiration (ETo) using artificial intelligence methods: Offering a new approach for lagged ETo data-based modeling

    Science.gov (United States)

    Mehdizadeh, Saeid

    2018-04-01

    Evapotranspiration (ET) is considered as a key factor in hydrological and climatological studies, agricultural water management, irrigation scheduling, etc. It can be directly measured using lysimeters. Moreover, other methods such as empirical equations and artificial intelligence methods can be used to model ET. In the recent years, artificial intelligence methods have been widely utilized to estimate reference evapotranspiration (ETo). In the present study, local and external performances of multivariate adaptive regression splines (MARS) and gene expression programming (GEP) were assessed for estimating daily ETo. For this aim, daily weather data of six stations with different climates in Iran, namely Urmia and Tabriz (semi-arid), Isfahan and Shiraz (arid), Yazd and Zahedan (hyper-arid) were employed during 2000-2014. Two types of input patterns consisting of weather data-based and lagged ETo data-based scenarios were considered to develop the models. Four statistical indicators including root mean square error (RMSE), mean absolute error (MAE), coefficient of determination (R2), and mean absolute percentage error (MAPE) were used to check the accuracy of models. The local performance of models revealed that the MARS and GEP approaches have the capability to estimate daily ETo using the meteorological parameters and the lagged ETo data as inputs. Nevertheless, the MARS had the best performance in the weather data-based scenarios. On the other hand, considerable differences were not observed in the models' accuracy for the lagged ETo data-based scenarios. In the innovation of this study, novel hybrid models were proposed in the lagged ETo data-based scenarios through combination of MARS and GEP models with autoregressive conditional heteroscedasticity (ARCH) time series model. It was concluded that the proposed novel models named MARS-ARCH and GEP-ARCH improved the performance of ETo modeling compared to the single MARS and GEP. In addition, the external

  6. Trials by Juries: Suggested Practices for Database Trials

    Science.gov (United States)

    Ritterbush, Jon

    2012-01-01

    Librarians frequently utilize product trials to assess the content and usability of a database prior to committing funds to a new subscription or purchase. At the 2012 Electronic Resources and Libraries Conference in Austin, Texas, three librarians presented a panel discussion on their institutions' policies and practices regarding database…

  7. Optimising case detection within UK electronic health records : use of multiple linked databases for detecting liver injury

    NARCIS (Netherlands)

    Wing, Kevin; Bhaskaran, Krishnan; Smeeth, Liam; van Staa, Tjeerd P|info:eu-repo/dai/nl/304827762; Klungel, Olaf H|info:eu-repo/dai/nl/181447649; Reynolds, Robert F; Douglas, Ian

    2016-01-01

    OBJECTIVES: We aimed to create a 'multidatabase' algorithm for identification of cholestatic liver injury using multiple linked UK databases, before (1) assessing the improvement in case ascertainment compared to using a single database and (2) developing a new single-database case-definition

  8. Electronic Resource Management Systems

    Directory of Open Access Journals (Sweden)

    Mark Ellingsen

    2004-10-01

    Full Text Available Computer applications which deal with electronic resource management (ERM are quite a recent development. They have grown out of the need to manage the burgeoning number of electronic resources particularly electronic journals. Typically, in the early years of e-journal acquisition, library staff provided an easy means of accessing these journals by providing an alphabetical list on a web page. Some went as far as categorising the e-journals by subject and then grouping the journals either on a single web page or by using multiple pages. It didn't take long before it was recognised that it would be more efficient to dynamically generate the pages from a database rather than to continually edit the pages manually. Of course, once the descriptive metadata for an electronic journal was held within a database the next logical step was to provide administrative forms whereby that metadata could be manipulated. This in turn led to demands for incorporating more information and more functionality into the developing application.

  9. A Database for Decision-Making in Training and Distributed Learning Technology

    National Research Council Canada - National Science Library

    Stouffer, Virginia

    1998-01-01

    .... A framework for incorporating data about distributed learning courseware into the existing training database was devised and a plan for a national electronic courseware redistribution network was recommended...

  10. Limitations of the condensed history method for low-energy electrons

    International Nuclear Information System (INIS)

    Martin, W.R.; Ballinger, C.T.; Rathkopf, J.A.

    1991-01-01

    A systematic evaluation of the conventional, condensed history electron transport methodology has been performed through comparisons with more accurate single-scatter Monte Carlo calculations. These comparisons highlight the inaccuracies associated with the condensed history method and indicate its range of validity. The condensed history method is used in codes such as MCNP4, SANDYL, ETRAN, ITS, and EGS and requires a number of restrictive assumptions about the scattering characteristics to make tractable the analytical solution to the infinite-medium transport equation. Distributions describing electron characteristics after multiple collisions (multiscatter distributions) are constructed from such solutions and serve as the heart of the condensed history codes. A two-level approach is taken to quantify the errors inherent in condensed history. First, conventional condensed history multiscattering distributions in energy and angle are compared directly with analogous distributions generated with a single-scatter Monte Carlo code. This recently developed code directly simulates individual electron interactions. Second, the conventional distributions are replaced in the condensed history code by distributions constructed via a single-scatter Monte Carlo simulation

  11. The mass angular scattering power method for determining the kinetic energies of clinical electron beams

    International Nuclear Information System (INIS)

    Blais, N.; Podgorsak, E.B.

    1992-01-01

    A method for determining the kinetic energy of clinical electron beams is described, based on the measurement in air of the spatial spread of a pencil electron beam which is produced from the broad clinical electron beam. As predicted by the Fermi-Eyges theory, the dose distribution measured in air on a plane, perpendicular to the incident direction of the initial pencil electron beam, is Gaussian. The square of its spatial spread is related to the mass angular scattering power which in turn is related to the kinetic energy of the electron beam. The measured spatial spread may thus be used to determine the mass angular scattering power, which is then used to determine the kinetic energy of the electron beam from the known relationship between mass angular scattering power and kinetic energy. Energies obtained with the mass angular scattering power method agree with those obtained with the electron range method. (author)

  12. Synthesis method for using in the design of an electron gun for gyrotion

    International Nuclear Information System (INIS)

    Silva, C.A.B.

    1987-09-01

    In this work a synthesis method is applied to the design of an electron gun for a 94GHz gyrotron. Using the synthesis method, it is found the shape of the electrodes compatible with the laminar flow which minimizes the action of space change on the electron velocity dispersion. A sistematic procedure is presented to fuid the parameters of the synthesis method which, in turn, are closely related to the characteristics of the aptoclechonic system. (author) [pt

  13. Color electron microprobe cathodoluminescence of Bishunpur meteorite compared with the traditional optical microscopy method

    Directory of Open Access Journals (Sweden)

    Amanda Araujo Tosi

    Full Text Available Abstract Cathodoluminescence (CL imaging is an outstanding method for sub classification of Unequilibrated Ordinary Chondrites (UOC - petrological type 3. CL can be obtained by several electron beam apparatuses. The traditional method uses an electron gun coupled to an optical microscope (OM. Although many scanning electron microscopes (SEM and electron microprobes (EPMA have been equipped with a cathodoluminescence, this technique was not fully explored. Images obtained by the two methods differ due to a different kind of signal acquisition. While in the CL-OM optical photography true colors are obtained, in the CL-EPMA the results are grayscale monochromatic electronic signals. L-RGB filters were used in the CL-EPMA analysis in order to obtain color data. The aim of this work is to compare cathodoluminescence data obtained from both techniques, optical microscope and electron microprobe, on the Bishunpur meteorite classified as LL 3.1 chondrite. The present study allows concluding that 20 KeV and 7 nA is the best analytical condition at EPMA in order to test the equivalence between CL-EPMA and CL-OM colour results. Moreover, the color index revealed to be a method for aiding the study of the thermal metamorphism, but it is not definitive for the meteorite classification.

  14. Electronic couplings for molecular charge transfer: Benchmarking CDFT, FODFT, and FODFTB against high-level ab initio calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kubas, Adam; Blumberger, Jochen, E-mail: j.blumberger@ucl.ac.uk [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Hoffmann, Felix [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Lehrstuhl für Theoretische Chemie, Ruhr-Universität Bochum, Universitätsstr. 150, 44801 Bochum (Germany); Heck, Alexander; Elstner, Marcus [Institute of Physical Chemistry, Karlsruhe Institute of Technology, Fritz-Haber-Weg 6, 76131 Karlsruhe (Germany); Oberhofer, Harald [Department of Chemistry, Technical University of Munich, Lichtenbergstr. 4, 85747 Garching (Germany)

    2014-03-14

    We introduce a database (HAB11) of electronic coupling matrix elements (H{sub ab}) for electron transfer in 11 π-conjugated organic homo-dimer cations. High-level ab inito calculations at the multireference configuration interaction MRCI+Q level of theory, n-electron valence state perturbation theory NEVPT2, and (spin-component scaled) approximate coupled cluster model (SCS)-CC2 are reported for this database to assess the performance of three DFT methods of decreasing computational cost, including constrained density functional theory (CDFT), fragment-orbital DFT (FODFT), and self-consistent charge density functional tight-binding (FODFTB). We find that the CDFT approach in combination with a modified PBE functional containing 50% Hartree-Fock exchange gives best results for absolute H{sub ab} values (mean relative unsigned error = 5.3%) and exponential distance decay constants β (4.3%). CDFT in combination with pure PBE overestimates couplings by 38.7% due to a too diffuse excess charge distribution, whereas the economic FODFT and highly cost-effective FODFTB methods underestimate couplings by 37.6% and 42.4%, respectively, due to neglect of interaction between donor and acceptor. The errors are systematic, however, and can be significantly reduced by applying a uniform scaling factor for each method. Applications to dimers outside the database, specifically rotated thiophene dimers and larger acenes up to pentacene, suggests that the same scaling procedure significantly improves the FODFT and FODFTB results for larger π-conjugated systems relevant to organic semiconductors and DNA.

  15. METHOD OF ELECTRON BEAM PROCESSING

    DEFF Research Database (Denmark)

    2003-01-01

    As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which is the o......As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which...... is the option of welding workpieces of large thicknesses. Therefore the idea is to guide the electron beam (2) to the workpiece via a hollow wire, said wire thereby acting as a prolongation of the vacuum chamber (4) down to workpiece. Thus, a workpiece need not be placed inside the vacuum chamber, thereby...... exploiting the potential of electron beam processing to a greater degree than previously possible, for example by means of electron beam welding...

  16. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    Science.gov (United States)

    Zhang, S. Y.; Shen, G. H.; Sun, Y.; Zhou, D. Z.; Zhang, X. X.; Li, J. W.; Huang, C.; Zhang, X. G.; Dong, Y. J.; Zhang, W. J.; Zhang, B. Q.; Shi, C. Y.

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference 90Sr/90Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  17. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  18. Method for the determination of the three-dimensional structure of ultrashort relativistic electron bunches

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca; Ilinski, Petr; Saldin, Evgeni; Schneidmiller, Evgeni; Yurkov, Mikhail

    2009-05-15

    We describe a novel technique to characterize ultrashort electron bunches in Xray Free-Electron Lasers. Namely, we propose to use coherent Optical Transition Radiation to measure three-dimensional (3D) electron density distributions. Our method relies on the combination of two known diagnostics setups, an Optical Replica Synthesizer (ORS) and an Optical Transition Radiation (OTR) imager. Electron bunches are modulated at optical wavelengths in the ORS setup.When these electron bunches pass through a metal foil target, coherent radiation pulses of tens MW power are generated. It is thereafter possible to exploit advantages of coherent imaging techniques, such as direct imaging, diffractive imaging, Fourier holography and their combinations. The proposed method opens up the possibility of real-time, wavelength-limited, single-shot 3D imaging of an ultrashort electron bunch. (orig.)

  19. Update History of This Database - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Update History of This Database Date Update contents 201...0/03/29 Yeast Interacting Proteins Database English archive site is opened. 2000/12/4 Yeast Interacting Proteins Database...( http://itolab.cb.k.u-tokyo.ac.jp/Y2H/ ) is released. About This Database Database Description... Download License Update History of This Database Site Policy | Contact Us Update History of This Database... - Yeast Interacting Proteins Database | LSDB Archive ...

  20. A simplified spherical harmonic method for coupled electron-photon transport calculations

    International Nuclear Information System (INIS)

    Josef, J.A.

    1996-12-01

    In this thesis we have developed a simplified spherical harmonic method (SP N method) and associated efficient solution techniques for 2-D multigroup electron-photon transport calculations. The SP N method has never before been applied to charged-particle transport. We have performed a first time Fourier analysis of the source iteration scheme and the P 1 diffusion synthetic acceleration (DSA) scheme applied to the 2-D SP N equations. Our theoretical analyses indicate that the source iteration and P 1 DSA schemes are as effective for the 2-D SP N equations as for the 1-D S N equations. Previous analyses have indicated that the P 1 DSA scheme is unstable (with sufficiently forward-peaked scattering and sufficiently small absorption) for the 2-D S N equations, yet is very effective for the 1-D S N equations. In addition, we have applied an angular multigrid acceleration scheme, and computationally demonstrated that it performs as well for the 2-D SP N equations as for the 1-D S N equations. It has previously been shown for 1-D S N calculations that this scheme is much more effective than the DSA scheme when scattering is highly forward-peaked. We have investigated the applicability of the SP N approximation to two different physical classes of problems: satellite electronics shielding from geomagnetically trapped electrons, and electron beam problems. In the space shielding study, the SP N method produced solutions that are accurate within 10% of the benchmark Monte Carlo solutions, and often orders of magnitude faster than Monte Carlo. We have successfully modeled quasi-void problems and have obtained excellent agreement with Monte Carlo. We have observed that the SP N method appears to be too diffusive an approximation for beam problems. This result, however, is in agreement with theoretical expectations

  1. Implementation of an interactive database interface utilizing HTML, PHP, JavaScript, and MySQL in support of water quality assessments in the Northeastern North Carolina Pasquotank Watershed

    Science.gov (United States)

    Guion, A., Jr.; Hodgkins, H.

    2015-12-01

    The Center of Excellence in Remote Sensing Education and Research (CERSER) has implemented three research projects during the summer Research Experience for Undergraduates (REU) program gathering water quality data for local waterways. The data has been compiled manually utilizing pen and paper and then entered into a spreadsheet. With the spread of electronic devices capable of interacting with databases, the development of an electronic method of entering and manipulating the water quality data was pursued during this project. This project focused on the development of an interactive database to gather, display, and analyze data collected from local waterways. The database and entry form was built in MySQL on a PHP server allowing participants to enter data from anywhere Internet access is available. This project then researched applying this data to the Google Maps site to provide labeling and information to users. The NIA server at http://nia.ecsu.edu is used to host the application for download and for storage of the databases. Water Quality Database Team members included the authors plus Derek Morris Jr., Kathryne Burton and Mr. Jeff Wood as mentor.

  2. Application of CTOF method to detect secondly charged particle from 2 GeV electron

    International Nuclear Information System (INIS)

    Takahashi, Kazutoshi; Sanami, Toshiya; Ban, Syuichi; Lee, Hee-Seok; Sato, Tatsuhiko

    2002-01-01

    To design a shield and evaluate leakage radiation at high energy electron accelerators, the energy and angular data of secondary particle from the reaction of electrons with structural materials are required. Secondly neutron spectrum from structural materials has been measured by using electron accelerator in PAL (Pohang Accelerator Laboratory). In the neutron measurement, the electronics with Multi-hit TDC (MHTDC) was adopted to measure Time of Flight of every particles (TOFs) emitted from the reactions by each single electron bunch. The measurements are extended to secondly charged particles. For the charged particles measurement, the pulse height data for every particles are indispensable to distinguish charged particles by Δ E-E method. A new system which can measure pulse height for every particle is required instead of the MHTDC system. For this requirement, the method which can take output current from detectors was developed by using digital storage oscilloscope system is named ''Current Time of Flight method'' (CTOF). The CTOF method is able to measure pulse height and TOF for every particles produced by single electron bunch. Electrons are accelerated to 2.04 GeV and the repetition rate is 10 Hz. These electrons bombard thin disk samples of Cu 1mm, Al 4 mm and W 0.5 mm. Secondly charged particles, proton and deuteron, are produced in the samples by photonuclear reaction. Two dimensional of Δ E-E spectrum for each the samples measured by CTOF shows separation between proton and deuteron perfectly. Thus, proton and deuteron spectrum are obtained from this data. (M. Suetake)

  3. Observational database for studies of nearby universe

    Science.gov (United States)

    Kaisina, E. I.; Makarov, D. I.; Karachentsev, I. D.; Kaisin, S. S.

    2012-01-01

    We present the description of a database of galaxies of the Local Volume (LVG), located within 10 Mpc around the Milky Way. It contains more than 800 objects. Based on an analysis of functional capabilities, we used the PostgreSQL DBMS as a management system for our LVG database. Applying semantic modelling methods, we developed a physical ER-model of the database. We describe the developed architecture of the database table structure, and the implemented web-access, available at http://www.sao.ru/lv/lvgdb.

  4. Dereplication of plant phenolics using a mass-spectrometry database independent method.

    Science.gov (United States)

    Borges, Ricardo M; Taujale, Rahil; de Souza, Juliana Santana; de Andrade Bezerra, Thaís; Silva, Eder Lana E; Herzog, Ronny; Ponce, Francesca V; Wolfender, Jean-Luc; Edison, Arthur S

    2018-05-29

    Dereplication, an approach to sidestep the efforts involved in the isolation of known compounds, is generally accepted as being the first stage of novel discoveries in natural product research. It is based on metabolite profiling analysis of complex natural extracts. To present the application of LipidXplorer for automatic targeted dereplication of phenolics in plant crude extracts based on direct infusion high-resolution tandem mass spectrometry data. LipidXplorer uses a user-defined molecular fragmentation query language (MFQL) to search for specific characteristic fragmentation patterns in large data sets and highlight the corresponding metabolites. To this end, MFQL files were written to dereplicate common phenolics occurring in plant extracts. Complementary MFQL files were used for validation purposes. New MFQL files with molecular formula restrictions for common classes of phenolic natural products were generated for the metabolite profiling of different representative crude plant extracts. This method was evaluated against an open-source software for mass-spectrometry data processing (MZMine®) and against manual annotation based on published data. The targeted LipidXplorer method implemented using common phenolic fragmentation patterns, was found to be able to annotate more phenolics than MZMine® that is based on automated queries on the available databases. Additionally, screening for ascarosides, natural products with unrelated structures to plant phenolics collected from the nematode Caenorhabditis elegans, demonstrated the specificity of this method by cross-testing both groups of chemicals in both plants and nematodes. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Method of synthesizing small-diameter carbon nanotubes with electron field emission properties

    Science.gov (United States)

    Liu, Jie (Inventor); Du, Chunsheng (Inventor); Qian, Cheng (Inventor); Gao, Bo (Inventor); Qiu, Qi (Inventor); Zhou, Otto Z. (Inventor)

    2009-01-01

    Carbon nanotube material having an outer diameter less than 10 nm and a number of walls less than ten are disclosed. Also disclosed are an electron field emission device including a substrate, an optionally layer of adhesion-promoting layer, and a layer of electron field emission material. The electron field emission material includes a carbon nanotube having a number of concentric graphene shells per tube of from two to ten, an outer diameter from 2 to 8 nm, and a nanotube length greater than 0.1 microns. One method to fabricate carbon nanotubes includes the steps of (a) producing a catalyst containing Fe and Mo supported on MgO powder, (b) using a mixture of hydrogen and carbon containing gas as precursors, and (c) heating the catalyst to a temperature above 950.degree. C. to produce a carbon nanotube. Another method of fabricating an electron field emission cathode includes the steps of (a) synthesizing electron field emission materials containing carbon nanotubes with a number of concentric graphene shells per tube from two to ten, an outer diameter of from 2 to 8 nm, and a length greater than 0.1 microns, (b) dispersing the electron field emission material in a suitable solvent, (c) depositing the electron field emission materials onto a substrate, and (d) annealing the substrate.

  6. Database Description - RMOS | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name RMOS Alternative nam...arch Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Microarray Data and other Gene Expression Database...s Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The Ric...19&lang=en Whole data download - Referenced database Rice Expression Database (RED) Rice full-length cDNA Database... (KOME) Rice Genome Integrated Map Database (INE) Rice Mutant Panel Database (Tos17) Rice Genome Annotation Database

  7. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  8. A case-control study of autism and mumps-measles-rubella vaccination using the general practice research database: design and methodology

    Directory of Open Access Journals (Sweden)

    Huang Xiangning

    2001-02-01

    Full Text Available Abstract Background An association between mumps-measles-rubella (MMR vaccination and the onset of symptoms typical of autism has recently been suggested. This has led to considerable concern about the safety of the vaccine. Methods A matched case-control study using data derived form the United Kingdom General Practice Research Database. Children with a possible diagnosis of autism will be identified from their electronic health records. All diagnoses will be validated by a detailed review of hospital letters and by using information derived from a parental questionnaire. Ten controls per case will be selected from the database. Conditional logistic regression will be used to assess the association between MMR vaccination and autism. In addition case series analyses will be undertaken to estimate the relative incidence of onset of autism in defined time intervals after vaccination. The study is funded by the United Kingdom Medical Research Council. Discussion Electronic health databases offer tremendous opportunities for evaluating the adverse effects of vaccines. However there is much scope for bias and confounding. The rigorous validation of all diagnoses and the collection of additional information by parental questionnaire in this study are essential to minimise the possibility of misleading results.

  9. What to Ask Women Composers: Feminist Fieldwork in Electronic Dance Music

    Directory of Open Access Journals (Sweden)

    Magdalena Olszanowski

    2012-11-01

    Full Text Available Normal 0 false false false EN-US JA X-NONE This article reflects upon the research methods employed for microfemininewarfare (2012, an interactive database documentary that investigates female electronic dance music (EDM artists. The purpose of the documentary is to feature the contributions of women as composers, to show how they came to be composers and to reveal the tactics used to approach significant issues of gender in the male-dominated world of EDM. I highlight the theoretical and methodological processes that went into the making of this documentary, subtitled “exploring women’s space in electronic music”. By constructing “electronic music by women” as a category, two objectives are addressed: first, the visibility of women’s contribution to the musical tradition is heightened; and, second, it allows an exploration of the broadening of discourses about female subjectivity. This article showcases feminist research-creation and friendship-as-method as effective research methods to glean meaningful content when applied to EDM fieldwork.

  10. The MPI facial expression database--a validated database of emotional and conversational facial expressions.

    Directory of Open Access Journals (Sweden)

    Kathrin Kaulard

    Full Text Available The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision to investigate the processing of a wider range of natural

  11. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  12. Ei Compendex: A new database makes life easier for engineers

    CERN Multimedia

    2001-01-01

    The Library is expanding its range of databases. The latest arrival, called Ei Compendex, is the world's most comprehensive engineering database, which indexes engineering literature published throughout the world. It also offers bibliographic entries for articles published in scientific journals and for conference proceedings and covers an extensive range of subjects from mechanical engineering to the environment, materials science, solid state physics and superconductivity. Moreover, it is the most relevant quality control and engineering management database. Ei Compendex contains over 4.6 million references from over 2600 journals, conference proceedings and technical reports dating from 1966 to the present. Every year, 220,000 new abstracts are added to the database which is also updated on a weekly basis. In the case of articles published in recent years, it provides an electronic link to the full texts of all major publishers. The database also contains the full texts of Elsevier periodicals (over 250...

  13. Apparatus and method for generating high density pulses of electrons

    International Nuclear Information System (INIS)

    Lee, C.; Oettinger, P.E.

    1981-01-01

    An apparatus and method are described for the production of high density pulses of electrons using a laser energized emitter. Caesium atoms from a low pressure vapour atmosphere are absorbed on and migrate from a metallic target rapidly heated by a laser to a high temperature. Due to this heating time being short compared with the residence time of the caesium atoms adsorbed on the target surface, copious electrons are emitted which form a high current density pulse. (U.K.)

  14. Electronic Versus Manual Data Processing: Evaluating the Use of Electronic Health Records in Out-of-Hospital Clinical Research

    Science.gov (United States)

    Newgard, Craig D.; Zive, Dana; Jui, Jonathan; Weathers, Cody; Daya, Mohamud

    2011-01-01

    Objectives To compare case ascertainment, agreement, validity, and missing values for clinical research data obtained, processed, and linked electronically from electronic health records (EHR), compared to “manual” data processing and record abstraction in a cohort of out-ofhospital trauma patients. Methods This was a secondary analysis of two sets of data collected for a prospective, population-based, out-of-hospital trauma cohort evaluated by 10 emergency medical services (EMS) agencies transporting to 16 hospitals, from January 1, 2006 through October 2, 2007. Eighteen clinical, operational, procedural, and outcome variables were collected and processed separately and independently using two parallel data processing strategies, by personnel blinded to patients in the other group. The electronic approach included electronic health record data exports from EMS agencies, reformatting and probabilistic linkage to outcomes from local trauma registries and state discharge databases. The manual data processing approach included chart matching, data abstraction, and data entry by a trained abstractor. Descriptive statistics, measures of agreement, and validity were used to compare the two approaches to data processing. Results During the 21-month period, 418 patients underwent both data processing methods and formed the primary cohort. Agreement was good to excellent (kappa 0.76 to 0.97; intraclass correlation coefficient 0.49 to 0.97), with exact agreement in 67% to 99% of cases, and a median difference of zero for all continuous and ordinal variables. The proportions of missing out-of-hospital values were similar between the two approaches, although electronic processing generated more missing outcomes (87 out of 418, 21%, 95% CI = 17% to 25%) than the manual approach (11 out of 418, 3%, 95% CI = 1% to 5%). Case ascertainment of eligible injured patients was greater using electronic methods (n = 3,008) compared to manual methods (n = 629). Conclusions In this

  15. Improving decoy databases for protein folding algorithms

    KAUST Repository

    Lindsey, Aaron; Yeh, Hsin-Yi (Cindy); Wu, Chih-Peng; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    energetically stable) from non-native structures. Decoy databases are collections of non-native structures used to test and verify these functions. We present a method to evaluate and improve the quality of decoy databases by adding novel structures and removing

  16. CANDID: Comparison algorithm for navigating digital image databases

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, P.M.; Cannon, T.M.

    1994-02-21

    In this paper, we propose a method for calculating the similarity between two digital images. A global signature describing the texture, shape, or color content is first computed for every image stored in a database, and a normalized distance between probability density functions of feature vectors is used to match signatures. This method can be used to retrieve images from a database that are similar to an example target image. This algorithm is applied to the problem of search and retrieval for database containing pulmonary CT imagery, and experimental results are provided.

  17. Introduction of the American Academy of Facial Plastic and Reconstructive Surgery FACE TO FACE Database.

    Science.gov (United States)

    Abraham, Manoj T; Rousso, Joseph J; Hu, Shirley; Brown, Ryan F; Moscatello, Augustine L; Finn, J Charles; Patel, Neha A; Kadakia, Sameep P; Wood-Smith, Donald

    2017-07-01

    The American Academy of Facial Plastic and Reconstructive Surgery FACE TO FACE database was created to gather and organize patient data primarily from international humanitarian surgical mission trips, as well as local humanitarian initiatives. Similar to cloud-based Electronic Medical Records, this web-based user-generated database allows for more accurate tracking of provider and patient information and outcomes, regardless of site, and is useful when coordinating follow-up care for patients. The database is particularly useful on international mission trips as there are often different surgeons who may provide care to patients on subsequent missions, and patients who may visit more than 1 mission site. Ultimately, by pooling data across multiples sites and over time, the database has the potential to be a useful resource for population-based studies and outcome data analysis. The objective of this paper is to delineate the process involved in creating the AAFPRS FACE TO FACE database, to assess its functional utility, to draw comparisons to electronic medical records systems that are now widely implemented, and to explain the specific benefits and disadvantages of the use of the database as it was implemented on recent international surgical mission trips.

  18. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI): A Completed Reference Database of Lung Nodules on CT Scans

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-02-15

    Purpose: The development of computer-aided diagnostic (CAD) methods for lung nodule detection, classification, and quantitative assessment can be facilitated through a well-characterized repository of computed tomography (CT) scans. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) completed such a database, establishing a publicly available reference for the medical imaging research community. Initiated by the National Cancer Institute (NCI), further advanced by the Foundation for the National Institutes of Health (FNIH), and accompanied by the Food and Drug Administration (FDA) through active participation, this public-private partnership demonstrates the success of a consortium founded on a consensus-based process. Methods: Seven academic centers and eight medical imaging companies collaborated to identify, address, and resolve challenging organizational, technical, and clinical issues to provide a solid foundation for a robust database. The LIDC/IDRI Database contains 1018 cases, each of which includes images from a clinical thoracic CT scan and an associated XML file that records the results of a two-phase image annotation process performed by four experienced thoracic radiologists. In the initial blinded-read phase, each radiologist independently reviewed each CT scan and marked lesions belonging to one of three categories (''nodule{>=}3 mm,''''nodule<3 mm,'' and ''non-nodule{>=}3 mm''). In the subsequent unblinded-read phase, each radiologist independently reviewed their own marks along with the anonymized marks of the three other radiologists to render a final opinion. The goal of this process was to identify as completely as possible all lung nodules in each CT scan without requiring forced consensus. Results: The Database contains 7371 lesions marked ''nodule'' by at least one radiologist. 2669 of these lesions were marked &apos

  19. Database Description - SAHG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SAHG Alternative nam...h: Contact address Chie Motono Tel : +81-3-3599-8067 E-mail : Database classification Structure Databases - ...e databases - Protein properties Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description... Links: Original website information Database maintenance site The Molecular Profiling Research Center for D...stration Not available About This Database Database Description Download License Update History of This Database Site Policy | Contact Us Database Description - SAHG | LSDB Archive ...

  20. Electron backscatter diffraction as a useful method for alloys microstructure characterization

    Energy Technology Data Exchange (ETDEWEB)

    Klimek, Leszek; Pietrzyk, Bozena

    2004-11-17

    Microstructure examination of cast Co-Cr-Mo alloy is presented in this paper. The surface morphology and chemical composition of the alloy were investigated by means of scanning electron microscopy (SEM) and energy dispersive X-ray microanalysis (EDX). An identification of alloy phases was carried out using electron backscatter diffraction (EBSD). Two different kinds of precipitates in metallic matrix were found. They were identified as MC and M{sub 23}C{sub 6} type of carbides in Co-lattice solid solution. The advantages and limits of the EBSD method are described. It is presented that EBSD, as excellent tool for phase identification, is a valuable supplementary method for materials research.

  1. Overview of intelligent data retrieval methods for waveforms and images in massive fusion databases

    Energy Technology Data Exchange (ETDEWEB)

    Vega, J. [JET-EFDA, Culham Science Center, OX14 3DB Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense 22, 28040 Madrid (Spain)], E-mail: jesus.vega@ciemat.es; Murari, A. [JET-EFDA, Culham Science Center, OX14 3DB Abingdon (United Kingdom); Consorzio RFX-Associazione EURATOM ENEA per la Fusione, I-35127 Padua (Italy); Pereira, A.; Portas, A.; Ratta, G.A.; Castro, R. [JET-EFDA, Culham Science Center, OX14 3DB Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense 22, 28040 Madrid (Spain)

    2009-06-15

    JET database contains more than 42 Tbytes of data (waveforms and images) and it doubles its size about every 2 years. ITER database is expected to be orders of magnitude above this quantity. Therefore, data access in such huge databases can no longer be efficiently based on shot number or temporal interval. Taking into account that diagnostics generate reproducible signal patterns (structural shapes) for similar physical behaviour, high level data access systems can be developed. In these systems, the input parameter is a pattern and the outputs are the shot numbers and the temporal locations where similar patterns appear inside the database. These pattern oriented techniques can be used for first data screening of any type of morphological aspect of waveforms and images. The article shows a new technique to look for similar images in huge databases in a fast an efficient way. Also, previous techniques to search for similar waveforms and to retrieve time-series data or images containing any kind of patterns are reviewed.

  2. Cluster randomized trial in the general practice research database: 2. Secondary prevention after first stroke (eCRT study: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Dregan Alex

    2012-10-01

    Full Text Available Abstract Background The purpose of this research is to develop and evaluate methods for conducting pragmatic cluster randomized trials in a primary care electronic database. The proposal describes one application, in a less frequent chronic condition of public health importance, secondary prevention of stroke. A related protocol in antibiotic prescribing was reported previously. Methods/Design The study aims to implement a cluster randomized trial (CRT using the electronic patient records of the General Practice Research Database (GPRD as a sampling frame and data source. The specific objective of the trial is to evaluate the effectiveness of a computer-delivered intervention at enhancing the delivery of stroke secondary prevention in primary care. GPRD family practices will be allocated to the intervention or usual care. The intervention promotes the use of electronic prompts to support adherence with the recommendations of the UK Intercollegiate Stroke Working Party and NICE guidelines for the secondary prevention of stroke in primary care. Primary outcome measure will be the difference in systolic blood pressure between intervention and control trial arms at 12-month follow-up. Secondary outcomes will be differences in serum cholesterol, prescribing of antihypertensive drugs, statins, and antiplatelet therapy. The intervention will continue for 12 months. Information on the utilization of the decision-support tools will also be analyzed. Discussion The CRT will investigate the effectiveness of using a computer-delivered intervention to reduce the risk of stroke recurrence following a first stroke event. The study will provide methodological guidance on the implementation of CRTs in electronic databases in primary care. Trial registration Current Controlled Trials ISRCTN35701810

  3. Database Description - PSCDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PSCDB Alternative n...rial Science and Technology (AIST) Takayuki Amemiya E-mail: Database classification Structure Databases - Protein structure Database...554-D558. External Links: Original website information Database maintenance site Graduate School of Informat...available URL of Web services - Need for user registration Not available About This Database Database Descri...ption Download License Update History of This Database Site Policy | Contact Us Database Description - PSCDB | LSDB Archive ...

  4. Database Description - ASTRA | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name ASTRA Alternative n...tics Journal Search: Contact address Database classification Nucleotide Sequence Databases - Gene structure,...3702 Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The database represents classified p...(10):1211-6. External Links: Original website information Database maintenance site National Institute of Ad... for user registration Not available About This Database Database Description Dow

  5. Concurrency control in distributed database systems

    CERN Document Server

    Cellary, W; Gelenbe, E

    1989-01-01

    Distributed Database Systems (DDBS) may be defined as integrated database systems composed of autonomous local databases, geographically distributed and interconnected by a computer network.The purpose of this monograph is to present DDBS concurrency control algorithms and their related performance issues. The most recent results have been taken into consideration. A detailed analysis and selection of these results has been made so as to include those which will promote applications and progress in the field. The application of the methods and algorithms presented is not limited to DDBSs but a

  6. Developing of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, J.; Ryu, W. S.

    2002-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the tensile database system was developed by internet method using JSP(Java Server pages) tool

  7. Global Mammal Parasite Database version 2.0.

    Science.gov (United States)

    Stephens, Patrick R; Pappalardo, Paula; Huang, Shan; Byers, James E; Farrell, Maxwell J; Gehman, Alyssa; Ghai, Ria R; Haas, Sarah E; Han, Barbara; Park, Andrew W; Schmidt, John P; Altizer, Sonia; Ezenwa, Vanessa O; Nunn, Charles L

    2017-05-01

    Illuminating the ecological and evolutionary dynamics of parasites is one of the most pressing issues facing modern science, and is critical for basic science, the global economy, and human health. Extremely important to this effort are data on the disease-causing organisms of wild animal hosts (including viruses, bacteria, protozoa, helminths, arthropods, and fungi). Here we present an updated version of the Global Mammal Parasite Database, a database of the parasites of wild ungulates (artiodactyls and perissodactyls), carnivores, and primates, and make it available for download as complete flat files. The updated database has more than 24,000 entries in the main data file alone, representing data from over 2700 literature sources. We include data on sampling method and sample sizes when reported, as well as both "reported" and "corrected" (i.e., standardized) binomials for each host and parasite species. Also included are current higher taxonomies and data on transmission modes used by the majority of species of parasites in the database. In the associated metadata we describe the methods used to identify sources and extract data from the primary literature, how entries were checked for errors, methods used to georeference entries, and how host and parasite taxonomies were standardized across the database. We also provide definitions of the data fields in each of the four files that users can download. © 2017 by the Ecological Society of America.

  8. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, S.Y. [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Shen, G.H., E-mail: shgh@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Sun, Y., E-mail: sunying@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhou, D.Z., E-mail: dazhuang.zhou@gmail.com [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, X.X., E-mail: xxzhang@cma.gov.cn [National Center for Space Weather, Beijing (China); Li, J.W., E-mail: lijw@cma.gov.cn [National Center for Space Weather, Beijing (China); Huang, C., E-mail: huangc@cma.gov.cn [National Center for Space Weather, Beijing (China); Zhang, X.G., E-mail: zhangxg@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Dong, Y.J., E-mail: dyj@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, W.J., E-mail: zhangreatest@163.com [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, B.Q., E-mail: zhangbinquan@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Shi, C.Y., E-mail: scy@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China)

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference {sup 90}Sr/{sup 90}Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  9. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    International Nuclear Information System (INIS)

    Zhang, S.Y.; Shen, G.H.; Sun, Y.; Zhou, D.Z.; Zhang, X.X.; Li, J.W.; Huang, C.; Zhang, X.G.; Dong, Y.J.; Zhang, W.J.; Zhang, B.Q.; Shi, C.Y.

    2016-01-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference "9"0Sr/"9"0Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  10. Electronic cigarettes: incorporating human factors engineering into risk assessments

    OpenAIRE

    Yang, Ling; Rudy, Susan F; Cheng, James M; Durmowicz, Elizabeth L

    2014-01-01

    Objective A systematic review was conducted to evaluate the impact of human factors (HF) on the risks associated with electronic cigarettes (e-cigarettes) and to identify research gaps. HF is the evaluation of human interactions with products and includes the analysis of user, environment and product complexity. Consideration of HF may mitigate known and potential hazards from the use and misuse of a consumer product, including e-cigarettes. Methods Five databases were searched through Januar...

  11. Database Description - RPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RPD Alternative name Rice Proteome Database...titute of Crop Science, National Agriculture and Food Research Organization Setsuko Komatsu E-mail: Database... classification Proteomics Resources Plant databases - Rice Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database... description Rice Proteome Database contains information on protei...and entered in the Rice Proteome Database. The database is searchable by keyword,

  12. Database Description - PLACE | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PLACE Alternative name A Database...Kannondai, Tsukuba, Ibaraki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Databas...e classification Plant databases Organism Taxonomy Name: Tracheophyta Taxonomy ID: 58023 Database...99, Vol.27, No.1 :297-300 External Links: Original website information Database maintenance site National In...- Need for user registration Not available About This Database Database Descripti

  13. Krylov subspace method for evaluating the self-energy matrices in electron transport calculations

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Hansen, Per Christian; Petersen, D. E.

    2008-01-01

    We present a Krylov subspace method for evaluating the self-energy matrices used in the Green's function formulation of electron transport in nanoscale devices. A procedure based on the Arnoldi method is employed to obtain solutions of the quadratic eigenvalue problem associated with the infinite...... calculations. Numerical tests within a density functional theory framework are provided to validate the accuracy and robustness of the proposed method, which in most cases is an order of magnitude faster than conventional methods.......We present a Krylov subspace method for evaluating the self-energy matrices used in the Green's function formulation of electron transport in nanoscale devices. A procedure based on the Arnoldi method is employed to obtain solutions of the quadratic eigenvalue problem associated with the infinite...

  14. Quantum chemistry the development of ab initio methods in molecular electronic structure theory

    CERN Document Server

    Schaefer III, Henry F

    2004-01-01

    This guide is guaranteed to prove of keen interest to the broad spectrum of experimental chemists who use electronic structure theory to assist in the interpretation of their laboratory findings. A list of 150 landmark papers in ab initio molecular electronic structure methods, it features the first page of each paper (which usually encompasses the abstract and introduction). Its primary focus is methodology, rather than the examination of particular chemical problems, and the selected papers either present new and important methods or illustrate the effectiveness of existing methods in predi

  15. SU-E-T-544: A Radiation Oncology-Specific Multi-Institutional Federated Database: Initial Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, K; Phillips, M; Fishburn, M; Evans, K; Banerian, S; Mayr, N [University of Washington, Seattle, WA (United States); Wong, J; McNutt, T; Moore, J; Robertson, S [Johns Hopkins University, Baltimore, MD (United States)

    2014-06-01

    Purpose: To implement a common database structure and user-friendly web-browser based data collection tools across several medical institutions to better support evidence-based clinical decision making and comparative effectiveness research through shared outcomes data. Methods: A consortium of four academic medical centers agreed to implement a federated database, known as Oncospace. Initial implementation has addressed issues of differences between institutions in workflow and types and breadth of structured information captured. This requires coordination of data collection from departmental oncology information systems (OIS), treatment planning systems, and hospital electronic medical records in order to include as much as possible the multi-disciplinary clinical data associated with a patients care. Results: The original database schema was well-designed and required only minor changes to meet institution-specific data requirements. Mobile browser interfaces for data entry and review for both the OIS and the Oncospace database were tailored for the workflow of individual institutions. Federation of database queries--the ultimate goal of the project--was tested using artificial patient data. The tests serve as proof-of-principle that the system as a whole--from data collection and entry to providing responses to research queries of the federated database--was viable. The resolution of inter-institutional use of patient data for research is still not completed. Conclusions: The migration from unstructured data mainly in the form of notes and documents to searchable, structured data is difficult. Making the transition requires cooperation of many groups within the department and can be greatly facilitated by using the structured data to improve clinical processes and workflow. The original database schema design is critical to providing enough flexibility for multi-institutional use to improve each institution s ability to study outcomes, determine best practices

  16. SU-E-T-544: A Radiation Oncology-Specific Multi-Institutional Federated Database: Initial Implementation

    International Nuclear Information System (INIS)

    Hendrickson, K; Phillips, M; Fishburn, M; Evans, K; Banerian, S; Mayr, N; Wong, J; McNutt, T; Moore, J; Robertson, S

    2014-01-01

    Purpose: To implement a common database structure and user-friendly web-browser based data collection tools across several medical institutions to better support evidence-based clinical decision making and comparative effectiveness research through shared outcomes data. Methods: A consortium of four academic medical centers agreed to implement a federated database, known as Oncospace. Initial implementation has addressed issues of differences between institutions in workflow and types and breadth of structured information captured. This requires coordination of data collection from departmental oncology information systems (OIS), treatment planning systems, and hospital electronic medical records in order to include as much as possible the multi-disciplinary clinical data associated with a patients care. Results: The original database schema was well-designed and required only minor changes to meet institution-specific data requirements. Mobile browser interfaces for data entry and review for both the OIS and the Oncospace database were tailored for the workflow of individual institutions. Federation of database queries--the ultimate goal of the project--was tested using artificial patient data. The tests serve as proof-of-principle that the system as a whole--from data collection and entry to providing responses to research queries of the federated database--was viable. The resolution of inter-institutional use of patient data for research is still not completed. Conclusions: The migration from unstructured data mainly in the form of notes and documents to searchable, structured data is difficult. Making the transition requires cooperation of many groups within the department and can be greatly facilitated by using the structured data to improve clinical processes and workflow. The original database schema design is critical to providing enough flexibility for multi-institutional use to improve each institution s ability to study outcomes, determine best practices

  17. Towards seamlessly-integrated textile electronics: methods to coat fabrics and fibers with conducting polymers for electronic applications.

    Science.gov (United States)

    Allison, Linden; Hoxie, Steven; Andrew, Trisha L

    2017-06-29

    Traditional textile materials can be transformed into functional electronic components upon being dyed or coated with films of intrinsically conducting polymers, such as poly(aniline), poly(pyrrole) and poly(3,4-ethylenedioxythiophene). A variety of textile electronic devices are built from the conductive fibers and fabrics thus obtained, including: physiochemical sensors, thermoelectric fibers/fabrics, heated garments, artificial muscles and textile supercapacitors. In all these cases, electrical performance and device ruggedness is determined by the morphology of the conducting polymer active layer on the fiber or fabric substrate. Tremendous variation in active layer morphology can be observed with different coating or dyeing conditions. Here, we summarize various methods used to create fiber- and fabric-based devices and highlight the influence of the coating method on active layer morphology and device stability.

  18. Translation from the collaborative OSM database to cartography

    Science.gov (United States)

    Hayat, Flora

    2018-05-01

    The OpenStreetMap (OSM) database includes original items very useful for geographical analysis and for creating thematic maps. Contributors record in the open database various themes regarding amenities, leisure, transports, buildings and boundaries. The Michelin mapping department develops map prototypes to test the feasibility of mapping based on OSM. To translate the OSM database structure into a database structure fitted with Michelin graphic guidelines a research project is in development. It aims at defining the right structure for the Michelin uses. The research project relies on the analysis of semantic and geometric heterogeneities in OSM data. In that order, Michelin implements methods to transform the input geographical database into a cartographic image dedicated for specific uses (routing and tourist maps). The paper focuses on the mapping tools available to produce a personalised spatial database. Based on processed data, paper and Web maps can be displayed. Two prototypes are described in this article: a vector tile web map and a mapping method to produce paper maps on a regional scale. The vector tile mapping method offers an easy navigation within the map and within graphic and thematic guide- lines. Paper maps can be partly automatically drawn. The drawing automation and data management are part of the mapping creation as well as the final hand-drawing phase. Both prototypes have been set up using the OSM technical ecosystem.

  19. Comparison of optimization methods for electronic-structure calculations

    International Nuclear Information System (INIS)

    Garner, J.; Das, S.G.; Min, B.I.; Woodward, C.; Benedek, R.

    1989-01-01

    The performance of several local-optimization methods for calculating electronic structure is compared. The fictitious first-order equation of motion proposed by Williams and Soler is integrated numerically by three procedures: simple finite-difference integration, approximate analytical integration (the Williams-Soler algorithm), and the Born perturbation series. These techniques are applied to a model problem for which exact solutions are known, the Mathieu equation. The Williams-Soler algorithm and the second Born approximation converge equally rapidly, but the former involves considerably less computational effort and gives a more accurate converged solution. Application of the method of conjugate gradients to the Mathieu equation is discussed

  20. Database Description - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name JSNP Alternative nam...n Science and Technology Agency Creator Affiliation: Contact address E-mail : Database...sapiens Taxonomy ID: 9606 Database description A database of about 197,000 polymorphisms in Japanese populat...1):605-610 External Links: Original website information Database maintenance site Institute of Medical Scien...er registration Not available About This Database Database Description Download License Update History of This Database

  1. Database Description - RED | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RED Alternative name Rice Expression Database...enome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Database classifi...cation Microarray, Gene Expression Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database descripti... Article title: Rice Expression Database: the gateway to rice functional genomics...nt Science (2002) Dec 7 (12):563-564 External Links: Original website information Database maintenance site

  2. News from the Library: Looking for materials properties? Find the answer in CINDAS databases

    CERN Multimedia

    CERN Library

    2012-01-01

    Materials properties databases are a crucial source of information when doing research in Materials Science. The creation and regular updating of such databases requires identification and collection of relevant worldwide scientific and technical literature, followed by the compilation, critical evaluation, correlation and synthesis of both existing and new experimental data.   The Center for Information and Numerical Data Analysis and Synthesis (CINDAS) at Purdue University produces several databases on the properties and behaviour of materials. The databases include: - ASMD (Aerospace Structural Metals Database) which gives access to approximately 80,000 data curves on over 220 alloys used in the aerospace and other industries - the Microelectronics Packaging Materials Database (MPMD), providing data and information on the thermal, mechanical, electrical and physical properties of electronics packaging materials, and - the Thermophysical Properties of Matter Database (TPMD), covering the...

  3. Outputs and Growth of Primary Care Databases in the United Kingdom: Bibliometric Analysis

    Directory of Open Access Journals (Sweden)

    Zain Chaudhry

    2017-10-01

    Full Text Available Background: Electronic health database (EHD data is increasingly used by researchers. The major United Kingdom EHDs are the ‘Clinical Practice Research Datalink’ (CPRD, ‘The Health Improvement Network’ (THIN and ‘QResearch’. Over time, outputs from these databases have increased, but have not been evaluated. Objective: This study compares research outputs from CPRD, THIN and QResearch assessing growth and publication outputs over a 10-year period (2004-2013. CPRD was also reviewed separately over 20 years as a case study. Methods:  Publications from CPRD and QResearch were extracted using the Science Citation Index (SCI of the Thomson Scientific Institute for Scientific Information (Web of Science. THIN data was obtained from University College London and validated in Web of Science. All databases were analysed for growth in publications, the speciality areas and the journals in which their data have been published. Results: These databases collectively produced 1,296 publications over a ten-year period, with CPRD representing 63.6% (n=825 papers, THIN 30.4% (n=394 and QResearch 5.9% (n=77. Pharmacoepidemiology and General Medicine were the most common specialities featured. Over the 9-year period (2004-2013, publications for THIN and QResearch have slowly increased over time, whereas CPRD publications have increased substantially in last 4 years with almost 75% of CPRD publications published in the past 9 years. Conclusion: These databases are enhancing scientific research and are growing yearly, however display variability in their growth. They could become more powerful research tools if the National Health Service and general practitioners can provide accurate and comprehensive data for inclusion in these databases.

  4. Seventy Years of RN Effectiveness: A Database Development Project to Inform Best Practice.

    Science.gov (United States)

    Lulat, Zainab; Blain-McLeod, Julie; Grinspun, Doris; Penney, Tasha; Harripaul-Yhap, Anastasia; Rey, Michelle

    2018-03-23

    The appropriate nursing staff mix is imperative to the provision of quality care. Nurse staffing levels and staff mix vary from country to country, as well as between care settings. Understanding how staffing skill mix impacts patient, organizational, and financial outcomes is critical in order to allow policymakers and clinicians to make evidence-informed staffing decisions. This paper reports on the methodology for creation of an electronic database of studies exploring the effectiveness of Registered Nurses (RNs) on clinical and patient outcomes, organizational and nurse outcomes, and financial outcomes. Comprehensive literature searches were conducted in four electronic databases. Inclusion criteria for the database included studies published from 1946 to 2016, peer-reviewed international literature, and studies focused on RNs in all health-care disciplines, settings, and sectors. Masters-prepared nurse researchers conducted title and abstract screening and relevance review to determine eligibility of studies for the database. High-level analysis was conducted to determine key outcomes and the frequency at which they appeared within the database. Of the initial 90,352 records, a total of 626 abstracts were included within the database. Studies were organized into three groups corresponding to clinical and patient outcomes, organizational and nurse-related outcomes, and financial outcomes. Organizational and nurse-related outcomes represented the largest category in the database with 282 studies, followed by clinical and patient outcomes with 244 studies, and lastly financial outcomes, which included 124 studies. The comprehensive database of evidence for RN effectiveness is freely available at https://rnao.ca/bpg/initiatives/RNEffectiveness. The database will serve as a resource for the Registered Nurses' Association of Ontario, as well as a tool for researchers, clinicians, and policymakers for making evidence-informed staffing decisions. © 2018 The Authors

  5. A modified method of calculating the lateral build-up ratio for small electron fields

    International Nuclear Information System (INIS)

    Tyner, E; McCavana, P; McClean, B

    2006-01-01

    This note outlines an improved method of calculating dose per monitor unit values for small electron fields using Khan's lateral build-up ratio (LBR). This modified method obtains the LBR directly from the ratio of measured, surface normalized, electron beam percentage depth dose curves. The LBR calculated using this modified method more accurately accounts for the change in lateral scatter with decreasing field size. The LBR is used along with Khan's dose per monitor unit formula to calculate dose per monitor unit values for a set of small fields. These calculated dose per monitor unit values are compared to measured values to within 3.5% for all circular fields and electron energies examined. The modified method was further tested using a small triangular field. A maximum difference of 4.8% was found. (note)

  6. Protein structure database search and evolutionary classification.

    Science.gov (United States)

    Yang, Jinn-Moon; Tung, Chi-Hua

    2006-01-01

    As more protein structures become available and structural genomics efforts provide structural models in a genome-wide strategy, there is a growing need for fast and accurate methods for discovering homologous proteins and evolutionary classifications of newly determined structures. We have developed 3D-BLAST, in part, to address these issues. 3D-BLAST is as fast as BLAST and calculates the statistical significance (E-value) of an alignment to indicate the reliability of the prediction. Using this method, we first identified 23 states of the structural alphabet that represent pattern profiles of the backbone fragments and then used them to represent protein structure databases as structural alphabet sequence databases (SADB). Our method enhanced BLAST as a search method, using a new structural alphabet substitution matrix (SASM) to find the longest common substructures with high-scoring structured segment pairs from an SADB database. Using personal computers with Intel Pentium4 (2.8 GHz) processors, our method searched more than 10 000 protein structures in 1.3 s and achieved a good agreement with search results from detailed structure alignment methods. [3D-BLAST is available at http://3d-blast.life.nctu.edu.tw].

  7. Database Description - ConfC | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name ConfC Alternative name Database...amotsu Noguchi Tel: 042-495-8736 E-mail: Database classification Structure Database...s - Protein structure Structure Databases - Small molecules Structure Databases - Nucleic acid structure Database... services - Need for user registration - About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Database Description - ConfC | LSDB Archive ...

  8. Methods for coupling radiation, ion, and electron energies in grey Implicit Monte Carlo

    International Nuclear Information System (INIS)

    Evans, T.M.; Densmore, J.D.

    2007-01-01

    We present three methods for extending the Implicit Monte Carlo (IMC) method to treat the time-evolution of coupled radiation, electron, and ion energies. The first method splits the ion and electron coupling and conduction from the standard IMC radiation-transport process. The second method recasts the IMC equations such that part of the coupling is treated during the Monte Carlo calculation. The third method treats all of the coupling and conduction in the Monte Carlo simulation. We apply modified equation analysis (MEA) to simplified forms of each method that neglects the errors in the conduction terms. Through MEA we show that the third method is theoretically the most accurate. We demonstrate the effectiveness of each method on a series of 0-dimensional, nonlinear benchmark problems where the accuracy of the third method is shown to be up to ten times greater than the other coupling methods for selected calculations

  9. Computational 2D Materials Database

    DEFF Research Database (Denmark)

    Rasmussen, Filip Anselm; Thygesen, Kristian Sommer

    2015-01-01

    We present a comprehensive first-principles study of the electronic structure of 51 semiconducting monolayer transition-metal dichalcogenides and -oxides in the 2H and 1T hexagonal phases. The quasiparticle (QP) band structures with spin-orbit coupling are calculated in the G(0)W(0) approximation...... and used as input to a 2D hydrogenic model to estimate exciton binding energies. Throughout the paper we focus on trends and correlations in the electronic structure rather than detailed analysis of specific materials. All the computed data is available in an open database......., and comparison is made with different density functional theory descriptions. Pitfalls related to the convergence of GW calculations for two-dimensional (2D) materials are discussed together with possible solutions. The monolayer band edge positions relative to vacuum are used to estimate the band alignment...

  10. Secure Distributed Databases Using Cryptography

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2006-01-01

    Full Text Available The computational encryption is used intensively by different databases management systems for ensuring privacy and integrity of information that are physically stored in files. Also, the information is sent over network and is replicated on different distributed systems. It is proved that a satisfying level of security is achieved if the rows and columns of tables are encrypted independently of table or computer that sustains the data. Also, it is very important that the SQL - Structured Query Language query requests and responses to be encrypted over the network connection between the client and databases server. All this techniques and methods must be implemented by the databases administrators, designer and developers in a consistent security policy.

  11. Inventory of electronic money as method of its control: process approach

    Directory of Open Access Journals (Sweden)

    A.Р. Semenets

    2016-09-01

    Full Text Available The extent of legal regulation of inventory of electronic money in the company is considered. The absence of developed techniques of valuation as well as reflection of electronic money on the accounts, which results in distortion of indicators of financial statements are detected. The author develops the organizational and methodical provisions of inventory of electronic money in accordance with the stages that will ensure the avoidance of misstatements in the financial statements and providing users with more reliable information about the amount and as well as oddments of electronic money at the company on the balance sheet date. The effect of accounting policies, provisions for the organization of accounting as well as job description on the control system for transactions with electronic money, including their inventory, are determined. The author discovers the typical violations that occur during reflecting the transactions with electronic money in accounting, early detection of which will enable appropriate adjustments for the avoidance of misstatements of the information provided in the financial statements of the company.

  12. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  13. Monte Carlo methods in electron transport problems. Pt. 1

    International Nuclear Information System (INIS)

    Cleri, F.

    1989-01-01

    The condensed-history Monte Carlo method for charged particles transport is reviewed and discussed starting from a general form of the Boltzmann equation (Part I). The physics of the electronic interactions, together with some pedagogic example will be introduced in the part II. The lecture is directed to potential users of the method, for which it can be a useful introduction to the subject matter, and wants to establish the basis of the work on the computer code RECORD, which is at present in a developing stage

  14. Searching mixed DNA profiles directly against profile databases.

    Science.gov (United States)

    Bright, Jo-Anne; Taylor, Duncan; Curran, James; Buckleton, John

    2014-03-01

    DNA databases have revolutionised forensic science. They are a powerful investigative tool as they have the potential to identify persons of interest in criminal investigations. Routinely, a DNA profile generated from a crime sample could only be searched for in a database of individuals if the stain was from single contributor (single source) or if a contributor could unambiguously be determined from a mixed DNA profile. This meant that a significant number of samples were unsuitable for database searching. The advent of continuous methods for the interpretation of DNA profiles offers an advanced way to draw inferential power from the considerable investment made in DNA databases. Using these methods, each profile on the database may be considered a possible contributor to a mixture and a likelihood ratio (LR) can be formed. Those profiles which produce a sufficiently large LR can serve as an investigative lead. In this paper empirical studies are described to determine what constitutes a large LR. We investigate the effect on a database search of complex mixed DNA profiles with contributors in equal proportions with dropout as a consideration, and also the effect of an incorrect assignment of the number of contributors to a profile. In addition, we give, as a demonstration of the method, the results using two crime samples that were previously unsuitable for database comparison. We show that effective management of the selection of samples for searching and the interpretation of the output can be highly informative. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    Science.gov (United States)

    Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian

    2012-01-01

    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions

  16. Coordinate transformation based cryo-correlative methods for electron tomography and focused ion beam milling

    International Nuclear Information System (INIS)

    Fukuda, Yoshiyuki; Schrod, Nikolas; Schaffer, Miroslava; Feng, Li Rebekah; Baumeister, Wolfgang; Lucic, Vladan

    2014-01-01

    Correlative microscopy allows imaging of the same feature over multiple length scales, combining light microscopy with high resolution information provided by electron microscopy. We demonstrate two procedures for coordinate transformation based correlative microscopy of vitrified biological samples applicable to different imaging modes. The first procedure aims at navigating cryo-electron tomography to cellular regions identified by fluorescent labels. The second procedure, allowing navigation of focused ion beam milling to fluorescently labeled molecules, is based on the introduction of an intermediate scanning electron microscopy imaging step to overcome the large difference between cryo-light microscopy and focused ion beam imaging modes. These methods make it possible to image fluorescently labeled macromolecular complexes in their natural environments by cryo-electron tomography, while minimizing exposure to the electron beam during the search for features of interest. - Highlights: • Correlative light microscopy and focused ion beam milling of vitrified samples. • Coordinate transformation based cryo-correlative method. • Improved correlative light microscopy and cryo-electron tomography

  17. ESPSD, Nuclear Power Plant Siting Database

    International Nuclear Information System (INIS)

    Slezak, S.

    2001-01-01

    1 - Description of program or function: This database is a repository of comprehensive licensing and technical reviews of siting regulatory processes and acceptance criteria for advanced light water reactor (ALWR) nuclear power plants. The program is designed to be used by applicants for an early site permit or combined construction permit/operating license (10CFRR522), Sub-parts A and C) as input for the development of the application. The database is a complete, menu-driven, self-contained package that can search and sort the supplied data by topic, keyword, or other input. The software is designed for operation on IBM compatible computers with DOS. 2 - Method of solution: The database is an R:BASE Runtime program with all the necessary database files included

  18. Planned and ongoing projects (pop) database: development and results.

    Science.gov (United States)

    Wild, Claudia; Erdös, Judit; Warmuth, Marisa; Hinterreiter, Gerda; Krämer, Peter; Chalon, Patrice

    2014-11-01

    The aim of this study was to present the development, structure and results of a database on planned and ongoing health technology assessment (HTA) projects (POP Database) in Europe. The POP Database (POP DB) was set up in an iterative process from a basic Excel sheet to a multifunctional electronic online database. The functionalities, such as the search terminology, the procedures to fill and update the database, the access rules to enter the database, as well as the maintenance roles, were defined in a multistep participatory feedback loop with EUnetHTA Partners. The POP Database has become an online database that hosts not only the titles and MeSH categorizations, but also some basic information on status and contact details about the listed projects of EUnetHTA Partners. Currently, it stores more than 1,200 planned, ongoing or recently published projects of forty-three EUnetHTA Partners from twenty-four countries. Because the POP Database aims to facilitate collaboration, it also provides a matching system to assist in identifying similar projects. Overall, more than 10 percent of the projects in the database are identical both in terms of pathology (indication or disease) and technology (drug, medical device, intervention). In addition, approximately 30 percent of the projects are similar, meaning that they have at least some overlap in content. Although the POP DB is successful concerning regular updates of most national HTA agencies within EUnetHTA, little is known about its actual effects on collaborations in Europe. Moreover, many non-nationally nominated HTA producing agencies neither have access to the POP DB nor can share their projects.

  19. Bibliographical database of radiation biological dosimetry and risk assessment: Part 1, through June 1988

    Energy Technology Data Exchange (ETDEWEB)

    Straume, T.; Ricker, Y.; Thut, M.

    1988-08-29

    This database was constructed to support research in radiation biological dosimetry and risk assessment. Relevant publications were identified through detailed searches of national and international electronic databases and through our personal knowledge of the subject. Publications were numbered and key worded, and referenced in an electronic data-retrieval system that permits quick access through computerized searches on publication number, authors, key words, title, year, and journal name. Photocopies of all publications contained in the database are maintained in a file that is numerically arranged by citation number. This report of the database is provided as a useful reference and overview. It should be emphasized that the database will grow as new citations are added to it. With that in mind, we arranged this report in order of ascending citation number so that follow-up reports will simply extend this document. The database cite 1212 publications. Publications are from 119 different scientific journals, 27 of these journals are cited at least 5 times. It also contains reference to 42 books and published symposia, and 129 reports. Information relevant to radiation biological dosimetry and risk assessment is widely distributed among the scientific literature, although a few journals clearly dominate. The four journals publishing the largest number of relevant papers are Health Physics, Mutation Research, Radiation Research, and International Journal of Radiation Biology. Publications in Health Physics make up almost 10% of the current database.

  20. Bibliographical database of radiation biological dosimetry and risk assessment: Part 1, through June 1988

    International Nuclear Information System (INIS)

    Straume, T.; Ricker, Y.; Thut, M.

    1988-01-01

    This database was constructed to support research in radiation biological dosimetry and risk assessment. Relevant publications were identified through detailed searches of national and international electronic databases and through our personal knowledge of the subject. Publications were numbered and key worded, and referenced in an electronic data-retrieval system that permits quick access through computerized searches on publication number, authors, key words, title, year, and journal name. Photocopies of all publications contained in the database are maintained in a file that is numerically arranged by citation number. This report of the database is provided as a useful reference and overview. It should be emphasized that the database will grow as new citations are added to it. With that in mind, we arranged this report in order of ascending citation number so that follow-up reports will simply extend this document. The database cite 1212 publications. Publications are from 119 different scientific journals, 27 of these journals are cited at least 5 times. It also contains reference to 42 books and published symposia, and 129 reports. Information relevant to radiation biological dosimetry and risk assessment is widely distributed among the scientific literature, although a few journals clearly dominate. The four journals publishing the largest number of relevant papers are Health Physics, Mutation Research, Radiation Research, and International Journal of Radiation Biology. Publications in Health Physics make up almost 10% of the current database

  1. Electron-phonon thermalization in a scalable method for real-time quantum dynamics

    Science.gov (United States)

    Rizzi, Valerio; Todorov, Tchavdar N.; Kohanoff, Jorge J.; Correa, Alfredo A.

    2016-01-01

    We present a quantum simulation method that follows the dynamics of out-of-equilibrium many-body systems of electrons and oscillators in real time. Its cost is linear in the number of oscillators and it can probe time scales from attoseconds to hundreds of picoseconds. Contrary to Ehrenfest dynamics, it can thermalize starting from a variety of initial conditions, including electronic population inversion. While an electronic temperature can be defined in terms of a nonequilibrium entropy, a Fermi-Dirac distribution in general emerges only after thermalization. These results can be used to construct a kinetic model of electron-phonon equilibration based on the explicit quantum dynamics.

  2. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  3. Database Description - RMG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RMG Alternative name ...raki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Database... classification Nucleotide Sequence Databases Organism Taxonomy Name: Oryza sativa Japonica Group Taxonomy ID: 39947 Database...rnal: Mol Genet Genomics (2002) 268: 434–445 External Links: Original website information Database...available URL of Web services - Need for user registration Not available About This Database Database Descri

  4. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  5. The effects of photovoltaic electricity injection into microgrids: Combination of Geographical Information Systems, multicriteria decision methods and electronic control modeling

    International Nuclear Information System (INIS)

    Roa-Escalante, Gino de Jesús; Sánchez-Lozano, Juan Miguel; Faxas, Juan-Gabriel; García-Cascales, M. Socorro; Urbina, Antonio

    2015-01-01

    Highlights: • Geographical Information Systems can be used as a support to classify the viable locations for photovoltaic facilities. • Multicriteria decision methods are useful tools to choose the optimal locations for photovoltaic systems. • Variations of photovoltaic power injected into the grid have been calculated for the optimum locations. • Grid stabilization can be achieved within 500 ms with electronic control strategies. - Abstract: This article presents a model to calculate the impact on the grid of the injection of electricity generated from photovoltaic systems. The methodology combines the use of Geographical Information System tools to classify the optimal locations for the installation of photovoltaic systems with the calculation of the impact into microgrids of the electricity generated in such locations. The case study is focused on Murcia region, in South-east Spain, and on medium size photovoltaic systems. The locations have been selected from a Geographical Information System database including several parameters, and evaluated and classified using a fuzzy version of the multicriteria decision method called Technique for Order Preference by Similarity to Ideal Solution. In order to obtain the weights for the criteria used in the evaluation, the Analytic Hierarchy Process has been used. Finally, using meteorological data from a small set of possible locations, the impact on the grid arising from the injection of power generated from photovoltaic systems that are connected to the grid via a module implementing different control electronic strategies has been calculated. Different electronic control strategies have been modeled to demonstrate that stabilization of the electrical parameters of a microgrid can be obtained within 500 ms in all cases, even when a relatively large power surge, or slower variations, are injected into the grid from the medium size photovoltaic systems

  6. Application of the method of continued fractions for electron scattering by linear molecules

    International Nuclear Information System (INIS)

    Lee, M.-T.; Iga, I.; Fujimoto, M.M.; Lara, O.; Brasilia Univ., DF

    1995-01-01

    The method of continued fractions (MCF) of Horacek and Sasakawa is adapted for the first time to study low-energy electron scattering by linear molecules. Particularly, we have calculated the reactance K-matrices for an electron scattered by hydrogen molecule and hydrogen molecular ion as well as by a polar LiH molecule in the static-exchange level. For all the applications studied herein. the calculated physical quantities converge rapidly, even for a strongly polar molecule such as LiH, to the correct values and in most cases the convergence is monotonic. Our study suggests that the MCF could be an efficient method for studying electron-molecule scattering and also photoionization of molecules. (Author)

  7. Research Electronic Data Capture (REDCap®) used as an audit tool with a built-in database.

    Science.gov (United States)

    Kragelund, Signe H; Kjærsgaard, Mona; Jensen-Fangel, Søren; Leth, Rita A; Ank, Nina

    2018-05-01

    The aim of this study was to develop an audit tool with a built-in database using Research Electronic Data Capture (REDCap®) as part of an antimicrobial stewardship program at a regional hospital in the Central Denmark Region, and to analyse the need, if any, to involve more than one expert in the evaluation of cases of antimicrobial treatment, and the level of agreement among the experts. Patients treated with systemic antimicrobials in the period from 1 September 2015 to 31 August 2016 were included, in total 722 cases. Data were collected retrospectively and entered manually. The audit was based on seven flow charts regarding: (1) initiation of antimicrobial treatment (2) infection (3) prescription and administration of antimicrobials (4) discontinuation of antimicrobials (5) reassessment within 48 h after the first prescription of antimicrobials (6) microbiological sampling in the period between suspicion of infection and the first administration of antimicrobials (7) microbiological results. The audit was based on automatic calculations drawing on the entered data and on expert assessments. Initially, two experts completed the audit, and in the cases in which they disagreed, a third expert was consulted. In 31.9% of the cases, the two experts agreed on all elements of the audit. In 66.2%, the two experts reached agreement by discussing the cases. Finally, 1.9% of the cases were completed in cooperation with a third expert. The experts assessed 3406 flow charts of which they agreed on 75.8%. We succeeded in creating an audit tool with a built-in database that facilitates independent expert evaluation using REDCap. We found a large inter-observer difference that needs to be considered when constructing a project based on expert judgements. Our two experts agreed on most of the flow charts after discussion, whereas the third expert's intervention did not have any influence on the overall assessment. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Database Description - DGBY | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name DGBY Alternative name Database...EL: +81-29-838-8066 E-mail: Database classification Microarray Data and other Gene Expression Databases Orga...nism Taxonomy Name: Saccharomyces cerevisiae Taxonomy ID: 4932 Database descripti...-called phenomics). We uploaded these data on this website which is designated DGBY(Database for Gene expres...ma J, Ando A, Takagi H. Journal: Yeast. 2008 Mar;25(3):179-90. External Links: Original website information Database

  9. Database Description - KOME | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name KOME Alternative nam... Sciences Plant Genome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice ...Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description Information about approximately ...Hayashizaki Y, Kikuchi S. Journal: PLoS One. 2007 Nov 28; 2(11):e1235. External Links: Original website information Database...OS) Rice mutant panel database (Tos17) A Database of Plant Cis-acting Regulatory

  10. A parallel orbital-updating based plane-wave basis method for electronic structure calculations

    International Nuclear Information System (INIS)

    Pan, Yan; Dai, Xiaoying; Gironcoli, Stefano de; Gong, Xin-Gao; Rignanese, Gian-Marco; Zhou, Aihui

    2017-01-01

    Highlights: • Propose three parallel orbital-updating based plane-wave basis methods for electronic structure calculations. • These new methods can avoid the generating of large scale eigenvalue problems and then reduce the computational cost. • These new methods allow for two-level parallelization which is particularly interesting for large scale parallelization. • Numerical experiments show that these new methods are reliable and efficient for large scale calculations on modern supercomputers. - Abstract: Motivated by the recently proposed parallel orbital-updating approach in real space method , we propose a parallel orbital-updating based plane-wave basis method for electronic structure calculations, for solving the corresponding eigenvalue problems. In addition, we propose two new modified parallel orbital-updating methods. Compared to the traditional plane-wave methods, our methods allow for two-level parallelization, which is particularly interesting for large scale parallelization. Numerical experiments show that these new methods are more reliable and efficient for large scale calculations on modern supercomputers.

  11. Generalized Hartree-Fock method for electron-atom scattering

    International Nuclear Information System (INIS)

    Rosenberg, L.

    1997-01-01

    In the widely used Hartree-Fock procedure for atomic structure calculations, trial functions in the form of linear combinations of Slater determinants are constructed and the Rayleigh-Ritz minimum principle is applied to determine the best in that class. A generalization of this approach, applicable to low-energy electron-atom scattering, is developed here. The method is based on a unique decomposition of the scattering wave function into open- and closed-channel components, so chosen that an approximation to the closed-channel component may be obtained by adopting it as a trial function in a minimum principle, whose rigor can be maintained even when the target wave functions are imprecisely known. Given a closed-channel trial function, the full scattering function may be determined from the solution of an effective one-body Schroedinger equation. Alternatively, in a generalized Hartree-Fock approach, the minimum principle leads to coupled integrodifferential equations to be satisfied by the basis functions appearing in a Slater-determinant representation of the closed-channel wave function; it also provides a procedure for optimizing the choice of nonlinear parameters in a variational determination of these basis functions. Inclusion of additional Slater determinants in the closed-channel trial function allows for systematic improvement of that function, as well as the calculated scattering parameters, with the possibility of spurious singularities avoided. Electron-electron correlations can be important in accounting for long-range forces and resonances. These correlation effects can be included explicitly by suitable choice of one component of the closed-channel wave function; the remaining component may then be determined by the generalized Hartree-Fock procedure. As a simple test, the method is applied to s-wave scattering of positrons by hydrogen. copyright 1997 The American Physical Society

  12. Solutions for medical databases optimal exploitation.

    Science.gov (United States)

    Branescu, I; Purcarea, V L; Dobrescu, R

    2014-03-15

    The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, "multimodel" federated system for extending OLAP querying to external object databases.

  13. Enriching Great Britain's National Landslide Database by searching newspaper archives

    Science.gov (United States)

    Taylor, Faith E.; Malamud, Bruce D.; Freeborough, Katy; Demeritt, David

    2015-11-01

    Our understanding of where landslide hazard and impact will be greatest is largely based on our knowledge of past events. Here, we present a method to supplement existing records of landslides in Great Britain by searching an electronic archive of regional newspapers. In Great Britain, the British Geological Survey (BGS) is responsible for updating and maintaining records of landslide events and their impacts in the National Landslide Database (NLD). The NLD contains records of more than 16,500 landslide events in Great Britain. Data sources for the NLD include field surveys, academic articles, grey literature, news, public reports and, since 2012, social media. We aim to supplement the richness of the NLD by (i) identifying additional landslide events, (ii) acting as an additional source of confirmation of events existing in the NLD and (iii) adding more detail to existing database entries. This is done by systematically searching the Nexis UK digital archive of 568 regional newspapers published in the UK. In this paper, we construct a robust Boolean search criterion by experimenting with landslide terminology for four training periods. We then apply this search to all articles published in 2006 and 2012. This resulted in the addition of 111 records of landslide events to the NLD over the 2 years investigated (2006 and 2012). We also find that we were able to obtain information about landslide impact for 60-90% of landslide events identified from newspaper articles. Spatial and temporal patterns of additional landslides identified from newspaper articles are broadly in line with those existing in the NLD, confirming that the NLD is a representative sample of landsliding in Great Britain. This method could now be applied to more time periods and/or other hazards to add richness to databases and thus improve our ability to forecast future events based on records of past events.

  14. Final Results of Shuttle MMOD Impact Database

    Science.gov (United States)

    Hyde, J. L.; Christiansen, E. L.; Lear, D. M.

    2015-01-01

    The Shuttle Hypervelocity Impact Database documents damage features on each Orbiter thought to be from micrometeoroids (MM) or orbital debris (OD). Data is divided into tables for crew module windows, payload bay door radiators and thermal protection systems along with other miscellaneous regions. The combined number of records in the database is nearly 3000. Each database record provides impact feature dimensions, location on the vehicle and relevant mission information. Additional detail on the type and size of particle that produced the damage site is provided when sampling data and definitive spectroscopic analysis results are available. Guidelines are described which were used in determining whether impact damage is from micrometeoroid or orbital debris impact based on the findings from scanning electron microscopy chemical analysis. Relationships assumed when converting from observed feature sizes in different shuttle materials to particle sizes will be presented. A small number of significant impacts on the windows, radiators and wing leading edge will be highlighted and discussed in detail, including the hypervelocity impact testing performed to estimate particle sizes that produced the damage.

  15. Database Application Schema Forensics

    Directory of Open Access Journals (Sweden)

    Hector Quintus Beyers

    2014-12-01

    Full Text Available The application schema layer of a Database Management System (DBMS can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic investigation can take place in. Arguments are provided why these environments are important. Methods are presented how these environments can be achieved for the application schema layer of a DBMS. A process is proposed on how forensic evidence should be extracted from the application schema layer of a DBMS. The application schema forensic evidence identification process can be applied to a wide range of forensic settings.

  16. Database Vs Data Warehouse

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Data warehouse technology includes a set of concepts and methods that offer the users useful information for decision making. The necessity to build a data warehouse arises from the necessity to improve the quality of information in the organization. The date proceeding from different sources, having a variety of forms - both structured and unstructured, are filtered according to business rules and are integrated in a single large data collection. Using informatics solutions, managers have understood that data stored in operational systems - including databases, are an informational gold mine that must be exploited. Data warehouses have been developed to answer the increasing demands for complex analysis, which could not be properly achieved with operational databases. The present paper emphasizes some of the criteria that information application developers can use in order to choose between a database solution or a data warehouse one.

  17. Calculation on spectrum of direct DNA damage induced by low-energy electrons including dissociative electron attachment.

    Science.gov (United States)

    Liu, Wei; Tan, Zhenyu; Zhang, Liming; Champion, Christophe

    2017-03-01

    In this work, direct DNA damage induced by low-energy electrons (sub-keV) is simulated using a Monte Carlo method. The characteristics of the present simulation are to consider the new mechanism of DNA damage due to dissociative electron attachment (DEA) and to allow determining damage to specific bases (i.e., adenine, thymine, guanine, or cytosine). The electron track structure in liquid water is generated, based on the dielectric response model for describing electron inelastic scattering and on a free-parameter theoretical model and the NIST database for calculating electron elastic scattering. Ionization cross sections of DNA bases are used to generate base radicals, and available DEA cross sections of DNA components are applied for determining DNA-strand breaks and base damage induced by sub-ionization electrons. The electron elastic scattering from DNA components is simulated using cross sections from different theoretical calculations. The resulting yields of various strand breaks and base damage in cellular environment are given. Especially, the contributions of sub-ionization electrons to various strand breaks and base damage are quantitatively presented, and the correlation between complex clustered DNA damage and the corresponding damaged bases is explored. This work shows that the contribution of sub-ionization electrons to strand breaks is substantial, up to about 40-70%, and this contribution is mainly focused on single-strand break. In addition, the base damage induced by sub-ionization electrons contributes to about 20-40% of the total base damage, and there is an evident correlation between single-strand break and damaged base pair A-T.

  18. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  19. Databases for rRNA gene profiling of microbial communities

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, Matthew

    2013-07-02

    The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.

  20. The electronic register patients with hypertensia in Tomsk Region

    Directory of Open Access Journals (Sweden)

    O. S. Kobyakova

    2012-01-01

    Full Text Available Within the limits of the regional program «Prevention and treatment of an arterial hypertension for the period of 2004—2008» the electronic register of the patients with hypertensia inTomskRegion has been created.The electronic register is a two-level system where interaction of two kinds of databases is carried out: the first level is the databases of separate medical organization; the second level is the central integrated database.The basic information for the electronic register are documents confirmed by the Health service Ministry of the Russian Federation, that is the coupon of the out-patient patient and a card of dynamic supervision over the patient with hypertensia.All the data about the patients, included in the register are subdivided into unchangeable and changeable ones.The electronic register is an effective control system providing local leading of health service bodies with qualitative and high-grade information in processes of preparation of decision-making and measure taken for prevention and treatment of hypertensia.The electronic register is an effective monitoring system, providing medical authority of important information for taking decisions establishment measures for prevention and treatment of hypertensia.

  1. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)

    2002-03-01

    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  2. The Research Potential of the Electronic OED Database at the University of Waterloo: A Case Study.

    Science.gov (United States)

    Berg, Donna Lee

    1991-01-01

    Discusses the history and structure of the online database of the second edition of the Oxford English Dictionary (OED) and the software tools developed at the University of Waterloo to manipulate the unusually complex database. Four sample searches that indicate some types of problems that might be encountered are appended. (DB)

  3. Download - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...t_db_link_en.zip (36.3 KB) - 6 Genome analysis methods pgdbj_dna_marker_linkage_map_genome_analysis_methods_... of This Database Site Policy | Contact Us Download - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  4. ECG-ViEW II, a freely accessible electrocardiogram database

    Science.gov (United States)

    Park, Man Young; Lee, Sukhoon; Jeon, Min Seok; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    The Electrocardiogram Vigilance with Electronic data Warehouse II (ECG-ViEW II) is a large, single-center database comprising numeric parameter data of the surface electrocardiograms of all patients who underwent testing from 1 June 1994 to 31 July 2013. The electrocardiographic data include the test date, clinical department, RR interval, PR interval, QRS duration, QT interval, QTc interval, P axis, QRS axis, and T axis. These data are connected with patient age, sex, ethnicity, comorbidities, age-adjusted Charlson comorbidity index, prescribed drugs, and electrolyte levels. This longitudinal observational database contains 979,273 electrocardiograms from 461,178 patients over a 19-year study period. This database can provide an opportunity to study electrocardiographic changes caused by medications, disease, or other demographic variables. ECG-ViEW II is freely available at http://www.ecgview.org. PMID:28437484

  5. Iteratively-coupled propagating exterior complex scaling method for electron-hydrogen collisions

    International Nuclear Information System (INIS)

    Bartlett, Philip L; Stelbovics, Andris T; Bray, Igor

    2004-01-01

    A newly-derived iterative coupling procedure for the propagating exterior complex scaling (PECS) method is used to efficiently calculate the electron-impact wavefunctions for atomic hydrogen. An overview of this method is given along with methods for extracting scattering cross sections. Differential scattering cross sections at 30 eV are presented for the electron-impact excitation to the n = 1, 2, 3 and 4 final states, for both PECS and convergent close coupling (CCC), which are in excellent agreement with each other and with experiment. PECS results are presented at 27.2 eV and 30 eV for symmetric and asymmetric energy-sharing triple differential cross sections, which are in excellent agreement with CCC and exterior complex scaling calculations, and with experimental data. At these intermediate energies, the efficiency of the PECS method with iterative coupling has allowed highly accurate partial-wave solutions of the full Schroedinger equation, for L ≤ 50 and a large number of coupled angular momentum states, to be obtained with minimal computing resources. (letter to the editor)

  6. Development of a standardized Intranet database of formulation records for nonsterile compounding, Part 2.

    Science.gov (United States)

    Haile, Michael; Anderson, Kim; Evans, Alex; Crawford, Angela

    2012-01-01

    In part 1 of this series, we outlined the rationale behind the development of a centralized electronic database used to maintain nonsterile compounding formulation records in the Mission Health System, which is a union of several independent hospitals and satellite and regional pharmacies that form the cornerstone of advanced medical care in several areas of western North Carolina. Hospital providers in many healthcare systems require compounded formulations to meet the needs of their patients (in particular, pediatric patients). Before a centralized electronic compounding database was implemented in the Mission Health System, each satellite or regional pharmacy affiliated with that system had a specific set of formulation records, but no standardized format for those records existed. In this article, we describe the quality control, database platform selection, description, implementation, and execution of our intranet database system, which is designed to maintain, manage, and disseminate nonsterile compounding formulation records in the hospitals and affiliated pharmacies of the Mission Health System. The objectives of that project were to standardize nonsterile compounding formulation records, create a centralized computerized database that would increase healthcare staff members' access to formulation records, establish beyond-use dates based on published stability studies, improve quality control, reduce the potential for medication errors related to compounding medications, and (ultimately) improve patient safety.

  7. [Method of traditional Chinese medicine formula design based on 3D-database pharmacophore search and patent retrieval].

    Science.gov (United States)

    He, Yu-su; Sun, Zhi-yi; Zhang, Yan-ling

    2014-11-01

    By using the pharmacophore model of mineralocorticoid receptor antagonists as a starting point, the experiment stud- ies the method of traditional Chinese medicine formula design for anti-hypertensive. Pharmacophore models were generated by 3D-QSAR pharmacophore (Hypogen) program of the DS3.5, based on the training set composed of 33 mineralocorticoid receptor antagonists. The best pharmacophore model consisted of two Hydrogen-bond acceptors, three Hydrophobic and four excluded volumes. Its correlation coefficient of training set and test set, N, and CAI value were 0.9534, 0.6748, 2.878, and 1.119. According to the database screening, 1700 active compounds from 86 source plant were obtained. Because of lacking of available anti-hypertensive medi cation strategy in traditional theory, this article takes advantage of patent retrieval in world traditional medicine patent database, in order to design drug formula. Finally, two formulae was obtained for antihypertensive.

  8. Database Description - SSBD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SSBD Alternative nam...ss 2-2-3 Minatojima-minamimachi, Chuo-ku, Kobe 650-0047, Japan, RIKEN Quantitative Biology Center Shuichi Onami E-mail: Database... classification Other Molecular Biology Databases Database classification Dynamic databa...elegans Taxonomy ID: 6239 Taxonomy Name: Escherichia coli Taxonomy ID: 562 Database description Systems Scie...i Onami Journal: Bioinformatics/April, 2015/Volume 31, Issue 7 External Links: Original website information Database

  9. Database Description - GETDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name GETDB Alternative n...ame Gal4 Enhancer Trap Insertion Database DOI 10.18908/lsdba.nbdc00236-000 Creator Creator Name: Shigeo Haya... Chuo-ku, Kobe 650-0047 Tel: +81-78-306-3185 FAX: +81-78-306-3183 E-mail: Database classification Expression... Invertebrate genome database Organism Taxonomy Name: Drosophila melanogaster Taxonomy ID: 7227 Database des...riginal website information Database maintenance site Drosophila Genetic Resource

  10. Calculation of dynamic and electronic properties of perfect and defect crystals by semiempirical quantum mechanical methods

    International Nuclear Information System (INIS)

    Zunger, A.

    1975-07-01

    Semiempirical all-valence-electron LCAO methods, that were previously used to study the electronic structure of molecules are applied to three problems in solid state physics: the electronic band structure of covalent crystals, point defect problems in solids and lattice dynamical study of molecular crystals. Calculation methods for the electronic band structure of regular solids are introduced and problems regarding the computation of the density matrix in solids are discussed. Three models for treating the electronic eigenvalue problem in the solid, within the proposed calculation schemes, are discussed and the proposed models and calculation schemes are applied to the calculation of the electronic structure of several solids belonging to different crystal types. The calculation models also describe electronic properties of deep defects in covalent insulating crystals. The possible usefulness of the semieipirical LCAO methods in determining the first order intermolecular interaction potential in solids and an improved model for treating the lattice dynamics and related thermodynamical properties of molecular solids are presented. The improved lattice dynamical is used to compute phonon dispersion curves, phonon density of states, stable unit cell structure, lattice heat capacity and thermal crystal parameters, in α and γ-N 2 crystals, using the N 2 -N 2 intermolecular interaction potential that has been computed from the semiempirical LCAO methods. (B.G.)

  11. JICST Factual DatabaseJICST Chemical Substance Safety Regulation Database

    Science.gov (United States)

    Abe, Atsushi; Sohma, Tohru

    JICST Chemical Substance Safety Regulation Database is based on the Database of Safety Laws for Chemical Compounds constructed by Japan Chemical Industry Ecology-Toxicology & Information Center (JETOC) sponsored by the Sience and Technology Agency in 1987. JICST has modified JETOC database system, added data and started the online service through JOlS-F (JICST Online Information Service-Factual database) in January 1990. JICST database comprises eighty-three laws and fourteen hundred compounds. The authors outline the database, data items, files and search commands. An example of online session is presented.

  12. Advanced cluster methods for correlated-electron systems

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andre

    2015-04-27

    In this thesis, quantum cluster methods are used to calculate electronic properties of correlated-electron systems. A special focus lies in the determination of the ground state properties of a 3/4 filled triangular lattice within the one-band Hubbard model. At this filling, the electronic density of states exhibits a so-called van Hove singularity and the Fermi surface becomes perfectly nested, causing an instability towards a variety of spin-density-wave (SDW) and superconducting states. While chiral d+id-wave superconductivity has been proposed as the ground state in the weak coupling limit, the situation towards strong interactions is unclear. Additionally, quantum cluster methods are used here to investigate the interplay of Coulomb interactions and symmetry-breaking mechanisms within the nematic phase of iron-pnictide superconductors. The transition from a tetragonal to an orthorhombic phase is accompanied by a significant change in electronic properties, while long-range magnetic order is not established yet. The driving force of this transition may not only be phonons but also magnetic or orbital fluctuations. The signatures of these scenarios are studied with quantum cluster methods to identify the most important effects. Here, cluster perturbation theory (CPT) and its variational extention, the variational cluster approach (VCA) are used to treat the respective systems on a level beyond mean-field theory. Short-range correlations are incorporated numerically exactly by exact diagonalization (ED). In the VCA, long-range interactions are included by variational optimization of a fictitious symmetry-breaking field based on a self-energy functional approach. Due to limitations of ED, cluster sizes are limited to a small number of degrees of freedom. For the 3/4 filled triangular lattice, the VCA is performed for different cluster symmetries. A strong symmetry dependence and finite-size effects make a comparison of the results from different clusters difficult

  13. SciELO, Scientific Electronic Library Online, a Database of Open Access Journals

    Science.gov (United States)

    Meneghini, Rogerio

    2013-01-01

    This essay discusses SciELO, a scientific journal database operating in 14 countries. It covers over 1000 journals providing open access to full text and table sets of scientometrics data. In Brazil it is responsible for a collection of nearly 300 journals, selected along 15 years as the best Brazilian periodicals in natural and social sciences.…

  14. [SciELO: method for electronic publishing].

    Science.gov (United States)

    Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C

    2001-01-01

    It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.

  15. Free Electron Laser Induced Forward Transfer Method of Biomaterial for Marking

    Science.gov (United States)

    Suzuki, Kaoru

    Biomaterial, such as chitosan, poly lactic acid, etc., containing fluorescence agent was deposited onto biology hard tissue, such as teeth, fingernail of dog or cat, or sapphire substrate by free electron laser induced forward transfer method for direct write marking. Spin-coated biomaterial with fluorescence agent of rhodamin-6G or zinc phthalochyamine target on sapphire plate was ablated by free electron laser (resonance absorption wavelength of biomaterial : 3380 nm). The influence of the spin-coating film-forming temperature on hardness and adhesion strength of biomaterial is particularly studied. Effect of resonance excitation of biomaterial target by turning free electron laser was discussed to damage of biomaterial, rhodamin-6G or zinc phtarochyamine for direct write marking

  16. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    Directory of Open Access Journals (Sweden)

    Tomi Kauppi

    2013-01-01

    Full Text Available We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions.

  17. Method for calculating ionic and electronic defect concentrations in y-stabilised zirconia

    Energy Technology Data Exchange (ETDEWEB)

    Poulsen, F W [Risoe National Lab., Materials Research Dept., Roskilde (Denmark)

    1997-10-01

    A numerical (trial and error) method for calculation of concentration of ions, vacancies and ionic and electronic defects in solids (Brouwer-type diagrams) is presented. No approximations or truncations of the set of equations describing the chemistry for the various defect regions are used. Doped zirconia and doped thoria with simultaneous presence of protonic and electronic defects are taken as examples: 7 concentrations as function of oxygen partial pressure and/or water vapour partial pressure are determined. Realistic values for the equilibrium constants for equilibration with oxygen gas and water vapour, as well as for the internal equilibrium between holes and electrons were taken from the literature. The present mathematical method is versatile - it has also been employed by the author to treat more complex systems, such as perovskite structure oxides with over- and under-stoichiometry in oxygen, cation vacancies and simultaneous presence of protons. (au) 6 refs.

  18. System and method employing a self-organizing map load feature database to identify electric load types of different electric loads

    Science.gov (United States)

    Lu, Bin; Harley, Ronald G.; Du, Liang; Yang, Yi; Sharma, Santosh K.; Zambare, Prachi; Madane, Mayura A.

    2014-06-17

    A method identifies electric load types of a plurality of different electric loads. The method includes providing a self-organizing map load feature database of a plurality of different electric load types and a plurality of neurons, each of the load types corresponding to a number of the neurons; employing a weight vector for each of the neurons; sensing a voltage signal and a current signal for each of the loads; determining a load feature vector including at least four different load features from the sensed voltage signal and the sensed current signal for a corresponding one of the loads; and identifying by a processor one of the load types by relating the load feature vector to the neurons of the database by identifying the weight vector of one of the neurons corresponding to the one of the load types that is a minimal distance to the load feature vector.

  19. From Passive to Active in the Design of External Radiotherapy Database at Oncology Institute

    Directory of Open Access Journals (Sweden)

    Valentin Ioan CERNEA

    2009-12-01

    Full Text Available Implementation during 1997 of a computer network at Oncology Institute “Prof. Dr. Ion Chiricuţă" from Cluj-Napoca (OICN opens the era of patient electronic file where the presented database is included. The database developed before 2000, used till December 2006 in all reports of OICN has collected data from primary documents as radiotherapy files. Present level of the computer network permits to change the sense of data from computer to primary document. Now the primary document is built firstly electronically inside the computer, and secondly, after validation is printed as a known document. The paper discusses the issues concerning safety, functionality and access derived.

  20. Practical guide to electronic resources in the humanities

    CERN Document Server

    Dubnjakovic, Ana

    2010-01-01

    From full-text article databases to digitized collections of primary source materials, newly emerging electronic resources have radically impacted how research in the humanities is conducted and discovered. This book, covering high-quality, up-to-date electronic resources for the humanities, is an easy-to-use annotated guide for the librarian, student, and scholar alike. It covers online databases, indexes, archives, and many other critical tools in key humanities disciplines including philosophy, religion, languages and literature, and performing and visual arts. Succinct overviews of key eme

  1. CHANT (CHinese ANcient Texts): a comprehensive database of all ancient Chinese texts up to 600 AD

    OpenAIRE

    Ho, Che Wah

    2006-01-01

    The CHinese ANcient Texts (CHANT) database is a long-term project which began in 1988 to build up a comprehensive database of all ancient Chinese texts up to the sixth century AD. The project is near completion and the entire database, which includes both traditional and excavated materials, will be released on the CHANT Web site (www.chant.org) in mid-2002. With more than a decade of experience in establishing an electronic Chinese literary database, we have gained much insight useful to the...

  2. Database Description - KAIKOcDNA | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us KAIKOcDNA Database Description General information of database Database name KAIKOcDNA Alter...National Institute of Agrobiological Sciences Akiya Jouraku E-mail : Database cla...ssification Nucleotide Sequence Databases Organism Taxonomy Name: Bombyx mori Taxonomy ID: 7091 Database des...rnal: G3 (Bethesda) / 2013, Sep / vol.9 External Links: Original website information Database maintenance si...available URL of Web services - Need for user registration Not available About This Database Database

  3. Download - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Download First of all, please read the license of this database. Data ...1.4 KB) Simple search and download Downlaod via FTP FTP server is sometimes jammed. If it is, access [here]. About This Database Data...base Description Download License Update History of This Database Site Policy | Contact Us Download - Trypanosomes Database | LSDB Archive ...

  4. Database on aircraft accidents

    International Nuclear Information System (INIS)

    Nishio, Masahide; Koriyama, Tamio

    2012-09-01

    The Reactor Safety Subcommittee in the Nuclear Safety and Preservation Committee published the report 'The criteria on assessment of probability of aircraft crash into light water reactor facilities' as the standard method for evaluating probability of aircraft crash into nuclear reactor facilities in July 2002. In response to the report, Japan Nuclear Energy Safety Organization has been collecting open information on aircraft accidents of commercial airplanes, self-defense force (SDF) airplanes and US force airplanes every year since 2003, sorting out them and developing the database of aircraft accidents for latest 20 years to evaluate probability of aircraft crash into nuclear reactor facilities. This year, the database was revised by adding aircraft accidents in 2010 to the existing database and deleting aircraft accidents in 1991 from it, resulting in development of the revised 2011 database for latest 20 years from 1991 to 2010. Furthermore, the flight information on commercial aircrafts was also collected to develop the flight database for latest 20 years from 1991 to 2010 to evaluate probability of aircraft crash into reactor facilities. The method for developing the database of aircraft accidents to evaluate probability of aircraft crash into reactor facilities is based on the report 'The criteria on assessment of probability of aircraft crash into light water reactor facilities' described above. The 2011 revised database for latest 20 years from 1991 to 2010 shows the followings. The trend of the 2011 database changes little as compared to the last year's one. (1) The data of commercial aircraft accidents is based on 'Aircraft accident investigation reports of Japan transport safety board' of Ministry of Land, Infrastructure, Transport and Tourism. 4 large fixed-wing aircraft accidents, 58 small fixed-wing aircraft accidents, 5 large bladed aircraft accidents and 114 small bladed aircraft accidents occurred. The relevant accidents for evaluating

  5. Examining the symptom of fatigue in primary care: a comparative study using electronic medical records

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2015-01-01

    Full Text Available Background The symptom of fatigue is one of the top five most frequently presented health complaints in primary care, yet it remains underexplored in the Canadian primary care context.Objective The objective of this study was to examine the prevalence and impact of patients presenting with fatigue in primary care, using the only known electronic database in Canada to capture patient-reported symptoms.Methods Data were extracted from the Deliver Primary Healthcare Information (DELPHI database, an electronic medical record database located in Ontario, Canada. Patients were identified using the International Classification of Primary Care, Revised Second Edition coding system. Two groups of patients (fatigue or non-fatigue symptom were followed for one year and compared. Both descriptive and multivariable analyses were conducted.Results A total of 103 fatigue symptom patients, and 103 non-fatigue symptom patients, were identified in the DELPHI database. The period prevalence of fatigue presentation was 8.2%, with the majority of patients being female and over 60 years of age. These patients experienced numerous co-occurring morbidities, in addition to the fatigue itself. During the one year follow-up period, fatigue symptom patients had significantly higher rates of subsequent visits (IRR = 1.19, p = 0.038 and investigations (IRR = 1.68, p < 0.001, and markedly high levels of referrals following their index visit.Conclusions This research used an electronic database to examine the symptom, fatigue. Using these data, fatigue symptom patients were found to have higher rates of health care utilisation, compared to non-fatigue symptom patients.

  6. Reusability of coded data in the primary care electronic medical record : A dynamic cohort study concerning cancer diagnoses

    NARCIS (Netherlands)

    Sollie, Annet; Sijmons, Rolf H.; Helsper, Charles W.; Numans, Mattijs E.

    Objectives: To assess quality and reusability of coded cancer diagnoses in routine primary care data. To identify factors that influence data quality and areas for improvement. Methods: A dynamic cohort study in a Dutch network database containing 250,000 anonymized electronic medical records (EMRs)

  7. Statistics of electron multiplication in a multiplier phototube; Iterative method; Estadistica de la multiplicacion de electrones en un fotomultiplicador: Metodos iterativos

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, J F; Grau, A

    1985-07-01

    In the present paper an iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situation are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (Author) 11 refs.

  8. License - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database License License to Use This Database Last updated : 2017/02/27 You may use this database...cense specifies the license terms regarding the use of this database and the requirements you must follow in using this database.... The license for this database is specified in the Creative ...Commons Attribution-Share Alike 4.0 International . If you use data from this database, please be sure attribute this database...ative Commons Attribution-Share Alike 4.0 International is found here . With regard to this database, you ar

  9. Study and Handling Methods of Power IGBT Module Failures in Power Electronic Converter Systems

    DEFF Research Database (Denmark)

    Choi, Uimin; Blaabjerg, Frede; Lee, Kyo-Beum

    2015-01-01

    Power electronics plays an important role in a wide range of applications in order to achieve high efficiency and performance. Increasing efforts are being made to improve the reliability of power electronics systems to ensure compliance with more stringent constraints on cost, safety......, and availability in different applications. This paper presents an overview of the major failure mechanisms of IGBT modules and their handling methods in power converter systems improving reliability. The major failure mechanisms of IGBT modules are presented first, and methods for predicting lifetime...... and estimating the junction temperature of IGBT modules are then discussed. Subsequently, different methods for detecting open- and short-circuit faults are presented. Finally, fault-tolerant strategies for improving the reliability of power electronic systems under field operation are explained and compared...

  10. Marker list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...Database Site Policy | Contact Us Marker list - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  11. FDA toxicity databases and real-time data entry

    International Nuclear Information System (INIS)

    Arvidson, Kirk B.

    2008-01-01

    Structure-searchable electronic databases are valuable new tools that are assisting the FDA in its mission to promptly and efficiently review incoming submissions for regulatory approval of new food additives and food contact substances. The Center for Food Safety and Applied Nutrition's Office of Food Additive Safety (CFSAN/OFAS), in collaboration with Leadscope, Inc., is consolidating genetic toxicity data submitted in food additive petitions from the 1960s to the present day. The Center for Drug Evaluation and Research, Office of Pharmaceutical Science's Informatics and Computational Safety Analysis Staff (CDER/OPS/ICSAS) is separately gathering similar information from their submissions. Presently, these data are distributed in various locations such as paper files, microfiche, and non-standardized toxicology memoranda. The organization of the data into a consistent, searchable format will reduce paperwork, expedite the toxicology review process, and provide valuable information to industry that is currently available only to the FDA. Furthermore, by combining chemical structures with genetic toxicity information, biologically active moieties can be identified and used to develop quantitative structure-activity relationship (QSAR) modeling and testing guidelines. Additionally, chemicals devoid of toxicity data can be compared to known structures, allowing for improved safety review through the identification and analysis of structural analogs. Four database frameworks have been created: bacterial mutagenesis, in vitro chromosome aberration, in vitro mammalian mutagenesis, and in vivo micronucleus. Controlled vocabularies for these databases have been established. The four separate genetic toxicity databases are compiled into a single, structurally-searchable database for easy accessibility of the toxicity information. Beyond the genetic toxicity databases described here, additional databases for subchronic, chronic, and teratogenicity studies have been prepared

  12. DATABASES DEVELOPED IN INDIA FOR BIOLOGICAL SCIENCES

    Directory of Open Access Journals (Sweden)

    Gitanjali Yadav

    2017-09-01

    Full Text Available The complexity of biological systems requires use of a variety of experimental methods with ever increasing sophistication to probe various cellular processes at molecular and atomic resolution. The availability of technologies for determining nucleic acid sequences of genes and atomic resolution structures of biomolecules prompted development of major biological databases like GenBank and PDB almost four decades ago. India was one of the few countries to realize early, the utility of such databases for progress in modern biology/biotechnology. Department of Biotechnology (DBT, India established Biotechnology Information System (BTIS network in late eighties. Starting with the genome sequencing revolution at the turn of the century, application of high-throughput sequencing technologies in biology and medicine for analysis of genomes, transcriptomes, epigenomes and microbiomes have generated massive volumes of sequence data. BTIS network has not only provided state of the art computational infrastructure to research institutes and universities for utilizing various biological databases developed abroad in their research, it has also actively promoted research and development (R&D projects in Bioinformatics to develop a variety of biological databases in diverse areas. It is encouraging to note that, a large number of biological databases or data driven software tools developed in India, have been published in leading peer reviewed international journals like Nucleic Acids Research, Bioinformatics, Database, BMC, PLoS and NPG series publication. Some of these databases are not only unique, they are also highly accessed as reflected in number of citations. Apart from databases developed by individual research groups, BTIS has initiated consortium projects to develop major India centric databases on Mycobacterium tuberculosis, Rice and Mango, which can potentially have practical applications in health and agriculture. Many of these biological

  13. Organizational-methodical provisions for the audit of operations with electronic money

    Directory of Open Access Journals (Sweden)

    Semenetz A.P.

    2017-06-01

    Full Text Available To obtain objective and unbiased information about the accuracy and completeness of electronic money transactions at the enterprise, it is necessary to conduct an audit. The results of the external audit of electronic money transactions help the company’s management personnel to assess the efficiency and rationality of using such a modern means of payment, such as electronic money, as well as verify the proper functioning of the internal control service. The work substantiates organizational and methodical provisions of the process of conducting an external audit of transactions with electronic money in terms of clarifying the organizational provisions for conducting an audit of transactions with electronic money, namely the definition of the purpose, task, subjects and objects of audit and sources of information. Accordingly, the purpose of the audit of operations with electronic money is to provide the auditor’s unbiased opinion on the reliability of the financial statements of the enterprise in terms of operations with electronic money. Within the scope of this dissertation, the object of external audit is operations with electronic money, since electronic money is a new and contemporary object of accounting, and therefore the development of scientifically grounded order of conducting external audit of the investigated object is necessary. The subject of an external audit of electronic money transactions is a set of business transactions in electronic money settlements, that is, transactions with their acquisition and repayment and the accuracy of displaying information about them in the financial statements. In the course of the study, the procedure for the implementation of external audit procedures during the stages of the process of electronic money audit at the enterprise was determined, which allowed to confirm the correctness of the accounting of a new and modern means of payment such as electronic money. These proposals are aimed

  14. Pretreatment of Cellulose By Electron Beam Irradiation Method

    Science.gov (United States)

    Jusri, N. A. A.; Azizan, A.; Ibrahim, N.; Salleh, R. Mohd; Rahman, M. F. Abd

    2018-05-01

    Pretreatment process of lignocellulosic biomass (LCB) to produce biofuel has been conducted by using various methods including physical, chemical, physicochemical as well as biological. The conversion of bioethanol process typically involves several steps which consist of pretreatment, hydrolysis, fermentation and separation. In this project, microcrystalline cellulose (MCC) was used in replacement of LCB since cellulose has the highest content of LCB for the purpose of investigating the effectiveness of new pretreatment method using radiation technology. Irradiation with different doses (100 kGy to 1000 kGy) was conducted by using electron beam accelerator equipment at Agensi Nuklear Malaysia. Fourier Transform Infrared Spectroscopy (FTIR) and X-Ray Diffraction (XRD) analyses were studied to further understand the effect of the suggested pretreatment step to the content of MCC. Through this method namely IRR-LCB, an ideal and optimal condition for pretreatment prior to the production of biofuel by using LCB may be introduced.

  15. Database Description - AcEST | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name AcEST Alternative n...hi, Tokyo-to 192-0397 Tel: +81-42-677-1111(ext.3654) E-mail: Database classificat...eneris Taxonomy ID: 13818 Database description This is a database of EST sequences of Adiantum capillus-vene...(3): 223-227. External Links: Original website information Database maintenance site Plant Environmental Res...base Database Description Download License Update History of This Database Site Policy | Contact Us Database Description - AcEST | LSDB Archive ...

  16. License - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database License License to Use This Database Last updated : 2017/03/13 You may use this database...specifies the license terms regarding the use of this database and the requirements you must follow in using this database.... The license for this database is specified in the Creative Common...s Attribution-Share Alike 4.0 International . If you use data from this database, please be sure attribute this database...al ... . The summary of the Creative Commons Attribution-Share Alike 4.0 International is found here . With regard to this database

  17. Development of database for the divertor recycling in JT-60U and its analysis

    Energy Technology Data Exchange (ETDEWEB)

    Takizuka, Tomonori; Shimizu, Katsuhiro; Hayashi, Nobuhiko; Asakura, Nobuyuki [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment; Arakawa, Kazuya [Komatsu, Ltd., Tokyo (Japan)

    2003-05-01

    We have developed a database for the divertor recycling in JT-60U plasmas. This database makes it possible to investigate behaviors of the neutral-particle flux in plasmas and the ion flux to divertor plates under a condition for core-plasma parameters, such as electron density and heating power. The correlation between the electron density and the heating power is not strong in this database, and parameter scans for the density and the power in wide ranges are realized. On the basis of this database, we have analyzed the ion flux to divertor plates. The divertor-plate ion flux amplified by the recycling grows nonlinearly with the increase of the electron density n{sub e}. Its averaged dependence is a linear growth ({approx}n{sub e}{sup 1.0}) at the low density, and becomes a nonlinear growth ({approx}n{sub e}{sup 1.5}) at the high density. The spread of dependence from the averaged one is very large. This spread is caused mainly by complex physical characteristics of divertor plasmas, though it is little dependent on the heating power. The behavior of ion flux depends strongly on divertor configurations and divertor-plate/first-wall conditions. It is confirmed that the bifurcated transition takes place from the low-recycling divertor plasma at the low density to the high-recycling divertor plasma at the high density. The density at the transition is nearly proportional to the 1/4 power of the heating power. (author)

  18. Electronic Mail in the Library: A Perspective.

    Science.gov (United States)

    Whitaker, Becki

    1989-01-01

    Provides an overview of electronic mail in libraries. Topics discussed include general business applications; interlibrary loan; acquisition and claims systems; document delivery; complete text journal databases; reference requests and answers; obstacles to electronic mail usage; telecommunications; cost factors; and the impact of voice mail and…

  19. Simulation of therapeutic electron beam tracking through a non-uniform magnetic field using finite element method.

    Science.gov (United States)

    Tahmasebibirgani, Mohammad Javad; Maskani, Reza; Behrooz, Mohammad Ali; Zabihzadeh, Mansour; Shahbazian, Hojatollah; Fatahiasl, Jafar; Chegeni, Nahid

    2017-04-01

    In radiotherapy, megaelectron volt (MeV) electrons are employed for treatment of superficial cancers. Magnetic fields can be used for deflection and deformation of the electron flow. A magnetic field is composed of non-uniform permanent magnets. The primary electrons are not mono-energetic and completely parallel. Calculation of electron beam deflection requires using complex mathematical methods. In this study, a device was made to apply a magnetic field to an electron beam and the path of electrons was simulated in the magnetic field using finite element method. A mini-applicator equipped with two neodymium permanent magnets was designed that enables tuning the distance between magnets. This device was placed in a standard applicator of Varian 2100 CD linear accelerator. The mini-applicator was simulated in CST Studio finite element software. Deflection angle and displacement of the electron beam was calculated after passing through the magnetic field. By determining a 2 to 5cm distance between two poles, various intensities of transverse magnetic field was created. The accelerator head was turned so that the deflected electrons became vertical to the water surface. To measure the displacement of the electron beam, EBT2 GafChromic films were employed. After being exposed, the films were scanned using HP G3010 reflection scanner and their optical density was extracted using programming in MATLAB environment. Displacement of the electron beam was compared with results of simulation after applying the magnetic field. Simulation results of the magnetic field showed good agreement with measured values. Maximum deflection angle for a 12 MeV beam was 32.9° and minimum deflection for 15 MeV was 12.1°. Measurement with the film showed precision of simulation in predicting the amount of displacement in the electron beam. A magnetic mini-applicator was made and simulated using finite element method. Deflection angle and displacement of electron beam were calculated. With

  20. Study of developing a database of energy statistics

    Energy Technology Data Exchange (ETDEWEB)

    Park, T.S. [Korea Energy Economics Institute, Euiwang (Korea, Republic of)

    1997-08-01

    An integrated energy database should be prepared in advance for managing energy statistics comprehensively. However, since much manpower and budget is required for developing an integrated energy database, it is difficult to establish a database within a short period of time. Therefore, this study sets the purpose in drawing methods to analyze existing statistical data lists and to consolidate insufficient data as first stage work for the energy database, and at the same time, in analyzing general concepts and the data structure of the database. I also studied the data content and items of energy databases in operation in international energy-related organizations such as IEA, APEC, Japan, and the USA as overseas cases as well as domestic conditions in energy databases, and the hardware operating systems of Japanese databases. I analyzed the making-out system of Korean energy databases, discussed the KEDB system which is representative of total energy databases, and present design concepts for new energy databases. In addition, I present the establishment directions and their contents of future Korean energy databases, data contents that should be collected by supply and demand statistics, and the establishment of data collection organization, etc. by analyzing the Korean energy statistical data and comparing them with the system of OECD/IEA. 26 refs., 15 figs., 11 tabs.