WorldWideScience

Sample records for computer database application

  1. Computer Application Of Object Oriented Database Management ...

    African Journals Online (AJOL)

    Object Oriented Systems (OOS) have been widely adopted in software engineering because of their superiority with respect to data extensibility. The present trend in the software engineering process (SEP) towards concurrent computing raises novel concerns for the facilities and technology available in database ...

  2. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  3. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Lewis, G.; Howman-Giles, R.

    1999-01-01

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  4. Some Aspects of Process Computers Configuration Control in Nuclear Power Plant Krsko - Process Computer Signal Configuration Database (PCSCDB)

    International Nuclear Information System (INIS)

    Mandic, D.; Kocnar, R.; Sucic, B.

    2002-01-01

    During the operation of NEK and other nuclear power plants it has been recognized that certain issues related to the usage of digital equipment and associated software in NPP technological process protection, control and monitoring, is not adequately addressed in the existing programs and procedures. The term and the process of Process Computers Configuration Control joins three 10CFR50 Appendix B quality requirements of Process Computers application in NPP: Design Control, Document Control and Identification and Control of Materials, Parts and Components. This paper describes Process Computer Signal Configuration Database (PCSCDB), that was developed and implemented in order to resolve some aspects of Process Computer Configuration Control related to the signals or database points that exist in the life cycle of different Process Computer Systems (PCS) in Nuclear Power Plant Krsko. PCSCDB is controlled, master database, related to the definition and description of the configurable database points associated with all Process Computer Systems in NEK. PCSCDB holds attributes related to the configuration of addressable and configurable real time database points and attributes related to the signal life cycle references and history data such as: Input/Output signals, Manually Input database points, Program constants, Setpoints, Calculated (by application program or SCADA calculation tools) database points, Control Flags (example: enable / disable certain program feature) Signal acquisition design references to the DCM (Document Control Module Application software for document control within Management Information System - MIS) and MECL (Master Equipment and Component List MIS Application software for identification and configuration control of plant equipment and components) Usage of particular database point in particular application software packages, and in the man-machine interface features (display mimics, printout reports, ...) Signals history (EEAR Engineering

  5. Computer application for database management and networking of service radio physics

    International Nuclear Information System (INIS)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-01-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Micros of Office) our service implements this philosophy on the canter's computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  6. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  7. Design of multi-tiered database application based on CORBA component

    International Nuclear Information System (INIS)

    Sun Xiaoying; Dai Zhimin

    2003-01-01

    As computer technology quickly developing, middleware technology changed traditional two-tier database system. The multi-tiered database system, consisting of client application program, application servers and database serves, is mainly applying. While building multi-tiered database system using CORBA component has become the mainstream technique. In this paper, an example of DUV-FEL database system is presented, and then discuss the realization of multi-tiered database based on CORBA component. (authors)

  8. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner dos Santos

    2016-01-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  9. Distributed Database Access in the LHC Computing Grid with CORAL

    CERN Document Server

    Molnár, Z; Düllmann, D; Giacomo, G; Kalkhof, A; Valassi, A; CERN. Geneva. IT Department

    2009-01-01

    The CORAL package is the LCG Persistency Framework foundation for accessing relational databases. From the start CORAL has been designed to facilitate the deployment of the LHC experiment database applications in a distributed computing environment. In particular we cover - improvements to database service scalability by client connection management - platform-independent, multi-tier scalable database access by connection multiplexing, caching - a secure authentication and authorisation scheme integrated with existing grid services. We will summarize the deployment experience from several experiment productions using the distributed database infrastructure, which is now available in LCG. Finally, we present perspectives for future developments in this area.

  10. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  11. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  12. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  13. Computer applications in radiation protection

    International Nuclear Information System (INIS)

    Cole, P.R.; Moores, B.M.

    1995-01-01

    Computer applications in general and diagnostic radiology in particular are becoming more widespread. Their application to the field of radiation protection in medical imaging, including quality control initiatives, is similarly becoming more widespread. Advances in computer technology have enabled departments of diagnostic radiology to have access to powerful yet affordable personal computers. The application of databases, expert systems and computer-based learning is under way. The executive information systems for the management of dose and QA data that are under way at IRS are discussed. An important consideration in developing these pragmatic software tools has been the range of computer literacy within the end user group. Using interfaces have been specifically designed to reflect the requirements of many end users who will have little or no computer knowledge. (Author)

  14. Nuclear plant operations, maintenance, and configuration management using three-dimensional computer graphics and databases

    International Nuclear Information System (INIS)

    Tutos, N.C.; Reinschmidt, K.F.

    1987-01-01

    Stone and Webster Engineering Corporation has developed the Plant Digital Model concept as a new approach to Configuration Mnagement of nuclear power plants. The Plant Digital Model development is a step-by-step process, based on existing manual procedures and computer applications, and is fully controllable by the plant managers and engineers. The Plant Digital Model is based on IBM computer graphics and relational database management systems, and therefore can be easily integrated with existing plant databases and corporate management-information systems

  15. An algorithm of discovering signatures from DNA databases on a computer cluster.

    Science.gov (United States)

    Lee, Hsiao Ping; Sheu, Tzu-Fang

    2014-10-05

    Signatures are short sequences that are unique and not similar to any other sequence in a database that can be used as the basis to identify different species. Even though several signature discovery algorithms have been proposed in the past, these algorithms require the entirety of databases to be loaded in the memory, thus restricting the amount of data that they can process. It makes those algorithms unable to process databases with large amounts of data. Also, those algorithms use sequential models and have slower discovery speeds, meaning that the efficiency can be improved. In this research, we are debuting the utilization of a divide-and-conquer strategy in signature discovery and have proposed a parallel signature discovery algorithm on a computer cluster. The algorithm applies the divide-and-conquer strategy to solve the problem posed to the existing algorithms where they are unable to process large databases and uses a parallel computing mechanism to effectively improve the efficiency of signature discovery. Even when run with just the memory of regular personal computers, the algorithm can still process large databases such as the human whole-genome EST database which were previously unable to be processed by the existing algorithms. The algorithm proposed in this research is not limited by the amount of usable memory and can rapidly find signatures in large databases, making it useful in applications such as Next Generation Sequencing and other large database analysis and processing. The implementation of the proposed algorithm is available at http://www.cs.pu.edu.tw/~fang/DDCSDPrograms/DDCSD.htm.

  16. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    schemata, query evaluation, semantic processing, information retrieval, temporal and spatial databases, querying XML, organisational aspects of databases, natural language processing, ontologies, Web data extraction, semantic Web, data stream management, data extraction, distributed database systems......This book constitutes the refereed proceedings of the 16th International Conference on Database and Expert Systems Applications, DEXA 2005, held in Copenhagen, Denmark, in August 2005.The 92 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 390...... submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...

  17. Database Application Schema Forensics

    Directory of Open Access Journals (Sweden)

    Hector Quintus Beyers

    2014-12-01

    Full Text Available The application schema layer of a Database Management System (DBMS can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic investigation can take place in. Arguments are provided why these environments are important. Methods are presented how these environments can be achieved for the application schema layer of a DBMS. A process is proposed on how forensic evidence should be extracted from the application schema layer of a DBMS. The application schema forensic evidence identification process can be applied to a wide range of forensic settings.

  18. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Md Saion Salikin; Muhammad Farid Abdul Khalid

    2002-01-01

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  19. Database computing in HEP

    International Nuclear Information System (INIS)

    Day, C.T.; Loken, S.; MacFarlane, J.F.; May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Baden, A.

    1992-01-01

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  20. Artist Material BRDF Database for Computer Graphics Rendering

    Science.gov (United States)

    Ashbaugh, Justin C.

    The primary goal of this thesis was to create a physical library of artist material samples. This collection provides necessary data for the development of a gonio-imaging system for use in museums to more accurately document their collections. A sample set was produced consisting of 25 panels and containing nearly 600 unique samples. Selected materials are representative of those commonly used by artists both past and present. These take into account the variability in visual appearance resulting from the materials and application techniques used. Five attributes of variability were identified including medium, color, substrate, application technique and overcoat. Combinations of these attributes were selected based on those commonly observed in museum collections and suggested by surveying experts in the field. For each sample material, image data is collected and used to measure an average bi-directional reflectance distribution function (BRDF). The results are available as a public-domain image and optical database of artist materials at art-si.org. Additionally, the database includes specifications for each sample along with other information useful for computer graphics rendering such as the rectified sample images and normal maps.

  1. FaceWarehouse: a 3D facial expression database for visual computing.

    Science.gov (United States)

    Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun

    2014-03-01

    We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.

  2. Computer applications in the nuclear reprocessing industry

    International Nuclear Information System (INIS)

    McKenzie, H.G.; Swartfigure, G.T.

    1985-01-01

    The subject is discussed under the headings: introduction; benefits of computer application; factors affecting productivity; implementation of engineering design systems; the conceptual model; system design database; plant design system; pipe detailing system; overall assessment of benefits; conclusions. (U.K.)

  3. Database characterisation of HEP applications

    International Nuclear Information System (INIS)

    Piorkowski, Mariusz; Grancher, Eric; Topurov, Anton

    2012-01-01

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  4. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  5. Simple re-instantiation of small databases using cloud computing.

    Science.gov (United States)

    Tan, Tin Wee; Xie, Chao; De Silva, Mark; Lim, Kuan Siong; Patro, C Pawan K; Lim, Shen Jean; Govindarajan, Kunde Ramamoorthy; Tong, Joo Chuan; Choo, Khar Heng; Ranganathan, Shoba; Khan, Asif M

    2013-01-01

    Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear.

  6. Just-in-time Database-Driven Web Applications

    Science.gov (United States)

    2003-01-01

    "Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109

  7. The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system

    Science.gov (United States)

    Zerkin, V. V.; Pritychenko, B.

    2018-04-01

    The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.

  8. Planform: an application and database of graph-encoded planarian regenerative experiments.

    Science.gov (United States)

    Lobo, Daniel; Malone, Taylor J; Levin, Michael

    2013-04-15

    Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.

  9. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  10. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Performance of popular open source databases for HEP related computing problems

    International Nuclear Information System (INIS)

    Kovalskyi, D; Sfiligoi, I; Wuerthwein, F; Yagil, A

    2014-01-01

    Databases are used in many software components of HEP computing, from monitoring and job scheduling to data storage and processing. It is not always clear at the beginning of a project if a problem can be handled by a single server, or if one needs to plan for a multi-server solution. Before a scalable solution is adopted, it helps to know how well it performs in a single server case to avoid situations when a multi-server solution is adopted mostly due to sub-optimal performance per node. This paper presents comparison benchmarks of popular open source database management systems. As a test application we use a user job monitoring system based on the Glidein workflow management system used in the CMS Collaboration.

  12. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  13. RA radiological characterization database application

    International Nuclear Information System (INIS)

    Steljic, M.M; Ljubenov, V.Lj. . E-mail address of corresponding author: milijanas@vin.bg.ac.yu; Steljic, M.M.)

    2005-01-01

    Radiological characterization of the RA research reactor is one of the main activities in the first two years of the reactor decommissioning project. The raw characterization data from direct measurements or laboratory analyses (defined within the existing sampling and measurement programme) have to be interpreted, organized and summarized in order to prepare the final characterization survey report. This report should be made so that the radiological condition of the entire site is completely and accurately shown with the radiological condition of the components clearly depicted. This paper presents an electronic database application, designed as a serviceable and efficient tool for characterization data storage, review and analysis, as well as for the reports generation. Relational database model was designed and the application is made by using Microsoft Access 2002 (SP1), a 32-bit RDBMS for the desktop and client/server database applications that run under Windows XP. (author)

  14. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  15. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2015-10-01

    Full Text Available Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. database design, installation, and operation. Therefore the authors have investigated MongoDB, a popular application, both from the perspective of industry retraining for database specialists and for teaching. This paper demonstrates some practical activities that can be done by students at the Eastern Institute of Technology New Zealand. In addition to testing and preparing new content for future students, this paper contributes to the very recent and emerging academic literature in this area. This paper concludes with general recommendations for IT educators, database engineers, and other IT professionals.

  16. Database Systems - Present and Future

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.

  17. Some Considerations about Modern Database Machines

    Directory of Open Access Journals (Sweden)

    Manole VELICANU

    2010-01-01

    Full Text Available Optimizing the two computing resources of any computing system - time and space - has al-ways been one of the priority objectives of any database. A current and effective solution in this respect is the computer database. Optimizing computer applications by means of database machines has been a steady preoccupation of researchers since the late seventies. Several information technologies have revolutionized the present information framework. Out of these, those which have brought a major contribution to the optimization of the databases are: efficient handling of large volumes of data (Data Warehouse, Data Mining, OLAP – On Line Analytical Processing, the improvement of DBMS – Database Management Systems facilities through the integration of the new technologies, the dramatic increase in computing power and the efficient use of it (computer networks, massive parallel computing, Grid Computing and so on. All these information technologies, and others, have favored the resumption of the research on database machines and the obtaining in the last few years of some very good practical results, as far as the optimization of the computing resources is concerned.

  18. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  19. [A systematic evaluation of application of the web-based cancer database].

    Science.gov (United States)

    Huang, Tingting; Liu, Jialin; Li, Yong; Zhang, Rui

    2013-10-01

    In order to support the theory and practice of the web-based cancer database development in China, we applied a systematic evaluation to assess the development condition of the web-based cancer databases at home and abroad. We performed computer-based retrieval of the Ovid-MEDLINE, Springerlink, EBSCOhost, Wiley Online Library and CNKI databases, the papers of which were published between Jan. 1995 and Dec. 2011, and retrieved the references of these papers by hand. We selected qualified papers according to the pre-established inclusion and exclusion criteria, and carried out information extraction and analysis of the papers. Eventually, searching the online database, we obtained 1244 papers, and checking the reference lists, we found other 19 articles. Thirty-one articles met the inclusion and exclusion criteria and we extracted the proofs and assessed them. Analyzing these evidences showed that the U.S.A. counted for 26% in the first place. Thirty-nine percent of these web-based cancer databases are comprehensive cancer databases. As for single cancer databases, breast cancer and prostatic cancer are on the top, both counting for 10% respectively. Thirty-two percent of the cancer database are associated with cancer gene information. For the technical applications, MySQL and PHP applied most widely, nearly 23% each.

  20. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  1. COMPARISON OF POPULAR BIOINFORMATICS DATABASES

    OpenAIRE

    Abdulganiyu Abdu Yusuf; Zahraddeen Sufyanu; Kabir Yusuf Mamman; Abubakar Umar Suleiman

    2016-01-01

    Bioinformatics is the application of computational tools to capture and interpret biological data. It has wide applications in drug development, crop improvement, agricultural biotechnology and forensic DNA analysis. There are various databases available to researchers in bioinformatics. These databases are customized for a specific need and are ranged in size, scope, and purpose. The main drawbacks of bioinformatics databases include redundant information, constant change, data spread over m...

  2. Thermodynamic database of multi-component Mg alloys and its application to solidification and heat treatment

    Directory of Open Access Journals (Sweden)

    Guanglong Xu

    2016-12-01

    Full Text Available An overview about one thermodynamic database of multi-component Mg alloys is given in this work. This thermodynamic database includes thermodynamic descriptions for 145 binary systems and 48 ternary systems in 23-component (Mg–Ag–Al–Ca–Ce–Cu–Fe–Gd–K–La–Li–Mn–Na–Nd–Ni–Pr–Si–Sn–Sr–Th–Y–Zn–Zr system. First, the major computational and experimental tools to establish the thermodynamic database of Mg alloys are briefly described. Subsequently, among the investigated binary and ternary systems, representative binary and ternary systems are shown to demonstrate the major feature of the database. Finally, application of the thermodynamic database to solidification simulation and selection of heat treatment schedule is described.

  3. Distributed Pseudo-Random Number Generation and Its Application to Cloud Database

    OpenAIRE

    Chen, Jiageng; Miyaji, Atsuko; Su, Chunhua

    2014-01-01

    Cloud database is now a rapidly growing trend in cloud computing market recently. It enables the clients run their computation on out-sourcing databases or access to some distributed database service on the cloud. At the same time, the security and privacy concerns is major challenge for cloud database to continue growing. To enhance the security and privacy of the cloud database technology, the pseudo-random number generation (PRNG) plays an important roles in data encryptions and privacy-pr...

  4. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  5. Computer applications in radiology business management

    International Nuclear Information System (INIS)

    Pratt, J.; Parrish, D.; Butler, J.; Gregg, S.; Farley, G.

    1987-01-01

    This presentation focuses on two areas of prime importance to radiology business management: financial/accounting applications and computer networking. The business management portion is an overview of accounts receivable management, financial reporting, management reporting, budgeting and forecasting (including cost/benefit analysis and break-even analysis), and personal and/or financial tax planning. The networking portion focuses on telecommunications and considers satellite facilities, electronic claims submission, and national database networking. Both numeric and graphic summaries are demonstrated in the presentation

  6. CERN database services for the LHC computing grid

    International Nuclear Information System (INIS)

    Girone, M

    2008-01-01

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed

  7. CERN database services for the LHC computing grid

    Energy Technology Data Exchange (ETDEWEB)

    Girone, M [CERN IT Department, CH-1211 Geneva 23 (Switzerland)], E-mail: maria.girone@cern.ch

    2008-07-15

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed.

  8. The ATLAS Wide-Range Database & Application Monitoring

    CERN Document Server

    Vasileva, Petya Tsvetanova; The ATLAS collaboration

    2018-01-01

    In HEP experiments at LHC the database applications often become complex by reflecting the ever demanding requirements of the researchers. The ATLAS experiment has several Oracle DB clusters with over 216 database schemes each with its own set of database objects. To effectively monitor them, we designed a modern and portable application with exceptionally good characteristics. Some of them include: concise view of the most important DB metrics; top SQL statements based on CPU, executions, block reads, etc.; volume growth plots per schema and DB object type; database jobs section with signaling for problematic ones; in-depth analysis in case of contention on data or processes. This contribution describes also the technical aspects of the implementation. The project can be separated into three independent layers. The first layer consists in highly-optimized database objects hiding all complicated calculations. The second layer represents a server providing REST access to the underlying database backend. The th...

  9. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  10. Computer-assisted indexing for the INIS database

    International Nuclear Information System (INIS)

    Nevyjel, A.

    2006-01-01

    INIS has identified computer-assisted indexing as areas where information technology could assist best in maintaining database quality and indexing consistency, while containing production costs. Subject analysis is a very important but also very expensive process in the production of the INIS database. Given the current necessity to process an increased number of records, including subject analysis, without additional staff, INIS as well as the member states need improvements in their processing efficiency. Computer assisted subject analysis is a promising way to achieve this. The quality of the INIS database is defined by its inputting rules. The Thesaurus is a terminological control device used in translating from the natural language of documents, indexers or users into a more constrained system language. It is a controlled and dynamic vocabulary of semantically and generically related terms. It is the essential tool for subject analysis as well as for advanced search engines. To support the identification of descriptors in the free text (title, abstract, free keywords) 'hidden terms' have been introduced as extension of the Thesaurus, which identify phrases or character strings of free text and point to the valid descriptor, which should be suggested. In the process of computer-assisted subject analysis the bibliographic records (including title and abstract) are analyzed by the software, resulting in a list of suggested descriptors. Within the working platform (graphical user interface) the suggested descriptors are sorted by importance (by their relevance for the content of the document) and the subject specialist clearly sees the highlighted context from which the terms were selected. The system allows the subject specialist to accept or reject descriptors from the suggested list and to assign additional descriptors when necessary. First experiences show that a performance enhancement of about 80-100% can be achieved in the subject analysis process. (author)

  11. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    Science.gov (United States)

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework

  12. AIDA Asia. Artificial Insemination Database Application. User manual. 1

    International Nuclear Information System (INIS)

    Garcia Podesta, Mario

    2002-01-01

    Artificial Insemination Database Application (AIDA-Asia) is a computer application to store and analyze information from AI Services (farms, females, inseminated, semen, estrus characteristics, inseminator and pregnancy diagnosis data). The need for such an application arose during a consultancy undertaken by the author for the International Atomic Energy Agency (IAEA, Vienna) under the framework of its Regional Co-operative Agreement for Asia and the Pacific (RCA) which is implementing a project on 'Improving Animal Productivity and Reproductive Efficiency' (RAS/5/035). The detailed specifications for the application were determined through a Task Force Meeting of National Consultants from five RCA Member States, organized by the IAEA and held in Sri Lanka in April 2001. The application has been developed in MS Access 2000 and Visual Basic for Applications (VBA) 6.0. However, it can run as a stand-alone application through its own executable files. It is based on screen forms for data entry or editing of information and command buttons. The structure of the data, the design of the application and VBA codes cannot be seen and cannot be modified by users. However, the designated administrator of AIDA-Asia in each country can customize it

  13. Needs assessment for next generation computer-aided mammography reference image databases and evaluation studies.

    Science.gov (United States)

    Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias

    2011-11-01

    Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM

  14. ATLAS database application enhancements using Oracle 11g

    CERN Document Server

    Dimitrov, G; The ATLAS collaboration; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemas (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have...

  15. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    Science.gov (United States)

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Development and application of nuclear power operation database

    International Nuclear Information System (INIS)

    Shao Juying; Fang Zhaoxia

    1996-01-01

    The article describes the development of the Nuclear Power Operation Database which include Domestic and Overseas Nuclear Event Scale Database, Overseas Nuclear Power Operation Abnormal Event Database, Overseas Nuclear Power Operation General Reliability Database and Qinshan Nuclear Power Operation Abnormal Event Database. The development includes data collection and analysis, database construction and code design, database management system selection. The application of the database to provide support to the safety analysis of the NPPs which have been in commercial operation is also introduced

  18. Application Of Database Program in selecting Sorghum (Sorghum bicolor L) Mutant Lines

    International Nuclear Information System (INIS)

    H, Soeranto

    2000-01-01

    Computer database software namely MSTAT and paradox have been exercised in the field of mutation breeding especially in the process of selecting plant mutant lines of sorghum. In MSTAT, selecting mutant lines can be done by activating the SELECTION function and then followed by entering mathematical formulas for the selection criterion. Another alternative is by defining the desired selection intensity to the analysis results of subprogram SORT. Including the selected plant mutant lines in BRSERIES program, it will make their progenies be easier to be traced in subsequent generations. In paradox, an application program for selecting mutant lines can be made by combining facilities of Table, form and report. Selecting mutant lines with defined selection criterion can easily be done through filtering data. As a relation database, paradox ensures that the application program for selecting mutant lines and progeny trachings, can be made easier, efficient and interactive

  19. CQL: a database in smart card for health care applications.

    Science.gov (United States)

    Paradinas, P C; Dufresnes, E; Vandewalle, J J

    1995-01-01

    The CQL-Card is the first smart card in the world to use Database Management Systems (DBMS) concepts. The CQL-Card is particularly suited to a portable file in health applications where the information is required by many different partners, such as health insurance organizations, emergency services, and General Practitioners. All the information required by these different partners can be shared with independent security mechanisms. Database engine functions are carried out by the card, which manages tables, views, and dictionaries. Medical Information is stored in tables and views are logical and dynamic subsets of tables. For owner-partners like MIS (Medical Information System), it is possible to grant privileges (select, insert, update, and delete on table or view) to other partners. Furthermore, dictionaries are structures that contain requested descriptions and which allow adaptation to computer environments. Health information held in the CQL-Card is accessed using CQL (Card Query Language), a high level database query language which is a subset of the standard SQL (Structured Query Language). With this language, CQL-Card can be easily integrated into Medical Information Systems.

  20. ATLAS database application enhancements using Oracle 11g

    International Nuclear Information System (INIS)

    Dimitrov, G; Canali, L; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  1. The SQL Server Database for Non Computer Professional Teaching Reform

    Science.gov (United States)

    Liu, Xiangwei

    2012-01-01

    A summary of the teaching methods of the non-computer professional SQL Server database, analyzes the current situation of the teaching course. According to non computer professional curriculum teaching characteristic, put forward some teaching reform methods, and put it into practice, improve the students' analysis ability, practice ability and…

  2. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  3. Professional iOS database application programming

    CERN Document Server

    Alessi, Patrick

    2013-01-01

    Updated and revised coverage that includes the latest versions of iOS and Xcode Whether you're a novice or experienced developer, you will want to dive into this updated resource on database application programming for the iPhone and iPad. Packed with more than 50 percent new and revised material - including completely rebuilt code, screenshots, and full coverage of new features pertaining to database programming and enterprise integration in iOS 6 - this must-have book intends to continue the precedent set by the previous edition by helping thousands of developers master database

  4. THE NASA AMES POLYCYCLIC AROMATIC HYDROCARBON INFRARED SPECTROSCOPIC DATABASE: THE COMPUTED SPECTRA

    International Nuclear Information System (INIS)

    Bauschlicher, C. W.; Ricca, A.; Boersma, C.; Mattioda, A. L.; Cami, J.; Peeters, E.; Allamandola, L. J.; Sanchez de Armas, F.; Puerta Saborido, G.; Hudgins, D. M.

    2010-01-01

    The astronomical emission features, formerly known as the unidentified infrared bands, are now commonly ascribed to polycyclic aromatic hydrocarbons (PAHs). The laboratory experiments and computational modeling done at the NASA Ames Research Center to create a collection of PAH IR spectra relevant to test and refine the PAH hypothesis have been assembled into a spectroscopic database. This database now contains over 800 PAH spectra spanning 2-2000 μm (5000-5 cm -1 ). These data are now available on the World Wide Web at www.astrochem.org/pahdb. This paper presents an overview of the computational spectra in the database and the tools developed to analyze and interpret astronomical spectra using the database. A description of the online and offline user tools available on the Web site is also presented.

  5. Computer system for International Reactor Pressure Vessel Materials Database support

    International Nuclear Information System (INIS)

    Arutyunjan, R.; Kabalevsky, S.; Kiselev, V.; Serov, A.

    1997-01-01

    This report presents description of the computer tools for support of International Reactor Pressure Vessel Materials Database developed at IAEA. Work was focused on raw, qualified, processed materials data, search, retrieval, analysis, presentation and export possibilities of data. Developed software has the following main functions: provides software tools for querying and search of any type of data in the database; provides the capability to update the existing information in the database; provides the capability to present and print selected data; provides the possibility of export on yearly basis the run-time IRPVMDB with raw, qualified and processed materials data to Database members; provides the capability to export any selected sets of raw, qualified, processed materials data

  6. A database application for wilderness character monitoring

    Science.gov (United States)

    Ashley Adams; Peter Landres; Simon Kingston

    2012-01-01

    The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...

  7. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  8. Analysis and Design of Web-Based Database Application for Culinary Community

    Directory of Open Access Journals (Sweden)

    Choirul Huda

    2017-03-01

    Full Text Available This research is based on the rapid development of the culinary and information technology. The difficulties in communicating with the culinary expert and on recipe documentation make a proper support for media very important. Therefore, a web-based database application for the public is important to help the culinary community in communication, searching and recipe management. The aim of the research was to design a web-based database application that could be used as social media for the culinary community. This research used literature review, user interviews, and questionnaires. Moreover, the database system development life cycle was used as a guide for designing a database especially for conceptual database design, logical database design, and physical design database. Web-based application design used eight golden rules for user interface design. The result of this research is the availability of a web-based database application that can fulfill the needs of users in the culinary field related to communication and recipe management.

  9. Thermochemistry in BWR. An overview of applications of program codes and databases

    International Nuclear Information System (INIS)

    Hermansson, H-P.; Becker, R.

    2010-01-01

    The Swedish work on thermodynamics of metal-water systems relevant to BWR conditions has been ongoing since the 70ies, and at present time a compilation and adaptation of codes and thermodynamic databases are in progress. In the previous work, basic thermodynamic data were compiled for parts of the system Fe-Cr-Ni-Co-Zn-S-H 2 O at 25-300 °C. Since some thermodynamic information necessary for temperature extrapolations of data up to 300 °C was not published in the earlier works, these data have now been partially recalculated. This applies especially to the parameters of the HKF-model, which are used to extrapolate the thermodynamic data for ionic and neutral aqua species from 25 °C to BWR temperatures. Using the completed data, e.g. the change in standard Gibbs energy (ΔG 0 ) and the equilibrium constant (log K) can be calculated for further applications at BWR/LWR conditions. In addition a computer program is currently being developed at Studsvik for the calculation of equilibrium conductivity in high temperature water. The program is intended for PWR applications, but can also be applied to BWR environment. Data as described above will be added to the database of this program. It will be relatively easy to further develop the program e.g. to calculate Pourbaix diagrams, and these graphs could then be calculated at any temperature. This means that there will be no limitation to the temperatures and total concentrations (usually 10 -6 to 10 -8 mol/kg) as reported in earlier work. It is also easy to add a function generating ΔG 0 and log K values at selected temperatures. One of the fundamentals for this work was also to overview and collect publicly available thermodynamic program codes and databases of relevance for BWR conditions found in open sources. The focus has been on finding already done compilations and reviews, and some 40 codes and 15 databases were found. Codes and data-bases are often integrated and such a package is often developed for

  10. Practical application of computer graphics in nuclear power plant engineering

    International Nuclear Information System (INIS)

    Machiba, Hiroshi; Kawamura, Hirobumi; Sasaki, Norio

    1992-01-01

    A nuclear power plant is composed of a vast amount of equipment, piping, and so on, and six or seven years are required to complete the design and engineering from the initial planning stage to the time of commercial operation. Furthermore, operating plants must be continually maintained and improved for a long period. Computer graphics were first applied to the composite arrangement design of nuclear power plants in the form of 3-dimensional CAD. Subsequently, as the introduction of CAE has progressed, a huge assortment of information has been accumulated in database, and measures have been sought that would permit the convenient utilization of this information. Using computer graphics technologies, improvement of the interface between the user and such databases has recently been accomplished. In response to the growth in environmental consciousness, photo-realistic simulations for artistic design of the interior and overviews showing harmony with the surroundings have been achieved through the application of computer graphics. (author)

  11. An Embedded Database Application for the Aggregation of Farming Device Data

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem; Pedersen, Torben Bach

    2010-01-01

    In order to store massive amounts of data produced by the farming devices and to keep data that spans long intervals of time for analysis, reporting and maintenance purposes; it is desirable to reduce the size of the data by maintaining the data at different aggregate levels. The older data can...... be made coarse-grained while keeping the newest data fine-grained. Considering the availability of a limited amount of storage capacity on the farm machinery, an application written in C was developed to collect the data from a CAN-BUS, store it into the embedded database efficiently and perform gradual...... data aggregation effectively. Furthermore, the aggregation is achieved by using either two ratio-based aggregation methods or a time-granularity based aggregation method. A detailed description of the embedded database technology on a tractor computer is also presented in this paper....

  12. International Ventilation Cooling Application Database

    DEFF Research Database (Denmark)

    Holzer, Peter; Psomas, Theofanis Ch.; OSullivan, Paul

    2016-01-01

    The currently running International Energy Agency, Energy and Conservation in Buildings, Annex 62 Ventilative Cooling (VC) project, is coordinating research towards extended use of VC. Within this Annex 62 the joint research activity of International VC Application Database has been carried out...... and locations, using VC as a mean of indoor comfort improvement. The building-spreadsheet highlights distributions of technologies and strategies, such as the following. (Numbers in % refer to the sample of the database’s 91 buildings.) It may be concluded that Ventilative Cooling is applied in temporary......, systematically investigating the distribution of technologies and strategies within VC. The database is structured as both a ticking-list-like building-spreadsheet and a collection of building-datasheets. The content of both closely follows Annex 62 State-Of-The- Art-Report. The database has been filled, based...

  13. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  14. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  15. First Database Course--Keeping It All Organized

    Science.gov (United States)

    Baugh, Jeanne M.

    2015-01-01

    All Computer Information Systems programs require a database course for their majors. This paper describes an approach to such a course in which real world examples, both design projects and actual database application projects are incorporated throughout the semester. Students are expected to apply the traditional database concepts to actual…

  16. Student Advising and Retention Application in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2016-11-01

    Full Text Available  This paper proposes a new user-friendly application enhancing and expanding the current advising services of Gradesfirst currently being used for advising and retention by the Athletic department of UMES with a view to implement new performance activities like mentoring, tutoring, scheduling, and study hall hours into existing tools. This application includes various measurements that can be used to monitor and improve the performance of the students in the Athletic Department of UMES by monitoring students’ weekly study hall hours, and tutoring schedules. It also supervises tutors’ login and logout activities in order to monitor their effectiveness, supervises tutor-tutee interaction, and stores and analyzes the overall academic progress of each student. A dedicated server for providing services will be developed at the local site. The paper has been implemented in three steps. The first step involves the creation of an independent cloud computing environment that provides resources such as database creation, query-based statistical data, performance measures activities, and automated support of performance measures such as advising, mentoring, monitoring and tutoring. The second step involves the creation of an application known as Student Advising and Retention (SAR application in a cloud computing environment. This application has been designed to be a comprehensive database management system which contains relevant data regarding student academic development that supports various strategic advising and monitoring of students. The third step involves the creation of a systematic advising chart and frameworks which help advisors. The paper shows ways of creating the most appropriate advising technique based on the student’s academic needs. The proposed application runs in a Windows-based system. As stated above, the proposed application is expected to enhance and expand the current advising service of Gradesfirst tool. A brief

  17. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  18. Intrusion Detection and Marking Transactions in a Cloud of Databases Environment

    OpenAIRE

    Syrine Chatti; Habib Ounelli

    2016-01-01

    The cloud computing is a paradigm for large scale distributed computing that includes several existing technologies. A database management is a collection of programs that enables you to store, modify and extract information from a database. Now, the database has moved to cloud computing, but it introduces at the same time a set of threats that target a cloud of database system. The unification of transaction based application in these environments present also a set of vulnerabilities and th...

  19. Concurrency control in distributed database systems

    CERN Document Server

    Cellary, W; Gelenbe, E

    1989-01-01

    Distributed Database Systems (DDBS) may be defined as integrated database systems composed of autonomous local databases, geographically distributed and interconnected by a computer network.The purpose of this monograph is to present DDBS concurrency control algorithms and their related performance issues. The most recent results have been taken into consideration. A detailed analysis and selection of these results has been made so as to include those which will promote applications and progress in the field. The application of the methods and algorithms presented is not limited to DDBSs but a

  20. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    Diaz, A.

    1996-01-01

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O 2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  1. Advances in computational metabolomics and databases deepen the understanding of metabolisms.

    Science.gov (United States)

    Tsugawa, Hiroshi

    2018-01-29

    Mass spectrometry (MS)-based metabolomics is the popular platform for metabolome analyses. Computational techniques for the processing of MS raw data, for example, feature detection, peak alignment, and the exclusion of false-positive peaks, have been established. The next stage of untargeted metabolomics would be to decipher the mass fragmentation of small molecules for the global identification of human-, animal-, plant-, and microbiota metabolomes, resulting in a deeper understanding of metabolisms. This review is an update on the latest computational metabolomics including known/expected structure databases, chemical ontology classifications, and mass spectrometry cheminformatics for the interpretation of mass fragmentations and for the elucidation of unknown metabolites. The importance of metabolome 'databases' and 'repositories' is also discussed because novel biological discoveries are often attributable to the accumulation of data, to relational databases, and to their statistics. Lastly, a practical guide for metabolite annotations is presented as the summary of this review. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Development and Use of an EFL Reading Practice Application for an Android Tablet Computer

    Science.gov (United States)

    Ishikawa, Yasushige; Smith, Craig; Kondo, Mutsumi; Akano, Ichiro; Maher, Kate; Wada, Norihisa

    2014-01-01

    This paper reports on the use of an English-language reading practice application for an Android tablet computer with students who are not native speakers of English. The application materials for vocabulary learning in reading-passage contexts were created to include words from a database of low-frequency and technical noun-verb collocations…

  3. Advances in Parallel Computing and Databases for Digital Pathology in Cancer Research

    Science.gov (United States)

    2016-11-13

    databases. The advent of NewSQL and NoSQL (Not Only SQL) databases has led to the development of new technologies that are well suited for applications... NoSQL graph databases are tuned to support graph operations and NoSQL key-value databases excel at rapid ingest of unstructured data. Recent NewSQL

  4. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  5. Database applications in high energy physics

    International Nuclear Information System (INIS)

    Jeffery, K.G.

    1982-01-01

    High Energy physicists were using computers to process and store their data early in the history of computing. They addressed problems of memory management, job control, job generation, data standards, file conventions, multiple simultaneous usage, tape file handling and data management earlier than, or at the same time as, the manufacturers of computing equipment. The HEP community have their own suites of programs for these functions, and are now turning their attention to the possibility of replacing some of the functional components of their 'homebrew' systems with more widely used software and/or hardware. High on the 'shopping list' for replacement is data management. ECFA Working Group 11 has been working on this problem. This paper reviews the characteristics of existing HEP systems and existing database systems and discusses the way forward. (orig.)

  6. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  7. The Use of a Relational Database in Qualitative Research on Educational Computing.

    Science.gov (United States)

    Winer, Laura R.; Carriere, Mario

    1990-01-01

    Discusses the use of a relational database as a data management and analysis tool for nonexperimental qualitative research, and describes the use of the Reflex Plus database in the Vitrine 2001 project in Quebec to study computer-based learning environments. Information systems are also discussed, and the use of a conceptual model is explained.…

  8. A Unit-Test Framework for Database Applications

    DEFF Research Database (Denmark)

    Christensen, Claus Abildgaard; Gundersborg, Steen; de Linde, Kristian

    The outcome of a test of an application that stores data in a database naturally depends on the state of the database. It is therefore important that test developers are able to set up and tear down database states in a simple and efficient manner. In existing unit-test frameworks, setting up...... test can be minimized. In addition, the reuse between unit tests can speed up the execution of test suites. A performance test on a medium-size project shows a 40% speed up and an estimated 25% reduction in the number of lines of test code....

  9. Using AMDD method for Database Design in Mobile Cloud Computing Systems

    OpenAIRE

    Silviu Claudiu POPA; Mihai-Constantin AVORNICULUI; Vasile Paul BRESFELEAN

    2013-01-01

    The development of the technologies of wireless telecommunications gave birth of new kinds of e-commerce, the so called Mobile e-Commerce or m-Commerce. Mobile Cloud Computing (MCC) represents a new IT research area that combines mobile computing and cloud compu-ting techniques. Behind a cloud mobile commerce system there is a database containing all necessary information for transactions. By means of Agile Model Driven Development (AMDD) method, we are able to achieve many benefits that smoo...

  10. Large scale access tests and online interfaces to ATLAS conditions databases

    International Nuclear Information System (INIS)

    Amorim, A; Lopes, L; Pereira, P; Simoes, J; Soloviev, I; Burckhart, D; Schmitt, J V D; Caprini, M; Kolos, S

    2008-01-01

    The access of the ATLAS Trigger and Data Acquisition (TDAQ) system to the ATLAS Conditions Databases sets strong reliability and performance requirements on the database storage and access infrastructures. Several applications were developed to support the integration of Conditions database access with the online services in TDAQ, including the interface to the Information Services (IS) and to the TDAQ Configuration Databases. The information storage requirements were the motivation for the ONline A Synchronous Interface to COOL (ONASIC) from the Information Service (IS) to LCG/COOL databases. ONASIC avoids the possible backpressure from Online Database servers by managing a local cache. In parallel, OKS2COOL was developed to store Configuration Databases into an Offline Database with history record. The DBStressor application was developed to test and stress the access to the Conditions database using the LCG/COOL interface while operating in an integrated way as a TDAQ application. The performance scaling of simultaneous Conditions database read accesses was studied in the context of the ATLAS High Level Trigger large computing farms. A large set of tests were performed involving up to 1000 computing nodes that simultaneously accessed the LCG central database server infrastructure at CERN

  11. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  12. Ontology to relational database transformation for web application development and maintenance

    Science.gov (United States)

    Mahmudi, Kamal; Inggriani Liem, M. M.; Akbar, Saiful

    2018-03-01

    Ontology is used as knowledge representation while database is used as facts recorder in a KMS (Knowledge Management System). In most applications, data are managed in a database system and updated through the application and then they are transformed to knowledge as needed. Once a domain conceptor defines the knowledge in the ontology, application and database can be generated from the ontology. Most existing frameworks generate application from its database. In this research, ontology is used for generating the application. As the data are updated through the application, a mechanism is designed to trigger an update to the ontology so that the application can be rebuilt based on the newest ontology. By this approach, a knowledge engineer has a full flexibility to renew the application based on the latest ontology without dependency to a software developer. In many cases, the concept needs to be updated when the data changed. The framework is built and tested in a spring java environment. A case study was conducted to proof the concepts.

  13. Computational 2D Materials Database

    DEFF Research Database (Denmark)

    Rasmussen, Filip Anselm; Thygesen, Kristian Sommer

    2015-01-01

    We present a comprehensive first-principles study of the electronic structure of 51 semiconducting monolayer transition-metal dichalcogenides and -oxides in the 2H and 1T hexagonal phases. The quasiparticle (QP) band structures with spin-orbit coupling are calculated in the G(0)W(0) approximation...... and used as input to a 2D hydrogenic model to estimate exciton binding energies. Throughout the paper we focus on trends and correlations in the electronic structure rather than detailed analysis of specific materials. All the computed data is available in an open database......., and comparison is made with different density functional theory descriptions. Pitfalls related to the convergence of GW calculations for two-dimensional (2D) materials are discussed together with possible solutions. The monolayer band edge positions relative to vacuum are used to estimate the band alignment...

  14. Database Dictionary for Ethiopian National Ground-Water DAtabase (ENGDA) Data Fields

    Science.gov (United States)

    Kuniansky, Eve L.; Litke, David W.; Tucci, Patrick

    2007-01-01

    Introduction This document describes the data fields that are used for both field forms and the Ethiopian National Ground-water Database (ENGDA) tables associated with information stored about production wells, springs, test holes, test wells, and water level or water-quality observation wells. Several different words are used in this database dictionary and in the ENGDA database to describe a narrow shaft constructed in the ground. The most general term is borehole, which is applicable to any type of hole. A well is a borehole specifically constructed to extract water from the ground; however, for this data dictionary and for the ENGDA database, the words well and borehole are used interchangeably. A production well is defined as any well used for water supply and includes hand-dug wells, small-diameter bored wells equipped with hand pumps, or large-diameter bored wells equipped with large-capacity motorized pumps. Test holes are borings made to collect information about the subsurface with continuous core or non-continuous core and/or where geophysical logs are collected. Test holes are not converted into wells. A test well is a well constructed for hydraulic testing of an aquifer in order to plan a larger ground-water production system. A water-level or water-quality observation well is a well that is used to collect information about an aquifer and not used for water supply. A spring is any naturally flowing, local, ground-water discharge site. The database dictionary is designed to help define all fields on both field data collection forms (provided in attachment 2 of this report) and for the ENGDA software screen entry forms (described in Litke, 2007). The data entered into each screen entry field are stored in relational database tables within the computer database. The organization of the database dictionary is designed based on field data collection and the field forms, because this is what the majority of people will use. After each field, however, the

  15. 6th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Luscombe, Nicholas; Fdez-Riverola, Florentino; Rodríguez, Juan; Practical Applications of Computational Biology & Bioinformatics

    2012-01-01

    The growth in the Bioinformatics and Computational Biology fields over the last few years has been remarkable.. The analysis of the datasets of Next Generation Sequencing needs new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Also Systems Biology has also been emerging as an alternative to the reductionist view that dominated biological research in the last decades. This book presents the results of the  6th International Conference on Practical Applications of Computational Biology & Bioinformatics held at University of Salamanca, Spain, 28-30th March, 2012 which brought together interdisciplinary scientists that have a strong background in the biological and computational sciences.

  16. A database application for the Naval Command Physical Readiness Testing Program

    OpenAIRE

    Quinones, Frances M.

    1998-01-01

    Approved for public release; distribution is unlimited 1T21 envisions a Navy with tandardized, state-of-art computer systems. Based on this vision, Naval database management systems will also need to become standardized among Naval commands. Today most commercial off the shelf (COTS) database management systems provide a graphical user interface. Among the many Naval database systems currently in use, the Navy's Physical Readiness Program database has continued to exist at the command leve...

  17. Advanced approaches to intelligent information and database systems

    CERN Document Server

    Boonjing, Veera; Chittayasothorn, Suphamit

    2014-01-01

    This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis, and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications, and multimedia systems methods and applications. The book will be interesting for both practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced ...

  18. Analysis on Cloud Computing Database in Cloud Environment – Concept and Adoption Paradigm

    Directory of Open Access Journals (Sweden)

    Elena-Geanina ULARU

    2012-08-01

    Full Text Available With the development of the Internet’s new technical functionalities, new concepts have started to take shape. These concepts have an important role especially in the development of corporate IT. Such a concept is „the Cloud”. Various marketing campaigns have started to focus on the Cloud and began to promote it in different but confusing ways. This campaigns do little, to explain what cloud computing is and why it is becoming increasingly necessary. The lack of understanding in this new technology generates a lack of knowledge in business cloud adoption regarding database and also application. Only by focusing on the business processes and objectives an enterprise can achieve the full benefits of the cloud and mitigate the potential risks. In this article we create our own complete definition of the cloud and we analyze the essential aspects of cloud adoption for a banking financial reporting application.

  19. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    Science.gov (United States)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  20. A computer database system to calculate staff radiation doses and maintain records

    International Nuclear Information System (INIS)

    Clewer, P.

    1985-01-01

    A database has been produced to record the personal dose records of all employees monitored for radiation exposure in the Wessex Health Region. Currently there are more than 2000 personnel in 115 departments but the capacity of the database allows for expansion. The computer is interfaced to a densitometer for film badge reading. The hardware used by the database, which is based on a popular microcomputer, is described, as are the various programs that make up the software. The advantages over the manual card index system that it replaces are discussed. (author)

  1. ESPSD, Nuclear Power Plant Siting Database

    International Nuclear Information System (INIS)

    Slezak, S.

    2001-01-01

    1 - Description of program or function: This database is a repository of comprehensive licensing and technical reviews of siting regulatory processes and acceptance criteria for advanced light water reactor (ALWR) nuclear power plants. The program is designed to be used by applicants for an early site permit or combined construction permit/operating license (10CFRR522), Sub-parts A and C) as input for the development of the application. The database is a complete, menu-driven, self-contained package that can search and sort the supplied data by topic, keyword, or other input. The software is designed for operation on IBM compatible computers with DOS. 2 - Method of solution: The database is an R:BASE Runtime program with all the necessary database files included

  2. A parallel model for SQL astronomical databases based on solid state storage. Application to the Gaia Archive PostgreSQL database

    Science.gov (United States)

    González-Núñez, J.; Gutiérrez-Sánchez, R.; Salgado, J.; Segovia, J. C.; Merín, B.; Aguado-Agelet, F.

    2017-07-01

    Query planning and optimisation algorithms in most popular relational databases were developed at the times hard disk drives were the only storage technology available. The advent of higher parallel random access capacity devices, such as solid state disks, opens up the way for intra-machine parallel computing over large datasets. We describe a two phase parallel model for the implementation of heavy analytical processes in single instance PostgreSQL astronomical databases. This model is particularised to fulfil two frequent astronomical problems, density maps and crossmatch computation with Quad Tree Cube (Q3C) indexes. They are implemented as part of the relational databases infrastructure for the Gaia Archive and performance is assessed. Improvement of a factor 28.40 in comparison to sequential execution is observed in the reference implementation for a histogram computation. Speedup ratios of 3.7 and 4.0 are attained for the reference positional crossmatches considered. We observe large performance enhancements over sequential execution for both CPU and disk access intensive computations, suggesting these methods might be useful with the growing data volumes in Astronomy.

  3. The use of computational thermodynamics to predict properties of multicomponent materials for nuclear applications

    International Nuclear Information System (INIS)

    Sundman, B.; Gueneau, C.

    2013-01-01

    Computational Thermodynamics is based on physically realistic models to describe metallic and oxide crystalline phases as well as the liquid and gas in a consistent manner. The models are used to assess experimental and theoretical data for many different materials and several thermodynamic databases has been developed for steels, ceramics, semiconductor materials as well as materials for nuclear applications. Within CEA a long term work is ongoing to develop a database for the properties of nuclear fuels and structural materials. An overview of the modelling technique will be given and several examples of the application of the database to different problems, both for traditional phase diagram calculations and its use in simulating phase transformations. The following diagrams (Fig. 1, Fig. 2 and Fig.3) show calculations in the U-Pu-O system. (authors)

  4. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    Science.gov (United States)

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  5. MetReS, an Efficient Database for Genomic Applications.

    Science.gov (United States)

    Vilaplana, Jordi; Alves, Rui; Solsona, Francesc; Mateo, Jordi; Teixidó, Ivan; Pifarré, Marc

    2018-02-01

    MetReS (Metabolic Reconstruction Server) is a genomic database that is shared between two software applications that address important biological problems. Biblio-MetReS is a data-mining tool that enables the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the processes of interest and their function. The main goal of this work was to identify the areas where the performance of the MetReS database performance could be improved and to test whether this improvement would scale to larger datasets and more complex types of analysis. The study was started with a relational database, MySQL, which is the current database server used by the applications. We also tested the performance of an alternative data-handling framework, Apache Hadoop. Hadoop is currently used for large-scale data processing. We found that this data handling framework is likely to greatly improve the efficiency of the MetReS applications as the dataset and the processing needs increase by several orders of magnitude, as expected to happen in the near future.

  6. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  7. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  8. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  9. Applications of symbolic algebraic computation

    International Nuclear Information System (INIS)

    Brown, W.S.; Hearn, A.C.

    1979-01-01

    This paper is a survey of applications of systems for symbomic algebraic computation. In most successful applications, calculations that can be taken to a given order by hand are then extended one or two more orders by computer. Furthermore, with a few notable exceptins, these applications also involve numerical computation in some way. Therefore the authors emphasize the interface between symbolic and numerical computation, including: 1. Computations with both symbolic and numerical phases. 2. Data involving both the unpredictible size and shape that typify symbolic computation and the (usually inexact) numerical values that characterize numerical computation. 3. Applications of one field to the other. It is concluded that the fields of symbolic and numerical computation can advance most fruitfully in harmony rather than in competition. (Auth.)

  10. Advancements in web-database applications for rabies surveillance

    Directory of Open Access Journals (Sweden)

    Bélanger Denise

    2011-08-01

    Full Text Available Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1 automatic integration of multi-agency data and diagnostic results on a daily basis; 2 a web-based data editing interface that enables authorized users to add, edit and extract data; and 3 an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from

  11. Construction of crystal structure prototype database: methods and applications.

    Science.gov (United States)

    Su, Chuanxun; Lv, Jian; Li, Quan; Wang, Hui; Zhang, Lijun; Wang, Yanchao; Ma, Yanming

    2017-04-26

    Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery.

  12. Construction of crystal structure prototype database: methods and applications

    International Nuclear Information System (INIS)

    Su, Chuanxun; Lv, Jian; Wang, Hui; Wang, Yanchao; Ma, Yanming; Li, Quan; Zhang, Lijun

    2017-01-01

    Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery. (paper)

  13. Simple Logic for Big Problems: An Inside Look at Relational Databases.

    Science.gov (United States)

    Seba, Douglas B.; Smith, Pat

    1982-01-01

    Discusses database design concept termed "normalization" (process replacing associations between data with associations in two-dimensional tabular form) which results in formation of relational databases (they are to computers what dictionaries are to spoken languages). Applications of the database in serials control and complex systems…

  14. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  15. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  16. Private and Efficient Query Processing on Outsourced Genomic Databases.

    Science.gov (United States)

    Ghasemi, Reza; Al Aziz, Md Momin; Mohammed, Noman; Dehkordi, Massoud Hadian; Jiang, Xiaoqian

    2017-09-01

    Applications of genomic studies are spreading rapidly in many domains of science and technology such as healthcare, biomedical research, direct-to-consumer services, and legal and forensic. However, there are a number of obstacles that make it hard to access and process a big genomic database for these applications. First, sequencing genomic sequence is a time consuming and expensive process. Second, it requires large-scale computation and storage systems to process genomic sequences. Third, genomic databases are often owned by different organizations, and thus, not available for public usage. Cloud computing paradigm can be leveraged to facilitate the creation and sharing of big genomic databases for these applications. Genomic data owners can outsource their databases in a centralized cloud server to ease the access of their databases. However, data owners are reluctant to adopt this model, as it requires outsourcing the data to an untrusted cloud service provider that may cause data breaches. In this paper, we propose a privacy-preserving model for outsourcing genomic data to a cloud. The proposed model enables query processing while providing privacy protection of genomic databases. Privacy of the individuals is guaranteed by permuting and adding fake genomic records in the database. These techniques allow cloud to evaluate count and top-k queries securely and efficiently. Experimental results demonstrate that a count and a top-k query over 40 Single Nucleotide Polymorphisms (SNPs) in a database of 20 000 records takes around 100 and 150 s, respectively.

  17. Cloud Computing Databases: Latest Trends and Architectural Concepts

    OpenAIRE

    Tarandeep Singh; Parvinder S. Sandhu

    2011-01-01

    The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services...

  18. IAEA nuclear databases for applications

    International Nuclear Information System (INIS)

    Schwerer, Otto

    2003-01-01

    The Nuclear Data Section (NDS) of the International Atomic Energy Agency (IAEA) provides nuclear data services to scientists on a worldwide scale with particular emphasis on developing countries. More than 100 data libraries are made available cost-free by Internet, CD-ROM and other media. These databases are used for practically all areas of nuclear applications as well as basic research. An overview is given of the most important nuclear reaction and nuclear structure databases, such as EXFOR, CINDA, ENDF, NSR, ENSDF, NUDAT, and of selected special purpose libraries such as FENDL, RIPL, RNAL, the IAEA Photonuclear Data Library, and the IAEA charged-particle cross section database for medical radioisotope production. The NDS also coordinates two international nuclear data centre networks and is involved in data development activities (to create new or improve existing data libraries when the available data are inadequate) and in technology transfer to developing countries, e.g. through the installation and support of the mirror web site of the IAEA Nuclear Data Services at IPEN (operational since March 2000) and by organizing nuclear-data related workshops. By encouraging their participation in IAEA Co-ordinated Research Projects and also by compiling their experimental results in databases such as EXFOR, the NDS helps to make developing countries' contributions to nuclear science visible and conveniently available. The web address of the IAEA Nuclear Data Services is http://www.nds.iaea.org and the NDS mirror service at IPEN (Brasil) can be accessed at http://www.nds.ipen.br/ (author)

  19. Internet Databases of the Properties, Enzymatic Reactions, and Metabolism of Small Molecules—Search Options and Applications in Food Science

    Directory of Open Access Journals (Sweden)

    Piotr Minkiewicz

    2016-12-01

    Full Text Available Internet databases of small molecules, their enzymatic reactions, and metabolism have emerged as useful tools in food science. Database searching is also introduced as part of chemistry or enzymology courses for food technology students. Such resources support the search for information about single compounds and facilitate the introduction of secondary analyses of large datasets. Information can be retrieved from databases by searching for the compound name or structure, annotating with the help of chemical codes or drawn using molecule editing software. Data mining options may be enhanced by navigating through a network of links and cross-links between databases. Exemplary databases reviewed in this article belong to two classes: tools concerning small molecules (including general and specialized databases annotating food components and tools annotating enzymes and metabolism. Some problems associated with database application are also discussed. Data summarized in computer databases may be used for calculation of daily intake of bioactive compounds, prediction of metabolism of food components, and their biological activity as well as for prediction of interactions between food component and drugs.

  20. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  1. PROPERTY DATABASE FOR THE DEVELOPMENT OF SHAPE MEMORY ALLOY APPLICATIONS

    OpenAIRE

    Tang , W.; CederstrÖm , J.; SandstrÖm , R.

    1991-01-01

    Important points involving the selection of shape memory alloy (SMA) application projects are discussed. The development of a property database for SMA is initiated. Both conventional data as well as characteristics which are unique for SMA are stored. As an application example of the database SMA-SELECT, important properties for Ti-Ni alloys near equi-atomic composition, such as temperature window width for superelasticity (SE), stress rate, critical yield stress, and their interaction have ...

  2. Database management system for large container inspection system

    International Nuclear Information System (INIS)

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  3. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  4. Integrating the DLD dosimetry system into the Almaraz NPP Corporative Database

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    1996-01-01

    The article discusses the experience acquired during the integration of a new MGP Instruments DLD Dosimetry System into the Almaraz NPP corporative database and general communications network, following a client-server philosophy and taking into account the computer standards of the Plant. The most important results obtained are: Integration of DLD dosimetry information into corporative databases, permitting the use of new applications Sharing of existing personnel information with the DLD dosimetry application, thereby avoiding the redundant work of introducing data and improving the quality of the information. Facilitation of maintenance, both software and hardware, of the DLD system. Maximum explotation, from the computer point of view, of the initial investment. Adaptation of the application to the applicable legislation. (Author)

  5. Applying artificial intelligence to astronomical databases - a surveyof applicable technology.

    Science.gov (United States)

    Rosenthal, D. A.

    This paper surveys several emerging technologies which are relevant to astronomical database issues such as interface technology, internal database representation, and intelligent data reduction aids. Among the technologies discussed are natural language understanding, frame and object representations, planning, pattern analysis, machine learning and the nascent study of simulated neural nets. These techniques will become increasingly important for astronomical research, and in particular, for applications with large databases.

  6. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  7. Proceedings of the 11th China symposium on computer application in modern science and technology

    International Nuclear Information System (INIS)

    2003-01-01

    The 11th China symposium on computer application in modern science and technology were held by China Electronics Society and Nuclear Electronics and Nuclear Detecting Technology branch Society of China Nuclear Society on september, 8th-12th, 2003 in Changdao of Shandong province 77 articles is collected in the proceedings. The contents included calculation and calculation method, software system and software application, data acquisition and control system, database, and management information system, general system, network application and grid calculation and its application system and so on

  8. A database of fragmentation cross section measurements applicable to cosmic ray propagation calculations

    International Nuclear Information System (INIS)

    Crawford, H.J.; Engelage, J.; Jones, F.C.

    1989-08-01

    A database of single particle inclusive fragment production cross section measurements has been established and is accessible over common computer networks. These measurements have been obtained from both published literature and direct communication with experimenters and include cross sections for nuclear beams on H, He, and heavier targets, and for H and He beams on nuclear targets, for energies >30 MeV/nucleon. These cross sections are directly applicable to calculations involving cosmic ray nuclear interactions with matter. The data base includes projectile, target, and fragment specifications, beam energy, cross section with uncertainty, literature reference, and comment code. It is continuously updated to assure accuracy and completeness. Also available are widely used semi-empirical formulations for calculating production cross sections and excitation functions. In this paper we discuss the database in detail and describe how it can be accessed. We compare the measurements with semi-empirical calculations and point out areas where improved calculations and further cross section measurements are required. 5 refs., 2 figs

  9. A database for CO2 Separation Performances of MOFs based on Computational Materials Screening.

    Science.gov (United States)

    Altintas, Cigdem; Avci, Gokay; Daglar, Hilal; Nemati Vesali Azar, Ayda; Velioglu, Sadiye; Erucar, Ilknur; Keskin, Seda

    2018-05-03

    Metal organic frameworks (MOFs) have been considered as great candidates for CO2 capture. Considering the very large number of available MOFs, high-throughput computational screening plays a critical role in identifying the top performing materials for target applications in a time-effective manner. In this work, we used molecular simulations to screen the most recent and complete MOF database for identifying the most promising materials for CO2 separation from flue gas (CO2/N2) and landfill gas (CO2/CH4) under realistic operating conditions. We first validated our approach by comparing the results of our molecular simulations for the CO2 uptakes, CO2/N2 and CO2/CH4 selectivities of various types of MOFs with the available experimental data. We then computed binary CO2/N2 and CO2/CH4 mixture adsorption data for the entire MOF database and used these results to calculate several adsorbent selection metrics such as selectivity, working capacity, adsorbent performance score, regenerability, and separation potential. MOFs were ranked based on the combination of these metrics and the top performing MOF adsorbents that can achieve CO2/N2 and CO2/CH4 separations with high performance were identified. Molecular simulations for the adsorption of a ternary CO2/N2/CH4 mixture were performed for these top materials in order to provide a more realistic performance assessment of MOF adsorbents. Structure-performance analysis showed that MOFs with ΔQ>30 kJ/mol, 3.8 A≤PLD≤5 A, 5 A≤LCD≤7.5 A, 0.5≤ϕ≤0.75, SA≤1,000 m2/g, ρ>1 g/cm 3 are the best candidates for selective separation of CO2 from flue gas and landfill gas. This information will be very useful to design novel MOFs with the desired structural features that can lead to high CO2 separation potentials. Finally, an online, freely accessible database https://cosmoserc.ku.edu.tr was established, for the first time in the literature, which reports all computed adsorbent metrics of 3,816 MOFs for CO2/N2, CO2/CH4

  10. JT-60 database system, 2

    International Nuclear Information System (INIS)

    Itoh, Yasuhiro; Kurihara, Kenichi; Kimura, Toyoaki.

    1987-07-01

    The JT-60 central control system, ''ZENKEI'' collects the control and instrumentation data relevant to discharge and device status data for plant monitoring. The former of the engineering data amounts to about 3 Mbytes per shot of discharge. The ''ZENKEI'' control system which consists of seven minicomputers for on-line real-time control has little performance of handling such a large amount of data for physical and engineering analysis. In order to solve this problem, it was planned to establish the experimental database on the Front-end Processor (FEP) of general purpose large computer in JAERI Computer Center. The database management system (DBMS), therefore, has been developed for creating the database during the shot interval. The engineering data are shipped up from ''ZENKEI'' to FEP through the dedicated communication line after the shot. The hierarchical data model has been adopted in this database, which consists of the data files with tree structure of three keys of system, discharge type and shot number. The JT-60 DBMS provides the data handling packages of subroutines for interfacing the database with user's application programs. The subroutine packages for supporting graphic processing and the function of access control for security of the database are also prepared in this DBMS. (author)

  11. Cloud-Based NoSQL Open Database of Pulmonary Nodules for Computer-Aided Lung Cancer Diagnosis and Reproducible Research.

    Science.gov (United States)

    Ferreira Junior, José Raniery; Oliveira, Marcelo Costa; de Azevedo-Marques, Paulo Mazzoncini

    2016-12-01

    Lung cancer is the leading cause of cancer-related deaths in the world, and its main manifestation is pulmonary nodules. Detection and classification of pulmonary nodules are challenging tasks that must be done by qualified specialists, but image interpretation errors make those tasks difficult. In order to aid radiologists on those hard tasks, it is important to integrate the computer-based tools with the lesion detection, pathology diagnosis, and image interpretation processes. However, computer-aided diagnosis research faces the problem of not having enough shared medical reference data for the development, testing, and evaluation of computational methods for diagnosis. In order to minimize this problem, this paper presents a public nonrelational document-oriented cloud-based database of pulmonary nodules characterized by 3D texture attributes, identified by experienced radiologists and classified in nine different subjective characteristics by the same specialists. Our goal with the development of this database is to improve computer-aided lung cancer diagnosis and pulmonary nodule detection and classification research through the deployment of this database in a cloud Database as a Service framework. Pulmonary nodule data was provided by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), image descriptors were acquired by a volumetric texture analysis, and database schema was developed using a document-oriented Not only Structured Query Language (NoSQL) approach. The proposed database is now with 379 exams, 838 nodules, and 8237 images, 4029 of them are CT scans and 4208 manually segmented nodules, and it is allocated in a MongoDB instance on a cloud infrastructure.

  12. Computer application for database management and networking of service radio physics; Aplicacion informatica para la gestion de bases de datos y conexiones en red de un servicio de radiofisica

    Energy Technology Data Exchange (ETDEWEB)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-07-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Microsoft Office) our service implements this philosophy on the centers computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  13. Development of IAEA nuclear reaction databases and services

    Energy Technology Data Exchange (ETDEWEB)

    Zerkin, V.; Trkov, A. [International Atomic Energy Agency, Dept. of Nuclear Sciences and Applications, Vienna (Austria)

    2008-07-01

    From mid-2004 onwards, the major nuclear reaction databases (EXFOR, CINDA and Endf) and services (Web and CD-Roms retrieval systems and specialized applications) have been functioning within a modern computing environment as multi-platform software, working under several operating systems with relational databases. Subsequent work at the IAEA has focused on three areas of development: revision and extension of the contents of the databases; extension and improvement of the functionality and integrity of the retrieval systems; development of software for database maintenance and system deployment. (authors)

  14. Klaim-DB: A Modeling Language for Distributed Database Applications

    DEFF Research Database (Denmark)

    Wu, Xi; Li, Ximeng; Lluch Lafuente, Alberto

    2015-01-01

    and manipulation of structured data, with integrity and atomicity considerations. We present the formal semantics of KlaimDB and illustrate the use of the language in a scenario where the sales from different branches of a chain of department stores are aggregated from their local databases. It can be seen......We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access...... that raising the abstraction level and encapsulating integrity checks (concerning the schema of tables, etc.) in the language primitives for database operations benefit the modelling task considerably....

  15. A PC/workstation cluster computing environment for reservoir engineering simulation applications

    International Nuclear Information System (INIS)

    Hermes, C.E.; Koo, J.

    1995-01-01

    Like the rest of the petroleum industry, Texaco has been transferring its applications and databases from mainframes to PC's and workstations. This transition has been very positive because it provides an environment for integrating applications, increases end-user productivity, and in general reduces overall computing costs. On the down side, the transition typically results in a dramatic increase in workstation purchases and raises concerns regarding the cost and effective management of computing resources in this new environment. The workstation transition also places the user in a Unix computing environment which, to say the least, can be quite frustrating to learn and to use. This paper describes the approach, philosophy, architecture, and current status of the new reservoir engineering/simulation computing environment developed at Texaco's E and P Technology Dept. (EPTD) in Houston. The environment is representative of those under development at several other large oil companies and is based on a cluster of IBM and Silicon Graphics Intl. (SGI) workstations connected by a fiber-optics communications network and engineering PC's connected to local area networks, or Ethernets. Because computing resources and software licenses are shared among a group of users, the new environment enables the company to get more out of its investments in workstation hardware and software

  16. COMAR - database for certified reference materials

    International Nuclear Information System (INIS)

    Klich, H.; Caliste, J.P.

    1988-01-01

    With more than 130 producers of reference materials (RM) throughout the world, it is often difficult to find the best reference material for a specific application. The computer database COMAR has been developed to aid chemists in finding the needed reference material. (orig.)

  17. Computer Applications in Educational Audiology.

    Science.gov (United States)

    Mendel, Lisa Lucks; And Others

    1995-01-01

    This article provides an overview of how computer technologies can be used by educational audiologists. Computer technologies are classified into three categories: (1) information systems applications; (2) screening and diagnostic applications; and (3) intervention applications. (Author/DB)

  18. Relational databases for conditions data and event selection in ATLAS

    International Nuclear Information System (INIS)

    Viegas, F; Hawkings, R; Dimitrov, G

    2008-01-01

    The ATLAS experiment at LHC will make extensive use of relational databases in both online and offline contexts, running to O(TBytes) per year. Two of the most challenging applications in terms of data volume and access patterns are conditions data, making use of the LHC conditions database, COOL, and the TAG database, that stores summary event quantities allowing a rapid selection of interesting events. Both of these databases are being replicated to regional computing centres using Oracle Streams technology, in collaboration with the LCG 3D project. Database optimisation, performance tests and first user experience with these applications will be described, together with plans for first LHC data-taking and future prospects

  19. Relational databases for conditions data and event selection in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Viegas, F; Hawkings, R; Dimitrov, G [CERN, CH-1211 Geneve 23 (Switzerland)

    2008-07-15

    The ATLAS experiment at LHC will make extensive use of relational databases in both online and offline contexts, running to O(TBytes) per year. Two of the most challenging applications in terms of data volume and access patterns are conditions data, making use of the LHC conditions database, COOL, and the TAG database, that stores summary event quantities allowing a rapid selection of interesting events. Both of these databases are being replicated to regional computing centres using Oracle Streams technology, in collaboration with the LCG 3D project. Database optimisation, performance tests and first user experience with these applications will be described, together with plans for first LHC data-taking and future prospects.

  20. Application of material databases for improved reliability of reactor pressure vessels

    International Nuclear Information System (INIS)

    Griesbach, T.J.; Server, W.L.; Beaudoin, B.F.; Burgos, B.N.

    1994-01-01

    A vital part of reactor vessel Life Cycle Management program must begin with an accurate characterization of the vessel material properties. Uncertainties in vessel material properties or use of bounding values may result in unnecessary conservatisms in vessel integrity calculations. These conservatisms may be eliminated through a better understanding of the material properties in reactor vessels, both in the unirradiated and irradiated conditions. Reactor vessel material databases are available for quantifying the chemistry and Charpy shift behavior of individual heats of reactor vessel materials. Application of the databases for vessels with embrittlement concerns has proven to be an effective embrittlement management tool. This paper presents details of database development and applications which demonstrate the value of using material databases for improving material chemistry and for maximizing the data from integrated material surveillance programs

  1. Exploring Natural Products from the Biodiversity of Pakistan for Computational Drug Discovery Studies: Collection, Optimization, Design and Development of A Chemical Database (ChemDP).

    Science.gov (United States)

    Mirza, Shaher Bano; Bokhari, Habib; Fatmi, Muhammad Qaiser

    2015-01-01

    Pakistan possesses a rich and vast source of natural products (NPs). Some of these secondary metabolites have been identified as potent therapeutic agents. However, the medicinal usage of most of these compounds has not yet been fully explored. The discoveries for new scaffolds of NPs as inhibitors of certain enzymes or receptors using advanced computational drug discovery approaches are also limited due to the unavailability of accurate 3D structures of NPs. An organized database incorporating all relevant information, therefore, can facilitate to explore the medicinal importance of the metabolites from Pakistani Biodiversity. The Chemical Database of Pakistan (ChemDP; release 01) is a fully-referenced, evolving, web-based, virtual database which has been designed and developed to introduce natural products (NPs) and their derivatives from the biodiversity of Pakistan to Global scientific communities. The prime aim is to provide quality structures of compounds with relevant information for computer-aided drug discovery studies. For this purpose, over 1000 NPs have been identified from more than 400 published articles, for which 2D and 3D molecular structures have been generated with a special focus on their stereochemistry, where applicable. The PM7 semiempirical quantum chemistry method has been used to energy optimize the 3D structure of NPs. The 2D and 3D structures can be downloaded as .sdf, .mol, .sybyl, .mol2, and .pdb files - readable formats by many chemoinformatics/bioinformatics software packages. Each entry in ChemDP contains over 100 data fields representing various molecular, biological, physico-chemical and pharmacological properties, which have been properly documented in the database for end users. These pieces of information have been either manually extracted from the literatures or computationally calculated using various computational tools. Cross referencing to a major data repository i.e. ChemSpider has been made available for overlapping

  2. Solutions for medical databases optimal exploitation.

    Science.gov (United States)

    Branescu, I; Purcarea, V L; Dobrescu, R

    2014-03-15

    The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, "multimodel" federated system for extending OLAP querying to external object databases.

  3. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    Science.gov (United States)

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  4. The new Cloud Dynamics and Radiation Database algorithms for AMSR2 and GMI: exploitation of the GPM observational database for operational applications

    Science.gov (United States)

    Cinzia Marra, Anna; Casella, Daniele; Martins Costa do Amaral, Lia; Sanò, Paolo; Dietrich, Stefano; Panegrossi, Giulia

    2017-04-01

    Two new precipitation retrieval algorithms for the Advanced Microwave Scanning Radiometer 2 (AMSR2) and for the GPM Microwave Imager (GMI) are presented. The algorithms are based on the Cloud Dynamics and Radiation Database (CDRD) Bayesian approach and represent an evolution of the previous version applied to Special Sensor Microwave Imager/Sounder (SSMIS) observations, and used operationally within the EUMETSAT Satellite Application Facility on support to Operational Hydrology and Water Management (H-SAF). These new products present as main innovation the use of an extended database entirely empirical, derived from coincident radar and radiometer observations from the NASA/JAXA Global Precipitation Measurement Core Observatory (GPM-CO) (Dual-frequency Precipitation Radar-DPR and GMI). The other new aspects are: 1) a new rain-no-rain screening approach; 2) the use of Empirical Orthogonal Functions (EOF) and Canonical Correlation Analysis (CCA) both in the screening approach, and in the Bayesian algorithm; 2) the use of new meteorological and environmental ancillary variables to categorize the database and mitigate the problem of non-uniqueness of the retrieval solution; 3) the development and implementations of specific modules for computational time minimization. The CDRD algorithms for AMSR2 and GMI are able to handle an extremely large observational database available from GPM-CO and provide the rainfall estimate with minimum latency, making them suitable for near-real time hydrological and operational applications. As far as CDRD for AMSR2, a verification study over Italy using ground-based radar data and over the MSG full disk area using coincident GPM-CO/AMSR2 observations has been carried out. Results show remarkable AMSR2 capabilities for rainfall rate (RR) retrieval over ocean (for RR > 0.25 mm/h), good capabilities over vegetated land (for RR > 1 mm/h), while for coastal areas the results are less certain. Comparisons with NASA GPM products, and with

  5. Geoscientific (GEO) database of the Andra Meuse / Haute-Marne research center

    International Nuclear Information System (INIS)

    Tabani, P.; Hemet, P.; Hermand, G.; Delay, J.; Auriere, C.

    2010-01-01

    Document available in extended abstract form only. The GEO database (geo-scientific database of the Meuse/Haute-Marne Center) is a tool developed by Andra, with a view to group in a secured computer form all data related to the acquisition of in situ and laboratory measurements made on solid and fluid samples. This database has three main functions: - Acquisition and management of data and computer files related to geological, geomechanical, hydrogeological and geochemical measurements on solid and fluid samples and in situ measurements (logging, on sample measurements, geological logs, etc). - Available consultation by the staff on Andra's intranet network for selective viewing of data linked to a borehole and/or a sample and for making computations and graphs on sets of laboratory measurements related to a sample. - Physical management of fluid and solid samples stored in a 'core library' in order to localize a sample, follow-up its movement out of the 'core library' to an organization, and carry out regular inventories. The GEO database is a relational Oracle data base. It is installed on a data server which stores information and manages the users' transactions. The users can consult, download and exploit data from any computer connected to the Andra network or Internet. Management of the access rights is made through a login/ password. Four geo-scientific explanations are linked to the Geo database, they are: - The Geosciences portal: The Geosciences portal is a web Intranet application accessible from the ANDRA network. It does not require a particular installation from the client and is accessible through the Internet navigator. A SQL Server Express database manages the users and access rights to the application. This application is used for the acquisition of hydrogeological and geochemical data collected on the field and on fluid samples, as well as data related to scientific work carried out at surface level or in drifts

  6. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  7. ARIDA: An Arabic Inter-Language Database and Its Applications: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Ghazi Abuhakema

    2009-08-01

    Full Text Available This paper describes a pilot study in which we collected a small learner corpus of Arabic, developed a tagset for error annotation and performed simple Computer-aided Error Analysis (CEA on the data. For this study, we adapted the French Interlanguage Database (FRIDA (Granger, 2003a tagset to the data. We chose FRIDA in order to keep our tagging in line with a known standard. The paper describes the need for learner corpora, the learner data we have collected, the tagset we have developed, its advantages and disadvantages, the preliminary CEA results, other potential applications of the error-annotated corpus of Arabic, and the error frequency distribution of both proficiency levels as well as our ongoing work.

  8. High Performance Protein Sequence Database Scanning on the Cell Broadband Engine

    Directory of Open Access Journals (Sweden)

    Adrianto Wirawan

    2009-01-01

    Full Text Available The enormous growth of biological sequence databases has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing rapidly as well. The recent emergence of low cost parallel multicore accelerator technologies has made it possible to reduce execution times of many bioinformatics applications. In this paper, we demonstrate how the Cell Broadband Engine can be used as a computational platform to accelerate two approaches for protein sequence database scanning: exhaustive and heuristic. We present efficient parallelization techniques for two representative algorithms: the dynamic programming based Smith–Waterman algorithm and the popular BLASTP heuristic. Their implementation on a Playstation®3 leads to significant runtime savings compared to corresponding sequential implementations.

  9. Analysis and Design of Web-Based Database Application for Culinary Community

    OpenAIRE

    Huda, Choirul; Awang, Osel Dharmawan; Raymond, Raymond; Raynaldi, Raynaldi

    2017-01-01

    This research is based on the rapid development of the culinary and information technology. The difficulties in communicating with the culinary expert and on recipe documentation make a proper support for media very important. Therefore, a web-based database application for the public is important to help the culinary community in communication, searching and recipe management. The aim of the research was to design a web-based database application that could be used as social media for the cu...

  10. NIRS database of the original research database

    International Nuclear Information System (INIS)

    Morita, Kyoko

    1991-01-01

    Recently, library staffs arranged and compiled the original research papers that have been written by researchers for 33 years since National Institute of Radiological Sciences (NIRS) established. This papers describes how the internal database of original research papers has been created. This is a small sample of hand-made database. This has been cumulating by staffs who have any knowledge about computer machine or computer programming. (author)

  11. In-database processing of a large collection of remote sensing data: applications and implementation

    Science.gov (United States)

    Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina

    2016-04-01

    Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability

  12. Computed radiography in NDT application

    International Nuclear Information System (INIS)

    Deprins, Eric

    2004-01-01

    Computed Radiography, or digital radiography by use of reusable Storage Phosphor screens, offers a convenient and reliable way to replace film. In addition to the reduced cost on consumables, the return on investment of CR systems is strongly determined by savings in exposure time, processing times and archival times. But also intangible costs like plant shutdown, environment safety and longer usability of isotopes are increasingly important when considering replacing film by Storage Phosphor systems. But mote than in traditional radiography, the use of digital images is a trade-off between the speed and the required quality. Better image quality is obtained by longer exposure times, slower phosphor screens and higher scan resolutions. Therefore, different kinds of storage phosphor screens are needed in order to cover every application. Most operations have the data, associated with the tests to be performed, centrally stored in a database. Using a digital radiography system gives not only the advantages of the manipulation of digital images, but also the digital data that is associated with it. Smart methods to associate cassettes and Storage screens with exposed images enhance the workflow of the NDT processes, and avoid human error. Automated measurements tools increase the throughput in different kinds of operations. This paper gives an overview of the way certain operations have decided to replace film by Computed Radiography, and what the major benefits for them have been.

  13. Computer science a concise introduction

    CERN Document Server

    Sinclair, Ian

    2014-01-01

    Computer Science: A Concise Introduction covers the fundamentals of computer science. The book describes micro-, mini-, and mainframe computers and their uses; the ranges and types of computers and peripherals currently available; applications to numerical computation; and commercial data processing and industrial control processes. The functions of data preparation, data control, computer operations, applications programming, systems analysis and design, database administration, and network control are also encompassed. The book then discusses batch, on-line, and real-time systems; the basic

  14. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  15. GAMCAT - a personal computer database on alpha particles and gamma rays from radioactive decay

    International Nuclear Information System (INIS)

    Tepel, J.W.; Mueller, H.W.

    1990-01-01

    The GAMCAT database is a compilation of data describing the alpha particles and gamma rays that occur in the radioactive decay of all known nuclides, adapted for IBM Personal Computers and compatible systems. These compiled data have been previously published, and are now available as a compact database. Entries can be retrieved by defining the properties of the parent nuclei as well as alpha-particle and gamma-ray energies or any combination of these parameters. The system provides fast access to the data and has been completely written in C to run on an AT-compatible computer, with a hard disk and 640K of memory under DOS 2.11 or higher. GAMCAT is available from the Fachinformationszentrum Karlsruhe. (orig.)

  16. Soft Computing Applications : Proceedings of the 5th International Workshop Soft Computing Applications

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária; Dombi, Joszef; Jain, Lakhmi

    2013-01-01

                    This volume contains the Proceedings of the 5thInternational Workshop on Soft Computing Applications (SOFA 2012).                                The book covers a broad spectrum of soft computing techniques, theoretical and practical applications employing knowledge and intelligence to find solutions for world industrial, economic and medical problems. The combination of such intelligent systems tools and a large number of applications introduce a need for a synergy of scientific and technological disciplines in order to show the great potential of Soft Computing in all domains.                   The conference papers included in these proceedings, published post conference, were grouped into the following area of research: ·         Soft Computing and Fusion Algorithms in Biometrics, ·         Fuzzy Theory, Control andApplications, ·         Modelling and Control Applications, ·         Steps towa...

  17. Development, deployment and operations of ATLAS databases

    International Nuclear Information System (INIS)

    Vaniachine, A. V.; von der Schmitt, J. G.

    2008-01-01

    In preparation for ATLAS data taking, a coordinated shift from development towards operations has occurred in ATLAS database activities. In addition to development and commissioning activities in databases, ATLAS is active in the development and deployment (in collaboration with the WLCG 3D project) of the tools that allow the worldwide distribution and installation of databases and related datasets, as well as the actual operation of this system on ATLAS multi-grid infrastructure. We describe development and commissioning of major ATLAS database applications for online and offline. We present the first scalability test results and ramp-up schedule over the initial LHC years of operations towards the nominal year of ATLAS running, when the database storage volumes are expected to reach 6.1 TB for the Tag DB and 1.0 TB for the Conditions DB. ATLAS database applications require robust operational infrastructure for data replication between online and offline at Tier-0, and for the distribution of the offline data to Tier-1 and Tier-2 computing centers. We describe ATLAS experience with Oracle Streams and other technologies for coordinated replication of databases in the framework of the WLCG 3D services

  18. Computer Cataloging of Electronic Journals in Unstable Aggregator Databases: The Hong Kong Baptist University Library Experience.

    Science.gov (United States)

    Li, Yiu-On; Leung, Shirley W.

    2001-01-01

    Discussion of aggregator databases focuses on a project at the Hong Kong Baptist University library to integrate full-text electronic journal titles from three unstable aggregator databases into its online public access catalog (OPAC). Explains the development of the electronic journal computer program (EJCOP) to generate MARC records for…

  19. Computer application in scientific investigations

    International Nuclear Information System (INIS)

    Govorun, N.N.

    1981-01-01

    A short review of the computer development and application and software in JINR for the last 15 years is presented. Main trends of studies on computer application in experimental and theoretical investigations are enumerated: software of computers and their systems, software of data processing systems, designing automatic and automized systems for measuring track detectors images, development of technique of carrying out experiments on computer line, packets of applied computer codes and specialized systems. The development of the on line technique is successfully used in investigations of nuclear processes at relativistic energies. The new trend is the development of television methods of data output and its computer recording [ru

  20. Enhancements to the Redmine Database Metrics Plug in

    Science.gov (United States)

    2017-08-01

    management web application has been adopted within the US Army Research Laboratory’s Computational and Information Sciences Directorate as a database...project management web application.∗ The Redmine plug-in† enabled the use of the numerous, powerful features of the web application. The many...distribution is unlimited. 2 • Selectable export of citations/references by type, writing style , and FY • Enhanced naming convention options for

  1. The establishment of the Blacknest seismological database on the Rutherford Laboratory system 360/195 computer

    International Nuclear Information System (INIS)

    Blamey, C.

    1977-01-01

    In order to assess the problems which might arise from monitoring a comprehensive test ban treaty by seismological methods, an experimental monitoring operation is being conducted. This work has involved the establishment of a database on the Rutherford Laboratory 360/195 system computer. The database can be accessed in the UK over the public telephone network and in the USA via ARPANET. (author)

  2. Database Description - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Database Description General information of database Database n... BioResource Center Hiroshi Masuya Database classification Plant databases - Arabidopsis thaliana Organism T...axonomy Name: Arabidopsis thaliana Taxonomy ID: 3702 Database description The Arabidopsis thaliana phenome i...heir effective application. We developed the new Arabidopsis Phenome Database integrating two novel database...seful materials for their experimental research. The other, the “Database of Curated Plant Phenome” focusing

  3. LHCb Conditions Database Operation Assistance Systems

    CERN Multimedia

    Shapoval, Illya

    2012-01-01

    The Conditions Database of the LHCb experiment (CondDB) provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger, reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues: - an extension to the automatic content validation done by the “Oracle Streams” replication technology, to trap cases when the replication was unsuccessful; - an automated distribution process for the S...

  4. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  5. Efficient Similarity Search Using the Earth Mover's Distance for Large Multimedia Databases

    DEFF Research Database (Denmark)

    Assent, Ira; Wichterich, Marc; Meisen, Tobias

    2008-01-01

    Multimedia similarity search in large databases requires efficient query processing. The Earth mover's distance, introduced in computer vision, is successfully used as a similarity model in a number of small-scale applications. Its computational complexity hindered its adoption in large multimedia...... databases. We enable directly indexing the Earth mover's distance in structures such as the R-tree and the VA-file by providing the accurate 'MinDist' function to any bounding rectangle in the index. We exploit the computational structure of the new MinDist to derive a new lower bound for the EMD Min...

  6. Proposal for grid computing for nuclear applications

    International Nuclear Information System (INIS)

    Faridah Mohamad Idris; Wan Ahmad Tajuddin Wan Abdullah; Zainol Abidin Ibrahim; Zukhaimira Zolkapli

    2013-01-01

    Full-text: The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process. (author)

  7. Integer programming theory, applications, and computations

    CERN Document Server

    Taha, Hamdy A

    1975-01-01

    Integer Programming: Theory, Applications, and Computations provides information pertinent to the theory, applications, and computations of integer programming. This book presents the computational advantages of the various techniques of integer programming.Organized into eight chapters, this book begins with an overview of the general categorization of integer applications and explains the three fundamental techniques of integer programming. This text then explores the concept of implicit enumeration, which is general in a sense that it is applicable to any well-defined binary program. Other

  8. Evolution and applications of plant pathway resources and databases

    DEFF Research Database (Denmark)

    Sucaet, Yves; Deva, Taru

    2011-01-01

    Plants are important sources of food and plant products are essential for modern human life. Plants are increasingly gaining importance as drug and fuel resources, bioremediation tools and as tools for recombinant technology. Considering these applications, database infrastructure for plant model...... systems deserves much more attention. Study of plant biological pathways, the interconnection between these pathways and plant systems biology on the whole has in general lagged behind human systems biology. In this article we review plant pathway databases and the resources that are currently available...

  9. The Service Status and Development Strategy of the Mobile Application Service of Ancient Books Database

    Directory of Open Access Journals (Sweden)

    Yang Siluo

    2017-12-01

    Full Text Available [Purpose/significance] The mobile application of ancient books database is a change of the ancient books database from the online version to the mobile one. At present, the mobile application of ancient books database is in the initial stage of development, so it is necessary to investigate the current situation and provide suggestions for the development of it. [Method/process] This paper selected two kinds of ancient books databases, namely WeChat platform and the mobile phone client, and analyzed the operation mode and the main function. [Result/conclusion] We come to conclusion that the ancient database mobile application has some defects, such as resources in a small scale, single content and data form, and the function of single platform construction is not perfect, users pay inadequate attention to such issues. Then, we put forward some corresponding suggestions and point out that in order to construct ancient books database mobile applications, it is necessary to improve the platform construction, enrich the data form and quantity, optimize the function, emphasize the communication and interaction with the user.

  10. SWEETLEAD: an in silico database of approved drugs, regulated chemicals, and herbal isolates for computer-aided drug discovery.

    Directory of Open Access Journals (Sweden)

    Paul A Novick

    Full Text Available In the face of drastically rising drug discovery costs, strategies promising to reduce development timelines and expenditures are being pursued. Computer-aided virtual screening and repurposing approved drugs are two such strategies that have shown recent success. Herein, we report the creation of a highly-curated in silico database of chemical structures representing approved drugs, chemical isolates from traditional medicinal herbs, and regulated chemicals, termed the SWEETLEAD database. The motivation for SWEETLEAD stems from the observance of conflicting information in publicly available chemical databases and the lack of a highly curated database of chemical structures for the globally approved drugs. A consensus building scheme surveying information from several publicly accessible databases was employed to identify the correct structure for each chemical. Resulting structures are filtered for the active pharmaceutical ingredient, standardized, and differing formulations of the same drug were combined in the final database. The publically available release of SWEETLEAD (https://simtk.org/home/sweetlead provides an important tool to enable the successful completion of computer-aided repurposing and drug discovery campaigns.

  11. KVANE - a Kvanefjeld drill core database

    International Nuclear Information System (INIS)

    Lund Clausen, F.

    1980-01-01

    A database KVANE containing all drill core information from the drilling programme carried out in 1958, 1962, 1969 and 1977 at the uranium deposit in Kvanefjeld, Southwest Greenland has been made. The applicaTion software ''Statistical Analysis System (SAS)'' was used as the programming tool. It is shown how this software, usually used for other purposes, satisfy a demand of easy storing of larger data amounts. The paper describes how KVANE was made and organized and how data can be picked out of the database. A short introduction to the SAS system is also given. The database has been implemented at the Northern European University Computing Center (NEUCC) at the Technical University of Denmark. (author)

  12. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  13. Professional iPhone and iPad Database Application Programming

    CERN Document Server

    Alessi, Patrick

    2010-01-01

    A much-needed resource on database development and enterprise integration for the iPhone. An enormous demand exists for getting iPhone applications into the enterprise and this book guides you through all the necessary steps for integrating an iPhone app within an existing enterprise. Experienced iPhone developers will learn how to take advantage of the built-in capabilities of the iPhone to confidently implement a data-driven application for the iPhone.: Shows you how to integrate iPhone applications into enterprise class systems; Introduces development of data-driven applications on the iPho

  14. Application of new type of distributed multimedia databases to networked electronic museum

    Science.gov (United States)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed

  15. Handheld Devices with Wide-Area Wireless Connectivity: Applications in Astronomy Educational Technology and Remote Computational Control

    Science.gov (United States)

    Budiardja, R. D.; Lingerfelt, E. J.; Guidry, M. W.

    2003-05-01

    Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) programs and controlling data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. Another allows one to control and monitor a computation done on a Beowulf cluster by changing the parameters of the computation remotely and retrieving the result when the computation is done. The presentation will include hands-on demonstrations with real devices. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  16. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    Science.gov (United States)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  17. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  18. Validation and application of a physics database for fast reactor fuel cycle analysis

    International Nuclear Information System (INIS)

    McKnight, R.D.; Stillman, J.A.; Toppel, B.J.; Khalil, H.S.

    1994-01-01

    An effort has been made to automate the execution of fast reactor fuel cycle analysis, using EBR-II as a demonstration vehicle, and to validate the analysis results for application to the IFR closed fuel cycle demonstration at EBR-II and its fuel cycle facility. This effort has included: (1) the application of the standard ANL depletion codes to perform core-follow analyses for an extensive series of EBR-II runs, (2) incorporation of the EBR-II data into a physics database, (3) development and verification of software to update, maintain and verify the database files, (4) development and validation of fuel cycle models and methodology, (5) development and verification of software which utilizes this physics database to automate the application of the ANL depletion codes, methods and models to perform the core-follow analysis, and (6) validation studies of the ANL depletion codes and of their application in support of anticipated near-term operations in EBR-II and the Fuel Cycle Facility. Results of the validation tests indicate the physics database and associated analysis codes and procedures are adequate to predict required quantities in support of early phases of FCF operations

  19. The influence of industrial applications on a control system toolbox

    International Nuclear Information System (INIS)

    Clout, P.

    1992-01-01

    Vsystem is as an open, advanced software application toolbox for rapidly creating fast, efficient and cost-effective control and data-acquisition systems. Vsystem's modular architecture is designed for single computers, networked computers and workstations running under VAX/VMS or VAX/ELN. At the heart of Vsystem lies Vaccess, a user extendible real-time database and library of access routines. The application database provides the link to the hardware of the application and can be organized as one database or separate database installed in different computers on the network. Vsystem has found application in charged-particle accelerator control, tokamak control, and industrial research, as well as its more recent industrial applications. This paper describes the broad feature of Vsystem and the influence that recent industrial applications have had on the software. (author)

  20. SPIRE Data-Base Management System

    Science.gov (United States)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  1. Management of virtualized infrastructure for physics databases

    International Nuclear Information System (INIS)

    Topurov, Anton; Gallerani, Luigi; Chatal, Francois; Piorkowski, Mariusz

    2012-01-01

    Demands for information storage of physics metadata are rapidly increasing together with the requirements for its high availability. Most of the HEP laboratories are struggling to squeeze more from their computer centers, thus focus on virtualizing available resources. CERN started investigating database virtualization in early 2006, first by testing database performance and stability on native Xen. Since then we have been closely evaluating the constantly evolving functionality of virtualisation solutions for database and middle tier together with the associated management applications – Oracle's Enterprise Manager and VM Manager. This session will detail our long experience in dealing with virtualized environments, focusing on newest Oracle OVM 3.0 for x86 and Oracle Enterprise Manager functionality for efficiently managing your virtualized database infrastructure.

  2. Applications of interval computations

    CERN Document Server

    Kreinovich, Vladik

    1996-01-01

    Primary Audience for the Book • Specialists in numerical computations who are interested in algorithms with automatic result verification. • Engineers, scientists, and practitioners who desire results with automatic verification and who would therefore benefit from the experience of suc­ cessful applications. • Students in applied mathematics and computer science who want to learn these methods. Goal Of the Book This book contains surveys of applications of interval computations, i. e. , appli­ cations of numerical methods with automatic result verification, that were pre­ sented at an international workshop on the subject in EI Paso, Texas, February 23-25, 1995. The purpose of this book is to disseminate detailed and surveyed information about existing and potential applications of this new growing field. Brief Description of the Papers At the most fundamental level, interval arithmetic operations work with sets: The result of a single arithmetic operation is the set of all possible results as the o...

  3. Collectively loading an application in a parallel computer

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  4. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  5. BUSINESS MODELLING AND DATABASE DESIGN IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Mihai-Constantin AVORNICULUI

    2015-04-01

    Full Text Available Electronic commerce is growing constantly from one year to another in the last decade, few are the areas that also register such a growth. It covers the exchanges of computerized data, but also electronic messaging, linear data banks and electronic transfer payment. Cloud computing, a relatively new concept and term, is a model of access services via the internet to distributed systems of configurable calculus resources at request which can be made available quickly with minimum management effort and intervention from the client and the provider. Behind an electronic commerce system in cloud there is a data base which contains the necessary information for the transactions in the system. Using business modelling, we get many benefits, which makes the design of the database used by electronic commerce systems in cloud considerably easier.

  6. 77 FR 66617 - HIT Policy and Standards Committees; Workgroup Application Database

    Science.gov (United States)

    2012-11-06

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy and Standards Committees; Workgroup Application... of New ONC HIT FACA Workgroup Application Database. The Office of the National Coordinator (ONC) has.... Name of Committees: HIT Standards Committee and HIT Policy Committee. General Function of the...

  7. Advances in Computer Science, Engineering & Applications : Proceedings of the Second International Conference on Computer Science, Engineering & Applications

    CERN Document Server

    Zizka, Jan; Nagamalai, Dhinaharan

    2012-01-01

    The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of  intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.

  8. Advances in Computer Science, Engineering & Applications : Proceedings of the Second International Conference on Computer Science, Engineering & Applications

    CERN Document Server

    Zizka, Jan; Nagamalai, Dhinaharan

    2012-01-01

    The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of  intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.  

  9. Computer graphics from basic to application

    International Nuclear Information System (INIS)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-01

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  10. Computer graphics from basic to application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-15

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  11. Towards Monitoring-as-a-service for Scientific Computing Cloud applications using the ElasticSearch ecosystem

    Science.gov (United States)

    Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monitoring solution if needed. The heterogeneous accounting information is transferred from the database to the ElasticSearch engine via a custom Logstash plugin. Each use-case is indexed separately in ElasticSearch and we setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service. Moreover, we have developed a billing system for our private Cloud, which relies on the RabbitMQ message queue for asynchronous communication to the database and on the ELK stack for its graphical interface. The Italian Grid accounting framework is also migrating to a similar set-up. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BESIII

  12. Engineering applications of soft computing

    CERN Document Server

    Díaz-Cortés, Margarita-Arimatea; Rojas, Raúl

    2017-01-01

    This book bridges the gap between Soft Computing techniques and their applications to complex engineering problems. In each chapter we endeavor to explain the basic ideas behind the proposed applications in an accessible format for readers who may not possess a background in some of the fields. Therefore, engineers or practitioners who are not familiar with Soft Computing methods will appreciate that the techniques discussed go beyond simple theoretical tools, since they have been adapted to solve significant problems that commonly arise in such areas. At the same time, the book will show members of the Soft Computing community how engineering problems are now being solved and handled with the help of intelligent approaches. Highlighting new applications and implementations of Soft Computing approaches in various engineering contexts, the book is divided into 12 chapters. Further, it has been structured so that each chapter can be read independently of the others.

  13. Informatics derived materials databases for multifunctional properties

    International Nuclear Information System (INIS)

    Broderick, Scott; Rajan, Krishna

    2015-01-01

    In this review, we provide an overview of the development of quantitative structure–property relationships incorporating the impact of data uncertainty from small, limited knowledge data sets from which we rapidly develop new and larger databases. Unlike traditional database development, this informatics based approach is concurrent with the identification and discovery of the key metrics controlling structure–property relationships; and even more importantly we are now in a position to build materials databases based on design ‘intent’ and not just design parameters. This permits for example to establish materials databases that can be used for targeted multifunctional properties and not just one characteristic at a time as is presently done. This review provides a summary of the computational logic of building such virtual databases and gives some examples in the field of complex inorganic solids for scintillator applications. (review)

  14. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  15. MammoGrid: a mammography database

    CERN Multimedia

    2002-01-01

    What would be the advantages if physicians around the world could gain access to a unique mammography database? The answer may come from MammoGrid, a three-year project under the Fifth Framework Programme of the EC. Led by CERN, MammoGrid involves the UK (the Universities of Oxford, Cambridge and the West of England, Bristol, plus the company Mirada Solutions of Oxford), and Italy (the Universities of Pisa and Sassari and the Hospitals in Udine and Torino). The aim of the project is, in light of emerging GRID technology, to develop a Europe-wide database of mammograms. The database will be used to investigate a set of important healthcare applications as well as the potential of the GRID to enable healthcare professionals throughout the EU to work together effectively. The contributions of the partners include building the GRID-database infrastructure, developing image processing and Computer Aided Detection techniques, and making the clinical evaluation. The first project meeting took place at CERN in Sept...

  16. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Prorocol (WAP) applications in medical information processing

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Dørup, Jens

    2001-01-01

    script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2......) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. RESULTS: A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol...... service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. CONCLUSIONS: We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further...

  17. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  18. Computer systems and methods for the query and visualization of multidimensional databases

    Science.gov (United States)

    Stolte, Chris [Palo Alto, CA; Tang, Diane L [Palo Alto, CA; Hanrahan, Patrick [Portola Valley, CA

    2011-02-01

    In response to a user request, a computer generates a graphical user interface on a computer display. A schema information region of the graphical user interface includes multiple operand names, each operand name associated with one or more fields of a multi-dimensional database. A data visualization region of the graphical user interface includes multiple shelves. Upon detecting a user selection of the operand names and a user request to associate each user-selected operand name with a respective shelf in the data visualization region, the computer generates a visual table in the data visualization region in accordance with the associations between the operand names and the corresponding shelves. The visual table includes a plurality of panes, each pane having at least one axis defined based on data for the fields associated with a respective operand name.

  19. Cloud Computing and Its Applications in GIS

    Science.gov (United States)

    Kang, Cao

    2011-12-01

    Cloud computing is a novel computing paradigm that offers highly scalable and highly available distributed computing services. The objectives of this research are to: 1. analyze and understand cloud computing and its potential for GIS; 2. discover the feasibilities of migrating truly spatial GIS algorithms to distributed computing infrastructures; 3. explore a solution to host and serve large volumes of raster GIS data efficiently and speedily. These objectives thus form the basis for three professional articles. The first article is entitled "Cloud Computing and Its Applications in GIS". This paper introduces the concept, structure, and features of cloud computing. Features of cloud computing such as scalability, parallelization, and high availability make it a very capable computing paradigm. Unlike High Performance Computing (HPC), cloud computing uses inexpensive commodity computers. The uniform administration systems in cloud computing make it easier to use than GRID computing. Potential advantages of cloud-based GIS systems such as lower barrier to entry are consequently presented. Three cloud-based GIS system architectures are proposed: public cloud- based GIS systems, private cloud-based GIS systems and hybrid cloud-based GIS systems. Public cloud-based GIS systems provide the lowest entry barriers for users among these three architectures, but their advantages are offset by data security and privacy related issues. Private cloud-based GIS systems provide the best data protection, though they have the highest entry barriers. Hybrid cloud-based GIS systems provide a compromise between these extremes. The second article is entitled "A cloud computing algorithm for the calculation of Euclidian distance for raster GIS". Euclidean distance is a truly spatial GIS algorithm. Classical algorithms such as the pushbroom and growth ring techniques require computational propagation through the entire raster image, which makes it incompatible with the distributed nature

  20. Enhanced DIII-D Data Management Through a Relational Database

    Science.gov (United States)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  1. Applications of Evolutionary Computation

    NARCIS (Netherlands)

    Mora, Antonio M.; Squillero, Giovanni; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Smith, Stephen L; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Mora, Antonio M.; Squillero, Giovanni; Jan, Mathieu; Matthias, M; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Esparcia-Alcazar, Anna I; Silva, Sara; Agapitos, Alexandros; Cotta, Carlos; De Falco, Ivanoe; Cioppa, Antonio Della; Diwold, Konrad; Ekart, Aniko; Tarantino, Ernesto; Vega, Francisco Fernandez De; Burelli, Paolo; Sim, Kevin; Cagnoni, Stefano; Simoes, Anabela; Merelo, J.J.; Urquhart, Neil; Haasdijk, Evert; Zhang, Mengjie; Squillero, Giovanni; Eiben, A E; Tettamanzi, Andrea G B; Glette, Kyrre; Rohlfshagen, Philipp; Schaefer, Robert; Caserta, Marco; Ramirez, Adriana; Voß, Stefan

    2015-01-01

    The application of genetic and evolutionary computation to problems in medicine has increased rapidly over the past five years, but there are specific issues and challenges that distinguish it from other real-world applications. Obtaining reliable and coherent patient data, establishing the clinical

  2. Automated Oracle database testing

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Ensuring database stability and steady performance in the modern world of agile computing is a major challenge. Various changes happening at any level of the computing infrastructure: OS parameters & packages, kernel versions, database parameters & patches, or even schema changes, all can potentially harm production services. This presentation shows how an automatic and regular testing of Oracle databases can be achieved in such agile environment.

  3. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  4. Real-life applications with membrane computing

    CERN Document Server

    Zhang, Gexiang; Gheorghe, Marian

    2017-01-01

    This book thoroughly investigates the underlying theoretical basis of membrane computing models, and reveals their latest applications. In addition, to date there have been no illustrative case studies or complex real-life applications that capitalize on the full potential of the sophisticated membrane systems computational apparatus; gaps that this book remedies. By studying various complex applications – including engineering optimization, power systems fault diagnosis, mobile robot controller design, and complex biological systems involving data modeling and process interactions – the book also extends the capabilities of membrane systems models with features such as formal verification techniques, evolutionary approaches, and fuzzy reasoning methods. As such, the book offers a comprehensive and up-to-date guide for all researchers, PhDs and undergraduate students in the fields of computer science, engineering and the bio-sciences who are interested in the applications of natural computing models.

  5. Applications of computational intelligence in nuclear reactors

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Jehadeesan, R.

    2016-01-01

    Computational intelligence techniques have been successfully employed in a wide range of applications which include the domains of medical, bioinformatics, electronics, communications and business. There has been progress in applying of computational intelligence in the nuclear reactor domain during the last two decades. The stringent nuclear safety regulations pertaining to reactor environment present challenges in the application of computational intelligence in various nuclear sub-systems. The applications of various methods of computational intelligence in the domain of nuclear reactors are discussed in this paper. (author)

  6. Practical applications of soft computing in engineering

    CERN Document Server

    2001-01-01

    Soft computing has been presented not only with the theoretical developments but also with a large variety of realistic applications to consumer products and industrial systems. Application of soft computing has provided the opportunity to integrate human-like vagueness and real-life uncertainty into an otherwise hard computer program. This book highlights some of the recent developments in practical applications of soft computing in engineering problems. All the chapters have been sophisticatedly designed and revised by international experts to achieve wide but in-depth coverage. Contents: Au

  7. Grid computing infrastructure, service, and applications

    CERN Document Server

    Jie, Wei; Chen, Jinjun

    2009-01-01

    Offering a comprehensive discussion of advances in grid computing, this book summarizes the concepts, methods, technologies, and applications. It covers topics such as philosophy, middleware, architecture, services, and applications. It also includes technical details to demonstrate how grid computing works in the real world

  8. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  9. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  10. Application of computational systems biology to explore environmental toxicity hazards

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Grandjean, Philippe

    2011-01-01

    Background: Computer-based modeling is part of a new approach to predictive toxicology.Objectives: We investigated the usefulness of an integrated computational systems biology approach in a case study involving the isomers and metabolites of the pesticide dichlorodiphenyltrichloroethane (DDT......) to ascertain their possible links to relevant adverse effects.Methods: We extracted chemical-protein association networks for each DDT isomer and its metabolites using ChemProt, a disease chemical biology database that includes both binding and gene expression data, and we explored protein-protein interactions...... using a human interactome network. To identify associated dysfunctions and diseases, we integrated protein-disease annotations into the protein complexes using the Online Mendelian Inheritance in Man database and the Comparative Toxicogenomics Database.Results: We found 175 human proteins linked to p,p´-DDT...

  11. Constructing a Geology Ontology Using a Relational Database

    Science.gov (United States)

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances

  12. Handbook of video databases design and applications

    CERN Document Server

    Furht, Borko

    2003-01-01

    INTRODUCTIONIntroduction to Video DatabasesOge Marques and Borko FurhtVIDEO MODELING AND REPRESENTATIONModeling Video Using Input/Output Markov Models with Application to Multi-Modal Event DetectionAshutosh Garg, Milind R. Naphade, and Thomas S. HuangStatistical Models of Video Structure and SemanticsNuno VasconcelosFlavor: A Language for Media RepresentationAlexandros Eleftheriadis and Danny HongIntegrating Domain Knowledge and Visual Evidence to Support Highlight Detection in Sports VideosJuergen Assfalg, Marco Bertini, Carlo Colombo, and Alberto Del BimboA Generic Event Model and Sports Vid

  13. [Mobile phone-computer wireless interactive graphics transmission technology and its medical application].

    Science.gov (United States)

    Huang, Shuo; Liu, Jing

    2010-05-01

    Application of clinical digital medical imaging has raised many tough issues to tackle, such as data storage, management, and information sharing. Here we investigated a mobile phone based medical image management system which is capable of achieving personal medical imaging information storage, management and comprehensive health information analysis. The technologies related to the management system spanning the wireless transmission technology, the technical capabilities of phone in mobile health care and management of mobile medical database were discussed. Taking medical infrared images transmission between phone and computer as an example, the working principle of the present system was demonstrated.

  14. Computer-aided diagnosis system for bone scintigrams from Japanese patients: importance of training database

    DEFF Research Database (Denmark)

    Horikoshi, Hiroyuki; Kikuchi, Akihiro; Onoguchi, Masahisa

    2012-01-01

    higher performance than the corresponding CAD software trained with a European database for the analysis of bone scans from Japanese patients. These results could at least partly be caused by the physical differences between Japanese and European patients resulting in less influence of attenuation......Computer-aided diagnosis (CAD) software for bone scintigrams have recently been introduced as a clinical quality assurance tool. The purpose of this study was to compare the diagnostic accuracy of two CAD systems, one based on a European and one on a Japanese training database, in a group of bone...... scans from Japanese patients.The two CAD software are trained to interpret bone scans using training databases consisting of bone scans with the desired interpretation, metastatic disease or not. One software was trained using 795 bone scans from European patients and the other with 904 bone scans from...

  15. Database architecture optimized for the new bottleneck: Memory access

    NARCIS (Netherlands)

    P.A. Boncz (Peter); S. Manegold (Stefan); M.L. Kersten (Martin)

    1999-01-01

    textabstractIn the past decade, advances in speed of commodity CPUs have far out-paced advances in memory latency. Main-memory access is therefore increasingly a performance bottleneck for many computer applications, including database systems. In this article, we use a simple scan test to show the

  16. Optimizing Database Architecture for the New Bottleneck: Memory Access

    NARCIS (Netherlands)

    S. Manegold (Stefan); P.A. Boncz (Peter); M.L. Kersten (Martin)

    2000-01-01

    textabstractIn the past decade, advances in speed of commodity CPUs have far out-paced advances in memory latency. Main-memory access is therefore increasingly a performance bottleneck for many computer applications, including database systems. In this article, we use a simple scan test to show the

  17. TuBaFrost 5: multifunctional central database application for a European tumor bank.

    Science.gov (United States)

    Isabelle, M; Teodorovic, I; Morente, M M; Jaminé, D; Passioukov, A; Lejeune, S; Therasse, P; Dinjens, W N M; Oosterhuis, J W; Lam, K H; Oomen, M H A; Spatz, A; Ratcliffe, C; Knox, K; Mager, R; Kerr, D; Pezzella, F; van de Vijver, M; van Boven, H; Alonso, S; Kerjaschki, D; Pammer, J; Lopez-Guerrero, J A; Llombart Bosch, A; Carbone, A; Gloghini, A; van Veen, E-B; van Damme, B; Riegman, P H J

    2006-12-01

    Developing a tissue bank database has become more than just logically arranging data in tables combined with a search engine. Current demand for high quality samples and data, and the ever-changing legal and ethical regulations mean that the application must reflect TuBaFrost rules and protocols for the collection, exchange and use of tissue. To ensure continuation and extension of the TuBaFrost European tissue bank, the custodianship of the samples, and hence the decision over whether to issue samples to requestors, remains with the local collecting centre. The database application described in this article has been developed to facilitate this open structure virtual tissue bank model serving a large group. It encompasses many key tasks, without the requirement for personnel, hence minimising operational costs. The Internet-accessible database application enables search, selection and request submission for requestors, whereas collectors can upload and edit their collection. Communication between requestor and involved collectors is started with automatically generated e-mails.

  18. Application of cloud database in the management of clinical data of patients with skin diseases.

    Science.gov (United States)

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  19. THE DEVELOPMENT OF A WEB BASED DATABASE APPLICATIONS OF PROCUREMENT, INVENTORY, AND SALES AT PT. INTERJAYA SURYA MEGAH

    Directory of Open Access Journals (Sweden)

    Choirul Huda

    2011-10-01

    Full Text Available The objective of this research is to develop a web based database application for the procurement, inventory and sales at PT. Interjaya Surya Megah. The current system at PT. Interjaya Surya Megah is running manually, so the company has difficulty in carrying out its activities. The methodology, that is used in this research, includes interviews, observation, literature review, conceptual database design, logical database design and physical database design. The results are the establishment of a web-based database application at PT. Interjaya Surya Megah. The conclusion is the company can be easier to run the day-to-day activities because data processing becomes faster and more accurate, faster report generation and more accurate, more secure data storage.Keywords: database; application; procurement; inventory; sales

  20. Computer systems and methods for the query and visualization of multidimensional databases

    Science.gov (United States)

    Stolte, Chris; Tang, Diane L; Hanrahan, Patrick

    2015-03-03

    A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes multiple operand names, each operand corresponding to one or more fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first operands with the columns shelf and to associate one or more second operands with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first operands, and each pane has a y-axis defined based on data for the one or more second operands.

  1. Application Program Interface for the Orion Aerodynamics Database

    Science.gov (United States)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The

  2. Computer Applications in the Design Process.

    Science.gov (United States)

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  3. Geo-scientific database for research and development purposes

    International Nuclear Information System (INIS)

    Tabani, P.; Mangeot, A.; Crabol, V.; Delage, P.; Dewonck, S.; Auriere, C.

    2012-01-01

    , fruit of a continuous computer development over the past ten years, can store several hundreds of million data. The GEO database (geo-scientific database) is a tool developed by Andra since 1992, in order to group in a secured computer form all data related to the acquisition of in situ and laboratory measurements made on solid and fluid samples as well as observations related to environment. This database has three main functions: - Acquisition and management of data and computer files related to geological, geomechanical, hydrogeological and geochemical measurements on solid and fluid samples and in situ measurements (logging, on sample measurements, geological logs, etc.) as well as observations on fauna and flora. - Available consultation by the staff on Andra's intranet network for selective viewing of data linked to a borehole, a sample or a watch point and for making computations and graphs on sets of laboratory measurements related to a sample. - Physical management of fluid and solid samples stored in a 'core library' in order to localize a sample, follow-up its movement out of the 'core library' to an organization, and carry out regular inventories. Three geo-scientific software are linked to the Geo database: - Geosciences portal: it's a web Intranet application accessible from the ANDRA network. This application is used for the acquisition of hydrogeological and geochemical data collected on the field and on fluid samples, observations related to environmental monitoring, as well as data related to scientific work carried out at surface level or in drifts. - GESTECH application is a software used to integrate geomechanical and geological data collected on solid samples in the GEO database. - INTEGRAT application is a software application automatically integrates data files in the GEO database. For the sake of traceability and efficiency, references of the fluid and solid samples, of the containers (crates, cells, etc.) and storage zones of the 'core library

  4. Pattern database applications from design to manufacturing

    Science.gov (United States)

    Zhuang, Linda; Zhu, Annie; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    Pattern-based approaches are becoming more common and popular as the industry moves to advanced technology nodes. At the beginning of a new technology node, a library of process weak point patterns for physical and electrical verification are starting to build up and used to prevent known hotspots from re-occurring on new designs. Then the pattern set is expanded to create test keys for process development in order to verify the manufacturing capability and precheck new tape-out designs for any potential yield detractors. With the database growing, the adoption of pattern-based approaches has expanded from design flows to technology development and then needed for mass-production purposes. This paper will present the complete downstream working flows of a design pattern database(PDB). This pattern-based data analysis flow covers different applications across different functional teams from generating enhancement kits to improving design manufacturability, populating new testing design data based on previous-learning, generating analysis data to improve mass-production efficiency and manufacturing equipment in-line control to check machine status consistency across different fab sites.

  5. [Computer program "PANCREAS"].

    Science.gov (United States)

    Jakubowicz, J; Jankowski, M; Szomański, B; Switka, S; Zagórowicz, E; Pertkiewicz, M; Szczygieł, B

    1998-01-01

    Contemporary computer technology allows precise and fast large database analysis. Widespread and common use depends on appropriate, user friendly software, usually lacking in special medical applications. The aim of this work was to develop an integrated system designed to store, explore and analyze data of patients treated for pancreatic cancer. For that purpose the database administration system MS Visual Fox Pro 3.0 was used and special application, according to ISO 9000 series has been developed. The system works under MS Windows 95 with possibility of easy adaptation to MS Windows 3.11 or MS Windows NT by graphic user's interface. The system stores personal data, laboratory results, visual and histological analyses and information on treatment course and complications. However the system archives them and enables the preparation reports of according to individual and statistical needs. Help and security settings allow to work also for one not familiar with computer science.

  6. Monet: a next-generation database kernel for query-intensive applications

    NARCIS (Netherlands)

    P.A. Boncz (Peter)

    2002-01-01

    htmlabstractMonet is a database kernel targeted at query-intensive, heavy analysis applications (the opposite of transaction processing), which include OLAP and data mining, but also go beyond the business domain in GIS processing, multi-media retrieval and XML. The clean sheet approach of Monet

  7. Design and implementation of the CEBAF element database

    International Nuclear Information System (INIS)

    Larrieu, T.; Joyce, M.; Slominski, C.

    2012-01-01

    With the inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting control computers to building controls screens. A requirement influencing the CED design is that it provide access to not only present, but also future and past configurations of the accelerator. To accomplish this, an introspective database schema was designed that allows new elements, types, and properties to be defined on-the-fly with no changes to table structure. Used in conjunction with Oracle Workspace Manager, it allows users to query data from any time in the database history with the same tools used to query the present configuration. Users can also check-out workspaces to use as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented Application Programming Interface (API) that is translated automatically from original C++ source code into native libraries for scripting languages such as perl, php, and TCL making access to the CED easy and ubiquitous. (authors)

  8. Computer applications in water conservancy and hydropower engineering

    Energy Technology Data Exchange (ETDEWEB)

    Chen, J

    1984-09-20

    The use of computers in China's water conservancy and hydropower construction began in the 1960s for exploration surveys, planning, design, construction, operation, and scientific research. Despite the positive results, and the formation of a 1000-person computer computation contingent, computer development among different professions is not balanced. The weaknesses and disparities in computer applications include an overall low level of application relative to the rest of the world, which is partly due to inadequate hardware and programs. The report suggests five ways to improve applications and popularize microcomputers which emphasize leadership and planning.

  9. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  10. Experience with CANDID: Comparison algorithm for navigating digital image databases

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, P.; Cannon, M.

    1994-10-01

    This paper presents results from the authors experience with CANDID (Comparison Algorithm for Navigating Digital Image Databases), which was designed to facilitate image retrieval by content using a query-by-example methodology. A global signature describing the texture, shape, or color content is first computed for every image stored in a database, and a normalized similarity measure between probability density functions of feature vectors is used to match signatures. This method can be used to retrieve images from a database that are similar to a user-provided example image. Results for three test applications are included.

  11. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  12. The Evolution of Computing: Slowing down? Not Yet!

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Dr Sutherland will review the evolution of computing over the past decade, focusing particularly on the development of the database and middleware from client server to Internet computing. But what are the next steps from the perspective of a software company? Dr Sutherland will discuss the development of Grid as well as the future applications revolving around collaborative working, which are appearing as the next wave of computing applications.

  13. Introduction to morphogenetic computing

    CERN Document Server

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  14. Computational geometry for reactor applications

    International Nuclear Information System (INIS)

    Brown, F.B.; Bischoff, F.G.

    1988-01-01

    Monte Carlo codes for simulating particle transport involve three basic computational sections: a geometry package for locating particles and computing distances to regional boundaries, a physics package for analyzing interactions between particles and problem materials, and an editing package for determining event statistics and overall results. This paper describes the computational geometry methods in RACER, a vectorized Monte Carlo code used for reactor physics analysis, so that comparisons may be made with techniques used in other codes. The principal applications for RACER are eigenvalue calculations and power distributions associated with reactor core physics analysis. Successive batches of neutrons are run until convergence and acceptable confidence intervals are obtained, with typical problems involving >10 6 histories. As such, the development of computational geometry methods has emphasized two basic needs: a flexible but compact geometric representation that permits accurate modeling of reactor core details and efficient geometric computation to permit very large numbers of histories to be run. The current geometric capabilities meet these needs effectively, supporting a variety of very large and demanding applications

  15. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  16. DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs

    Science.gov (United States)

    Wrenn, Gregory A.

    2005-01-01

    This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.

  17. Exploiting relational database technology in a GIS

    Science.gov (United States)

    Batty, Peter

    1992-05-01

    All systems for managing data face common problems such as backup, recovery, auditing, security, data integrity, and concurrent update. Other challenges include the ability to share data easily between applications and to distribute data across several computers, whereas continuing to manage the problems already mentioned. Geographic information systems are no exception, and need to tackle all these issues. Standard relational database-management systems (RDBMSs) provide many features to help solve the issues mentioned so far. This paper describes how the IBM geoManager product approaches these issues by storing all its geographic data in a standard RDBMS in order to take advantage of such features. Areas in which standard RDBMS functions need to be extended are highlighted, and the way in which geoManager does this is explained. The performance implications of storing all data in the relational database are discussed. An important distinction is made between the storage and management of geographic data and the manipulation and analysis of geographic data, which needs to be made when considering the applicability of relational database technology to GIS.

  18. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors; Desenvolvimento de uma base de dados computacional para aplicação em Análise Probabilística de Segurança de reatores nucleares de pesquisa

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner dos Santos

    2016-07-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  19. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    Science.gov (United States)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  20. Teaching Psychology Students Computer Applications.

    Science.gov (United States)

    Atnip, Gilbert W.

    This paper describes an undergraduate-level course designed to teach the applications of computers that are most relevant in the social sciences, especially psychology. After an introduction to the basic concepts and terminology of computing, separate units were devoted to word processing, data analysis, data acquisition, artificial intelligence,…

  1. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  2. DATABASES DEVELOPED IN INDIA FOR BIOLOGICAL SCIENCES

    Directory of Open Access Journals (Sweden)

    Gitanjali Yadav

    2017-09-01

    Full Text Available The complexity of biological systems requires use of a variety of experimental methods with ever increasing sophistication to probe various cellular processes at molecular and atomic resolution. The availability of technologies for determining nucleic acid sequences of genes and atomic resolution structures of biomolecules prompted development of major biological databases like GenBank and PDB almost four decades ago. India was one of the few countries to realize early, the utility of such databases for progress in modern biology/biotechnology. Department of Biotechnology (DBT, India established Biotechnology Information System (BTIS network in late eighties. Starting with the genome sequencing revolution at the turn of the century, application of high-throughput sequencing technologies in biology and medicine for analysis of genomes, transcriptomes, epigenomes and microbiomes have generated massive volumes of sequence data. BTIS network has not only provided state of the art computational infrastructure to research institutes and universities for utilizing various biological databases developed abroad in their research, it has also actively promoted research and development (R&D projects in Bioinformatics to develop a variety of biological databases in diverse areas. It is encouraging to note that, a large number of biological databases or data driven software tools developed in India, have been published in leading peer reviewed international journals like Nucleic Acids Research, Bioinformatics, Database, BMC, PLoS and NPG series publication. Some of these databases are not only unique, they are also highly accessed as reflected in number of citations. Apart from databases developed by individual research groups, BTIS has initiated consortium projects to develop major India centric databases on Mycobacterium tuberculosis, Rice and Mango, which can potentially have practical applications in health and agriculture. Many of these biological

  3. Towards Monitoring-as-a-service for Scientific Computing Cloud applications using the ElasticSearch ecosystem

    CERN Document Server

    Bagnasco, S; Guarise, A; Lusso, S; Masera, M; Vallero, S

    2015-01-01

    The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monit...

  4. The eNanoMapper database for nanomaterial safety information.

    Science.gov (United States)

    Jeliazkova, Nina; Chomenidis, Charalampos; Doganis, Philip; Fadeel, Bengt; Grafström, Roland; Hardy, Barry; Hastings, Janna; Hegi, Markus; Jeliazkov, Vedrin; Kochev, Nikolay; Kohonen, Pekka; Munteanu, Cristian R; Sarimveis, Haralambos; Smeets, Bart; Sopasakis, Pantelis; Tsiliki, Georgia; Vorgrimmler, David; Willighagen, Egon

    2015-01-01

    The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the "representational state transfer" (REST) API enables building user friendly

  5. The Effect of Relational Database Technology on Administrative Computing at Carnegie Mellon University.

    Science.gov (United States)

    Golden, Cynthia; Eisenberger, Dorit

    1990-01-01

    Carnegie Mellon University's decision to standardize its administrative system development efforts on relational database technology and structured query language is discussed and its impact is examined in one of its larger, more widely used applications, the university information system. Advantages, new responsibilities, and challenges of the…

  6. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  7. Portable database driven control system for SPEAR

    International Nuclear Information System (INIS)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-04-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system. 10 refs., 1 fig

  8. Portable database driven control system for SPEAR

    Energy Technology Data Exchange (ETDEWEB)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-04-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system. 10 refs., 1 fig.

  9. On-line database of the spectral properties of polycyclic aromatic hydrocarbons

    International Nuclear Information System (INIS)

    Malloci, Giuliano; Joblin, Christine; Mulas, Giacomo

    2007-01-01

    We present an on-line database of computed molecular properties for a large sample of polycyclic aromatic hydrocarbons in four charge states: -1, 0, +1, and +2. At present our database includes 40 molecules ranging in size from naphthalene and azulene (C 10 H 8 ) up to circumovalene (C 66 H 20 ). We performed our calculations in the framework of the density functional theory (DFT) and the time-dependent DFT to obtain the most relevant molecular parameters needed for astrophysical applications. For each molecule in the sample, our database presents in a uniform way the energetic, rotational, vibrational, and electronic properties. It is freely accessible on the web at (http://astrochemistry.ca.astro.it/database/) and (http://www.cesr.fr/~joblin/database/)

  10. The ATLAS Distributed Data Management System & Databases

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  11. Computer Applications in Reading. Third Edition.

    Science.gov (United States)

    Blanchard, Jay S.; And Others

    Intended as a reference for researchers, teachers, and administrators, this book chronicles research, programs, and uses of computers in reading. Chapter 1 provides a broad view of computer applications in education, while Chapter 2 provides annotated references for computer based reading and language arts programs for children and adults in…

  12. Virtual materials design using databases of calculated materials properties

    International Nuclear Information System (INIS)

    Munter, T R; Landis, D D; Abild-Pedersen, F; Jones, G; Wang, S; Bligaard, T

    2009-01-01

    Materials design is most commonly carried out by experimental trial and error techniques. Current trends indicate that the increased complexity of newly developed materials, the exponential growth of the available computational power, and the constantly improving algorithms for solving the electronic structure problem, will continue to increase the relative importance of computational methods in the design of new materials. One possibility for utilizing electronic structure theory in the design of new materials is to create large databases of materials properties, and subsequently screen these for new potential candidates satisfying given design criteria. We utilize a database of more than 81 000 electronic structure calculations. This alloy database is combined with other published materials properties to form the foundation of a virtual materials design framework (VMDF). The VMDF offers a flexible collection of materials databases, filters, analysis tools and visualization methods, which are particularly useful in the design of new functional materials and surface structures. The applicability of the VMDF is illustrated by two examples. One is the determination of the Pareto-optimal set of binary alloy methanation catalysts with respect to catalytic activity and alloy stability; the other is the search for new alloy mercury absorbers.

  13. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  14. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  15. Exploring Human Cognition Using Large Image Databases.

    Science.gov (United States)

    Griffiths, Thomas L; Abbott, Joshua T; Hsu, Anne S

    2016-07-01

    Most cognitive psychology experiments evaluate models of human cognition using a relatively small, well-controlled set of stimuli. This approach stands in contrast to current work in neuroscience, perception, and computer vision, which have begun to focus on using large databases of natural images. We argue that natural images provide a powerful tool for characterizing the statistical environment in which people operate, for better evaluating psychological theories, and for bringing the insights of cognitive science closer to real applications. We discuss how some of the challenges of using natural images as stimuli in experiments can be addressed through increased sample sizes, using representations from computer vision, and developing new experimental methods. Finally, we illustrate these points by summarizing recent work using large image databases to explore questions about human cognition in four different domains: modeling subjective randomness, defining a quantitative measure of representativeness, identifying prior knowledge used in word learning, and determining the structure of natural categories. Copyright © 2016 Cognitive Science Society, Inc.

  16. Development of comprehensive material performance database for nuclear applications

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime

    1993-01-01

    This paper introduces the present status of the comprehensive material performance database for nuclear applications, which was named JAERI Material Performance Database (JMPD), and examples of its utilization. The JMPD has been developed since 1986 in JAERI with a view to utilizing various kinds of characteristics data of nuclear materials efficiently. Management system of relational database, PLANNER, was employed, and supporting systems for data retrieval and output were expanded. In order to improve user-friendliness of the retrieval system, the menu selection type procedures have been developed where knowledge of the system or the data structures are not required for end-users. As to utilization of the JMPD, two types of data analyses are mentioned as follows: (1) A series of statistical analyses was performed in order to estimate the design values both of the yield strength (Sy) and the tensile strength (Su) for aluminum alloys which are widely used as structural materials for research reactors. (2) Statistical analyses were accomplished by using the cyclic crack growth rate data for nuclear pressure vessel steels, and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and ΔK-constant type tests. (author)

  17. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  18. A database to enable discovery and design of piezoelectric materials

    Science.gov (United States)

    de Jong, Maarten; Chen, Wei; Geerlings, Henry; Asta, Mark; Persson, Kristin Aslaug

    2015-01-01

    Piezoelectric materials are used in numerous applications requiring a coupling between electrical fields and mechanical strain. Despite the technological importance of this class of materials, for only a small fraction of all inorganic compounds which display compatible crystallographic symmetry, has piezoelectricity been characterized experimentally or computationally. In this work we employ first-principles calculations based on density functional perturbation theory to compute the piezoelectric tensors for nearly a thousand compounds, thereby increasing the available data for this property by more than an order of magnitude. The results are compared to select experimental data to establish the accuracy of the calculated properties. The details of the calculations are also presented, along with a description of the format of the database developed to make these computational results publicly available. In addition, the ways in which the database can be accessed and applied in materials development efforts are described. PMID:26451252

  19. A database to enable discovery and design of piezoelectric materials.

    Science.gov (United States)

    de Jong, Maarten; Chen, Wei; Geerlings, Henry; Asta, Mark; Persson, Kristin Aslaug

    2015-01-01

    Piezoelectric materials are used in numerous applications requiring a coupling between electrical fields and mechanical strain. Despite the technological importance of this class of materials, for only a small fraction of all inorganic compounds which display compatible crystallographic symmetry, has piezoelectricity been characterized experimentally or computationally. In this work we employ first-principles calculations based on density functional perturbation theory to compute the piezoelectric tensors for nearly a thousand compounds, thereby increasing the available data for this property by more than an order of magnitude. The results are compared to select experimental data to establish the accuracy of the calculated properties. The details of the calculations are also presented, along with a description of the format of the database developed to make these computational results publicly available. In addition, the ways in which the database can be accessed and applied in materials development efforts are described.

  20. Distributed control and data processing system with a centralized database for a BWR power plant

    International Nuclear Information System (INIS)

    Fujii, K.; Neda, T.; Kawamura, A.; Monta, K.; Satoh, K.

    1980-01-01

    Recent digital techniques based on changes in electronics and computer technologies have realized a very wide scale of computer application to BWR Power Plant control and instrumentation. Multifarious computers, from micro to mega, are introduced separately. And to get better control and instrumentation system performance, hierarchical computer complex system architecture has been developed. This paper addresses the hierarchical computer complex system architecture which enables more efficient introduction of computer systems to a Nuclear Power Plant. Distributed control and processing systems, which are the components of the hierarchical computer complex, are described in some detail, and the database for the hierarchical computer complex is also discussed. The hierarchical computer complex system has been developed and is now in the detailed design stage for actual power plant application. (auth)

  1. BDVC (Bimodal Database of Violent Content): A database of violent audio and video

    Science.gov (United States)

    Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro

    2017-09-01

    Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.

  2. Brain-Computer Interfaces : Beyond Medical Applications

    NARCIS (Netherlands)

    Erp, J.B.F. van; Lotte, F.; Tangermann, M.

    2012-01-01

    Brain-computer interaction has already moved from assistive care to applications such as gaming. Improvements in usability, hardware, signal processing, and system integration should yield applications in other nonmedical areas.

  3. Database of Information technology resources

    OpenAIRE

    Barzda, Erlandas

    2005-01-01

    The subject of this master work is the internet information resource database. This work also handles the problems of old information systems which do not meet the new contemporary requirements. The aim is to create internet information system, based on object-oriented technologies and tailored to computer users’ needs. The internet information database system helps computers administrators to get the all needed information about computers network elements and easy to register all changes int...

  4. An approach for access differentiation design in medical distributed applications built on databases.

    Science.gov (United States)

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  5. 6th International Conference on Computer Science and its Applications

    CERN Document Server

    Stojmenovic, Ivan; Jeong, Hwa; Yi, Gangman

    2015-01-01

    The 6th FTRA International Conference on Computer Science and its Applications (CSA-14) will be held in Guam, USA, Dec. 17 - 19, 2014. CSA-14 presents a comprehensive conference focused on the various aspects of advances in engineering systems in computer science, and applications, including ubiquitous computing, U-Health care system, Big Data, UI/UX for human-centric computing, Computing Service, Bioinformatics and Bio-Inspired Computing and will show recent advances on various aspects of computing technology, Ubiquitous Computing Services and its application.

  6. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  7. Archives: Journal of Computer Science and Its Application

    African Journals Online (AJOL)

    Items 1 - 9 of 9 ... Archives: Journal of Computer Science and Its Application. Journal Home > Archives: Journal of Computer Science and Its Application. Log in or Register to get access to full text downloads.

  8. Advanced technologies for scalable ATLAS conditions database access on the grid

    CERN Document Server

    Basset, R; Dimitrov, G; Girone, M; Hawkings, R; Nevski, P; Valassi, A; Vaniachine, A; Viegas, F; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysi...

  9. ZAGRADA - A New Radiocarbon Database

    International Nuclear Information System (INIS)

    Portner, A.; Obelic, B.; Krajcar Bornic, I.

    2008-01-01

    In the Radiocarbon and Tritium Laboratory at the Rudjer Boskovic Institute three different techniques for 14C dating have been used: Gas Proportional Counting (GPC), Liquid Scintillation Counting (LSC) and preparation of milligram-sized samples for AMS dating (Accelerator Mass Spectrometry). The use of several measurement techniques has initiated a need for development of a new relational database ZAGRADA (Zagreb Radiocarbon Database) since the existing software package CARBO could not satisfy the requirements for parallel processing/using of several techniques. Using the SQL procedures, and constraints defined by primary and foreign keys, ZAGRADA enforces high data integrity and provides better performances in data filtering and sorting. Additionally, the new database for 14C samples is a multi-user oriented application that can be accessed from remote computers in the work group providing thus better efficiency of laboratory activities. In order to facilitate data handling and processing in ZAGRADA, the graphical user interface is designed to be user-friendly and to perform various actions on data like input, corrections, searching, sorting and output to printer. All invalid actions performed in user interface are registered with short textual description of an error occurred and appearing on screen in message boxes. Unauthorized access is also prevented by login control and each application window has implemented support to track last changes made by the user. The implementation of a new database for 14C samples has significant contribution to scientific research performed in the Radiocarbon and Tritium Laboratory and will provide better and easier communication with customers.(author)

  10. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  11. Semantic-Based Concurrency Control for Object-Oriented Database Systems Supporting Real-Time Applications

    National Research Council Canada - National Science Library

    Lee, Juhnyoung; Son, Sang H

    1994-01-01

    .... This paper investigates major issues in designing semantic-based concurrency control for object-oriented database systems supporting real-time applications, and it describes approaches to solving...

  12. Toward An Unstructured Mesh Database

    Science.gov (United States)

    Rezaei Mahdiraji, Alireza; Baumann, Peter Peter

    2014-05-01

    Unstructured meshes are used in several application domains such as earth sciences (e.g., seismology), medicine, oceanography, cli- mate modeling, GIS as approximate representations of physical objects. Meshes subdivide a domain into smaller geometric elements (called cells) which are glued together by incidence relationships. The subdivision of a domain allows computational manipulation of complicated physical structures. For instance, seismologists model earthquakes using elastic wave propagation solvers on hexahedral meshes. The hexahedral con- tains several hundred millions of grid points and millions of hexahedral cells. Each vertex node in the hexahedrals stores a multitude of data fields. To run simulation on such meshes, one needs to iterate over all the cells, iterate over incident cells to a given cell, retrieve coordinates of cells, assign data values to cells, etc. Although meshes are used in many application domains, to the best of our knowledge there is no database vendor that support unstructured mesh features. Currently, the main tool for querying and manipulating unstructured meshes are mesh libraries, e.g., CGAL and GRAL. Mesh li- braries are dedicated libraries which includes mesh algorithms and can be run on mesh representations. The libraries do not scale with dataset size, do not have declarative query language, and need deep C++ knowledge for query implementations. Furthermore, due to high coupling between the implementations and input file structure, the implementations are less reusable and costly to maintain. A dedicated mesh database offers the following advantages: 1) declarative querying, 2) ease of maintenance, 3) hiding mesh storage structure from applications, and 4) transparent query optimization. To design a mesh database, the first challenge is to define a suitable generic data model for unstructured meshes. We proposed ImG-Complexes data model as a generic topological mesh data model which extends incidence graph model to multi

  13. Application of embedded database to digital power supply system in HIRFL

    International Nuclear Information System (INIS)

    Wu Guanghua; Yan Huaihai; Chen Youxin; Huang Yuzhen; Zhou Zhongzu; Gao Daqing

    2014-01-01

    Background: This paper introduces the application of embedded MySQL database in the real-time monitoring system of the digital power supply system in Heavy Ion Research Facility in Lanzhou (HIRFL). Purpose: The aim is to optimize the real-time monitoring system of the digital power supply system for better performance. Methods: The MySQL database is designed and implemented under Linux operation system running on ARM processor, together with the related functions for real-time data monitoring, such as collection, storage and query. All status parameters of digital power supply system is collected and communicated with ARM by a FPGA, whilst the user interface is realized by Qt toolkits at ARM end. Results: The actual operation indicates that digital power supply can realize the function of real-time data monitoring, collection, storage and so on. Conclusion: Through practical application, we have found some aspects we can improve and we will try to optimize them in the future. (authors)

  14. Databases and information systems: Applications in biogeography

    International Nuclear Information System (INIS)

    Escalante E, Tania; Llorente B, Jorge; Espinoza O, David N; Soberon M, Jorge

    2000-01-01

    Some aspects of the new instrumentalization and methodological elements that make up information systems in biodiversity (ISB) are described. The use of accurate geographically referenced data allows a broad range of available sources: natural history collections and scientific literature require the use of databases and geographic information systems (GIS). The conceptualization of ISB and GIS, based in the use of extensive data bases, has implied detailed modeling and the construction of authoritative archives: exhaustive catalogues of nomenclature and synonymies, complete bibliographic lists, list of names proposed, historical-geographic gazetteers with localities and their synonyms united under a global positioning system which produces a geospheric conception of the earth and its biota. Certain difficulties in the development of the system and the construction of the biological databases are explained: quality control of data, for example. The use of such systems is basic in order to respond to many questions at the frontier of current studies of biodiversity and conservation. In particular, some applications in biogeography and their importance for modeling distributions, to identify and contrast areas of endemism and biological richness for conservation, and their use as tools in what we identify as predictive and experimental faunistics are detailed. Lastly, the process as well as its relevance is emphasized at national and regional levels

  15. Artificial immune system applications in computer security

    CERN Document Server

    Tan, Ying

    2016-01-01

    This book provides state-of-the-art information on the use, design, and development of the Artificial Immune System (AIS) and AIS-based solutions to computer security issues. Artificial Immune System: Applications in Computer Security focuses on the technologies and applications of AIS in malware detection proposed in recent years by the Computational Intelligence Laboratory of Peking University (CIL@PKU). It offers a theoretical perspective as well as practical solutions for readers interested in AIS, machine learning, pattern recognition and computer security. The book begins by introducing the basic concepts, typical algorithms, important features, and some applications of AIS. The second chapter introduces malware and its detection methods, especially for immune-based malware detection approaches. Successive chapters present a variety of advanced detection approaches for malware, including Virus Detection System, K-Nearest Neighbour (KNN), RBF networ s, and Support Vector Machines (SVM), Danger theory, ...

  16. Practical clinical applications of the computer in nuclear medicine

    International Nuclear Information System (INIS)

    Price, R.R.; Erickson, J.J.; Patton, J.A.; Jones, J.P.; Lagan, J.E.; Rollo, F.D.

    1978-01-01

    The impact of the computer on the practice of nuclear medicine has been felt primarily in the area of rapid dynamic studies. At this time it is difficult to find a clinic which routinely performs computer processing of static images. The general purpose digital computer is a sophisticated and flexible instrument. The number of applications for which one can use the computer to augment data acquisition, analysis, or display is essentially unlimited. In this light, the purpose of this exhibit is not to describe all possible applications of the computer in nuclear medicine but rather to illustrate those applications which have generally been accepted as practical in the routine clinical environment. Specifically, we have chosen examples of computer augmented cardiac, and renal function studies as well as examples of relative organ blood flow studies. In addition, a short description of basic computer components and terminology along with a few examples of non-imaging applications are presented

  17. A protable Database driven control system for SPEAR

    International Nuclear Information System (INIS)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-01-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system

  18. 8th Asian Conference on Intelligent Information and Database Systems

    CERN Document Server

    Madeyski, Lech; Nguyen, Ngoc

    2016-01-01

    The objective of this book is to contribute to the development of the intelligent information and database systems with the essentials of current knowledge, experience and know-how. The book contains a selection of 40 chapters based on original research presented as posters during the 8th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2016) held on 14–16 March 2016 in Da Nang, Vietnam. The papers to some extent reflect the achievements of scientific teams from 17 countries in five continents. The volume is divided into six parts: (a) Computational Intelligence in Data Mining and Machine Learning, (b) Ontologies, Social Networks and Recommendation Systems, (c) Web Services, Cloud Computing, Security and Intelligent Internet Systems, (d) Knowledge Management and Language Processing, (e) Image, Video, Motion Analysis and Recognition, and (f) Advanced Computing Applications and Technologies. The book is an excellent resource for researchers, those working in artificial intelligence, mu...

  19. On the Future of Thermochemical Databases, the Development of Solution Models and the Practical Use of Computational Thermodynamics in Volcanology, Geochemistry and Petrology: Can Innovations of Modern Data Science Democratize an Oligarchy?

    Science.gov (United States)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) has now become an essential tool of petrologic and geochemical research. CT is the basis for the construction of phase diagrams, the application of geothermometers and geobarometers, the equilibrium speciation of solutions, the construction of pseudosections, calculations of mass transfer between minerals, melts and fluids, and, it provides a means of estimating materials properties for the evaluation of constitutive relations in fluid dynamical simulations. The practical application of CT to Earth science problems requires data. Data on the thermochemical properties and the equation of state of relevant materials, and data on the relative stability and partitioning of chemical elements between phases as a function of temperature and pressure. These data must be evaluated and synthesized into a self consistent collection of theoretical models and model parameters that is colloquially known as a thermodynamic database. Quantitative outcomes derived from CT reply on the existence, maintenance and integrity of thermodynamic databases. Unfortunately, the community is reliant on too few such databases, developed by a small number of research groups, and mostly under circumstances where refinement and updates to the database lag behind or are unresponsive to need. Given the increasing level of reliance on CT calculations, what is required is a paradigm shift in the way thermodynamic databases are developed, maintained and disseminated. They must become community resources, with flexible and assessable software interfaces that permit easy modification, while at the same time maintaining theoretical integrity and fidelity to the underlying experimental observations. Advances in computational and data science give us the tools and resources to address this problem, allowing CT results to be obtained at the speed of thought, and permitting geochemical and petrological intuition to play a key role in model development and calibration.

  20. Application of computational intelligence in emerging power systems

    African Journals Online (AJOL)

    ... in the electrical engineering applications. This paper highlights the application of computational intelligence methods in power system problems. Various types of CI methods, which are widely used in power system, are also discussed in the brief. Keywords: Power systems, computational intelligence, artificial intelligence.

  1. Migrating to the Cloud IT Application, Database, and Infrastructure Innovation and Consolidation

    CERN Document Server

    Laszewski, Tom

    2011-01-01

    Whether your company is planning on database migration, desktop application migration, or has IT infrastructure consolidation projects, this book gives you all the resources you'll need. It gives you recommendations on tools, strategy and best practices and serves as a guide as you plan, determine effort and budget, design, execute and roll your modern Oracle system out to production. Focusing on Oracle grid relational database technology and Oracle Fusion Middleware as the target cloud-based architecture, your company can gain organizational efficiency, agility, increase innovation and reduce

  2. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    Science.gov (United States)

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  3. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  4. Integrated Optoelectronic Networks for Application-Driven Multicore Computing

    Science.gov (United States)

    2017-05-08

    AFRL-AFOSR-VA-TR-2017-0102 Integrated Optoelectronic Networks for Application- Driven Multicore Computing Sudeep Pasricha COLORADO STATE UNIVERSITY...AND SUBTITLE Integrated Optoelectronic Networks for Application-Driven Multicore Computing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-13-1-0110 5c...and supportive materials with innovative architectural designs that integrate these components according to system-wide application needs. 15

  5. MTR radiological database for SRS spent nuclear fuel facilities

    International Nuclear Information System (INIS)

    Blanchard, A.

    2000-01-01

    A database for radiological characterization of incoming Material Test Reactor (MTR) fuel has been developed for application to the Receiving Basin for Offsite Fuels (RBOF) and L-Basin spent fuel storage facilities at the Savannah River Site (SRS). This database provides a quick quantitative check to determine if SRS bound spent fuel is radiologically bounded by the Reference Fuel Assembly used in the L-Basin and RBOF authorization bases. The developed database considers pertinent characteristics of domestic and foreign research reactor fuel including exposure, fuel enrichment, irradiation time, cooling time, and fuel-to-moderator ratio. The supplied tables replace the time-consuming studies associated with authorization of SRS bound spent fuel with simple hand calculations. Additionally, the comprehensive database provides the means to overcome resource limitations, since a series of simple, yet conservative, hand calculations can now be performed in a timely manner and replace computational and technical staff requirements

  6. Development of a North American paleoclimate pollen-based reconstruction database application

    Science.gov (United States)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  7. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Wren, D.J.; Popov, N.; Snell, V.G.

    2004-01-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  8. Report from the 6th Workshop on Extremely Large Databases

    Directory of Open Access Journals (Sweden)

    Daniel Liwei Wang

    2013-05-01

    Full Text Available Petascale data management and analysis remain one of the main unresolved challenges in today's computing. The 6th Extremely Large Databases workshop was convened alongside the XLDB conference to discuss the challenges in the health care, biology, and natural resources communities. The role of cloud computing, the dominance of file-based solutions in science applications, in-situ and predictive analysis, and commercial software use in academic environments were discussed in depth as well. This paper summarizes the discussions of this workshop.

  9. Grid Computing Application for Brain Magnetic Resonance Image Processing

    International Nuclear Information System (INIS)

    Valdivia, F; Crépeault, B; Duchesne, S

    2012-01-01

    This work emphasizes the use of grid computing and web technology for automatic post-processing of brain magnetic resonance images (MRI) in the context of neuropsychiatric (Alzheimer's disease) research. Post-acquisition image processing is achieved through the interconnection of several individual processes into pipelines. Each process has input and output data ports, options and execution parameters, and performs single tasks such as: a) extracting individual image attributes (e.g. dimensions, orientation, center of mass), b) performing image transformations (e.g. scaling, rotation, skewing, intensity standardization, linear and non-linear registration), c) performing image statistical analyses, and d) producing the necessary quality control images and/or files for user review. The pipelines are built to perform specific sequences of tasks on the alphanumeric data and MRIs contained in our database. The web application is coded in PHP and allows the creation of scripts to create, store and execute pipelines and their instances either on our local cluster or on high-performance computing platforms. To run an instance on an external cluster, the web application opens a communication tunnel through which it copies the necessary files, submits the execution commands and collects the results. We present result on system tests for the processing of a set of 821 brain MRIs from the Alzheimer's Disease Neuroimaging Initiative study via a nonlinear registration pipeline composed of 10 processes. Our results show successful execution on both local and external clusters, and a 4-fold increase in performance if using the external cluster. However, the latter's performance does not scale linearly as queue waiting times and execution overhead increase with the number of tasks to be executed.

  10. The eNanoMapper database for nanomaterial safety information

    Directory of Open Access Journals (Sweden)

    Nina Jeliazkova

    2015-07-01

    Full Text Available Background: The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs. Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs.Results: The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API, and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms.Conclusion: We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the “representational state

  11. Membrane computing: brief introduction, recent results and applications.

    Science.gov (United States)

    Păun, Gheorghe; Pérez-Jiménez, Mario J

    2006-07-01

    The internal organization and functioning of living cells, as well as their cooperation in tissues and higher order structures, can be a rich source of inspiration for computer science, not fully exploited at the present date. Membrane computing is an answer to this challenge, well developed at the theoretical (mathematical and computability theory) level, already having several applications (via usual computers), but without having yet a bio-lab implementation. After briefly discussing some general issues related to natural computing, this paper provides an informal introduction to membrane computing, focused on the main ideas, the main classes of results and of applications. Then, three recent achievements, of three different types, are briefly presented, with emphasis on the usefulness of membrane computing as a framework for devising models of interest for biological and medical research.

  12. Serialization and persistent objects turning data structures into efficient databases

    CERN Document Server

    Soukup, Jiri

    2014-01-01

    Recently, the pressure for fast processing and efficient storage of large data with complex?relations increased beyond the capability of traditional databases. Typical examples include iPhone applications, computer aided design - both electrical and mechanical, biochemistry applications, and incremental compilers. Serialization, which is sometimes used in such situations is notoriously tedious and error prone.In this book, Jiri Soukup and Petr Macha?ek show in detail how to write programs which store their internal data automatically and transparently to disk. Together with special data structure libraries which treat relations among objects as first-class entities, and with a UML class-diagram generator, the core application code is much simplified. The benchmark chapter shows a typical example where persistent data is faster by the order of magnitude than with a traditional database, in both traversing and accessing the data.The authors explore and exploit advanced features of object-oriented languages in a...

  13. Building strategies for tsunami scenarios databases to be used in a tsunami early warning decision support system: an application to western Iberia

    Science.gov (United States)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.

    2012-04-01

    One of the most challenging goals that the geo-scientific community is facing after the catastrophic tsunami occurred on December 2004 in the Indian Ocean is to develop the so-called "next generation" Tsunami Early Warning Systems (TEWS). Indeed, the meaning of "next generation" does not refer to the aim of a TEWS, which obviously remains to detect whether a tsunami has been generated or not by a given source and, in the first case, to send proper warnings and/or alerts in a suitable time to all the countries and communities that can be affected by the tsunami. Instead, "next generation" identifies with the development of a Decision Support System (DSS) that, in general terms, relies on 1) an integrated set of seismic, geodetic and marine sensors whose objective is to detect and characterise the possible tsunamigenic sources and to monitor instrumentally the time and space evolution of the generated tsunami, 2) databases of pre-computed numerical tsunami scenarios to be suitably combined based on the information coming from the sensor environment and to be used to forecast the degree of exposition of different coastal places both in the near- and in the far-field, 3) a proper overall (software) system architecture. The EU-FP7 TRIDEC Project aims at developing such a DSS and has selected two test areas in the Euro-Mediterranean region, namely the western Iberian margin and the eastern Mediterranean (Turkish coasts). In this study, we discuss the strategies that are being adopted in TRIDEC to build the databases of pre-computed tsunami scenarios and we show some applications to the western Iberian margin. In particular, two different databases are being populated, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB). The VSDB contains detailed simulations of few selected earthquake-generated tsunamis. The cases provided by the members of the VSDB are computed "real events"; in other words, they represent the unknowns that the TRIDEC

  14. Go Figure: Computer Database Adds the Personal Touch.

    Science.gov (United States)

    Gaffney, Jean; Crawford, Pat

    1992-01-01

    A database for recordkeeping for a summer reading club was developed for a public library system using an IBM PC and Microsoft Works. Use of the database resulted in more efficient program management, giving librarians more time to spend with patrons and enabling timely awarding of incentives. (LAE)

  15. The Research of Computer Aided Farm Machinery Designing Method Based on Ergonomics

    Science.gov (United States)

    Gao, Xiyin; Li, Xinling; Song, Qiang; Zheng, Ying

    Along with agricultural economy development, the farm machinery product type Increases gradually, the ergonomics question is also getting more and more prominent. The widespread application of computer aided machinery design makes it possible that farm machinery design is intuitive, flexible and convenient. At present, because the developed computer aided ergonomics software has not suitable human body database, which is needed in view of farm machinery design in China, the farm machinery design have deviation in ergonomics analysis. This article puts forward that using the open database interface procedure in CATIA to establish human body database which aims at the farm machinery design, and reading the human body data to ergonomics module of CATIA can product practical application virtual body, using human posture analysis and human activity analysis module to analysis the ergonomics in farm machinery, thus computer aided farm machinery designing method based on engineering can be realized.

  16. Imperceptible watermarking for security of fundus images in tele-ophthalmology applications and computer-aided diagnosis of retina diseases.

    Science.gov (United States)

    Singh, Anushikha; Dutta, Malay Kishore

    2017-12-01

    The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  18. Advanced technologies for scalable ATLAS conditions database access on the grid

    International Nuclear Information System (INIS)

    Basset, R; Canali, L; Girone, M; Hawkings, R; Valassi, A; Viegas, F; Dimitrov, G; Nevski, P; Vaniachine, A; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  19. Development and application of basis database for materials life cycle assessment in china

    Science.gov (United States)

    Li, Xiaoqing; Gong, Xianzheng; Liu, Yu

    2017-03-01

    As the data intensive method, high quality environmental burden data is an important premise of carrying out materials life cycle assessment (MLCA), and the reliability of data directly influences the reliability of the assessment results and its application performance. Therefore, building Chinese MLCA database is the basic data needs and technical supports for carrying out and improving LCA practice. Firstly, some new progress on database which related to materials life cycle assessment research and development are introduced. Secondly, according to requirement of ISO 14040 series standards, the database framework and main datasets of the materials life cycle assessment are studied. Thirdly, MLCA data platform based on big data is developed. Finally, the future research works were proposed and discussed.

  20. Computational methods for industrial radiation measurement applications

    International Nuclear Information System (INIS)

    Gardner, R.P.; Guo, P.; Ao, Q.

    1996-01-01

    Computational methods have been used with considerable success to complement radiation measurements in solving a wide range of industrial problems. The almost exponential growth of computer capability and applications in the last few years leads to a open-quotes black boxclose quotes mentality for radiation measurement applications. If a black box is defined as any radiation measurement device that is capable of measuring the parameters of interest when a wide range of operating and sample conditions may occur, then the development of computational methods for industrial radiation measurement applications should now be focused on the black box approach and the deduction of properties of interest from the response with acceptable accuracy and reasonable efficiency. Nowadays, increasingly better understanding of radiation physical processes, more accurate and complete fundamental physical data, and more advanced modeling and software/hardware techniques have made it possible to make giant strides in that direction with new ideas implemented with computer software. The Center for Engineering Applications of Radioisotopes (CEAR) at North Carolina State University has been working on a variety of projects in the area of radiation analyzers and gauges for accomplishing this for quite some time, and they are discussed here with emphasis on current accomplishments

  1. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    Science.gov (United States)

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  2. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Bhateja, Vikrant; Udgata, Siba; Pattnaik, Prasant

    2017-01-01

    The book is a collection of high-quality peer-reviewed research papers presented at International Conference on Frontiers of Intelligent Computing: Theory and applications (FICTA 2016) held at School of Computer Engineering, KIIT University, Bhubaneswar, India during 16 – 17 September 2016. The book presents theories, methodologies, new ideas, experiences and applications in all areas of intelligent computing and its applications to various engineering disciplines like computer science, electronics, electrical and mechanical engineering.

  3. Application engineering for process computer systems

    International Nuclear Information System (INIS)

    Mueller, K.

    1975-01-01

    The variety of tasks for process computers in nuclear power stations necessitates the centralization of all production stages from the planning stage to the delivery of the finished process computer system (PRA) to the user. This so-called 'application engineering' comprises all of the activities connected with the application of the PRA: a) establishment of the PRA concept, b) project counselling, c) handling of offers, d) handling of orders, e) internal handling of orders, f) technical counselling, g) establishing of parameters, h) monitoring deadlines, i) training of customers, j) compiling an operation manual. (orig./AK) [de

  4. EXPLORATIONS IN QUANTUM COMPUTING FOR FINANCIAL APPLICATIONS

    OpenAIRE

    Gare, Jesse

    2010-01-01

    Quantum computers have the potential to increase the solution speed for many computational problems. This paper is a first step into possible applications for quantum computing in the context of computational finance. The fundamental ideas of quantum computing are introduced, followed by an exposition of the algorithms of Deutsch and Grover. Improved mean and median estimation are shown as results of Grover?s generalized framework. The algorithm for mean estimation is refined to an improved M...

  5. BIOPEP database and other programs for processing bioactive peptide sequences.

    Science.gov (United States)

    Minkiewicz, Piotr; Dziuba, Jerzy; Iwaniak, Anna; Dziuba, Marta; Darewicz, Małgorzata

    2008-01-01

    This review presents the potential for application of computational tools in peptide science based on a sample BIOPEP database and program as well as other programs and databases available via the World Wide Web. The BIOPEP application contains a database of biologically active peptide sequences and a program enabling construction of profiles of the potential biological activity of protein fragments, calculation of quantitative descriptors as measures of the value of proteins as potential precursors of bioactive peptides, and prediction of bonds susceptible to hydrolysis by endopeptidases in a protein chain. Other bioactive and allergenic peptide sequence databases are also presented. Programs enabling the construction of binary and multiple alignments between peptide sequences, the construction of sequence motifs attributed to a given type of bioactivity, searching for potential precursors of bioactive peptides, and the prediction of sites susceptible to proteolytic cleavage in protein chains are available via the Internet as are other approaches concerning secondary structure prediction and calculation of physicochemical features based on amino acid sequence. Programs for prediction of allergenic and toxic properties have also been developed. This review explores the possibilities of cooperation between various programs.

  6. Palantiri: a distributed real-time database system for process control

    International Nuclear Information System (INIS)

    Tummers, B.J.; Heubers, W.P.J.

    1992-01-01

    The medium-energy accelerator MEA, located in Amsterdam, is controlled by a heterogeneous computer network. A large real-time database contains the parameters involved in the control of the accelerator and the experiments. This database system was implemented about ten years ago and has since been extended several times. In response to increased needs the database system has been redesigned. The new database environment, as described in this paper, consists out of two new concepts: (1) A Palantir which is a per machine process that stores the locally declared data and forwards all non local requests for data access to the appropriate machine. It acts as a storage device for data and a looking glass upon the world. (2) Golems: working units that define the data within the Palantir, and that have knowledge of the hardware they control. Applications access the data of a Golem by name (which do resemble Unix path names). The palantir that runs on the same machine as the application handles the distribution of access requests. This paper focuses on the Palantir concept as a distributed data storage and event handling device for process control. (author)

  7. 2nd International Conference on Intelligent Computing and Applications

    CERN Document Server

    Dash, Subhransu; Das, Swagatam; Panigrahi, Bijaya

    2017-01-01

    Second International Conference on Intelligent Computing and Applications was the annual research conference aimed to bring together researchers around the world to exchange research results and address open issues in all aspects of Intelligent Computing and Applications. The main objective of the second edition of the conference for the scientists, scholars, engineers and students from the academia and the industry is to present ongoing research activities and hence to foster research relations between the Universities and the Industry. The theme of the conference unified the picture of contemporary intelligent computing techniques as an integral concept that highlights the trends in computational intelligence and bridges theoretical research concepts with applications. The conference covered vital issues ranging from intelligent computing, soft computing, and communication to machine learning, industrial automation, process technology and robotics. This conference also provided variety of opportunities for ...

  8. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  9. ISSUES IN MOBILE DISTRIBUTED REAL TIME DATABASES: PERFORMANCE AND REVIEW

    OpenAIRE

    VISHNU SWAROOP,; Gyanendra Kumar Gupta,; UDAI SHANKER

    2011-01-01

    Increase in handy and small electronic devices in computing fields; it makes the computing more popularand useful in business. Tremendous advances in wireless networks and portable computing devices have led to development of mobile computing. Support of real time database system depending upon thetiming constraints, due to availability of data distributed database, and ubiquitous computing pull the mobile database concept, which emerges them in a new form of technology as mobile distributed ...

  10. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  11. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  12. Outline of computer application in PNC

    International Nuclear Information System (INIS)

    Aoki, Minoru

    1990-01-01

    Computer application systems are an important resource for the R and D (research and development) in PNC. Various types of computer systems are widely used on the R and D of experiment, evaluation and analysis, plant operation and other jobs in PNC. Currently, the computer centers in PNC have been established in Oarai engineering Center and Tokai Works. The former uses a large scale digital computer and supercomputer systems. The latter uses only a large scale digital computer system. These computer systems have joined in the PNC Information Network that connects between Head Office and Branches, Oarai, Tokai, Ningyotoge and Fugen, by means of super digital circuit. In the near future, the computer centers will be brought together in order to raise up efficiency of operation of the computer systems. New computer center called 'Information Center' is under construction in Oarai Engineering Center. (author)

  13. Towards cloud-centric distributed database evaluation

    OpenAIRE

    Seybold, Daniel

    2016-01-01

    The area of cloud computing also pushed the evolvement of distributed databases, resulting in a variety of distributed database systems, which can be classified in relation databases, NoSQL and NewSQL database systems. In general all representatives of these database system classes claim to provide elasticity and "unlimited" horizontal scalability. As these characteristics comply with the cloud, distributed databases seem to be a perfect match for Database-as-a-Service systems (DBaaS).

  14. Towards Cloud-centric Distributed Database Evaluation

    OpenAIRE

    Seybold, Daniel

    2016-01-01

    The area of cloud computing also pushed the evolvement of distributed databases, resulting in a variety of distributed database systems, which can be classified in relation databases, NoSQL and NewSQL database systems. In general all representatives of these database system classes claim to provide elasticity and "unlimited" horizontal scalability. As these characteristics comply with the cloud, distributed databases seem to be a perfect match for Database-as-a-Service systems (DBaaS).

  15. Journal of Computer Science and Its Application: Site Map

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: Site Map. Journal Home > About the Journal > Journal of Computer Science and Its Application: Site Map. Log in or Register to get access to full text downloads.

  16. Journal of Computer Science and Its Application: Journal Sponsorship

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: Journal Sponsorship. Journal Home > About the Journal > Journal of Computer Science and Its Application: Journal Sponsorship. Log in or Register to get access to full text downloads.

  17. A Method to Ease the Deployment of Web Applications that Involve Database Systems A Method to Ease the Deployment of Web Applications that Involve Database Systems

    Directory of Open Access Journals (Sweden)

    Antonio Vega Corona

    2012-02-01

    Full Text Available El crecimiento continuo de la Internet ha permitido a las personas, alrededor de todo mundo, realizar transacciones en línea, buscar información o navegar usando el explorador de la Web. A medida que más gente se siente cómoda usando los exploradores de Web, más empresas productoras de software tratan de ofrecer interfaces Web como una forma alternativa para proporcionar acceso a sus aplicaciones. La naturaleza de la conexión Web y las restricciones impuestas por el ancho de banda disponible, hacen la integración de aplicaciones Web y los sistemas de bases de datos críticas. Debido a que las aplicaciones que usan bases de datos proporcionan una interfase gráfica para editar la información en la base de datos y debido a que cada columna en una tabla de una base de datos corresponde a un control en una interfase gráfica, el desarrollo de estas aplicaciones puede consumirun tiempo considerable, ya que la validación de campos y reglas de integridad referencial deben ser respetadas. Se propone un diseño orientado a objetos para así facilitar el desarrollo de aplicaciones que usan sistemas de bases de datos.The continuous growth of the Internet has driven people, all around the globe, to performtransactions on-line, search information or navigate using a browser. As more people feelcomfortable using a Web browser, more software companies are trying to alternatively offerWeb interfaces to provide access to their applications. The consequent nature of the Webconnection and the restrictions imposed by the available bandwidth make the successfulintegration of Web applications and database systems critical. Because popular databaseapplications provide a user interface to edit and maintain the information in the databaseand because each column in the database table maps to a graphic user interface control,the deployment of these applications can be time consuming; appropriate fi eld validationand referential integrity rules must be observed

  18. Hydrogen Leak Detection Sensor Database

    Science.gov (United States)

    Baker, Barton D.

    2010-01-01

    This slide presentation reviews the characteristics of the Hydrogen Sensor database. The database is the result of NASA's continuing interest in and improvement of its ability to detect and assess gas leaks in space applications. The database specifics and a snapshot of an entry in the database are reviewed. Attempts were made to determine the applicability of each of the 65 sensors for ground and/or vehicle use.

  19. GSIMF: a web service based software and database management system for the next generation grids

    International Nuclear Information System (INIS)

    Wang, N; Ananthan, B; Gieraltowski, G; May, E; Vaniachine, A

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  20. Improvement of level-1 PSA computer code package

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on `The improvement of level-1 PSA Computer Codes` is divided into two main activities : (1) improvement of level-1 PSA methodology, (2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs.

  1. Improvement of level-1 PSA computer code package

    International Nuclear Information System (INIS)

    Kim, Tae Woon; Park, C. K.; Kim, K. Y.; Han, S. H.; Jung, W. D.; Chang, S. C.; Yang, J. E.; Sung, T. Y.; Kang, D. I.; Park, J. H.; Lee, Y. H.; Kim, S. H.; Hwang, M. J.; Choi, S. Y.

    1997-07-01

    This year the fifth (final) year of the phase-I of the Government-sponsored Mid- and Long-term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The improvement of level-1 PSA Computer Codes' is divided into two main activities : 1) improvement of level-1 PSA methodology, 2) development of applications methodology of PSA techniques to operations and maintenance of nuclear power plant. Level-1 PSA code KIRAP is converted to PC-Windows environment. For the improvement of efficiency in performing PSA, the fast cutset generation algorithm and an analytical technique for handling logical loop in fault tree modeling are developed. Using about 30 foreign generic data sources, generic component reliability database (GDB) are developed considering dependency among source data. A computer program which handles dependency among data sources are also developed based on three stage bayesian updating technique. Common cause failure (CCF) analysis methods are reviewed and CCF database are established. Impact vectors can be estimated from this CCF database. A computer code, called MPRIDP, which handles CCF database are also developed. A CCF analysis reflecting plant-specific defensive strategy against CCF event is also performed. A risk monitor computer program, called Risk Monster, are being developed for the application to the operation and maintenance of nuclear power plant. The PSA application technique is applied to review the feasibility study of on-line maintenance and to the prioritization of in-service test (IST) of motor-operated valves (MOV). Finally, the root cause analysis (RCA) and reliability-centered maintenance (RCM) technologies are adopted and applied to the improvement of reliability of emergency diesel generators (EDG) of nuclear power plant. To help RCA and RCM analyses, two software programs are developed, which are EPIS and RAM Pro. (author). 129 refs., 20 tabs., 60 figs

  2. Recent trends in grid computing

    International Nuclear Information System (INIS)

    Miura, Kenichi

    2004-01-01

    Grid computing is a technology which allows uniform and transparent access to geographically dispersed computational resources, such as computers, databases, experimental and observational equipment etc. via high-speed, high-bandwidth networking. The commonly used analogy is that of electrical power grid, whereby the household electricity is made available from outlets on the wall, and little thought need to be given to where the electricity is generated and how it is transmitted. The usage of grid also includes distributed parallel computing, high through-put computing, data intensive computing (data grid) and collaborative computing. This paper reviews the historical background, software structure, current status and on-going grid projects, including applications of grid technology to nuclear fusion research. (author)

  3. NoSQL database scaling

    OpenAIRE

    Žardin, Norbert

    2017-01-01

    NoSQL database scaling is a decision, where system resources or financial expenses are traded for database performance or other benefits. By scaling a database, database performance and resource usage might increase or decrease, such changes might have a negative impact on an application that uses the database. In this work it is analyzed how database scaling affect database resource usage and performance. As a results, calculations are acquired, using which database scaling types and differe...

  4. The establish and application of equipment reliability database in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Zheng Wei; Li He

    2006-03-01

    Take the case of Daya Bay Nuclear Power Plant, the collecting and handling of equipment reliability data, the calculation method of reliability parameters and the establish and application of reliability databases, etc. are discussed. The data source involved the design information of the equipment, the operation information, the maintenance information and periodically test record, etc. Equipment reliability database built on a base of the operation experience. It provided the valid tool for thoroughly and objectively recording the operation history and the present condition of various equipment of the plant; supervising the appearance of the equipment, especially the safety-related equipment, provided the very practical worth information for enhancing the safety and availability management of the equipment and insuring the safety and economic operation of the plant; and provided the essential data for the research and applications in safety management, reliability analysis, probabilistic safety assessment, reliability centered maintenance and economic management in nuclear power plant. (authors)

  5. Computers and clinical arrhythmias.

    Science.gov (United States)

    Knoebel, S B; Lovelace, D E

    1983-02-01

    Cardiac arrhythmias are ubiquitous in normal and abnormal hearts. These disorders may be life-threatening or benign, symptomatic or unrecognized. Arrhythmias may be the precursor of sudden death, a cause or effect of cardiac failure, a clinical reflection of acute or chronic disorders, or a manifestation of extracardiac conditions. Progress is being made toward unraveling the diagnostic and therapeutic problems involved in arrhythmogenesis. Many of the advances would not be possible, however, without the availability of computer technology. To preserve the proper balance and purposeful progression of computer usage, engineers and physicians have been exhorted not to work independently in this field. Both should learn some of the other's trade. The two disciplines need to come together to solve important problems with computers in cardiology. The intent of this article was to acquaint the practicing cardiologist with some of the extant and envisioned computer applications and some of the problems with both. We conclude that computer-based database management systems are necessary for sorting out the clinical factors of relevance for arrhythmogenesis, but computer database management systems are beset with problems that will require sophisticated solutions. The technology for detecting arrhythmias on routine electrocardiograms is quite good but human over-reading is still required, and the rationale for computer application in this setting is questionable. Systems for qualitative, continuous monitoring and review of extended time ECG recordings are adequate with proper noise rejection algorithms and editing capabilities. The systems are limited presently for clinical application to the recognition of ectopic rhythms and significant pauses. Attention should now be turned to the clinical goals for detection and quantification of arrhythmias. We should be asking the following questions: How quantitative do systems need to be? Are computers required for the detection of

  6. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    Science.gov (United States)

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  7. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  8. Proceedings of the 2011 2nd International Congress on Computer Applications and Computational Science

    CERN Document Server

    Nguyen, Quang

    2012-01-01

    The latest inventions in computer technology influence most of human daily activities. In the near future, there is tendency that all of aspect of human life will be dependent on computer applications. In manufacturing, robotics and automation have become vital for high quality products. In education, the model of teaching and learning is focusing more on electronic media than traditional ones. Issues related to energy savings and environment is becoming critical.   Computational Science should enhance the quality of human life,  not only solve their problems. Computational Science should help humans to make wise decisions by presenting choices and their possible consequences. Computational Science should help us make sense of observations, understand natural language, plan and reason with extensive background knowledge. Intelligence with wisdom is perhaps an ultimate goal for human-oriented science.   This book is a compilation of some recent research findings in computer application and computational sci...

  9. Guide to cloud computing for business and technology managers from distributed computing to cloudware applications

    CERN Document Server

    Kale, Vivek

    2014-01-01

    Guide to Cloud Computing for Business and Technology Managers: From Distributed Computing to Cloudware Applications unravels the mystery of cloud computing and explains how it can transform the operating contexts of business enterprises. It provides a clear understanding of what cloud computing really means, what it can do, and when it is practical to use. Addressing the primary management and operation concerns of cloudware, including performance, measurement, monitoring, and security, this pragmatic book:Introduces the enterprise applications integration (EAI) solutions that were a first ste

  10. Computer applications in nuclear medicine

    International Nuclear Information System (INIS)

    Lancaster, J.L.; Lasher, J.C.; Blumhardt, R.

    1987-01-01

    Digital computers were introduced to nuclear medicine research as an imaging modality in the mid-1960s. Widespread use of imaging computers (scintigraphic computers) was not seen in nuclear medicine clinics until the mid-1970s. For the user, the ability to acquire scintigraphic images into the computer for quantitative purposes, with accurate selection of regions of interest (ROIs), promised almost endless computational capabilities. Investigators quickly developed many new methods for quantitating the distribution patterns of radiopharmaceuticals within the body both spatially and temporally. The computer was used to acquire data on practically every organ that could be imaged by means of gamma cameras or rectilinear scanners. Methods of image processing borrowed from other disciplines were applied to scintigraphic computer images in an attempt to improve image quality. Image processing in nuclear medicine has evolved into a relatively extensive set of tasks that can be called on by the user to provide additional clinical information rather than to improve image quality. Digital computers are utilized in nuclear medicine departments for nonimaging applications also, Patient scheduling, archiving, radiopharmaceutical inventory, radioimmunoassay (RIA), and health physics are just a few of the areas in which the digital computer has proven helpful. The computer is useful in any area in which a large quantity of data needs to be accurately managed, especially over a long period of time

  11. Accessing and using chemical databases

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Pavlov, Todor; Niemelä, Jay Russell

    2013-01-01

    Computer-based representation of chemicals makes it possible to organize data in chemical databases-collections of chemical structures and associated properties. Databases are widely used wherever efficient processing of chemical information is needed, including search, storage, retrieval......, and dissemination. Structure and functionality of chemical databases are considered. The typical kinds of information found in a chemical database are considered-identification, structural, and associated data. Functionality of chemical databases is presented, with examples of search and access types. More details...... are included about the OASIS database and platform and the Danish (Q)SAR Database online. Various types of chemical database resources are discussed, together with a list of examples....

  12. Database for environmental monitoring in nuclear facilities

    International Nuclear Information System (INIS)

    Raceanu, Mircea; Varlam, Carmen; Iliescu, Mariana; Enache, Adrian; Faurescu, Ionut

    2006-01-01

    To ensure that an assessment could be made of the impact of nuclear facilities on the local environment, a program of environmental monitoring must be established well before of nuclear facility commissioning. Enormous amount of data must be stored and correlated starting with: location, meteorology, type sample characterization from water to different kind of foods, radioactivity measurement and isotopic measurement (e.g. for C-14 determination, C-13 isotopic correction it is a must). Data modelling is a well known mechanism describing data structures at a high level of abstraction. Such models are often used to automatically create database structures, and to generate the code structures used to access the databases. This has the disadvantage of losing data constraints that might be specified in data models for data checking. Embodiment of the system of the present application includes a computer-readable memory for storing a definitional data table for defining variable symbols representing the corresponding measurable physical quantities. Developing a database system implies setting up well-established rules of how the data should be stored and accessed what is commonly called the Relational Database Theory. This consists of guidelines regarding issues as how to avoid duplicating data using the technique called normalization and how to identify the unique identifier for a database record. (authors)

  13. Interactive computer graphics applications for compressible aerodynamics

    Science.gov (United States)

    Benson, Thomas J.

    1994-01-01

    Three computer applications have been developed to solve inviscid compressible fluids problems using interactive computer graphics. The first application is a compressible flow calculator which solves for isentropic flow, normal shocks, and oblique shocks or centered expansions produced by two dimensional ramps. The second application couples the solutions generated by the first application to a more graphical presentation of the results to produce a desk top simulator of three compressible flow problems: 1) flow past a single compression ramp; 2) flow past two ramps in series; and 3) flow past two opposed ramps. The third application extends the results of the second to produce a design tool which solves for the flow through supersonic external or mixed compression inlets. The applications were originally developed to run on SGI or IBM workstations running GL graphics. They are currently being extended to solve additional types of flow problems and modified to operate on any X-based workstation.

  14. Functionally Graded Materials Database

    Science.gov (United States)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  15. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  16. Computational information geometry for image and signal processing

    CERN Document Server

    Critchley, Frank; Dodson, Christopher

    2017-01-01

    This book focuses on the application and development of information geometric methods in the analysis, classification and retrieval of images and signals. It provides introductory chapters to help those new to information geometry and applies the theory to several applications. This area has developed rapidly over recent years, propelled by the major theoretical developments in information geometry, efficient data and image acquisition and the desire to process and interpret large databases of digital information. The book addresses both the transfer of methodology to practitioners involved in database analysis and in its efficient computational implementation.

  17. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi; Ikeo, Kazuho; Katayama, Yukie; Kawabata, Takeshi; Kinjo, Akira R.; Kinoshita, Kengo; Kwon, Yeondae; Migita, Ohsuke; Mizutani, Hisashi; Muraoka, Masafumi; Nagata, Koji; Omori, Satoshi; Sugawara, Hideaki; Yamada, Daichi; Yura, Kei

    2016-01-01

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  18. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi

    2016-12-24

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  19. Cloud computing for data-intensive applications

    CERN Document Server

    Li, Xiaolin

    2014-01-01

    This book presents a range of cloud computing platforms for data-intensive scientific applications. It covers systems that deliver infrastructure as a service, including: HPC as a service; virtual networks as a service; scalable and reliable storage; algorithms that manage vast cloud resources and applications runtime; and programming models that enable pragmatic programming and implementation toolkits for eScience applications. Many scientific applications in clouds are also introduced, such as bioinformatics, biology, weather forecasting and social networks. Most chapters include case studie

  20. Network-based Database Course

    DEFF Research Database (Denmark)

    Nielsen, J.N.; Knudsen, Morten; Nielsen, Jens Frederik Dalsgaard

    A course in database design and implementation has been de- signed, utilizing existing network facilities. The course is an elementary course for students of computer engineering. Its purpose is to give the students a theoretical database knowledge as well as practical experience with design...... and implementation. A tutorial relational database and the students self-designed databases are implemented on the UNIX system of Aalborg University, thus giving the teacher the possibility of live demonstrations in the lecture room, and the students the possibility of interactive learning in their working rooms...

  1. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  2. Active in-database processing to support ambient assisted living systems.

    Science.gov (United States)

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  3. 9th Asian Conference on Intelligent Information and Database Systems

    CERN Document Server

    Nguyen, Ngoc; Shirai, Kiyoaki

    2017-01-01

    This book presents recent research in intelligent information and database systems. The carefully selected contributions were initially accepted for presentation as posters at the 9th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2017) held from to 5 April 2017 in Kanazawa, Japan. While the contributions are of an advanced scientific level, several are accessible for non-expert readers. The book brings together 47 chapters divided into six main parts: • Part I. From Machine Learning to Data Mining. • Part II. Big Data and Collaborative Decision Support Systems, • Part III. Computer Vision Analysis, Detection, Tracking and Recognition, • Part IV. Data-Intensive Text Processing, • Part V. Innovations in Web and Internet Technologies, and • Part VI. New Methods and Applications in Information and Software Engineering. The book is an excellent resource for researchers and those working in algorithmics, artificial and computational intelligence, collaborative systems, decisio...

  4. Function and organization of CPC database system

    International Nuclear Information System (INIS)

    Yoshida, Tohru; Tomiyama, Mineyoshi.

    1986-02-01

    It is very time-consuming and expensive work to develop computer programs. Therefore, it is desirable to effectively use the existing program. For this purpose, it is required for researchers and technical staffs to obtain the relevant informations easily. CPC (Computer Physics Communications) is a journal published to facilitate the exchange of physics programs and of the relevant information about the use of computers in the physics community. There are about 1300 CPC programs in JAERI computing center, and the number of programs is increasing. A new database system (CPC database) has been developed to manage the CPC programs and their information. Users obtain information about all the programs stored in the CPC database. Also users can find and copy the necessary program by inputting the program name, the catalogue number and the volume number. In this system, each operation is done by menu selection. Every CPC program is compressed and stored in the database; the required storage size is one third of the non-compressed format. Programs unused for a long time are moved to magnetic tape. The present report describes the CPC database system and the procedures for its use. (author)

  5. Failure rates in Barsebaeck-1 reactor coolant pressure boundary piping. An application of a piping failure database

    International Nuclear Information System (INIS)

    Lydell, B.

    1999-05-01

    This report documents an application of a piping failure database to estimate the frequency of leak and rupture in reactor coolant pressure boundary piping. The study used Barsebaeck-1 as reference plant. The study tried two different approaches to piping failure rate estimation: 1) PSA-style, simple estimation using Bayesian statistics, and 2) fitting of statistical distribution to failure data. A large, validated database on piping failures (like the SKI-PIPE database) supports both approaches. In addition to documenting leak and rupture frequencies, the SKI report describes the use of piping failure data to estimate frequency of medium and large loss of coolant accidents (LOCAs). This application study was co sponsored by Barsebaeck Kraft AB and SKI Research

  6. Database security in the cloud

    OpenAIRE

    Sakhi, Imal

    2012-01-01

    The aim of the thesis is to get an overview of the database services available in cloud computing environment, investigate the security risks associated with it and propose the possible countermeasures to minimize the risks. The thesis also analyzes two cloud database service providers namely; Amazon RDS and Xeround. The reason behind choosing these two providers is because they are currently amongst the leading cloud database providers and both provide relational cloud databases which makes ...

  7. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    Science.gov (United States)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  8. The EREC-STRESA database. Internet application

    International Nuclear Information System (INIS)

    Davydov, M.V.; Annunziato, A.

    2004-01-01

    A considerable amount of experimental data in the field of NPPs safety and reliability was produced and gathered in the Electrogorsk Research and Engineering Centre on NPPs Safety. In order to provide properly preservation and easy accessing to the data the EREC Database was created. This paper gives a description of the EREC Database and the supporting web-based informatic platform STRESA. (author)

  9. A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.

    Science.gov (United States)

    Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo

    2015-01-01

    The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.

  10. Profiling an application for power consumption during execution on a compute node

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2013-09-17

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  11. Analysis of technologies databases use in physical education and sport

    Directory of Open Access Journals (Sweden)

    Usychenko V.V.

    2010-03-01

    Full Text Available Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is considered on training and competition activity. A database is presented «Athlete». A base contains anthropometric and myometrical indexes of sportsmen of bodybuilding of high qualification.

  12. CT applications of medical computer graphics

    International Nuclear Information System (INIS)

    Rhodes, M.L.

    1985-01-01

    Few applications of computer graphics show as much promise and early success as that for CT. Unlike electron microscopy, ultrasound, business, military, and animation applications, CT image data are inherently digital. CT pictures can be processed directly by programs well established in the fields of computer graphics and digital image processing. Methods for reformatting digital pictures, enhancing structure shape, reducing image noise, and rendering three-dimensional (3D) scenes of anatomic structures have all become routine at many CT centers. In this chapter, the authors provide a brief introduction to computer graphics terms and techniques commonly applied to CT pictures and, when appropriate, to those showing promise for magnetic resonance images. Topics discussed here are image-processing options that are applied to digital images already constructed. In the final portion of this chapter techniques for ''slicing'' CT image data are presented, and geometric principles that describe the specification of oblique and curved images are outlined. Clinical examples are included

  13. Security in Computer Applications

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development. The last part of the lecture covers some miscellaneous issues like the use of cryptography, rules for networking applications, and social engineering threats. This lecture was first given on Thursd...

  14. Cloud computing with e-science applications

    CERN Document Server

    Terzo, Olivier

    2015-01-01

    The amount of data in everyday life has been exploding. This data increase has been especially significant in scientific fields, where substantial amounts of data must be captured, communicated, aggregated, stored, and analyzed. Cloud Computing with e-Science Applications explains how cloud computing can improve data management in data-heavy fields such as bioinformatics, earth science, and computer science. The book begins with an overview of cloud models supplied by the National Institute of Standards and Technology (NIST), and then:Discusses the challenges imposed by big data on scientific

  15. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2014-01-01

    This volume contains the papers presented at the Second International Conference on Frontiers in Intelligent Computing: Theory and Applications (FICTA-2013) held during 14-16 November 2013 organized by Bhubaneswar Engineering College (BEC), Bhubaneswar, Odisha, India. It contains 63 papers focusing on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, Fuzzy systems, Machine Intelligence and ANN, Web technologies and Multimedia applications and Intelligent computing and Networking etc.

  16. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    Science.gov (United States)

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  17. SuperNatural: a searchable database of available natural compounds.

    Science.gov (United States)

    Dunkel, Mathias; Fullbeck, Melanie; Neumann, Stefanie; Preissner, Robert

    2006-01-01

    Although tremendous effort has been put into synthetic libraries, most drugs on the market are still natural compounds or derivatives thereof. There are encyclopaedias of natural compounds, but the availability of these compounds is often unclear and catalogues from numerous suppliers have to be checked. To overcome these problems we have compiled a database of approximately 50,000 natural compounds from different suppliers. To enable efficient identification of the desired compounds, we have implemented substructure searches with typical templates. Starting points for in silico screenings are about 2500 well-known and classified natural compounds from a compendium that we have added. Possible medical applications can be ascertained via automatic searches for similar drugs in a free conformational drug database containing WHO indications. Furthermore, we have computed about three million conformers, which are deployed to account for the flexibilities of the compounds when the 3D superposition algorithm that we have developed is used. The SuperNatural Database is publicly available at http://bioinformatics.charite.de/supernatural. Viewing requires the free Chime-plugin from MDL (Chime) or Java2 Runtime Environment (MView), which is also necessary for using Marvin application for chemical drawing.

  18. The Development of a Web Based Database Applications of Procurement, Inventory, and Sales at PT. Interjaya Surya Megah

    OpenAIRE

    Huda, Choirul; Hudyanto, Chendra; Sisillia, Sisillia; Persada, Revin Kencana

    2011-01-01

    The objective of this research is to develop a web based database application for the procurement, inventory and sales at PT. Interjaya Surya Megah. The current system at PT. Interjaya Surya Megah is running manually, so the company has difficulty in carrying out its activities. The methodology, that is used in this research, includes interviews, observation, literature review, conceptual database design, logical database design and physical database design. The results are the establishment ...

  19. Secure Distributed Databases Using Cryptography

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2006-01-01

    Full Text Available The computational encryption is used intensively by different databases management systems for ensuring privacy and integrity of information that are physically stored in files. Also, the information is sent over network and is replicated on different distributed systems. It is proved that a satisfying level of security is achieved if the rows and columns of tables are encrypted independently of table or computer that sustains the data. Also, it is very important that the SQL - Structured Query Language query requests and responses to be encrypted over the network connection between the client and databases server. All this techniques and methods must be implemented by the databases administrators, designer and developers in a consistent security policy.

  20. Development of a Database for Study Data in Registration Applications for Veterinary Medicinal Products

    Directory of Open Access Journals (Sweden)

    Anke Finnah

    2017-02-01

    Full Text Available Objective: In the present study, the feasibility of a systematic record of clinical study data from marketing authorisation applications for veterinary medicinal products (VMP and benefits of the selected approach were investigated.Background: Drug registration dossiers for veterinary medicinal products contain extensive data from drug studies, which are not easily accessible to assessors.Evidentiary value: Fast access to these data including specific search tools could facilitate a meaningful use of the data and allow assessors for comparison of test and studies from different dossiers.Methods: First, pivotal test parameters and their mutual relationships were identified. Second, a data model was developed and implemented in a relational database management system, including a data entry form and various reports for database searches. Compilation of study data in the database was demonstrated using all available clinical studies involving VMPs containing the anthelmintic drug Praziquantel. By means of descriptive data analysis possibilities of data evaluation including graphical presentation were shown. Suitability of the database to support the performance of meta-analyses was tentatively validated.Results: The data model was designed to cover the specific requirements arising from study data. A total of 308 clinical studies related to 95 VMPs containing Praziquantel (single agent and combination drugs was selected for prototype testing. The relevant data extracted from these studies were appropriately structured and shown to be basically suitable for descriptive data analyses as well as for meta-analyses.Conclusion: The database-supported collection of study data would provide users with easy access to the continuously increasing pool of scientific information held by competent authorities. It enables specific data analyses. Database design allows expanding the data model to all types of studies and classes of drugs registered in veterinary

  1. Implementation of DFT application on ternary optical computer

    Science.gov (United States)

    Junjie, Peng; Youyi, Fu; Xiaofeng, Zhang; Shuai, Kong; Xinyu, Wei

    2018-03-01

    As its characteristics of huge number of data bits and low energy consumption, optical computing may be used in the applications such as DFT etc. which needs a lot of computation and can be implemented in parallel. According to this, DFT implementation methods in full parallel as well as in partial parallel are presented. Based on resources ternary optical computer (TOC), extensive experiments were carried out. Experimental results show that the proposed schemes are correct and feasible. They provide a foundation for further exploration of the applications on TOC that needs a large amount calculation and can be processed in parallel.

  2. Database for propagation models

    Science.gov (United States)

    Kantak, Anil V.

    1991-07-01

    A propagation researcher or a systems engineer who intends to use the results of a propagation experiment is generally faced with various database tasks such as the selection of the computer software, the hardware, and the writing of the programs to pass the data through the models of interest. This task is repeated every time a new experiment is conducted or the same experiment is carried out at a different location generating different data. Thus the users of this data have to spend a considerable portion of their time learning how to implement the computer hardware and the software towards the desired end. This situation may be facilitated considerably if an easily accessible propagation database is created that has all the accepted (standardized) propagation phenomena models approved by the propagation research community. Also, the handling of data will become easier for the user. Such a database construction can only stimulate the growth of the propagation research it if is available to all the researchers, so that the results of the experiment conducted by one researcher can be examined independently by another, without different hardware and software being used. The database may be made flexible so that the researchers need not be confined only to the contents of the database. Another way in which the database may help the researchers is by the fact that they will not have to document the software and hardware tools used in their research since the propagation research community will know the database already. The following sections show a possible database construction, as well as properties of the database for the propagation research.

  3. Color in Computer Vision Fundamentals and Applications

    CERN Document Server

    Gevers, Theo; van de Weijer, Joost; Geusebroek, Jan-Mark

    2012-01-01

    While the field of computer vision drives many of today’s digital technologies and communication networks, the topic of color has emerged only recently in most computer vision applications. One of the most extensive works to date on color in computer vision, this book provides a complete set of tools for working with color in the field of image understanding. Based on the authors’ intense collaboration for more than a decade and drawing on the latest thinking in the field of computer science, the book integrates topics from color science and computer vision, clearly linking theor

  4. LHCb Conditions database operation assistance systems

    International Nuclear Information System (INIS)

    Clemencic, M; Shapoval, I; Cattaneo, M; Degaudenzi, H; Santinelli, R

    2012-01-01

    The Conditions Database (CondDB) of the LHCb experiment provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger (HLT), reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues. The first system is a CondDB state tracking extension to the Oracle 3D Streams replication technology, to trap cases when the CondDB replication was corrupted. Second, an automated distribution system for the SQLite-based CondDB, providing also smart backup and checkout mechanisms for the CondDB managers and LHCb users respectively. And, finally, a system to verify and monitor the internal (CondDB self-consistency) and external (LHCb physics software vs. CondDB) compatibility. The former two systems are used in production in the LHCb experiment and have achieved the desired goal of higher flexibility and robustness for the management and operation of the CondDB. The latter one has been fully designed and is passing currently to the implementation stage.

  5. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  6. Industrial applications of computed tomography

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Carmignato, S.; Kruth, J. -P.

    2014-01-01

    The number of industrial applications of Computed Tomography(CT) is large and rapidly increasing. After a brief market overview, the paper gives a survey of state of the art and upcoming CT technologies, covering types of CT systems, scanning capabilities, and technological advances. The paper...

  7. Wearable computer technology for dismounted applications

    Science.gov (United States)

    Daniels, Reginald

    2010-04-01

    Small computing devices which rival the compact size of traditional personal digital assistants (PDA) have recently established a market niche. These computing devices are small enough to be considered unobtrusive for humans to wear. The computing devices are also powerful enough to run full multi-tasking general purpose operating systems. This paper will explore the wearable computer information system for dismounted applications recently fielded for ground-based US Air Force use. The environments that the information systems are used in will be reviewed, as well as a description of the net-centric, ground-based warrior. The paper will conclude with a discussion regarding the importance of intuitive, usable, and unobtrusive operator interfaces for dismounted operators.

  8. Failure rates in Barsebaeck-1 reactor coolant pressure boundary piping. An application of a piping failure database

    Energy Technology Data Exchange (ETDEWEB)

    Lydell, B. [RSA Technologies, Vista, CA (United States)

    1999-05-01

    This report documents an application of a piping failure database to estimate the frequency of leak and rupture in reactor coolant pressure boundary piping. The study used Barsebaeck-1 as reference plant. The study tried two different approaches to piping failure rate estimation: 1) PSA-style, simple estimation using Bayesian statistics, and 2) fitting of statistical distribution to failure data. A large, validated database on piping failures (like the SKI-PIPE database) supports both approaches. In addition to documenting leak and rupture frequencies, the SKI report describes the use of piping failure data to estimate frequency of medium and large loss of coolant accidents (LOCAs). This application study was co sponsored by Barsebaeck Kraft AB and SKI Research 41 refs, figs, tabs

  9. Industrial applications of computer tomography

    International Nuclear Information System (INIS)

    Sheng Kanglong; Qiang Yujun; Yang Fujia

    1992-01-01

    Industrial computer tomography (CT) and its application is a rapidly developing field of high technology. CT systems have been playing important roles in nondestructive testing (NDT) of products and equipment for a number of industries. Recently, the technique has advanced into the area of industrial process control, bringing even greater benefit to mankind. The basic principles and typical structure of an industrial CT system Descriptions are given of some successful CT systems for either NDT application or process control purposes

  10. Computer Science Research at Langley

    Science.gov (United States)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  11. From handwriting analysis to pen-computer applications

    NARCIS (Netherlands)

    Schomaker, L

    1998-01-01

    In this paper, pen computing, i.e. the use of computers and applications in which the pen is the main input device, will be described from four different viewpoints. Firstly a brief overview of the hardware developments in pen systems is given, leading to the conclusion that the technological

  12. Role of Database Management Systems in Selected Engineering Institutions of Andhra Pradesh: An Analytical Survey

    Directory of Open Access Journals (Sweden)

    Kutty Kumar

    2016-06-01

    Full Text Available This paper aims to analyze the function of database management systems from the perspective of librarians working in engineering institutions in Andhra Pradesh. Ninety-eight librarians from one hundred thirty engineering institutions participated in the study. The paper reveals that training by computer suppliers and software packages are the significant mode of acquiring DBMS skills by librarians; three-fourths of the librarians are postgraduate degree holders. Most colleges use database applications for automation purposes and content value. Electrical problems and untrained staff seem to be major constraints faced by respondents for managing library databases.

  13. WMC Database Evaluation. Case Study Report

    Energy Technology Data Exchange (ETDEWEB)

    Palounek, Andrea P. T [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-29

    The WMC Database is ultimately envisioned to hold a collection of experimental data, design information, and information from computational models. This project was a first attempt at using the Database to access experimental data and extract information from it. This evaluation shows that the Database concept is sound and robust, and that the Database, once fully populated, should remain eminently usable for future researchers.

  14. Journal of Computer Science and Its Application: About this journal

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: About this journal. Journal Home > Journal of Computer Science and Its Application: About this journal. Log in or Register to get access to full text downloads.

  15. RAPPORT: running scientific high-performance computing applications on the cloud.

    Science.gov (United States)

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  16. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  17. JICST Factual Database(2)

    Science.gov (United States)

    Araki, Keisuke

    The computer programme, which builds atom-bond connection tables from nomenclatures, is developed. Chemical substances with their nomenclature and varieties of trivial names or experimental code numbers are inputted. The chemical structures of the database are stereospecifically stored and are able to be searched and displayed according to stereochemistry. Source data are from laws and regulations of Japan, RTECS of US and so on. The database plays a central role within the integrated fact database service of JICST and makes interrelational retrieval possible.

  18. Manufacturing and application of micro computer for control

    International Nuclear Information System (INIS)

    Park, Seung Man; Heo, Gyeong; Yun, Jun Young

    1990-05-01

    This book deals with machine code and assembly program for micro computer. It composed of 20 chapters, which are micro computer system, practice of a storage cell, manufacturing 1 of micro computer, manufacturing 2 of micro computer, manufacturing of micro computer AID-80A, making of machine language, interface like Z80-PIO and 8255A(PPI), counter and timer interface, exercise of basic command, arithmetic operation, arrangement operation, an indicator control, music playing, detection of input of PIO. control of LED of PIO, PIO mode, CTC control by micro computer, SIO control by micro computer and application by micro computer.

  19. Embracing the quantum limit in silicon computing.

    Science.gov (United States)

    Morton, John J L; McCamey, Dane R; Eriksson, Mark A; Lyon, Stephen A

    2011-11-16

    Quantum computers hold the promise of massive performance enhancements across a range of applications, from cryptography and databases to revolutionary scientific simulation tools. Such computers would make use of the same quantum mechanical phenomena that pose limitations on the continued shrinking of conventional information processing devices. Many of the key requirements for quantum computing differ markedly from those of conventional computers. However, silicon, which plays a central part in conventional information processing, has many properties that make it a superb platform around which to build a quantum computer. © 2011 Macmillan Publishers Limited. All rights reserved

  20. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  1. Coupling computer-interpretable guidelines with a drug-database through a web-based system – The PRESGUID project

    Directory of Open Access Journals (Sweden)

    Fieschi Marius

    2004-03-01

    Full Text Available Abstract Background Clinical Practice Guidelines (CPGs available today are not extensively used due to lack of proper integration into clinical settings, knowledge-related information resources, and lack of decision support at the point of care in a particular clinical context. Objective The PRESGUID project (PREScription and GUIDelines aims to improve the assistance provided by guidelines. The project proposes an online service enabling physicians to consult computerized CPGs linked to drug databases for easier integration into the healthcare process. Methods Computable CPGs are structured as decision trees and coded in XML format. Recommendations related to drug classes are tagged with ATC codes. We use a mapping module to enhance computerized guidelines coupling with a drug database, which contains detailed information about each usable specific medication. In this way, therapeutic recommendations are backed up with current and up-to-date information from the database. Results Two authoritative CPGs, originally diffused as static textual documents, have been implemented to validate the computerization process and to illustrate the usefulness of the resulting automated CPGs and their coupling with a drug database. We discuss the advantages of this approach for practitioners and the implications for both guideline developers and drug database providers. Other CPGs will be implemented and evaluated in real conditions by clinicians working in different health institutions.

  2. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    Science.gov (United States)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  3. The design of distributed database system for HIRFL

    International Nuclear Information System (INIS)

    Wang Hong; Huang Xinmin

    2004-01-01

    This paper is focused on a kind of distributed database system used in HIRFL distributed control system. The database of this distributed database system is established by SQL Server 2000, and its application system adopts the Client/Server model. Visual C ++ is used to develop the applications, and the application uses ODBC to access the database. (authors)

  4. RODOS database adapter

    International Nuclear Information System (INIS)

    Xie Gang

    1995-11-01

    Integrated data management is an essential aspect of many automatical information systems such as RODOS, a real-time on-line decision support system for nuclear emergency management. In particular, the application software must provide access management to different commercial database systems. This report presents the tools necessary for adapting embedded SQL-applications to both HP-ALLBASE/SQL and CA-Ingres/SQL databases. The design of the database adapter and the concept of RODOS embedded SQL syntax are discussed by considering some of the most important features of SQL-functions and the identification of significant differences between SQL-implementations. Finally fully part of the software developed and the administrator's and installation guides are described. (orig.) [de

  5. Application of ABWR construction database to nuclear power plant project

    International Nuclear Information System (INIS)

    Takashima, Atsushi; Katsube, Yasuhiko

    1999-01-01

    Tokyo Electric Power Company (TEPCO) completed the construction of Kashiwazaki-Kariwa Nuclear Power Station Unit No. 6 and No. 7 (K-6/7) as the first advanced boiling water reactors (ABWR) in the world successfully. K-6 and K-7 started their commercial operations in November, 1996 and in July, 1997 respectively. We consider ABWR as a standard BWR in the world as well as in Japan because ABWR is highly reputed. However, because the interval of our nuclear power plant construction is going to be longer, our engineering level on plant construction will be declining. Hence it is necessary for us to maintain our engineering level. In addition to this circumstance, we are planning to wide application of separated purchase orders for further cost reduction. Also there is an expectation for our contribution to ABWR plant constructions overseas. As facing these circumstances, we have developed a construction database based on our experience for ABWR construction. As the first step of developing the database for these use, we analyzed our own activities in the previous ABWR construction. Through this analysis, we could define activity units of which the project consists. As the second step, we clarified the data which are treated in each activity unit and the interface among them. By taking these steps, we could develop our database efficiently. (author)

  6. Active In-Database Processing to Support Ambient Assisted Living Systems

    Directory of Open Access Journals (Sweden)

    Wagner O. de Morais

    2014-08-01

    Full Text Available As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  7. Review of the 3rd biennial conference on real-time computer applications in nuclear and particle physics

    International Nuclear Information System (INIS)

    O'Brien, D.W.

    1984-01-01

    The topics presented at this conference, held in May 1983 at Berkeley, centered on computer technologies and architecture as they affect the data-acquisition and data-reduction process. Data-acquisition hardware activities were represented with discussions of CAMAC, FASTBUS, and specialized hardware for data sorting, pipelined operations, and parallel processing. Data-acquisition software and software systems and data-reduction and analysis software systems were presented and discussed, as were graphics, data-base management, networking, interactive techniques, and IC technologies, and their impact on scientific applications. This review summarizes the more important papers presented at the conference

  8. Guide documentaire pour en savoir plus sur l'informatique (A Documentary Guide to Further Knowledge of Computer Science).

    Science.gov (United States)

    Achard-Bayle, Guy

    1985-01-01

    Presents a listing of textbooks, computer program guides, articles, opinion papers, histories, reference works, databases, reviews, bibliographies, organizations, and other information sources concerning computer applications. (MSE)

  9. Providing Availability, Performance, and Scalability By Using Cloud Database

    OpenAIRE

    Prof. Dr. Alaa Hussein Al-Hamami; RafalAdeeb Al-Khashab

    2014-01-01

    With the development of the internet, new technical and concepts have attention to all users of the internet especially in the development of information technology, such as concept is cloud. Cloud computing includes different components, of which cloud database has become an important one. A cloud database is a distributed database that delivers computing as a service or in form of virtual machine image instead of a product via the internet; its advantage is that database can...

  10. Scientific applications of symbolic computation

    International Nuclear Information System (INIS)

    Hearn, A.C.

    1976-02-01

    The use of symbolic computation systems for problem solving in scientific research is reviewed. The nature of the field is described, and particular examples are considered from celestial mechanics, quantum electrodynamics and general relativity. Symbolic integration and some more recent applications of algebra systems are also discussed [fr

  11. Cloud computing and digital media fundamentals, techniques, and applications

    CERN Document Server

    Li, Kuan-Ching; Shih, Timothy K

    2014-01-01

    Cloud Computing and Digital Media: Fundamentals, Techniques, and Applications presents the fundamentals of cloud and media infrastructure, novel technologies that integrate digital media with cloud computing, and real-world applications that exemplify the potential of cloud computing for next-generation digital media. It brings together technologies for media/data communication, elastic media/data storage, security, authentication, cross-network media/data fusion, interdevice media interaction/reaction, data centers, PaaS, SaaS, and more.The book covers resource optimization for multimedia clo

  12. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2013-01-01

    The volume contains the papers presented at FICTA 2012: International Conference on Frontiers in Intelligent Computing: Theory and Applications held on December 22-23, 2012 in Bhubaneswar engineering College, Bhubaneswar, Odissa, India. It contains 86 papers contributed by authors from the globe. These research papers mainly focused on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, image processing, cloud computing, networking etc.

  13. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  14. Design of the system of maintenance operations occupational safety and health database application of nuclear power station

    International Nuclear Information System (INIS)

    Wang Xuehong; Li Xiangyang; Ye Yongjun

    2011-01-01

    Based on the KKS code of building equipment in nuclear power station, this paper introduces the method of establishing the system of maintenance operation occupational safety and health database application. Through the application system of maintenance occupational safety and health database, it can summarize systematically all kinds of maintenance operation dangerous factor of nuclear power station, and make a convenience for staff to learn the maintenance operation dangerous factors and the prevention measures, so that it can achieve the management concept of 'precaution crucial, continuous improvement' that advocated by OSHMS. (authors)

  15. Mobile Computing: The Emerging Technology, Sensing, Challenges and Applications

    International Nuclear Information System (INIS)

    Bezboruah, T.

    2010-12-01

    The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)

  16. Ontology and Cloud Computing in Various Applications: The ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... to emphasize the importance of both ontology and cloud computing in various .... of knowledge management applications and retrieve information using .... above in terms of hard drive space, but any device ordinary computer ...

  17. Computer technology: its potential for industrial energy conservation. A technology applications manual

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-01-01

    Today, computer technology is within the reach of practically any industrial corporation regardless of product size. This manual highlights a few of the many applications of computers in the process industry and provides the technical reader with a basic understanding of computer technology, terminology, and the interactions among the various elements of a process computer system. The manual has been organized to separate process applications and economics from computer technology. Chapter 1 introduces the present status of process computer technology and describes the four major applications - monitoring, analysis, control, and optimization. The basic components of a process computer system also are defined. Energy-saving applications in the four major categories defined in Chapter 1 are discussed in Chapter 2. The economics of process computer systems is the topic of Chapter 3, where the historical trend of process computer system costs is presented. Evaluating a process for the possible implementation of a computer system requires a basic understanding of computer technology as well as familiarity with the potential applications; Chapter 4 provides enough technical information for an evaluation. Computer and associated peripheral costs and the logical sequence of steps in the development of a microprocessor-based process control system are covered in Chapter 5.

  18. Towards Process Support for Migrating Applications to Cloud Computing

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2012-01-01

    Cloud computing is an active area of research for industry and academia. There are a large number of organizations providing cloud computing infrastructure and services. In order to utilize these infrastructure resources and services, existing applications need to be migrated to clouds. However...... for supporting migration to cloud computing based on our experiences from migrating an Open Source System (OSS), Hackystat, to two different cloud computing platforms. We explained the process by performing a comparative analysis of our efforts to migrate Hackystate to Amazon Web Services and Google App Engine....... We also report the potential challenges, suitable solutions, and lesson learned to support the presented process framework. We expect that the reported experiences can serve guidelines for those who intend to migrate software applications to cloud computing....

  19. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    International Nuclear Information System (INIS)

    Jacobs, Colin; Prokop, Mathias; Rikxoort, Eva M. van; Ginneken, Bram van; Murphy, Keelin; Schaefer-Prokop, Cornelia M.

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process. The LIDC/IDRI database contains 888 thoracic CT scans with a section thickness of 2.5 mm or lower. We report performance of two commercial and one academic CAD system. The influence of presence of contrast, section thickness, and reconstruction kernel on CAD performance was assessed. Four radiologists independently analyzed the false positive CAD marks of the best CAD system. The updated commercial CAD system showed the best performance with a sensitivity of 82 % at an average of 3.1 false positive detections per scan. Forty-five false positive CAD marks were scored as nodules by all four radiologists in our study. On the largest publicly available reference database for lung nodule detection in chest CT, the updated commercial CAD system locates the vast majority of pulmonary nodules at a low false positive rate. Potential for CAD is substantiated by the fact that it identifies pulmonary nodules that were not marked during the extensive four-fold LIDC annotation process. (orig.)

  20. Design of remote weather monitor system based on embedded web database

    International Nuclear Information System (INIS)

    Gao Jiugang; Zhuang Along

    2010-01-01

    The remote weather monitoring system is designed by employing the embedded Web database technology and the S3C2410 microprocessor as the core. The monitoring system can simultaneously monitor the multi-channel sensor signals, and can give a dynamic Web pages display of various types of meteorological information on the remote computer. It gives a elaborated introduction of the construction and application of the Web database under the embedded Linux. Test results show that the client access the Web page via the GPRS or the Internet, acquires data and uses an intuitive graphical way to display the value of various types of meteorological information. (authors)

  1. Two-phase computer codes for zero-gravity applications

    International Nuclear Information System (INIS)

    Krotiuk, W.J.

    1986-10-01

    This paper discusses the problems existing in the development of computer codes which can analyze the thermal-hydraulic behavior of two-phase fluids especially in low gravity nuclear reactors. The important phenomenon affecting fluid flow and heat transfer in reduced gravity is discussed. The applicability of using existing computer codes for space applications is assessed. Recommendations regarding the use of existing earth based fluid flow and heat transfer correlations are made and deficiencies in these correlations are identified

  2. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  3. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  4. Discrete mathematics with applications

    CERN Document Server

    Koshy, Thomas

    2003-01-01

    This approachable text studies discrete objects and the relationsips that bind them. It helps students understand and apply the power of discrete math to digital computer systems and other modern applications. It provides excellent preparation for courses in linear algebra, number theory, and modern/abstract algebra and for computer science courses in data structures, algorithms, programming languages, compilers, databases, and computation.* Covers all recommended topics in a self-contained, comprehensive, and understandable format for students and new professionals * Emphasizes problem-solving techniques, pattern recognition, conjecturing, induction, applications of varying nature, proof techniques, algorithm development and correctness, and numeric computations* Weaves numerous applications into the text* Helps students learn by doing with a wealth of examples and exercises: - 560 examples worked out in detail - More than 3,700 exercises - More than 150 computer assignments - More than 600 writing projects*...

  5. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    Science.gov (United States)

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  6. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  7. The CAPEC Database

    DEFF Research Database (Denmark)

    Nielsen, Thomas Lund; Abildskov, Jens; Harper, Peter Mathias

    2001-01-01

    in the compound. This classification makes the CAPEC database a very useful tool, for example, in the development of new property models, since properties of chemically similar compounds are easily obtained. A program with efficient search and retrieval functions of properties has been developed.......The Computer-Aided Process Engineering Center (CAPEC) database of measured data was established with the aim to promote greater data exchange in the chemical engineering community. The target properties are pure component properties, mixture properties, and special drug solubility data....... The database divides pure component properties into primary, secondary, and functional properties. Mixture properties are categorized in terms of the number of components in the mixture and the number of phases present. The compounds in the database have been classified on the basis of the functional groups...

  8. First International Conference on Intelligent Computing and Applications

    CERN Document Server

    Kar, Rajib; Das, Swagatam; Panigrahi, Bijaya

    2015-01-01

    The idea of the 1st International Conference on Intelligent Computing and Applications (ICICA 2014) is to bring the Research Engineers, Scientists, Industrialists, Scholars and Students together from in and around the globe to present the on-going research activities and hence to encourage research interactions between universities and industries. The conference provides opportunities for the delegates to exchange new ideas, applications and experiences, to establish research relations and to find global partners for future collaboration. The proceedings covers latest progresses in the cutting-edge research on various research areas of Image, Language Processing, Computer Vision and Pattern Recognition, Machine Learning, Data Mining and Computational Life Sciences, Management of Data including Big Data and Analytics, Distributed and Mobile Systems including Grid and Cloud infrastructure, Information Security and Privacy, VLSI, Electronic Circuits, Power Systems, Antenna, Computational fluid dynamics & Hea...

  9. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  10. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  11. APPLICATIONS OF CLOUD COMPUTING SERVICES IN EDUCATION – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Tomasz Cieplak

    2014-11-01

    Full Text Available Applications of Cloud Computing in enterprises are very wide-ranging. In opposition, educational applications of Cloud Computing in Poland are someway limited. On the other hand, young people use services of Cloud Computing frequently. Utilization of Facebook, Google or other services in Poland by young people is almost the same as in Western Europe or in the USA. Taking into account those considerations, few years ago authors have started process of popularization and usage of Cloud Computing educational services in their professional work. This article briefly summarizes authors’ experience with selected and most popular Cloud Computing services.

  12. A Computer Knowledge Database of accidents at work in the construction industry

    Science.gov (United States)

    Hoła, B.; Szóstak, M.

    2017-10-01

    At least 60,000 fatal accidents at work occur on building sites all over the world each year, which means that on average, every 10 minutes an employee dies during the execution of work. In 2015 on Polish building sites, 5,776 accidents at work happened, of which 69 resulted in the death of an employee. Accidents are an enormous social and economic burden for companies, communities and countries. The vast majority of accidents at work can be prevented by appropriate and effective preventive measures. Therefore, the Computer Knowledge Database (CKD) was formulated for this purpose and it enables data and information on accidents at work in the construction industry to be collected and processed in order to obtain necessary knowledge. This gained knowledge will be the basis to form conclusions of a preventive nature

  13. Analysis of technologies databases use in physical education and sport

    OpenAIRE

    Usychenko V.V.; Byshevets N.G.

    2010-01-01

    Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is ...

  14. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  15. NoSQL and SQL Databases for Mobile Applications. Case Study: MongoDB versus PostgreSQL

    Directory of Open Access Journals (Sweden)

    Marin FOTACHE

    2013-01-01

    Full Text Available Compared with "classical" web, multi-tier applications, mobile applications have common and specific requirements concerning data persistence and processing. In mobile apps, database features can be distinctly analyzed for the client (minimalistic, isolated, memory-only and the server (data rich, centralized, distributed, synchronized and disk-based layers. Currently, a few lite relational database products reign the persistence for client platforms of mobile applications. There are two main objectives of this paper. First is to investigate storage options for major mobile platforms. Second is to point out some major differences between SQL and NoSQL datastores in terms of deployment, data model, schema design, data definition and manipulation. As NoSQL movement lacks standardization, from NoSQL products family MongoDB was chosen as reference, due to its strengths and popularity among developers. PostgreSQL serves the position of SQL DBMSs representative due to its popularity and conformity with SQL standards.

  16. Using ontology databases for scalable query answering, inconsistency detection, and data integration

    Science.gov (United States)

    Dou, Dejing

    2011-01-01

    An ontology database is a basic relational database management system that models an ontology plus its instances. To reason over the transitive closure of instances in the subsumption hierarchy, for example, an ontology database can either unfold views at query time or propagate assertions using triggers at load time. In this paper, we use existing benchmarks to evaluate our method—using triggers—and we demonstrate that by forward computing inferences, we not only improve query time, but the improvement appears to cost only more space (not time). However, we go on to show that the true penalties were simply opaque to the benchmark, i.e., the benchmark inadequately captures load-time costs. We have applied our methods to two case studies in biomedicine, using ontologies and data from genetics and neuroscience to illustrate two important applications: first, ontology databases answer ontology-based queries effectively; second, using triggers, ontology databases detect instance-based inconsistencies—something not possible using views. Finally, we demonstrate how to extend our methods to perform data integration across multiple, distributed ontology databases. PMID:22163378

  17. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  18. Notes on computer applications in the Canadian mineral industry and its future

    Energy Technology Data Exchange (ETDEWEB)

    Das, B M

    1983-10-01

    The importance of computer applications to the mineral industry in Canada; the formation and role of the Computer Applications and Process Control Committee (CAPC) of the CIM, and the CAPC's computer applications study in 1982 with the highlights of the study are discussed. The coal industry was the least touched by this survey. The need for computer workshops dealing with the various aspects of coal mining is stressed.

  19. The TJ-II Relational Database Access Library: A User's Guide

    International Nuclear Information System (INIS)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-01-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs

  20. Towards a Component Based Model for Database Systems

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2004-02-01

    Full Text Available Due to their effectiveness in the design and development of software applications and due to their recognized advantages in terms of reusability, Component-Based Software Engineering (CBSE concepts have been arousing a great deal of interest in recent years. This paper presents and extends a component-based approach to object-oriented database systems (OODB introduced by us in [1] and [2]. Components are proposed as a new abstraction level for database system, logical partitions of the schema. In this context, the scope is introduced as an escalated property for transactions. Components are studied from the integrity, consistency, and concurrency control perspective. The main benefits of our proposed component model for OODB are the reusability of the database design, including the access statistics required for a proper query optimization, and a smooth information exchange. The integration of crosscutting concerns into the component database model using aspect-oriented techniques is also discussed. One of the main goals is to define a method for the assessment of component composition capabilities. These capabilities are restricted by the component’s interface and measured in terms of adaptability, degree of compose-ability and acceptability level. The above-mentioned metrics are extended from database components to generic software components. This paper extends and consolidates into one common view the ideas previously presented by us in [1, 2, 3].[1] Octavian Paul Rotaru, Marian Dobre, Component Aspects in Object Oriented Databases, Proceedings of the International Conference on Software Engineering Research and Practice (SERP’04, Volume II, ISBN 1-932415-29-7, pages 719-725, Las Vegas, NV, USA, June 2004.[2] Octavian Paul Rotaru, Marian Dobre, Mircea Petrescu, Integrity and Consistency Aspects in Component-Oriented Databases, Proceedings of the International Symposium on Innovation in Information and Communication Technology (ISIICT

  1. Data-base tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    The authors use a commercial data-base software package to create several data-base products that enhance the ability of experimental physicists to analyze data from the TMX-U experiment. This software resides on a Dec-20 computer in M-Divisions's user service center (USC), where data can be analyzed separately from the main acquisition computers. When these data-base tools are combined with interactive data analysis programs, physicists can perform automated (batch-style) processing or interactive data analysis on the computers in the USC or on the supercomputers of the NMFECC, in addition to the normal processing done on the acquisition system. One data-base tool provides highly reduced data for searching and correlation analysis of several diagnostic signals for a single shot or many shots. A second data-base tool provides retrieval and storage of unreduced data for detailed analysis of one or more diagnostic signals. The authors report how these data-base tools form the core of an evolving off-line data-analysis environment on the USC computers

  2. Quantum Computation-Based Image Representation, Processing Operations and Their Applications

    Directory of Open Access Journals (Sweden)

    Fei Yan

    2014-10-01

    Full Text Available A flexible representation of quantum images (FRQI was proposed to facilitate the extension of classical (non-quantum-like image processing applications to the quantum computing domain. The representation encodes a quantum image in the form of a normalized state, which captures information about colors and their corresponding positions in the images. Since its conception, a handful of processing transformations have been formulated, among which are the geometric transformations on quantum images (GTQI and the CTQI that are focused on the color information of the images. In addition, extensions and applications of FRQI representation, such as multi-channel representation for quantum images (MCQI, quantum image data searching, watermarking strategies for quantum images, a framework to produce movies on quantum computers and a blueprint for quantum video encryption and decryption have also been suggested. These proposals extend classical-like image and video processing applications to the quantum computing domain and offer a significant speed-up with low computational resources in comparison to performing the same tasks on traditional computing devices. Each of the algorithms and the mathematical foundations for their execution were simulated using classical computing resources, and their results were analyzed alongside other classical computing equivalents. The work presented in this review is intended to serve as the epitome of advances made in FRQI quantum image processing over the past five years and to simulate further interest geared towards the realization of some secure and efficient image and video processing applications on quantum computers.

  3. Database design using entity-relationship diagrams

    CERN Document Server

    Bagui, Sikha

    2011-01-01

    Data, Databases, and the Software Engineering ProcessDataBuilding a DatabaseWhat is the Software Engineering Process?Entity Relationship Diagrams and the Software Engineering Life Cycle          Phase 1: Get the Requirements for the Database          Phase 2: Specify the Database          Phase 3: Design the DatabaseData and Data ModelsFiles, Records, and Data ItemsMoving from 3 × 5 Cards to ComputersDatabase Models     The Hierarchical ModelThe Network ModelThe Relational ModelThe Relational Model and Functional DependenciesFundamental Relational DatabaseRelational Database and SetsFunctional

  4. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    Science.gov (United States)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  5. A Tabu Search Algorithm for application placement in computer clustering

    NARCIS (Netherlands)

    van der Gaast, Jelmer; Rietveld, Cornelieus A.; Gabor, Adriana; Zhang, Yingqian

    2014-01-01

    This paper presents and analyzes a model for the problem of placing applications on computer clusters (APP). In this problem, organizations requesting a set of software applications have to be assigned to computer clusters such that the costs of opening clusters and installing the necessary

  6. Web application for monitoring mainframe computer, Linux operating systems and application servers

    OpenAIRE

    Dimnik, Tomaž

    2016-01-01

    This work presents the idea and the realization of web application for monitoring the operation of the mainframe computer, servers with Linux operating system and application servers. Web application is intended for administrators of these systems, as an aid to better understand the current state, load and operation of the individual components of the server systems.

  7. IP Telephony Applicability in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Francisco Palacios

    2018-02-01

    Full Text Available This paper carries out a research related to the applicability of VoIP over Cloud Computing to guarantee service stability and elasticity of the organizations. In this paper, Elastix is used as an open source software that allows the management and control of a Private Branch Exchange (PBX; and for developing, it is used the services given Amazon Web Services due to their leadership and experience in cloud computing providing security, scalability, backup service and feasibility for the users.

  8. Application of cluster computing in materials science

    International Nuclear Information System (INIS)

    Kuzmin, A.

    2006-01-01

    Solution of many problems in materials science requires that high performance computing (HPC) be used. Therefore, a cluster computer, Latvian Super-cluster (LASC), was constructed at the Institute of Solid State Physics of the University of Latvia in 2002. The LASC is used for advanced research in the fields of quantum chemistry, solid state physics and nano materials. In this work we overview currently available computational technologies and exemplify their application by interpretation of x-ray absorption spectra for nano-sized ZnO. (author)

  9. 3rd International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Biswal, Bhabendra; Udgata, Siba; Mandal, JK

    2015-01-01

    Volume 1 contains 95 papers presented at FICTA 2014: Third International Conference on Frontiers in Intelligent Computing: Theory and Applications. The conference was held during 14-15, November, 2014 at Bhubaneswar, Odisha, India.  This volume contains papers mainly focused on Data Warehousing and Mining, Machine Learning, Mobile and Ubiquitous Computing, AI, E-commerce & Distributed Computing and Soft Computing, Evolutionary Computing, Bio-inspired Computing and its Applications.

  10. Expert Oracle database architecture Oracle database programming 9i, 10g, and 11g : Techniques and solution

    CERN Document Server

    Kyte, Thomas

    2010-01-01

    Now in its second edition, this best-selling book by Tom Kyte of Ask Tom fame continues to bring you some of the best thinking on how to apply Oracle Database to produce scalable applications that perform well and deliver correct results. Tom has a simple philosophy: you can treat Oracle as a black box and just stick data into it or you can understand how it works and exploit it as a powerful computing environment. If you choose the latter, then you'll find that there are few information management problems that you cannot solve quickly and elegantly. This fully revised second edition covers t

  11. Applications of the Cambridge Structural Database in chemical education1

    Science.gov (United States)

    Battle, Gary M.; Ferrence, Gregory M.; Allen, Frank H.

    2010-01-01

    The Cambridge Structural Database (CSD) is a vast and ever growing compendium of accurate three-dimensional structures that has massive chemical diversity across organic and metal–organic compounds. For these reasons, the CSD is finding significant uses in chemical education, and these applications are reviewed. As part of the teaching initiative of the Cambridge Crystallographic Data Centre (CCDC), a teaching subset of more than 500 CSD structures has been created that illustrate key chemical concepts, and a number of teaching modules have been devised that make use of this subset in a teaching environment. All of this material is freely available from the CCDC website, and the subset can be freely viewed and interrogated using WebCSD, an internet application for searching and displaying CSD information content. In some cases, however, the complete CSD System is required for specific educational applications, and some examples of these more extensive teaching modules are also discussed. The educational value of visualizing real three-dimensional structures, and of handling real experimental results, is stressed throughout. PMID:20877495

  12. Aggregating job exit statuses of a plurality of compute nodes executing a parallel application

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Mundy, Michael B.

    2015-07-21

    Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregating each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.

  13. Educational Technology Network: a computer conferencing system dedicated to applications of computers in radiology practice, research, and education.

    Science.gov (United States)

    D'Alessandro, M P; Ackerman, M J; Sparks, S M

    1993-11-01

    Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.

  14. A directory of computer software applications: energy. Report for 1974--1976

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1977-04-01

    The computer programs or the computer program documentation cited in this directory have been developed for a variety of applications in the field of energy. The cited computer software includes applications in solar energy, petroleum resources, batteries, electrohydrodynamic generators, magnetohydrodynamic generators, natural gas, nuclear fission, nuclear fusion, hydroelectric power production, and geothermal energy. The computer software cited has been used for simulation and modeling, calculations of future energy requirements, calculations of energy conservation measures, and computations of economic considerations of energy systems

  15. ACToR - Aggregated Computational Toxicology Resource

    International Nuclear Information System (INIS)

    Judson, Richard; Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-01-01

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast TM

  16. Applications of Context-Aware Computing in Hospital Work - Examples and Design Principles

    DEFF Research Database (Denmark)

    Bardram, Jacob Eyvind

    2004-01-01

    Context-awareness is a key concept in ubiquitous computing, which sometimes seems to be a technology looking for a purpose. In this paper we report on the application of context-aware computing for medical work in hospitals, which has appeared to be a strong case for applying context-aware comput...... of designing, developing, and evaluating context-aware clinical applications, the paper outlines some key design principles for a context-awareness framework, supporting the development and deployment of context-aware clinical computer applications.......Context-awareness is a key concept in ubiquitous computing, which sometimes seems to be a technology looking for a purpose. In this paper we report on the application of context-aware computing for medical work in hospitals, which has appeared to be a strong case for applying context......-aware computing. We present the design of a context-aware pill container and a context-aware hospital bed, both of which reacts and adapts according to what is happening in their context. The applications have been evaluated in a number of workshop with clinicians and patients. Based on this empirical work...

  17. Research on cloud computing solutions

    Directory of Open Access Journals (Sweden)

    Liudvikas Kaklauskas

    2015-07-01

    Full Text Available Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.According to the different types of services offered, cloud computing can be considered to consist of three layers (services models: IaaS (infrastructure as a service, PaaS (platform as a service, SaaS (software as a service. Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.DOI: 10.15181/csat.v2i2.914

  18. Computational intelligence in digital forensics forensic investigation and applications

    CERN Document Server

    Choo, Yun-Huoy; Abraham, Ajith; Srihari, Sargur

    2014-01-01

    Computational Intelligence techniques have been widely explored in various domains including forensics. Analysis in forensic encompasses the study of pattern analysis that answer the question of interest in security, medical, legal, genetic studies and etc. However, forensic analysis is usually performed through experiments in lab which is expensive both in cost and time. Therefore, this book seeks to explore the progress and advancement of computational intelligence technique in different focus areas of forensic studies. This aims to build stronger connection between computer scientists and forensic field experts.   This book, Computational Intelligence in Digital Forensics: Forensic Investigation and Applications, is the first volume in the Intelligent Systems Reference Library series. The book presents original research results and innovative applications of computational intelligence in digital forensics. This edited volume contains seventeen chapters and presents the latest state-of-the-art advancement ...

  19. 47 CFR 69.120 - Line information database.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Line information database. 69.120 Section 69...) ACCESS CHARGES Computation of Charges § 69.120 Line information database. (a) A charge that is expressed... from a local exchange carrier database to recover the costs of: (1) The transmission facilities between...

  20. Application of computer technique in SMCAMS

    International Nuclear Information System (INIS)

    Lu Deming

    2001-01-01

    A series of applications of computer technique in SMCAMS physics design and magnetic field measurement is described, including digital calculation of electric-magnetic field, beam dynamics, calculation of beam injection and extraction, and mapping and shaping of the magnetic field

  1. Cloud Computing Integrated Multi-Factor Authentication Framework Application in Logistics Information Systems

    Directory of Open Access Journals (Sweden)

    Zeynel Erdi Karabulut

    2017-12-01

    Full Text Available As new technology enables firms to perform many daily processes easier the need of authentication and authorization process is becoming an integral part of many businesses. Also mobile applications are very popular nowadays play an important role in our lives. Such demands are not only limited to Logistics Information Systems (LIS but many field of information system as well. In this study multi-dimensional authentication which consist of online biometric face detection integrated as cloud computing software as a Service (SaaS, Near Field Communication (NFC card authentication, location confirmation, and temporal data confirmation are gathered together to fulfill different scenarios of authentication needs of business. Microsoft Face API (Application Program Interface, SAAS (software as a service has been used in face recognition module of developed mobile application. The face recognition module of the mobile application has been tested with Yale Face Database. Location, temporal data and NFC card information are collected and confirmed by the mobile application for authentication and authorization. These images were tested with our facial recognition module and confusion matrices were created. The accuracy of the system after the facial recognition test was found to be 100%. NFC card, location and temporal data authentication not only further increases security level but also fulfils many business authentication scenarios successfully. To the best of our knowledge there is no other authentication model other than implemented one that has a-4-factor confirmation including biometric face identification, NFC card authentication, location confirmation and temporal data confirmation.

  2. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  3. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  4. The Application and Future of Big Database Studies in Cardiology: A Single-Center Experience.

    Science.gov (United States)

    Lee, Kuang-Tso; Hour, Ai-Ling; Shia, Ben-Chang; Chu, Pao-Hsien

    2017-11-01

    As medical research techniques and quality have improved, it is apparent that cardiovascular problems could be better resolved by more strict experiment design. In fact, substantial time and resources should be expended to fulfill the requirements of high quality studies. Many worthy ideas and hypotheses were unable to be verified or proven due to ethical or economic limitations. In recent years, new and various applications and uses of databases have received increasing attention. Important information regarding certain issues such as rare cardiovascular diseases, women's heart health, post-marketing analysis of different medications, or a combination of clinical and regional cardiac features could be obtained by the use of rigorous statistical methods. However, there are limitations that exist among all databases. One of the key essentials to creating and correctly addressing this research is through reliable processes of analyzing and interpreting these cardiologic databases.

  5. Conditions Database for the Belle II Experiment

    Science.gov (United States)

    Wood, L.; Elsethagen, T.; Schram, M.; Stephan, E.

    2017-10-01

    The Belle II experiment at KEK is preparing for first collisions in 2017. Processing the large amounts of data that will be produced will require conditions data to be readily available to systems worldwide in a fast and efficient manner that is straightforward for both the user and maintainer. The Belle II conditions database was designed with a straightforward goal: make it as easily maintainable as possible. To this end, HEP-specific software tools were avoided as much as possible and industry standard tools used instead. HTTP REST services were selected as the application interface, which provide a high-level interface to users through the use of standard libraries such as curl. The application interface itself is written in Java and runs in an embedded Payara-Micro Java EE application server. Scalability at the application interface is provided by use of Hazelcast, an open source In-Memory Data Grid (IMDG) providing distributed in-memory computing and supporting the creation and clustering of new application interface instances as demand increases. The IMDG provides fast and efficient access to conditions data via in-memory caching.

  6. [Adverse Effect Predictions Based on Computational Toxicology Techniques and Large-scale Databases].

    Science.gov (United States)

    Uesawa, Yoshihiro

    2018-01-01

     Understanding the features of chemical structures related to the adverse effects of drugs is useful for identifying potential adverse effects of new drugs. This can be based on the limited information available from post-marketing surveillance, assessment of the potential toxicities of metabolites and illegal drugs with unclear characteristics, screening of lead compounds at the drug discovery stage, and identification of leads for the discovery of new pharmacological mechanisms. This present paper describes techniques used in computational toxicology to investigate the content of large-scale spontaneous report databases of adverse effects, and it is illustrated with examples. Furthermore, volcano plotting, a new visualization method for clarifying the relationships between drugs and adverse effects via comprehensive analyses, will be introduced. These analyses may produce a great amount of data that can be applied to drug repositioning.

  7. National Geochronological Database

    Science.gov (United States)

    Revised by Sloan, Jan; Henry, Christopher D.; Hopkins, Melanie; Ludington, Steve; Original database by Zartman, Robert E.; Bush, Charles A.; Abston, Carl

    2003-01-01

    The National Geochronological Data Base (NGDB) was established by the United States Geological Survey (USGS) to collect and organize published isotopic (also known as radiometric) ages of rocks in the United States. The NGDB (originally known as the Radioactive Age Data Base, RADB) was started in 1974. A committee appointed by the Director of the USGS was given the mission to investigate the feasibility of compiling the published radiometric ages for the United States into a computerized data bank for ready access by the user community. A successful pilot program, which was conducted in 1975 and 1976 for the State of Wyoming, led to a decision to proceed with the compilation of the entire United States. For each dated rock sample reported in published literature, a record containing information on sample location, rock description, analytical data, age, interpretation, and literature citation was constructed and included in the NGDB. The NGDB was originally constructed and maintained on a mainframe computer, and later converted to a Helix Express relational database maintained on an Apple Macintosh desktop computer. The NGDB and a program to search the data files were published and distributed on Compact Disc-Read Only Memory (CD-ROM) in standard ISO 9660 format as USGS Digital Data Series DDS-14 (Zartman and others, 1995). As of May 1994, the NGDB consisted of more than 18,000 records containing over 30,000 individual ages, which is believed to represent approximately one-half the number of ages published for the United States through 1991. Because the organizational unit responsible for maintaining the database was abolished in 1996, and because we wanted to provide the data in more usable formats, we have reformatted the data, checked and edited the information in some records, and provided this online version of the NGDB. This report describes the changes made to the data and formats, and provides instructions for the use of the database in geographic

  8. A New Database for Speaker Recognition

    DEFF Research Database (Denmark)

    Feng, Ling; Hansen, Lars Kai

    2005-01-01

    In this paper we discuss properties of speech databases used for speaker recognition research and evaluation, and we characterize some popular standard databases. The paper presents a new database called ELSDSR dedicated to speaker recognition applications. The main characteristics of this database...

  9. New Concepts and Applications in Soft Computing

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária

    2013-01-01

                  The book provides a sample of research on the innovative theory and applications of soft computing paradigms.             The idea of Soft Computing was initiated in 1981 when Professor Zadeh published his first paper on soft data analysis and constantly evolved ever since. Professor Zadeh defined Soft Computing as the fusion of the fields of fuzzy logic (FL), neural network theory (NN) and probabilistic reasoning (PR), with the latter subsuming belief networks, evolutionary computing including DNA computing, chaos theory and parts of learning theory into one multidisciplinary system. As Zadeh said the essence of soft computing is that unlike the traditional, hard computing, soft computing is aimed at an accommodation with the pervasive imprecision of the real world. Thus, the guiding principle of soft computing is to exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. ...

  10. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Kothe, Douglas B [ORNL; Nam, Hai Ah [ORNL

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  11. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    International Nuclear Information System (INIS)

    Joubert, Wayne; Kothe, Douglas B.; Nam, Hai Ah

    2009-01-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  12. Database structure and file layout of Nuclear Power Plant Database. Database for design information on Light Water Reactors in Japan

    International Nuclear Information System (INIS)

    Yamamoto, Nobuo; Izumi, Fumio.

    1995-12-01

    The Nuclear Power Plant Database (PPD) has been developed at the Japan Atomic Energy Research Institute (JAERI) to provide plant design information on domestic Light Water Reactors (LWRs) to be used for nuclear safety research and so forth. This database can run on the main frame computer in the JAERI Tokai Establishment. The PPD contains the information on the plant design concepts, the numbers, capacities, materials, structures and types of equipment and components, etc, based on the safety analysis reports of the domestic LWRs. This report describes the details of the PPD focusing on the database structure and layout of data files so that the users can utilize it efficiently. (author)

  13. Journal of Computer Science and Its Application: Submissions

    African Journals Online (AJOL)

    Author Guidelines. The Journal of Computer Science and Its Applications welcomes submission of complete and original research manuscripts, which are not under review in any other conference or journal. The topics covered by the journal include but are not limited to Artificial Intelligence, Bioinformatics, Computational ...

  14. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2011-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fifth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rules

  15. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2005-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fourth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rul

  16. Application of Google Maps API service for creating web map of information retrieved from CORINE land cover databases

    Directory of Open Access Journals (Sweden)

    Kilibarda Milan

    2010-01-01

    Full Text Available Today, Google Maps API application based on Ajax technology as standard web service; facilitate users with publication interactive web maps, thus opening new possibilities in relation to the classical analogue maps. CORINE land cover databases are recognized as the fundamental reference data sets for numerious spatial analysis. The theoretical and applicable aspects of Google Maps API cartographic service are considered on the case of creating web map of change in urban areas in Belgrade and surround from 2000. to 2006. year, obtained from CORINE databases.

  17. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  18. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  19. Institute for Computer Applications in Science and Engineering (ICASE)

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.

  20. Stochastic Collocation Applications in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Dragan Poljak

    2018-01-01

    Full Text Available The paper reviews the application of deterministic-stochastic models in some areas of computational electromagnetics. Namely, in certain problems there is an uncertainty in the input data set as some properties of a system are partly or entirely unknown. Thus, a simple stochastic collocation (SC method is used to determine relevant statistics about given responses. The SC approach also provides the assessment of related confidence intervals in the set of calculated numerical results. The expansion of statistical output in terms of mean and variance over a polynomial basis, via SC method, is shown to be robust and efficient approach providing a satisfactory convergence rate. This review paper provides certain computational examples from the previous work by the authors illustrating successful application of SC technique in the areas of ground penetrating radar (GPR, human exposure to electromagnetic fields, and buried lines and grounding systems.

  1. Database architectures for Space Telescope Science Institute

    Science.gov (United States)

    Lubow, Stephen

    1993-08-01

    At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).

  2. Integrasi Database DISDUKCAPIL dan Database KPU Kabupaten Maros Memanfaatkan Web Services

    Directory of Open Access Journals (Sweden)

    Frans N. Allokendek

    2013-01-01

    Abstract Many problems are encountered in the implementation of Local Election, which are caused by both the committee and participants. The problems that frequently occurred are the unavailability of data on the list of updated potential population to be voters in Local Election, the swallowing number of voters due to double data, and the limited time to verify documents. The similar problems that are encountered by General Election Committee of Maros Regency. Web service is a technology that includes a set of standards allowing two computer applications that can communicate with each other and exchange data in Internet. In the study, web services are used to communicate two different applications: SIAK of  the Demography and Civil Registration Office of Maros Regency and SIDP of the General Election Committee of Maros Regency. The study is followed by making the design of system and implementation such as a prototype data integration system between the database of SIAK of the Demography and Civil Registration Office of Maros Regency and that of General KPU in Maros Regency by utilizing web service technology. The result is the valid Fixed Voter List.   Keywords—Web Services, Data Integration, Fixed Voter List

  3. Enforcing Privacy in Cloud Databases

    OpenAIRE

    Moghadam, Somayeh Sobati; Darmont, Jérôme; Gavin, Gérald

    2017-01-01

    International audience; Outsourcing databases, i.e., resorting to Database-as-a-Service (DBaaS), is nowadays a popular choice due to the elasticity, availability, scalability and pay-as-you-go features of cloud computing. However, most data are sensitive to some extent, and data privacy remains one of the top concerns to DBaaS users, for obvious legal and competitive reasons.In this paper, we survey the mechanisms that aim at making databases secure in a cloud environment, and discuss current...

  4. Atlantic Canada's energy research and development website and database

    International Nuclear Information System (INIS)

    2005-01-01

    Petroleum Research Atlantic Canada maintains a website devoted to energy research and development in Atlantic Canada. The site can be viewed on the world wide web at www.energyresearch.ca. It includes a searchable database with information about researchers in Nova Scotia, their projects and published materials on issues related to hydrocarbons, alternative energy technologies, energy efficiency, climate change, environmental impacts and policy. The website also includes links to research funding agencies, external related databases and related energy organizations around the world. Nova Scotia-based users are invited to submit their academic, private or public research to the site. Before being uploaded into the database, a site administrator reviews and processes all new information. Users are asked to identify their areas of interest according to the following research categories: alternative or renewable energy technologies; climate change; coal; computer applications; economics; energy efficiency; environmental impacts; geology; geomatics; geophysics; health and safety; human factors; hydrocarbons; meteorology and oceanology (metocean) activities; petroleum operations in deep and shallow waters; policy; and power generation and supply. The database can be searched 5 ways according to topic, researchers, publication, projects or funding agency. refs., tabs., figs

  5. Reducing power consumption during execution of an application on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-06-05

    Methods, apparatus, and products are disclosed for reducing power consumption during execution of an application on a plurality of compute nodes that include: executing, by each compute node, an application, the application including power consumption directives corresponding to one or more portions of the application; identifying, by each compute node, the power consumption directives included within the application during execution of the portions of the application corresponding to those identified power consumption directives; and reducing power, by each compute node, to one or more components of that compute node according to the identified power consumption directives during execution of the portions of the application corresponding to those identified power consumption directives.

  6. User's guide to the Geothermal Resource Areas Database

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D.; Leung, K.; Yen, W.

    1981-10-01

    The National Geothermal Information Resource project at the Lawrence Berkeley Laboratory is developing a Geothermal Resource Areas Database, called GRAD, designed to answer questions about the progress of geothermal energy development. This database will contain extensive information on geothermal energy resources for selected areas, covering development from initial exploratory surveys to plant construction and operation. The database is available for on-lie interactive query by anyone with an account number on the computer, a computer terminal with an acoustic coupler, and a telephone. This report will help in making use of the database. Some information is provided on obtaining access to the computer system being used, instructions on obtaining standard reports, and some aids to using the query language.

  7. Computational electromagnetics recent advances and engineering applications

    CERN Document Server

    2014-01-01

    Emerging Topics in Computational Electromagnetics in Computational Electromagnetics presents advances in Computational Electromagnetics. This book is designed to fill the existing gap in current CEM literature that only cover the conventional numerical techniques for solving traditional EM problems. The book examines new algorithms, and applications of these algorithms for solving problems of current interest that are not readily amenable to efficient treatment by using the existing techniques. The authors discuss solution techniques for problems arising in nanotechnology, bioEM, metamaterials, as well as multiscale problems. They present techniques that utilize recent advances in computer technology, such as parallel architectures, and the increasing need to solve large and complex problems in a time efficient manner by using highly scalable algorithms.

  8. Survey of Machine Learning Methods for Database Security

    Science.gov (United States)

    Kamra, Ashish; Ber, Elisa

    Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.

  9. Computational geometry algorithms and applications

    CERN Document Server

    de Berg, Mark; Overmars, Mark; Schwarzkopf, Otfried

    1997-01-01

    Computational geometry emerged from the field of algorithms design and anal­ ysis in the late 1970s. It has grown into a recognized discipline with its own journals, conferences, and a large community of active researchers. The suc­ cess of the field as a research discipline can on the one hand be explained from the beauty of the problems studied and the solutions obtained, and, on the other hand, by the many application domains--computer graphics, geographic in­ formation systems (GIS), robotics, and others-in which geometric algorithms play a fundamental role. For many geometric problems the early algorithmic solutions were either slow or difficult to understand and implement. In recent years a number of new algorithmic techniques have been developed that improved and simplified many of the previous approaches. In this textbook we have tried to make these modem algorithmic solutions accessible to a large audience. The book has been written as a textbook for a course in computational geometry, but it can ...

  10. Computational logic: its origins and applications.

    Science.gov (United States)

    Paulson, Lawrence C

    2018-02-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.

  11. Conformal geometry computational algorithms and engineering applications

    CERN Document Server

    Jin, Miao; He, Ying; Wang, Yalin

    2018-01-01

    This book offers an essential overview of computational conformal geometry applied to fundamental problems in specific engineering fields. It introduces readers to conformal geometry theory and discusses implementation issues from an engineering perspective.  The respective chapters explore fundamental problems in specific fields of application, and detail how computational conformal geometric methods can be used to solve them in a theoretically elegant and computationally efficient way. The fields covered include computer graphics, computer vision, geometric modeling, medical imaging, and wireless sensor networks. Each chapter concludes with a summary of the material covered and suggestions for further reading, and numerous illustrations and computational algorithms complement the text.  The book draws on courses given by the authors at the University of Louisiana at Lafayette, the State University of New York at Stony Brook, and Tsinghua University, and will be of interest to senior undergraduates, gradua...

  12. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  13. Migration of the Almaraz NPP integrated operation management system to a new computer platform

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    1996-01-01

    In all power plants, it becomes necessary, with the passage of time, to migrate the initial operation management systems to adapt them to current technologies. That is a good time to improve the inclusion of data in the corporative database and standardize the system interfaces and operation, whilst maintaining data system operability. This article contains Almaraz experience in migrating its Integrated Operation Management System to an advanced computer platform based on open systems (UNIX), communications network (ETHERNET) and database (ORACLE). To this effect, clear objectives and strict standards were established to facilitate the work. The most noteworthy results obtained are: Better quality of information and structure in the corporative database Standardised user interface in all applications. Joint migration of applications for Maintenance, Components and Spare parts, Warehouses and Purchases. Integration of new applications into the system. Introduction of the navigator, which allows movement around the database using all available applications. (Author)

  14. Database for environmental monitoring at nuclear facilities

    International Nuclear Information System (INIS)

    Raceanu, M.; Varlam, C.; Enache, A.; Faurescu, I.

    2006-01-01

    To ensure that an assessment could be made of the impact of nuclear facilities on the local environment, a program of environmental monitoring must be established well in advance of nuclear facilities operation. Enormous amount of data must be stored and correlated starting with: location, meteorology, type sample characterization from water to different kind of food, radioactivity measurement and isotopic measurement (e.g. for C-14 determination, C-13 isotopic correction it is a must). Data modelling is a well known mechanism describing data structures at a high level of abstraction. Such models are often used to automatically create database structures, and to generate code structures used to access databases. This has the disadvantage of losing data constraints that might be specified in data models for data checking. Embodiment of the system of the present application includes a computer-readable memory for storing a definitional data table for defining variable symbols representing respective measurable physical phenomena. The definitional data table uniquely defines the variable symbols by relating them to respective data domains for the respective phenomena represented by the symbols. Well established rules of how the data should be stored and accessed, are given in the Relational Database Theory. The theory comprise of guidelines such as the avoidance of duplicating data using technique call normalization and how to identify the unique identifier for a database record. (author)

  15. Parapsychology and the neurosciences: a computer-based content analysis of abstracts in the database "MEDLINE" from 1975 to 1995.

    Science.gov (United States)

    Fassbender, P

    1997-04-01

    A computer-based content of 109 abstracts retrieved by the subject heading "parapsychology" from the database MEDLINE for the years 1975-1995 is presented. Data were analyzed by four categories to terms denoting (1) research methods, (2) neurosciences, (3) humanities/psychodynamics, and (4) parapsychology. Results indicated a growing interest in neuroscientific and neuropsychological explanations and theories.

  16. An online interactive geometric database including exact solutions of Einstein's field equations

    International Nuclear Information System (INIS)

    Ishak, Mustapha; Lake, Kayll

    2002-01-01

    We describe a new interactive database (GRDB) of geometric objects in the general area of differential geometry. Database objects include, but are not restricted to, exact solutions of Einstein's field equations. GRDB is designed for researchers (and teachers) in applied mathematics, physics and related fields. The flexible search environment allows the database to be useful over a wide spectrum of interests, for example, from practical considerations of neutron star models in astrophysics to abstract space-time classification schemes. The database is built using a modular and object-oriented design and uses several Java technologies (e.g. Applets, Servlets, JDBC). These are platform-independent and well adapted for applications developed for the World Wide Web. GRDB is accompanied by a virtual calculator (GRTensorJ), a graphical user interface to the computer algebra system GRTensorII, used to perform online coordinate, tetrad or basis calculations. The highly interactive nature of GRDB allows systematic internal self-checking and minimization of the required internal records. This new database is now available online at http://grdb.org

  17. Application of bioinformatics tools and databases in microbial dehalogenation research (a review).

    Science.gov (United States)

    Satpathy, R; Konkimalla, V B; Ratha, J

    2015-01-01

    Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.

  18. Profiling an application for power consumption during execution on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Peters, Amanda E.; Ratterman, Joseph D.; Smith, Brian E.

    2012-08-21

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  19. A computer-controlled conformal radiotherapy system. IV: Electronic chart

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; McShan, Daniel L.; Matrone, Gwynne M.; Weaver, Tamar A.; Lewis, James D.; Kessler, Marc L.

    1995-01-01

    Purpose: The design and implementation of a system for electronically tracking relevant plan, prescription, and treatment data for computer-controlled conformal radiation therapy is described. Methods and Materials: The electronic charting system is implemented on a computer cluster coupled by high-speed networks to computer-controlled therapy machines. A methodical approach to the specification and design of an integrated solution has been used in developing the system. The electronic chart system is designed to allow identification and access of patient-specific data including treatment-planning data, treatment prescription information, and charting of doses. An in-house developed database system is used to provide an integrated approach to the database requirements of the design. A hierarchy of databases is used for both centralization and distribution of the treatment data for specific treatment machines. Results: The basic electronic database system has been implemented and has been in use since July 1993. The system has been used to download and manage treatment data on all patients treated on our first fully computer-controlled treatment machine. To date, electronic dose charting functions have not been fully implemented clinically, requiring the continued use of paper charting for dose tracking. Conclusions: The routine clinical application of complex computer-controlled conformal treatment procedures requires the management of large quantities of information for describing and tracking treatments. An integrated and comprehensive approach to this problem has led to a full electronic chart for conformal radiation therapy treatments

  20. Budget-based power consumption for application execution on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J; Inglett, Todd A; Ratterman, Joseph D

    2012-10-23

    Methods, apparatus, and products are disclosed for budget-based power consumption for application execution on a plurality of compute nodes that include: assigning an execution priority to each of one or more applications; executing, on the plurality of compute nodes, the applications according to the execution priorities assigned to the applications at an initial power level provided to the compute nodes until a predetermined power consumption threshold is reached; and applying, upon reaching the predetermined power consumption threshold, one or more power conservation actions to reduce power consumption of the plurality of compute nodes during execution of the applications.

  1. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  2. Similarity joins in relational database systems

    CERN Document Server

    Augsten, Nikolaus

    2013-01-01

    State-of-the-art database systems manage and process a variety of complex objects, including strings and trees. For such objects equality comparisons are often not meaningful and must be replaced by similarity comparisons. This book describes the concepts and techniques to incorporate similarity into database systems. We start out by discussing the properties of strings and trees, and identify the edit distance as the de facto standard for comparing complex objects. Since the edit distance is computationally expensive, token-based distances have been introduced to speed up edit distance comput

  3. Technical property and application of industrial computed tomography

    International Nuclear Information System (INIS)

    Sun Lingxia; Ye Yunchang

    2006-01-01

    The main technical property of industrial computed tomography (ICT) and its application in non-destructive testing (NDT) were described. And some examples of ICT applications in such fields as defects detection, welding quality, density uniformity, structure analysis and making-up quality were given. (authors)

  4. Elements of quantum computing history, theories and engineering applications

    CERN Document Server

    Akama, Seiki

    2015-01-01

    A quantum computer is a computer based on a computational model which uses quantum mechanics, which is a subfield of physics to study phenomena at the micro level. There has been a growing interest on quantum computing in the 1990's, and some quantum computers at the experimental level were recently implemented. Quantum computers enable super-speed computation, and can solve some important problems whose solutions were regarded impossible or intractable with traditional computers. This book provides a quick introduction to quantum computing for readers who have no backgrounds of both theory of computation and quantum mechanics. “Elements of Quantum Computing” presents the history, theories, and engineering applications of quantum computing. The book is suitable to computer scientists, physicist, and software engineers.

  5. Computer, Informatics, Cybernetics and Applications : Proceedings of the CICA 2011

    CERN Document Server

    Hua, Ertian; Lin, Yun; Liu, Xiaozhu

    2012-01-01

    Computer Informatics Cybernetics and Applications offers 91 papers chosen for publication from among 184 papers accepted for presentation to the International Conference on Computer, Informatics, Cybernetics and Applications 2011 (CICA 2011), held in Hangzhou, China, September 13-16, 2011. The CICA 2011 conference provided a forum for engineers and scientists in academia, industry, and government to address the most innovative research and development including technical challenges and social, legal, political, and economic issues, and to present and discuss their ideas, results, work in progress and experience on all aspects of Computer, Informatics, Cybernetics and Applications. Reflecting the broad scope of the conference, the contents are organized in these topical categories: Communication Technologies and Applications Intelligence and Biometrics Technologies Networks Systems and Web Technologies Data Modeling and Programming Languages Digital Image Processing Optimization and Scheduling Education and In...

  6. Tsunami early warning in the Mediterranean: role, structure and tricks of pre-computed tsunami simulation databases and matching/forecasting algorithms

    Science.gov (United States)

    Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano

    2014-05-01

    The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the

  7. Scalable Transactions for Web Applications in the Cloud

    NARCIS (Netherlands)

    Zhou, W.; Pierre, G.E.O.; Chi, C.-H.

    2009-01-01

    Cloud Computing platforms provide scalability and high availability properties for web applications but they sacrifice data consistency at the same time. However, many applications cannot afford any data inconsistency. We present a scalable transaction manager for NoSQL cloud database services to

  8. Cloud computing applications for biomedical science: A perspective.

    Science.gov (United States)

    Navale, Vivek; Bourne, Philip E

    2018-06-01

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

  9. Restricted access processor - An application of computer security technology

    Science.gov (United States)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  10. Design and implementation of typical target image database system

    International Nuclear Information System (INIS)

    Qin Kai; Zhao Yingjun

    2010-01-01

    It is necessary to provide essential background data and thematic data timely in image processing and application. In fact, application is an integrating and analyzing procedure with different kinds of data. In this paper, the authors describe an image database system which classifies, stores, manages and analyzes database of different types, such as image database, vector database, spatial database, spatial target characteristics database, its design and structure. (authors)

  11. Soft computing approach to 3D lung nodule segmentation in CT.

    Science.gov (United States)

    Badura, P; Pietka, E

    2014-10-01

    This paper presents a novel, multilevel approach to the segmentation of various types of pulmonary nodules in computed tomography studies. It is based on two branches of computational intelligence: the fuzzy connectedness (FC) and the evolutionary computation. First, the image and auxiliary data are prepared for the 3D FC analysis during the first stage of an algorithm - the masks generation. Its main goal is to process some specific types of nodules connected to the pleura or vessels. It consists of some basic image processing operations as well as dedicated routines for the specific cases of nodules. The evolutionary computation is performed on the image and seed points in order to shorten the FC analysis and improve its accuracy. After the FC application, the remaining vessels are removed during the postprocessing stage. The method has been validated using the first dataset of studies acquired and described by the Lung Image Database Consortium (LIDC) and by its latest release - the LIDC-IDRI (Image Database Resource Initiative) database. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Computational Intelligence and Decision Making Trends and Applications

    CERN Document Server

    Madureira, Ana; Marques, Viriato

    2013-01-01

    This book provides a general overview and original analysis of new developments and applications in several areas of Computational Intelligence and Information Systems. Computational Intelligence has become an important tool for engineers to develop and analyze novel techniques to solve problems in basic sciences such as physics, chemistry, biology, engineering, environment and social sciences.   The material contained in this book addresses the foundations and applications of Artificial Intelligence and Decision Support Systems, Complex and Biological Inspired Systems, Simulation and Evolution of Real and Artificial Life Forms, Intelligent Models and Control Systems, Knowledge and Learning Technologies, Web Semantics and Ontologies, Intelligent Tutoring Systems, Intelligent Power Systems, Self-Organized and Distributed Systems, Intelligent Manufacturing Systems and Affective Computing. The contributions have all been written by international experts, who provide current views on the topics discussed and pr...

  13. Programming database tools for the casual user

    International Nuclear Information System (INIS)

    Katz, R.A; Griffiths, C.

    1990-01-01

    The AGS Distributed Control System (AGSDCS) uses a relational database management system (INTERBASE) for the storage of all data associated with the control of the particle accelerator complex. This includes the static data which describes the component devices of the complex, as well as data for application program startup and data records that are used in analysis. Due to licensing restraints, it was necessary to develop tools to allow programs requiring access to a database to be unconcerned whether or not they were running on a licensed node. An in-house database server program was written, using Apollo mailbox communication protocols, allowing application programs via calls to this server to access the interbase database. Initially, the tools used by the server to actually access the database were written using the GDML C host language interface. Through the evolutionary learning process these tools have been converted to Dynamic SQL. Additionally, these tools have been extracted from the exclusive province of the database server and placed in their own library. This enables application programs to use these same tools on a licensed node without using the database server and without having to modify the application code. The syntax of the C calls remain the same

  14. Database theory and SQL practice using Access

    International Nuclear Information System (INIS)

    Kim, Gyeong Min; Lee, Myeong Jin

    2001-01-01

    This book introduces database theory and SQL practice using Access. It is comprised of seven chapters, which give description of understanding database with basic conception and DMBS, understanding relational database with examples of it, building database table and inputting data using access 2000, structured Query Language with introduction, management and making complex query using SQL, command for advanced SQL with understanding conception of join and virtual table, design on database for online bookstore with six steps and building of application with function, structure, component, understanding of the principle, operation and checking programming source for application menu.

  15. Application of Computer Technology to Educational Administration in the United States.

    Science.gov (United States)

    Bozeman, William C.; And Others

    1991-01-01

    Description of evolution of computer applications in U.S. educational administration is followed by an overview of the structure and governance of public education and Visscher's developmental framework. Typical administrative computer applications in education are discussed, including student records, personnel management, budgeting, library…

  16. The acceptability of computer applications to group practices.

    Science.gov (United States)

    Zimmerman, J; Gordon, R S; Tao, D K; Boxerman, S B

    1978-01-01

    Of the 72 identified group practices in a midwest urban environment, 39 were found to use computers. The practices had been influenced strongly by vendors in their selection of an automated system or service, and had usually spent less than a work-month analyzing their needs and reviewing alternate ways in which those needs could be met. Ninety-seven percent of the practices had some financial applications and 64% had administrative applications, but only 2.5% had medical applications. For half the practices at least 2 months elapsed from the time the automated applications were put into operation until they were considered to be integrated into the office routine. Advantages experienced by at least a third of the practices using computers were that the work was done faster, information was more readily available, and costs were reduced. The most common disadvantage was inflexibility. Most (89%) of the practices believed that automation was preferable to their previous manual system.

  17. Computational materials design

    International Nuclear Information System (INIS)

    Snyder, R.L.

    1999-01-01

    Full text: Trial and error experimentation is an extremely expensive route to the development of new materials. The coming age of reduced defense funding will dramatically alter the way in which advanced materials have developed. In the absence of large funding we must concentrate on reducing the time and expense that the R and D of a new material consumes. This may be accomplished through the development of computational materials science. Materials are selected today by comparing the technical requirements to the materials databases. When existing materials cannot meet the requirements we explore new systems to develop a new material using experimental databases like the PDF. After proof of concept, the scaling of the new material to manufacture requires evaluating millions of parameter combinations to optimize the performance of the new device. Historically this process takes 10 to 20 years and requires hundreds of millions of dollars. The development of a focused set of computational tools to predict the final properties of new materials will permit the exploration of new materials systems with only a limited amount of materials characterization. However, to bound computational extrapolations, the experimental formulations and characterization will need to be tightly coupled to the computational tasks. The required experimental data must be obtained by dynamic, in-situ, very rapid characterization. Finally, to evaluate the optimization matrix required to manufacture the new material, very rapid in situ analysis techniques will be essential to intelligently monitor and optimize the formation of a desired microstructure. Techniques and examples for the rapid real-time application of XRPD and optical microscopy will be shown. Recent developments in the cross linking of the world's structural and diffraction databases will be presented as the basis for the future Total Pattern Analysis by XRPD. Copyright (1999) Australian X-ray Analytical Association Inc

  18. Development of a personalized training system using the Lung Image Database Consortium and Image Database resource Initiative Database.

    Science.gov (United States)

    Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong

    2014-12-01

    The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  19. A SNP-centric database for the investigation of the human genome

    Directory of Open Access Journals (Sweden)

    Kohane Isaac S

    2004-03-01

    Full Text Available Abstract Background Single Nucleotide Polymorphisms (SNPs are an increasingly important tool for genetic and biomedical research. Although current genomic databases contain information on several million SNPs and are growing at a very fast rate, the true value of a SNP in this context is a function of the quality of the annotations that characterize it. Retrieving and analyzing such data for a large number of SNPs often represents a major bottleneck in the design of large-scale association studies. Description SNPper is a web-based application designed to facilitate the retrieval and use of human SNPs for high-throughput research purposes. It provides a rich local database generated by combining SNP data with the Human Genome sequence and with several other data sources, and offers the user a variety of querying, visualization and data export tools. In this paper we describe the structure and organization of the SNPper database, we review the available data export and visualization options, and we describe how the architecture of SNPper and its specialized data structures support high-volume SNP analysis. Conclusions The rich annotation database and the powerful data manipulation and presentation facilities it offers make SNPper a very useful online resource for SNP research. Its success proves the great need for integrated and interoperable resources in the field of computational biology, and shows how such systems may play a critical role in supporting the large-scale computational analysis of our genome.

  20. Dynamic graph system for a semantic database

    Science.gov (United States)

    Mizell, David

    2015-01-27

    A method and system in a computer system for dynamically providing a graphical representation of a data store of entries via a matrix interface is disclosed. A dynamic graph system provides a matrix interface that exposes to an application program a graphical representation of data stored in a data store such as a semantic database storing triples. To the application program, the matrix interface represents the graph as a sparse adjacency matrix that is stored in compressed form. Each entry of the data store is considered to represent a link between nodes of the graph. Each entry has a first field and a second field identifying the nodes connected by the link and a third field with a value for the link that connects the identified nodes. The first, second, and third fields represent the rows, column, and elements of the adjacency matrix.