WorldWideScience

Sample records for computer database application

  1. Computer Application Of Object Oriented Database Management ...

    African Journals Online (AJOL)

    Object Oriented Systems (OOS) have been widely adopted in software engineering because of their superiority with respect to data extensibility. The present trend in the software engineering process (SEP) towards concurrent computing raises novel concerns for the facilities and technology available in database ...

  2. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner dos Santos

    2016-01-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  3. Database computing in HEP

    International Nuclear Information System (INIS)

    Day, C.T.; Loken, S.; MacFarlane, J.F.; May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Baden, A.

    1992-01-01

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  4. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Lewis, G.; Howman-Giles, R.

    1999-01-01

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  5. Computer application for database management and networking of service radio physics

    International Nuclear Information System (INIS)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-01-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Micros of Office) our service implements this philosophy on the canter's computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  6. Database Application Schema Forensics

    Directory of Open Access Journals (Sweden)

    Hector Quintus Beyers

    2014-12-01

    Full Text Available The application schema layer of a Database Management System (DBMS can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic investigation can take place in. Arguments are provided why these environments are important. Methods are presented how these environments can be achieved for the application schema layer of a DBMS. A process is proposed on how forensic evidence should be extracted from the application schema layer of a DBMS. The application schema forensic evidence identification process can be applied to a wide range of forensic settings.

  7. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  8. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  9. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  10. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  11. Computational 2D Materials Database

    DEFF Research Database (Denmark)

    Rasmussen, Filip Anselm; Thygesen, Kristian Sommer

    2015-01-01

    We present a comprehensive first-principles study of the electronic structure of 51 semiconducting monolayer transition-metal dichalcogenides and -oxides in the 2H and 1T hexagonal phases. The quasiparticle (QP) band structures with spin-orbit coupling are calculated in the G(0)W(0) approximation...... and used as input to a 2D hydrogenic model to estimate exciton binding energies. Throughout the paper we focus on trends and correlations in the electronic structure rather than detailed analysis of specific materials. All the computed data is available in an open database......., and comparison is made with different density functional theory descriptions. Pitfalls related to the convergence of GW calculations for two-dimensional (2D) materials are discussed together with possible solutions. The monolayer band edge positions relative to vacuum are used to estimate the band alignment...

  12. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    Science.gov (United States)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  13. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    schemata, query evaluation, semantic processing, information retrieval, temporal and spatial databases, querying XML, organisational aspects of databases, natural language processing, ontologies, Web data extraction, semantic Web, data stream management, data extraction, distributed database systems......This book constitutes the refereed proceedings of the 16th International Conference on Database and Expert Systems Applications, DEXA 2005, held in Copenhagen, Denmark, in August 2005.The 92 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 390...... submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...

  14. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    Science.gov (United States)

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    Diaz, A.

    1996-01-01

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O 2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  16. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  17. Database characterisation of HEP applications

    International Nuclear Information System (INIS)

    Piorkowski, Mariusz; Grancher, Eric; Topurov, Anton

    2012-01-01

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  18. Distributed Database Access in the LHC Computing Grid with CORAL

    CERN Document Server

    Molnár, Z; Düllmann, D; Giacomo, G; Kalkhof, A; Valassi, A; CERN. Geneva. IT Department

    2009-01-01

    The CORAL package is the LCG Persistency Framework foundation for accessing relational databases. From the start CORAL has been designed to facilitate the deployment of the LHC experiment database applications in a distributed computing environment. In particular we cover - improvements to database service scalability by client connection management - platform-independent, multi-tier scalable database access by connection multiplexing, caching - a secure authentication and authorisation scheme integrated with existing grid services. We will summarize the deployment experience from several experiment productions using the distributed database infrastructure, which is now available in LCG. Finally, we present perspectives for future developments in this area.

  19. RA radiological characterization database application

    International Nuclear Information System (INIS)

    Steljic, M.M; Ljubenov, V.Lj. . E-mail address of corresponding author: milijanas@vin.bg.ac.yu; Steljic, M.M.)

    2005-01-01

    Radiological characterization of the RA research reactor is one of the main activities in the first two years of the reactor decommissioning project. The raw characterization data from direct measurements or laboratory analyses (defined within the existing sampling and measurement programme) have to be interpreted, organized and summarized in order to prepare the final characterization survey report. This report should be made so that the radiological condition of the entire site is completely and accurately shown with the radiological condition of the components clearly depicted. This paper presents an electronic database application, designed as a serviceable and efficient tool for characterization data storage, review and analysis, as well as for the reports generation. Relational database model was designed and the application is made by using Microsoft Access 2002 (SP1), a 32-bit RDBMS for the desktop and client/server database applications that run under Windows XP. (author)

  20. International Ventilation Cooling Application Database

    DEFF Research Database (Denmark)

    Holzer, Peter; Psomas, Theofanis Ch.; OSullivan, Paul

    2016-01-01

    The currently running International Energy Agency, Energy and Conservation in Buildings, Annex 62 Ventilative Cooling (VC) project, is coordinating research towards extended use of VC. Within this Annex 62 the joint research activity of International VC Application Database has been carried out...... and locations, using VC as a mean of indoor comfort improvement. The building-spreadsheet highlights distributions of technologies and strategies, such as the following. (Numbers in % refer to the sample of the database’s 91 buildings.) It may be concluded that Ventilative Cooling is applied in temporary......, systematically investigating the distribution of technologies and strategies within VC. The database is structured as both a ticking-list-like building-spreadsheet and a collection of building-datasheets. The content of both closely follows Annex 62 State-Of-The- Art-Report. The database has been filled, based...

  1. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  2. IAEA nuclear databases for applications

    International Nuclear Information System (INIS)

    Schwerer, Otto

    2003-01-01

    The Nuclear Data Section (NDS) of the International Atomic Energy Agency (IAEA) provides nuclear data services to scientists on a worldwide scale with particular emphasis on developing countries. More than 100 data libraries are made available cost-free by Internet, CD-ROM and other media. These databases are used for practically all areas of nuclear applications as well as basic research. An overview is given of the most important nuclear reaction and nuclear structure databases, such as EXFOR, CINDA, ENDF, NSR, ENSDF, NUDAT, and of selected special purpose libraries such as FENDL, RIPL, RNAL, the IAEA Photonuclear Data Library, and the IAEA charged-particle cross section database for medical radioisotope production. The NDS also coordinates two international nuclear data centre networks and is involved in data development activities (to create new or improve existing data libraries when the available data are inadequate) and in technology transfer to developing countries, e.g. through the installation and support of the mirror web site of the IAEA Nuclear Data Services at IPEN (operational since March 2000) and by organizing nuclear-data related workshops. By encouraging their participation in IAEA Co-ordinated Research Projects and also by compiling their experimental results in databases such as EXFOR, the NDS helps to make developing countries' contributions to nuclear science visible and conveniently available. The web address of the IAEA Nuclear Data Services is http://www.nds.iaea.org and the NDS mirror service at IPEN (Brasil) can be accessed at http://www.nds.ipen.br/ (author)

  3. Computer applications in radiation protection

    International Nuclear Information System (INIS)

    Cole, P.R.; Moores, B.M.

    1995-01-01

    Computer applications in general and diagnostic radiology in particular are becoming more widespread. Their application to the field of radiation protection in medical imaging, including quality control initiatives, is similarly becoming more widespread. Advances in computer technology have enabled departments of diagnostic radiology to have access to powerful yet affordable personal computers. The application of databases, expert systems and computer-based learning is under way. The executive information systems for the management of dose and QA data that are under way at IRS are discussed. An important consideration in developing these pragmatic software tools has been the range of computer literacy within the end user group. Using interfaces have been specifically designed to reflect the requirements of many end users who will have little or no computer knowledge. (Author)

  4. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  5. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  6. Design of multi-tiered database application based on CORBA component

    International Nuclear Information System (INIS)

    Sun Xiaoying; Dai Zhimin

    2003-01-01

    As computer technology quickly developing, middleware technology changed traditional two-tier database system. The multi-tiered database system, consisting of client application program, application servers and database serves, is mainly applying. While building multi-tiered database system using CORBA component has become the mainstream technique. In this paper, an example of DUV-FEL database system is presented, and then discuss the realization of multi-tiered database based on CORBA component. (authors)

  7. A database application for wilderness character monitoring

    Science.gov (United States)

    Ashley Adams; Peter Landres; Simon Kingston

    2012-01-01

    The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...

  8. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  9. Computer applications in the nuclear reprocessing industry

    International Nuclear Information System (INIS)

    McKenzie, H.G.; Swartfigure, G.T.

    1985-01-01

    The subject is discussed under the headings: introduction; benefits of computer application; factors affecting productivity; implementation of engineering design systems; the conceptual model; system design database; plant design system; pipe detailing system; overall assessment of benefits; conclusions. (U.K.)

  10. Professional iOS database application programming

    CERN Document Server

    Alessi, Patrick

    2013-01-01

    Updated and revised coverage that includes the latest versions of iOS and Xcode Whether you're a novice or experienced developer, you will want to dive into this updated resource on database application programming for the iPhone and iPad. Packed with more than 50 percent new and revised material - including completely rebuilt code, screenshots, and full coverage of new features pertaining to database programming and enterprise integration in iOS 6 - this must-have book intends to continue the precedent set by the previous edition by helping thousands of developers master database

  11. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  12. Simple re-instantiation of small databases using cloud computing.

    Science.gov (United States)

    Tan, Tin Wee; Xie, Chao; De Silva, Mark; Lim, Kuan Siong; Patro, C Pawan K; Lim, Shen Jean; Govindarajan, Kunde Ramamoorthy; Tong, Joo Chuan; Choo, Khar Heng; Ranganathan, Shoba; Khan, Asif M

    2013-01-01

    Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear.

  13. Artist Material BRDF Database for Computer Graphics Rendering

    Science.gov (United States)

    Ashbaugh, Justin C.

    The primary goal of this thesis was to create a physical library of artist material samples. This collection provides necessary data for the development of a gonio-imaging system for use in museums to more accurately document their collections. A sample set was produced consisting of 25 panels and containing nearly 600 unique samples. Selected materials are representative of those commonly used by artists both past and present. These take into account the variability in visual appearance resulting from the materials and application techniques used. Five attributes of variability were identified including medium, color, substrate, application technique and overcoat. Combinations of these attributes were selected based on those commonly observed in museum collections and suggested by surveying experts in the field. For each sample material, image data is collected and used to measure an average bi-directional reflectance distribution function (BRDF). The results are available as a public-domain image and optical database of artist materials at art-si.org. Additionally, the database includes specifications for each sample along with other information useful for computer graphics rendering such as the rectified sample images and normal maps.

  14. Applications of Evolutionary Computation

    NARCIS (Netherlands)

    Mora, Antonio M.; Squillero, Giovanni; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Smith, Stephen L; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Mora, Antonio M.; Squillero, Giovanni; Jan, Mathieu; Matthias, M; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Esparcia-Alcazar, Anna I; Silva, Sara; Agapitos, Alexandros; Cotta, Carlos; De Falco, Ivanoe; Cioppa, Antonio Della; Diwold, Konrad; Ekart, Aniko; Tarantino, Ernesto; Vega, Francisco Fernandez De; Burelli, Paolo; Sim, Kevin; Cagnoni, Stefano; Simoes, Anabela; Merelo, J.J.; Urquhart, Neil; Haasdijk, Evert; Zhang, Mengjie; Squillero, Giovanni; Eiben, A E; Tettamanzi, Andrea G B; Glette, Kyrre; Rohlfshagen, Philipp; Schaefer, Robert; Caserta, Marco; Ramirez, Adriana; Voß, Stefan

    2015-01-01

    The application of genetic and evolutionary computation to problems in medicine has increased rapidly over the past five years, but there are specific issues and challenges that distinguish it from other real-world applications. Obtaining reliable and coherent patient data, establishing the clinical

  15. Some Aspects of Process Computers Configuration Control in Nuclear Power Plant Krsko - Process Computer Signal Configuration Database (PCSCDB)

    International Nuclear Information System (INIS)

    Mandic, D.; Kocnar, R.; Sucic, B.

    2002-01-01

    During the operation of NEK and other nuclear power plants it has been recognized that certain issues related to the usage of digital equipment and associated software in NPP technological process protection, control and monitoring, is not adequately addressed in the existing programs and procedures. The term and the process of Process Computers Configuration Control joins three 10CFR50 Appendix B quality requirements of Process Computers application in NPP: Design Control, Document Control and Identification and Control of Materials, Parts and Components. This paper describes Process Computer Signal Configuration Database (PCSCDB), that was developed and implemented in order to resolve some aspects of Process Computer Configuration Control related to the signals or database points that exist in the life cycle of different Process Computer Systems (PCS) in Nuclear Power Plant Krsko. PCSCDB is controlled, master database, related to the definition and description of the configurable database points associated with all Process Computer Systems in NEK. PCSCDB holds attributes related to the configuration of addressable and configurable real time database points and attributes related to the signal life cycle references and history data such as: Input/Output signals, Manually Input database points, Program constants, Setpoints, Calculated (by application program or SCADA calculation tools) database points, Control Flags (example: enable / disable certain program feature) Signal acquisition design references to the DCM (Document Control Module Application software for document control within Management Information System - MIS) and MECL (Master Equipment and Component List MIS Application software for identification and configuration control of plant equipment and components) Usage of particular database point in particular application software packages, and in the man-machine interface features (display mimics, printout reports, ...) Signals history (EEAR Engineering

  16. Development and application of nuclear power operation database

    International Nuclear Information System (INIS)

    Shao Juying; Fang Zhaoxia

    1996-01-01

    The article describes the development of the Nuclear Power Operation Database which include Domestic and Overseas Nuclear Event Scale Database, Overseas Nuclear Power Operation Abnormal Event Database, Overseas Nuclear Power Operation General Reliability Database and Qinshan Nuclear Power Operation Abnormal Event Database. The development includes data collection and analysis, database construction and code design, database management system selection. The application of the database to provide support to the safety analysis of the NPPs which have been in commercial operation is also introduced

  17. Database applications in high energy physics

    International Nuclear Information System (INIS)

    Jeffery, K.G.

    1982-01-01

    High Energy physicists were using computers to process and store their data early in the history of computing. They addressed problems of memory management, job control, job generation, data standards, file conventions, multiple simultaneous usage, tape file handling and data management earlier than, or at the same time as, the manufacturers of computing equipment. The HEP community have their own suites of programs for these functions, and are now turning their attention to the possibility of replacing some of the functional components of their 'homebrew' systems with more widely used software and/or hardware. High on the 'shopping list' for replacement is data management. ECFA Working Group 11 has been working on this problem. This paper reviews the characteristics of existing HEP systems and existing database systems and discusses the way forward. (orig.)

  18. The SQL Server Database for Non Computer Professional Teaching Reform

    Science.gov (United States)

    Liu, Xiangwei

    2012-01-01

    A summary of the teaching methods of the non-computer professional SQL Server database, analyzes the current situation of the teaching course. According to non computer professional curriculum teaching characteristic, put forward some teaching reform methods, and put it into practice, improve the students' analysis ability, practice ability and…

  19. Applications of interval computations

    CERN Document Server

    Kreinovich, Vladik

    1996-01-01

    Primary Audience for the Book • Specialists in numerical computations who are interested in algorithms with automatic result verification. • Engineers, scientists, and practitioners who desire results with automatic verification and who would therefore benefit from the experience of suc­ cessful applications. • Students in applied mathematics and computer science who want to learn these methods. Goal Of the Book This book contains surveys of applications of interval computations, i. e. , appli­ cations of numerical methods with automatic result verification, that were pre­ sented at an international workshop on the subject in EI Paso, Texas, February 23-25, 1995. The purpose of this book is to disseminate detailed and surveyed information about existing and potential applications of this new growing field. Brief Description of the Papers At the most fundamental level, interval arithmetic operations work with sets: The result of a single arithmetic operation is the set of all possible results as the o...

  20. Computer Applications in Educational Audiology.

    Science.gov (United States)

    Mendel, Lisa Lucks; And Others

    1995-01-01

    This article provides an overview of how computer technologies can be used by educational audiologists. Computer technologies are classified into three categories: (1) information systems applications; (2) screening and diagnostic applications; and (3) intervention applications. (Author/DB)

  1. CERN database services for the LHC computing grid

    Energy Technology Data Exchange (ETDEWEB)

    Girone, M [CERN IT Department, CH-1211 Geneva 23 (Switzerland)], E-mail: maria.girone@cern.ch

    2008-07-15

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed.

  2. CERN database services for the LHC computing grid

    International Nuclear Information System (INIS)

    Girone, M

    2008-01-01

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed

  3. Handbook of video databases design and applications

    CERN Document Server

    Furht, Borko

    2003-01-01

    INTRODUCTIONIntroduction to Video DatabasesOge Marques and Borko FurhtVIDEO MODELING AND REPRESENTATIONModeling Video Using Input/Output Markov Models with Application to Multi-Modal Event DetectionAshutosh Garg, Milind R. Naphade, and Thomas S. HuangStatistical Models of Video Structure and SemanticsNuno VasconcelosFlavor: A Language for Media RepresentationAlexandros Eleftheriadis and Danny HongIntegrating Domain Knowledge and Visual Evidence to Support Highlight Detection in Sports VideosJuergen Assfalg, Marco Bertini, Carlo Colombo, and Alberto Del BimboA Generic Event Model and Sports Vid

  4. Security in Computer Applications

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development. The last part of the lecture covers some miscellaneous issues like the use of cryptography, rules for networking applications, and social engineering threats. This lecture was first given on Thursd...

  5. Computer application for database management and networking of service radio physics; Aplicacion informatica para la gestion de bases de datos y conexiones en red de un servicio de radiofisica

    Energy Technology Data Exchange (ETDEWEB)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-07-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Microsoft Office) our service implements this philosophy on the centers computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  6. Computer system for International Reactor Pressure Vessel Materials Database support

    International Nuclear Information System (INIS)

    Arutyunjan, R.; Kabalevsky, S.; Kiselev, V.; Serov, A.

    1997-01-01

    This report presents description of the computer tools for support of International Reactor Pressure Vessel Materials Database developed at IAEA. Work was focused on raw, qualified, processed materials data, search, retrieval, analysis, presentation and export possibilities of data. Developed software has the following main functions: provides software tools for querying and search of any type of data in the database; provides the capability to update the existing information in the database; provides the capability to present and print selected data; provides the possibility of export on yearly basis the run-time IRPVMDB with raw, qualified and processed materials data to Database members; provides the capability to export any selected sets of raw, qualified, processed materials data

  7. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  8. Pattern database applications from design to manufacturing

    Science.gov (United States)

    Zhuang, Linda; Zhu, Annie; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    Pattern-based approaches are becoming more common and popular as the industry moves to advanced technology nodes. At the beginning of a new technology node, a library of process weak point patterns for physical and electrical verification are starting to build up and used to prevent known hotspots from re-occurring on new designs. Then the pattern set is expanded to create test keys for process development in order to verify the manufacturing capability and precheck new tape-out designs for any potential yield detractors. With the database growing, the adoption of pattern-based approaches has expanded from design flows to technology development and then needed for mass-production purposes. This paper will present the complete downstream working flows of a design pattern database(PDB). This pattern-based data analysis flow covers different applications across different functional teams from generating enhancement kits to improving design manufacturability, populating new testing design data based on previous-learning, generating analysis data to improve mass-production efficiency and manufacturing equipment in-line control to check machine status consistency across different fab sites.

  9. Databases and information systems: Applications in biogeography

    International Nuclear Information System (INIS)

    Escalante E, Tania; Llorente B, Jorge; Espinoza O, David N; Soberon M, Jorge

    2000-01-01

    Some aspects of the new instrumentalization and methodological elements that make up information systems in biodiversity (ISB) are described. The use of accurate geographically referenced data allows a broad range of available sources: natural history collections and scientific literature require the use of databases and geographic information systems (GIS). The conceptualization of ISB and GIS, based in the use of extensive data bases, has implied detailed modeling and the construction of authoritative archives: exhaustive catalogues of nomenclature and synonymies, complete bibliographic lists, list of names proposed, historical-geographic gazetteers with localities and their synonyms united under a global positioning system which produces a geospheric conception of the earth and its biota. Certain difficulties in the development of the system and the construction of the biological databases are explained: quality control of data, for example. The use of such systems is basic in order to respond to many questions at the frontier of current studies of biodiversity and conservation. In particular, some applications in biogeography and their importance for modeling distributions, to identify and contrast areas of endemism and biological richness for conservation, and their use as tools in what we identify as predictive and experimental faunistics are detailed. Lastly, the process as well as its relevance is emphasized at national and regional levels

  10. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  11. Nuclear plant operations, maintenance, and configuration management using three-dimensional computer graphics and databases

    International Nuclear Information System (INIS)

    Tutos, N.C.; Reinschmidt, K.F.

    1987-01-01

    Stone and Webster Engineering Corporation has developed the Plant Digital Model concept as a new approach to Configuration Mnagement of nuclear power plants. The Plant Digital Model development is a step-by-step process, based on existing manual procedures and computer applications, and is fully controllable by the plant managers and engineers. The Plant Digital Model is based on IBM computer graphics and relational database management systems, and therefore can be easily integrated with existing plant databases and corporate management-information systems

  12. Cloud Computing Databases: Latest Trends and Architectural Concepts

    OpenAIRE

    Tarandeep Singh; Parvinder S. Sandhu

    2011-01-01

    The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services...

  13. AIDA Asia. Artificial Insemination Database Application. User manual. 1

    International Nuclear Information System (INIS)

    Garcia Podesta, Mario

    2002-01-01

    Artificial Insemination Database Application (AIDA-Asia) is a computer application to store and analyze information from AI Services (farms, females, inseminated, semen, estrus characteristics, inseminator and pregnancy diagnosis data). The need for such an application arose during a consultancy undertaken by the author for the International Atomic Energy Agency (IAEA, Vienna) under the framework of its Regional Co-operative Agreement for Asia and the Pacific (RCA) which is implementing a project on 'Improving Animal Productivity and Reproductive Efficiency' (RAS/5/035). The detailed specifications for the application were determined through a Task Force Meeting of National Consultants from five RCA Member States, organized by the IAEA and held in Sri Lanka in April 2001. The application has been developed in MS Access 2000 and Visual Basic for Applications (VBA) 6.0. However, it can run as a stand-alone application through its own executable files. It is based on screen forms for data entry or editing of information and command buttons. The structure of the data, the design of the application and VBA codes cannot be seen and cannot be modified by users. However, the designated administrator of AIDA-Asia in each country can customize it

  14. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  15. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  16. Computed radiography in NDT application

    International Nuclear Information System (INIS)

    Deprins, Eric

    2004-01-01

    Computed Radiography, or digital radiography by use of reusable Storage Phosphor screens, offers a convenient and reliable way to replace film. In addition to the reduced cost on consumables, the return on investment of CR systems is strongly determined by savings in exposure time, processing times and archival times. But also intangible costs like plant shutdown, environment safety and longer usability of isotopes are increasingly important when considering replacing film by Storage Phosphor systems. But mote than in traditional radiography, the use of digital images is a trade-off between the speed and the required quality. Better image quality is obtained by longer exposure times, slower phosphor screens and higher scan resolutions. Therefore, different kinds of storage phosphor screens are needed in order to cover every application. Most operations have the data, associated with the tests to be performed, centrally stored in a database. Using a digital radiography system gives not only the advantages of the manipulation of digital images, but also the digital data that is associated with it. Smart methods to associate cassettes and Storage screens with exposed images enhance the workflow of the NDT processes, and avoid human error. Automated measurements tools increase the throughput in different kinds of operations. This paper gives an overview of the way certain operations have decided to replace film by Computed Radiography, and what the major benefits for them have been.

  17. The EREC-STRESA database. Internet application

    International Nuclear Information System (INIS)

    Davydov, M.V.; Annunziato, A.

    2004-01-01

    A considerable amount of experimental data in the field of NPPs safety and reliability was produced and gathered in the Electrogorsk Research and Engineering Centre on NPPs Safety. In order to provide properly preservation and easy accessing to the data the EREC Database was created. This paper gives a description of the EREC Database and the supporting web-based informatic platform STRESA. (author)

  18. Applications of symbolic algebraic computation

    International Nuclear Information System (INIS)

    Brown, W.S.; Hearn, A.C.

    1979-01-01

    This paper is a survey of applications of systems for symbomic algebraic computation. In most successful applications, calculations that can be taken to a given order by hand are then extended one or two more orders by computer. Furthermore, with a few notable exceptins, these applications also involve numerical computation in some way. Therefore the authors emphasize the interface between symbolic and numerical computation, including: 1. Computations with both symbolic and numerical phases. 2. Data involving both the unpredictible size and shape that typify symbolic computation and the (usually inexact) numerical values that characterize numerical computation. 3. Applications of one field to the other. It is concluded that the fields of symbolic and numerical computation can advance most fruitfully in harmony rather than in competition. (Auth.)

  19. ATLAS database application enhancements using Oracle 11g

    CERN Document Server

    Dimitrov, G; The ATLAS collaboration; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemas (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have...

  20. Computer-assisted indexing for the INIS database

    International Nuclear Information System (INIS)

    Nevyjel, A.

    2006-01-01

    INIS has identified computer-assisted indexing as areas where information technology could assist best in maintaining database quality and indexing consistency, while containing production costs. Subject analysis is a very important but also very expensive process in the production of the INIS database. Given the current necessity to process an increased number of records, including subject analysis, without additional staff, INIS as well as the member states need improvements in their processing efficiency. Computer assisted subject analysis is a promising way to achieve this. The quality of the INIS database is defined by its inputting rules. The Thesaurus is a terminological control device used in translating from the natural language of documents, indexers or users into a more constrained system language. It is a controlled and dynamic vocabulary of semantically and generically related terms. It is the essential tool for subject analysis as well as for advanced search engines. To support the identification of descriptors in the free text (title, abstract, free keywords) 'hidden terms' have been introduced as extension of the Thesaurus, which identify phrases or character strings of free text and point to the valid descriptor, which should be suggested. In the process of computer-assisted subject analysis the bibliographic records (including title and abstract) are analyzed by the software, resulting in a list of suggested descriptors. Within the working platform (graphical user interface) the suggested descriptors are sorted by importance (by their relevance for the content of the document) and the subject specialist clearly sees the highlighted context from which the terms were selected. The system allows the subject specialist to accept or reject descriptors from the suggested list and to assign additional descriptors when necessary. First experiences show that a performance enhancement of about 80-100% can be achieved in the subject analysis process. (author)

  1. Computer application in scientific investigations

    International Nuclear Information System (INIS)

    Govorun, N.N.

    1981-01-01

    A short review of the computer development and application and software in JINR for the last 15 years is presented. Main trends of studies on computer application in experimental and theoretical investigations are enumerated: software of computers and their systems, software of data processing systems, designing automatic and automized systems for measuring track detectors images, development of technique of carrying out experiments on computer line, packets of applied computer codes and specialized systems. The development of the on line technique is successfully used in investigations of nuclear processes at relativistic energies. The new trend is the development of television methods of data output and its computer recording [ru

  2. Applications of Computer Algebra Conference

    CERN Document Server

    Martínez-Moro, Edgar

    2017-01-01

    The Applications of Computer Algebra (ACA) conference covers a wide range of topics from Coding Theory to Differential Algebra to Quantam Computing, focusing on the interactions of these and other areas with the discipline of Computer Algebra. This volume provides the latest developments in the field as well as its applications in various domains, including communications, modelling, and theoretical physics. The book will appeal to researchers and professors of computer algebra, applied mathematics, and computer science, as well as to engineers and computer scientists engaged in research and development.

  3. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  4. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  5. CQL: a database in smart card for health care applications.

    Science.gov (United States)

    Paradinas, P C; Dufresnes, E; Vandewalle, J J

    1995-01-01

    The CQL-Card is the first smart card in the world to use Database Management Systems (DBMS) concepts. The CQL-Card is particularly suited to a portable file in health applications where the information is required by many different partners, such as health insurance organizations, emergency services, and General Practitioners. All the information required by these different partners can be shared with independent security mechanisms. Database engine functions are carried out by the card, which manages tables, views, and dictionaries. Medical Information is stored in tables and views are logical and dynamic subsets of tables. For owner-partners like MIS (Medical Information System), it is possible to grant privileges (select, insert, update, and delete on table or view) to other partners. Furthermore, dictionaries are structures that contain requested descriptions and which allow adaptation to computer environments. Health information held in the CQL-Card is accessed using CQL (Card Query Language), a high level database query language which is a subset of the standard SQL (Structured Query Language). With this language, CQL-Card can be easily integrated into Medical Information Systems.

  6. Just-in-time Database-Driven Web Applications

    Science.gov (United States)

    2003-01-01

    "Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109

  7. An algorithm of discovering signatures from DNA databases on a computer cluster.

    Science.gov (United States)

    Lee, Hsiao Ping; Sheu, Tzu-Fang

    2014-10-05

    Signatures are short sequences that are unique and not similar to any other sequence in a database that can be used as the basis to identify different species. Even though several signature discovery algorithms have been proposed in the past, these algorithms require the entirety of databases to be loaded in the memory, thus restricting the amount of data that they can process. It makes those algorithms unable to process databases with large amounts of data. Also, those algorithms use sequential models and have slower discovery speeds, meaning that the efficiency can be improved. In this research, we are debuting the utilization of a divide-and-conquer strategy in signature discovery and have proposed a parallel signature discovery algorithm on a computer cluster. The algorithm applies the divide-and-conquer strategy to solve the problem posed to the existing algorithms where they are unable to process large databases and uses a parallel computing mechanism to effectively improve the efficiency of signature discovery. Even when run with just the memory of regular personal computers, the algorithm can still process large databases such as the human whole-genome EST database which were previously unable to be processed by the existing algorithms. The algorithm proposed in this research is not limited by the amount of usable memory and can rapidly find signatures in large databases, making it useful in applications such as Next Generation Sequencing and other large database analysis and processing. The implementation of the proposed algorithm is available at http://www.cs.pu.edu.tw/~fang/DDCSDPrograms/DDCSD.htm.

  8. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  9. Computer applications in radiology business management

    International Nuclear Information System (INIS)

    Pratt, J.; Parrish, D.; Butler, J.; Gregg, S.; Farley, G.

    1987-01-01

    This presentation focuses on two areas of prime importance to radiology business management: financial/accounting applications and computer networking. The business management portion is an overview of accounts receivable management, financial reporting, management reporting, budgeting and forecasting (including cost/benefit analysis and break-even analysis), and personal and/or financial tax planning. The networking portion focuses on telecommunications and considers satellite facilities, electronic claims submission, and national database networking. Both numeric and graphic summaries are demonstrated in the presentation

  10. The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system

    Science.gov (United States)

    Zerkin, V. V.; Pritychenko, B.

    2018-04-01

    The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.

  11. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors; Desenvolvimento de uma base de dados computacional para aplicação em Análise Probabilística de Segurança de reatores nucleares de pesquisa

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner dos Santos

    2016-07-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  12. BUSINESS MODELLING AND DATABASE DESIGN IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Mihai-Constantin AVORNICULUI

    2015-04-01

    Full Text Available Electronic commerce is growing constantly from one year to another in the last decade, few are the areas that also register such a growth. It covers the exchanges of computerized data, but also electronic messaging, linear data banks and electronic transfer payment. Cloud computing, a relatively new concept and term, is a model of access services via the internet to distributed systems of configurable calculus resources at request which can be made available quickly with minimum management effort and intervention from the client and the provider. Behind an electronic commerce system in cloud there is a data base which contains the necessary information for the transactions in the system. Using business modelling, we get many benefits, which makes the design of the database used by electronic commerce systems in cloud considerably easier.

  13. The ATLAS Wide-Range Database & Application Monitoring

    CERN Document Server

    Vasileva, Petya Tsvetanova; The ATLAS collaboration

    2018-01-01

    In HEP experiments at LHC the database applications often become complex by reflecting the ever demanding requirements of the researchers. The ATLAS experiment has several Oracle DB clusters with over 216 database schemes each with its own set of database objects. To effectively monitor them, we designed a modern and portable application with exceptionally good characteristics. Some of them include: concise view of the most important DB metrics; top SQL statements based on CPU, executions, block reads, etc.; volume growth plots per schema and DB object type; database jobs section with signaling for problematic ones; in-depth analysis in case of contention on data or processes. This contribution describes also the technical aspects of the implementation. The project can be separated into three independent layers. The first layer consists in highly-optimized database objects hiding all complicated calculations. The second layer represents a server providing REST access to the underlying database backend. The th...

  14. Applying artificial intelligence to astronomical databases - a surveyof applicable technology.

    Science.gov (United States)

    Rosenthal, D. A.

    This paper surveys several emerging technologies which are relevant to astronomical database issues such as interface technology, internal database representation, and intelligent data reduction aids. Among the technologies discussed are natural language understanding, frame and object representations, planning, pattern analysis, machine learning and the nascent study of simulated neural nets. These techniques will become increasingly important for astronomical research, and in particular, for applications with large databases.

  15. ATLAS database application enhancements using Oracle 11g

    International Nuclear Information System (INIS)

    Dimitrov, G; Canali, L; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  16. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  17. Computer applications in thermochemistry

    International Nuclear Information System (INIS)

    Vana Varamban, S.

    1996-01-01

    Knowledge of equilibrium is needed under many practical situations. Simple stoichiometric calculations can be performed by the use of hand calculators. Multi-component, multi-phase gas - solid chemical equilibrium calculations are far beyond the conventional devices and methods. Iterative techniques have to be resorted. Such problems are most elegantly handled by the use of modern computers. This report demonstrates the possible use of computers for chemical equilibrium calculations in the field of thermochemistry and chemical metallurgy. Four modules are explained. To fit the experimental C p data and to generate the thermal functions, to perform equilibrium calculations to the defined conditions, to prepare the elaborate input to the equilibrium and to analyse the calculated results graphically. The principles of thermochemical calculations are briefly described. An extensive input guide is given. Several illustrations are included to help the understanding and usage. (author)

  18. Computational fluid dynamic applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Lottes, S. A.; Zhou, C. Q.

    2000-04-03

    The rapid advancement of computational capability including speed and memory size has prompted the wide use of computational fluid dynamics (CFD) codes to simulate complex flow systems. CFD simulations are used to study the operating problems encountered in system, to evaluate the impacts of operation/design parameters on the performance of a system, and to investigate novel design concepts. CFD codes are generally developed based on the conservation laws of mass, momentum, and energy that govern the characteristics of a flow. The governing equations are simplified and discretized for a selected computational grid system. Numerical methods are selected to simplify and calculate approximate flow properties. For turbulent, reacting, and multiphase flow systems the complex processes relating to these aspects of the flow, i.e., turbulent diffusion, combustion kinetics, interfacial drag and heat and mass transfer, etc., are described in mathematical models, based on a combination of fundamental physics and empirical data, that are incorporated into the code. CFD simulation has been applied to a large variety of practical and industrial scale flow systems.

  19. Teaching Psychology Students Computer Applications.

    Science.gov (United States)

    Atnip, Gilbert W.

    This paper describes an undergraduate-level course designed to teach the applications of computers that are most relevant in the social sciences, especially psychology. After an introduction to the basic concepts and terminology of computing, separate units were devoted to word processing, data analysis, data acquisition, artificial intelligence,…

  20. Registry of EPA Applications, Models, and Databases

    Data.gov (United States)

    U.S. Environmental Protection Agency — READ is EPA's authoritative source for information about Agency information resources, including applications/systems, datasets and models. READ is one component of...

  1. PROPERTY DATABASE FOR THE DEVELOPMENT OF SHAPE MEMORY ALLOY APPLICATIONS

    OpenAIRE

    Tang , W.; CederstrÖm , J.; SandstrÖm , R.

    1991-01-01

    Important points involving the selection of shape memory alloy (SMA) application projects are discussed. The development of a property database for SMA is initiated. Both conventional data as well as characteristics which are unique for SMA are stored. As an application example of the database SMA-SELECT, important properties for Ti-Ni alloys near equi-atomic composition, such as temperature window width for superelasticity (SE), stress rate, critical yield stress, and their interaction have ...

  2. Engineering applications of soft computing

    CERN Document Server

    Díaz-Cortés, Margarita-Arimatea; Rojas, Raúl

    2017-01-01

    This book bridges the gap between Soft Computing techniques and their applications to complex engineering problems. In each chapter we endeavor to explain the basic ideas behind the proposed applications in an accessible format for readers who may not possess a background in some of the fields. Therefore, engineers or practitioners who are not familiar with Soft Computing methods will appreciate that the techniques discussed go beyond simple theoretical tools, since they have been adapted to solve significant problems that commonly arise in such areas. At the same time, the book will show members of the Soft Computing community how engineering problems are now being solved and handled with the help of intelligent approaches. Highlighting new applications and implementations of Soft Computing approaches in various engineering contexts, the book is divided into 12 chapters. Further, it has been structured so that each chapter can be read independently of the others.

  3. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    Science.gov (United States)

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework

  4. Thermodynamic database for proteins: features and applications.

    Science.gov (United States)

    Gromiha, M Michael; Sarai, Akinori

    2010-01-01

    We have developed a thermodynamic database for proteins and mutants, ProTherm, which is a collection of a large number of thermodynamic data on protein stability along with the sequence and structure information, experimental methods and conditions, and literature information. This is a valuable resource for understanding/predicting the stability of proteins, and it can be accessible at http://www.gibk26.bse.kyutech.ac.jp/jouhou/Protherm/protherm.html . ProTherm has several features including various search, display, and sorting options and visualization tools. We have analyzed the data in ProTherm to examine the relationship among thermodynamics, structure, and function of proteins. We describe the progress on the development of methods for understanding/predicting protein stability, such as (i) relationship between the stability of protein mutants and amino acid properties, (ii) average assignment method, (iii) empirical energy functions, (iv) torsion, distance, and contact potentials, and (v) machine learning techniques. The list of online resources for predicting protein stability has also been provided.

  5. Computational Linguistics Applications

    CERN Document Server

    Piasecki, Maciej; Jassem, Krzysztof; Fuglewicz, Piotr

    2013-01-01

    The ever-growing popularity of Google over the recent decade has required a specific method of man-machine communication: human query should be short, whereas the machine answer may take a form of a wide range of documents. This type of communication has triggered a rapid development in the domain of Information Extraction, aimed at providing the asker with a  more precise information. The recent success of intelligent personal assistants supporting users in searching or even extracting information and answers from large collections of electronic documents signals the onset of a new era in man-machine communication – we shall soon explain to our small devices what we need to know and expect valuable answers quickly and automatically delivered. The progress of man-machine communication is accompanied by growth in the significance of applied Computational Linguistics – we need machines to understand much more from the language we speak naturally than it is the case of up-to-date search systems. Moreover, w...

  6. Computer applications in nuclear medicine

    International Nuclear Information System (INIS)

    Lancaster, J.L.; Lasher, J.C.; Blumhardt, R.

    1987-01-01

    Digital computers were introduced to nuclear medicine research as an imaging modality in the mid-1960s. Widespread use of imaging computers (scintigraphic computers) was not seen in nuclear medicine clinics until the mid-1970s. For the user, the ability to acquire scintigraphic images into the computer for quantitative purposes, with accurate selection of regions of interest (ROIs), promised almost endless computational capabilities. Investigators quickly developed many new methods for quantitating the distribution patterns of radiopharmaceuticals within the body both spatially and temporally. The computer was used to acquire data on practically every organ that could be imaged by means of gamma cameras or rectilinear scanners. Methods of image processing borrowed from other disciplines were applied to scintigraphic computer images in an attempt to improve image quality. Image processing in nuclear medicine has evolved into a relatively extensive set of tasks that can be called on by the user to provide additional clinical information rather than to improve image quality. Digital computers are utilized in nuclear medicine departments for nonimaging applications also, Patient scheduling, archiving, radiopharmaceutical inventory, radioimmunoassay (RIA), and health physics are just a few of the areas in which the digital computer has proven helpful. The computer is useful in any area in which a large quantity of data needs to be accurately managed, especially over a long period of time

  7. FaceWarehouse: a 3D facial expression database for visual computing.

    Science.gov (United States)

    Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun

    2014-03-01

    We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.

  8. Advances in Parallel Computing and Databases for Digital Pathology in Cancer Research

    Science.gov (United States)

    2016-11-13

    databases. The advent of NewSQL and NoSQL (Not Only SQL) databases has led to the development of new technologies that are well suited for applications... NoSQL graph databases are tuned to support graph operations and NoSQL key-value databases excel at rapid ingest of unstructured data. Recent NewSQL

  9. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  11. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  12. A Unit-Test Framework for Database Applications

    DEFF Research Database (Denmark)

    Christensen, Claus Abildgaard; Gundersborg, Steen; de Linde, Kristian

    The outcome of a test of an application that stores data in a database naturally depends on the state of the database. It is therefore important that test developers are able to set up and tear down database states in a simple and efficient manner. In existing unit-test frameworks, setting up...... test can be minimized. In addition, the reuse between unit tests can speed up the execution of test suites. A performance test on a medium-size project shows a 40% speed up and an estimated 25% reduction in the number of lines of test code....

  13. Thermodynamic database of multi-component Mg alloys and its application to solidification and heat treatment

    Directory of Open Access Journals (Sweden)

    Guanglong Xu

    2016-12-01

    Full Text Available An overview about one thermodynamic database of multi-component Mg alloys is given in this work. This thermodynamic database includes thermodynamic descriptions for 145 binary systems and 48 ternary systems in 23-component (Mg–Ag–Al–Ca–Ce–Cu–Fe–Gd–K–La–Li–Mn–Na–Nd–Ni–Pr–Si–Sn–Sr–Th–Y–Zn–Zr system. First, the major computational and experimental tools to establish the thermodynamic database of Mg alloys are briefly described. Subsequently, among the investigated binary and ternary systems, representative binary and ternary systems are shown to demonstrate the major feature of the database. Finally, application of the thermodynamic database to solidification simulation and selection of heat treatment schedule is described.

  14. Industrial applications of computer tomography

    International Nuclear Information System (INIS)

    Sheng Kanglong; Qiang Yujun; Yang Fujia

    1992-01-01

    Industrial computer tomography (CT) and its application is a rapidly developing field of high technology. CT systems have been playing important roles in nondestructive testing (NDT) of products and equipment for a number of industries. Recently, the technique has advanced into the area of industrial process control, bringing even greater benefit to mankind. The basic principles and typical structure of an industrial CT system Descriptions are given of some successful CT systems for either NDT application or process control purposes

  15. Go Figure: Computer Database Adds the Personal Touch.

    Science.gov (United States)

    Gaffney, Jean; Crawford, Pat

    1992-01-01

    A database for recordkeeping for a summer reading club was developed for a public library system using an IBM PC and Microsoft Works. Use of the database resulted in more efficient program management, giving librarians more time to spend with patrons and enabling timely awarding of incentives. (LAE)

  16. Performance of popular open source databases for HEP related computing problems

    International Nuclear Information System (INIS)

    Kovalskyi, D; Sfiligoi, I; Wuerthwein, F; Yagil, A

    2014-01-01

    Databases are used in many software components of HEP computing, from monitoring and job scheduling to data storage and processing. It is not always clear at the beginning of a project if a problem can be handled by a single server, or if one needs to plan for a multi-server solution. Before a scalable solution is adopted, it helps to know how well it performs in a single server case to avoid situations when a multi-server solution is adopted mostly due to sub-optimal performance per node. This paper presents comparison benchmarks of popular open source database management systems. As a test application we use a user job monitoring system based on the Glidein workflow management system used in the CMS Collaboration.

  17. Scientific applications of symbolic computation

    International Nuclear Information System (INIS)

    Hearn, A.C.

    1976-02-01

    The use of symbolic computation systems for problem solving in scientific research is reviewed. The nature of the field is described, and particular examples are considered from celestial mechanics, quantum electrodynamics and general relativity. Symbolic integration and some more recent applications of algebra systems are also discussed [fr

  18. Industrial applications of computed tomography

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Carmignato, S.; Kruth, J. -P.

    2014-01-01

    The number of industrial applications of Computed Tomography(CT) is large and rapidly increasing. After a brief market overview, the paper gives a survey of state of the art and upcoming CT technologies, covering types of CT systems, scanning capabilities, and technological advances. The paper...

  19. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  20. Klaim-DB: A Modeling Language for Distributed Database Applications

    DEFF Research Database (Denmark)

    Wu, Xi; Li, Ximeng; Lluch Lafuente, Alberto

    2015-01-01

    and manipulation of structured data, with integrity and atomicity considerations. We present the formal semantics of KlaimDB and illustrate the use of the language in a scenario where the sales from different branches of a chain of department stores are aggregated from their local databases. It can be seen......We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access...... that raising the abstraction level and encapsulating integrity checks (concerning the schema of tables, etc.) in the language primitives for database operations benefit the modelling task considerably....

  1. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. DNA algorithms of implementing biomolecular databases on a biological computer.

    Science.gov (United States)

    Chang, Weng-Long; Vasilakos, Athanasios V

    2015-01-01

    In this paper, DNA algorithms are proposed to perform eight operations of relational algebra (calculus), which include Cartesian product, union, set difference, selection, projection, intersection, join, and division, on biomolecular relational databases.

  3. Computer Aided Design for Soil Classification Relational Database ...

    African Journals Online (AJOL)

    unique firstlady

    engineering, several developers were asked what rules they applied to identify ... classification is actually a part of all good science. As Michalski ... by a large number of soil scientists. .... and use. The calculus relational database processing is.

  4. Computational geometry for reactor applications

    International Nuclear Information System (INIS)

    Brown, F.B.; Bischoff, F.G.

    1988-01-01

    Monte Carlo codes for simulating particle transport involve three basic computational sections: a geometry package for locating particles and computing distances to regional boundaries, a physics package for analyzing interactions between particles and problem materials, and an editing package for determining event statistics and overall results. This paper describes the computational geometry methods in RACER, a vectorized Monte Carlo code used for reactor physics analysis, so that comparisons may be made with techniques used in other codes. The principal applications for RACER are eigenvalue calculations and power distributions associated with reactor core physics analysis. Successive batches of neutrons are run until convergence and acceptable confidence intervals are obtained, with typical problems involving >10 6 histories. As such, the development of computational geometry methods has emphasized two basic needs: a flexible but compact geometric representation that permits accurate modeling of reactor core details and efficient geometric computation to permit very large numbers of histories to be run. The current geometric capabilities meet these needs effectively, supporting a variety of very large and demanding applications

  5. Needs assessment for next generation computer-aided mammography reference image databases and evaluation studies.

    Science.gov (United States)

    Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias

    2011-11-01

    Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM

  6. Construction of crystal structure prototype database: methods and applications

    International Nuclear Information System (INIS)

    Su, Chuanxun; Lv, Jian; Wang, Hui; Wang, Yanchao; Ma, Yanming; Li, Quan; Zhang, Lijun

    2017-01-01

    Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery. (paper)

  7. Construction of crystal structure prototype database: methods and applications.

    Science.gov (United States)

    Su, Chuanxun; Lv, Jian; Li, Quan; Wang, Hui; Zhang, Lijun; Wang, Yanchao; Ma, Yanming

    2017-04-26

    Crystal structure prototype data have become a useful source of information for materials discovery in the fields of crystallography, chemistry, physics, and materials science. This work reports the development of a robust and efficient method for assessing the similarity of structures on the basis of their interatomic distances. Using this method, we proposed a simple and unambiguous definition of crystal structure prototype based on hierarchical clustering theory, and constructed the crystal structure prototype database (CSPD) by filtering the known crystallographic structures in a database. With similar method, a program structure prototype analysis package (SPAP) was developed to remove similar structures in CALYPSO prediction results and extract predicted low energy structures for a separate theoretical structure database. A series of statistics describing the distribution of crystal structure prototypes in the CSPD was compiled to provide an important insight for structure prediction and high-throughput calculations. Illustrative examples of the application of the proposed database are given, including the generation of initial structures for structure prediction and determination of the prototype structure in databases. These examples demonstrate the CSPD to be a generally applicable and useful tool for materials discovery.

  8. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  9. Combinatorial methods with computer applications

    CERN Document Server

    Gross, Jonathan L

    2007-01-01

    Combinatorial Methods with Computer Applications provides in-depth coverage of recurrences, generating functions, partitions, and permutations, along with some of the most interesting graph and network topics, design constructions, and finite geometries. Requiring only a foundation in discrete mathematics, it can serve as the textbook in a combinatorial methods course or in a combined graph theory and combinatorics course.After an introduction to combinatorics, the book explores six systematic approaches within a comprehensive framework: sequences, solving recurrences, evaluating summation exp

  10. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    Science.gov (United States)

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  11. [A systematic evaluation of application of the web-based cancer database].

    Science.gov (United States)

    Huang, Tingting; Liu, Jialin; Li, Yong; Zhang, Rui

    2013-10-01

    In order to support the theory and practice of the web-based cancer database development in China, we applied a systematic evaluation to assess the development condition of the web-based cancer databases at home and abroad. We performed computer-based retrieval of the Ovid-MEDLINE, Springerlink, EBSCOhost, Wiley Online Library and CNKI databases, the papers of which were published between Jan. 1995 and Dec. 2011, and retrieved the references of these papers by hand. We selected qualified papers according to the pre-established inclusion and exclusion criteria, and carried out information extraction and analysis of the papers. Eventually, searching the online database, we obtained 1244 papers, and checking the reference lists, we found other 19 articles. Thirty-one articles met the inclusion and exclusion criteria and we extracted the proofs and assessed them. Analyzing these evidences showed that the U.S.A. counted for 26% in the first place. Thirty-nine percent of these web-based cancer databases are comprehensive cancer databases. As for single cancer databases, breast cancer and prostatic cancer are on the top, both counting for 10% respectively. Thirty-two percent of the cancer database are associated with cancer gene information. For the technical applications, MySQL and PHP applied most widely, nearly 23% each.

  12. Evolution and applications of plant pathway resources and databases

    DEFF Research Database (Denmark)

    Sucaet, Yves; Deva, Taru

    2011-01-01

    Plants are important sources of food and plant products are essential for modern human life. Plants are increasingly gaining importance as drug and fuel resources, bioremediation tools and as tools for recombinant technology. Considering these applications, database infrastructure for plant model...... systems deserves much more attention. Study of plant biological pathways, the interconnection between these pathways and plant systems biology on the whole has in general lagged behind human systems biology. In this article we review plant pathway databases and the resources that are currently available...

  13. MetReS, an Efficient Database for Genomic Applications.

    Science.gov (United States)

    Vilaplana, Jordi; Alves, Rui; Solsona, Francesc; Mateo, Jordi; Teixidó, Ivan; Pifarré, Marc

    2018-02-01

    MetReS (Metabolic Reconstruction Server) is a genomic database that is shared between two software applications that address important biological problems. Biblio-MetReS is a data-mining tool that enables the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the processes of interest and their function. The main goal of this work was to identify the areas where the performance of the MetReS database performance could be improved and to test whether this improvement would scale to larger datasets and more complex types of analysis. The study was started with a relational database, MySQL, which is the current database server used by the applications. We also tested the performance of an alternative data-handling framework, Apache Hadoop. Hadoop is currently used for large-scale data processing. We found that this data handling framework is likely to greatly improve the efficiency of the MetReS applications as the dataset and the processing needs increase by several orders of magnitude, as expected to happen in the near future.

  14. Computer Aided Design for Soil Classification Relational Database ...

    African Journals Online (AJOL)

    The paper focuses on the problems associated with classification, storage and retrieval of information on soil data, such as the incompatibility of soil data semantics; inadequate documentation, and lack of indexing; hence it is pretty difficult to efficiently access large database. Consequently, information on soil is very difficult ...

  15. Proposal for grid computing for nuclear applications

    International Nuclear Information System (INIS)

    Faridah Mohamad Idris; Wan Ahmad Tajuddin Wan Abdullah; Zainol Abidin Ibrahim; Zukhaimira Zolkapli

    2013-01-01

    Full-text: The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process. (author)

  16. Analysis on Cloud Computing Database in Cloud Environment – Concept and Adoption Paradigm

    Directory of Open Access Journals (Sweden)

    Elena-Geanina ULARU

    2012-08-01

    Full Text Available With the development of the Internet’s new technical functionalities, new concepts have started to take shape. These concepts have an important role especially in the development of corporate IT. Such a concept is „the Cloud”. Various marketing campaigns have started to focus on the Cloud and began to promote it in different but confusing ways. This campaigns do little, to explain what cloud computing is and why it is becoming increasingly necessary. The lack of understanding in this new technology generates a lack of knowledge in business cloud adoption regarding database and also application. Only by focusing on the business processes and objectives an enterprise can achieve the full benefits of the cloud and mitigate the potential risks. In this article we create our own complete definition of the cloud and we analyze the essential aspects of cloud adoption for a banking financial reporting application.

  17. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  18. Planform: an application and database of graph-encoded planarian regenerative experiments.

    Science.gov (United States)

    Lobo, Daniel; Malone, Taylor J; Levin, Michael

    2013-04-15

    Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.

  19. Database organization for computer-aided characterization of laser diode

    International Nuclear Information System (INIS)

    Oyedokun, Z.O.

    1988-01-01

    Computer-aided data logging involves a huge amount of data which must be properly managed for optimized storage space, easy access, retrieval and utilization. An organization method is developed to enhance the advantages of computer-based data logging of the testing of the semiconductor injection laser which optimize storage space, permit authorized user easy access and inhibits penetration. This method is based on unique file identification protocol tree structure and command file-oriented access procedures

  20. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  1. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  2. Cloud Computing and Its Applications in GIS

    Science.gov (United States)

    Kang, Cao

    2011-12-01

    Cloud computing is a novel computing paradigm that offers highly scalable and highly available distributed computing services. The objectives of this research are to: 1. analyze and understand cloud computing and its potential for GIS; 2. discover the feasibilities of migrating truly spatial GIS algorithms to distributed computing infrastructures; 3. explore a solution to host and serve large volumes of raster GIS data efficiently and speedily. These objectives thus form the basis for three professional articles. The first article is entitled "Cloud Computing and Its Applications in GIS". This paper introduces the concept, structure, and features of cloud computing. Features of cloud computing such as scalability, parallelization, and high availability make it a very capable computing paradigm. Unlike High Performance Computing (HPC), cloud computing uses inexpensive commodity computers. The uniform administration systems in cloud computing make it easier to use than GRID computing. Potential advantages of cloud-based GIS systems such as lower barrier to entry are consequently presented. Three cloud-based GIS system architectures are proposed: public cloud- based GIS systems, private cloud-based GIS systems and hybrid cloud-based GIS systems. Public cloud-based GIS systems provide the lowest entry barriers for users among these three architectures, but their advantages are offset by data security and privacy related issues. Private cloud-based GIS systems provide the best data protection, though they have the highest entry barriers. Hybrid cloud-based GIS systems provide a compromise between these extremes. The second article is entitled "A cloud computing algorithm for the calculation of Euclidian distance for raster GIS". Euclidean distance is a truly spatial GIS algorithm. Classical algorithms such as the pushbroom and growth ring techniques require computational propagation through the entire raster image, which makes it incompatible with the distributed nature

  3. The Use of a Relational Database in Qualitative Research on Educational Computing.

    Science.gov (United States)

    Winer, Laura R.; Carriere, Mario

    1990-01-01

    Discusses the use of a relational database as a data management and analysis tool for nonexperimental qualitative research, and describes the use of the Reflex Plus database in the Vitrine 2001 project in Quebec to study computer-based learning environments. Information systems are also discussed, and the use of a conceptual model is explained.…

  4. The establishment of the Blacknest seismological database on the Rutherford Laboratory system 360/195 computer

    International Nuclear Information System (INIS)

    Blamey, C.

    1977-01-01

    In order to assess the problems which might arise from monitoring a comprehensive test ban treaty by seismological methods, an experimental monitoring operation is being conducted. This work has involved the establishment of a database on the Rutherford Laboratory 360/195 system computer. The database can be accessed in the UK over the public telephone network and in the USA via ARPANET. (author)

  5. Computer Cataloging of Electronic Journals in Unstable Aggregator Databases: The Hong Kong Baptist University Library Experience.

    Science.gov (United States)

    Li, Yiu-On; Leung, Shirley W.

    2001-01-01

    Discussion of aggregator databases focuses on a project at the Hong Kong Baptist University library to integrate full-text electronic journal titles from three unstable aggregator databases into its online public access catalog (OPAC). Explains the development of the electronic journal computer program (EJCOP) to generate MARC records for…

  6. Computer-aided visualization of database structural relationships

    International Nuclear Information System (INIS)

    Cahn, D.F.

    1980-04-01

    Interactive computer graphic displays can be extremely useful in augmenting understandability of data structures. In complexly interrelated domains such as bibliographic thesauri and energy information systems, node and link displays represent one such tool. This paper presents examples of data structure representations found useful in these domains and discusses some of their generalizable components. 2 figures

  7. Computational geometry algorithms and applications

    CERN Document Server

    de Berg, Mark; Overmars, Mark; Schwarzkopf, Otfried

    1997-01-01

    Computational geometry emerged from the field of algorithms design and anal­ ysis in the late 1970s. It has grown into a recognized discipline with its own journals, conferences, and a large community of active researchers. The suc­ cess of the field as a research discipline can on the one hand be explained from the beauty of the problems studied and the solutions obtained, and, on the other hand, by the many application domains--computer graphics, geographic in­ formation systems (GIS), robotics, and others-in which geometric algorithms play a fundamental role. For many geometric problems the early algorithmic solutions were either slow or difficult to understand and implement. In recent years a number of new algorithmic techniques have been developed that improved and simplified many of the previous approaches. In this textbook we have tried to make these modem algorithmic solutions accessible to a large audience. The book has been written as a textbook for a course in computational geometry, but it can ...

  8. Advancements in web-database applications for rabies surveillance

    Directory of Open Access Journals (Sweden)

    Bélanger Denise

    2011-08-01

    Full Text Available Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1 automatic integration of multi-agency data and diagnostic results on a daily basis; 2 a web-based data editing interface that enables authorized users to add, edit and extract data; and 3 an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from

  9. Distributed Pseudo-Random Number Generation and Its Application to Cloud Database

    OpenAIRE

    Chen, Jiageng; Miyaji, Atsuko; Su, Chunhua

    2014-01-01

    Cloud database is now a rapidly growing trend in cloud computing market recently. It enables the clients run their computation on out-sourcing databases or access to some distributed database service on the cloud. At the same time, the security and privacy concerns is major challenge for cloud database to continue growing. To enhance the security and privacy of the cloud database technology, the pseudo-random number generation (PRNG) plays an important roles in data encryptions and privacy-pr...

  10. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2015-10-01

    Full Text Available Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. database design, installation, and operation. Therefore the authors have investigated MongoDB, a popular application, both from the perspective of industry retraining for database specialists and for teaching. This paper demonstrates some practical activities that can be done by students at the Eastern Institute of Technology New Zealand. In addition to testing and preparing new content for future students, this paper contributes to the very recent and emerging academic literature in this area. This paper concludes with general recommendations for IT educators, database engineers, and other IT professionals.

  11. Application Program Interface for the Orion Aerodynamics Database

    Science.gov (United States)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The

  12. Computational intelligence in automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Prokhorov, Danil (ed.) [Toyota Motor Engineering and Manufacturing (TEMA), Ann Arbor, MI (United States). Toyota Technical Center

    2008-07-01

    What is computational intelligence (CI)? Traditionally, CI is understood as a collection of methods from the fields of neural networks (NN), fuzzy logic and evolutionary computation. This edited volume is the first of its kind, suitable to automotive researchers, engineers and students. It provides a representative sample of contemporary CI activities in the area of automotive technology. The volume consists of 13 chapters, including but not limited to these topics: vehicle diagnostics and vehicle system safety, control of vehicular systems, quality control of automotive processes, driver state estimation, safety of pedestrians, intelligent vehicles. All chapters contain overviews of state of the art, and several chapters illustrate their methodologies on examples of real-world systems. About the Editor: Danil Prokhorov began his technical career in St. Petersburg, Russia, after graduating with Honors from Saint Petersburg State University of Aerospace Instrumentation in 1992 (MS in Robotics). He worked as a research engineer in St. Petersburg Institute for Informatics and Automation, one of the institutes of the Russian Academy of Sciences. He came to the US in late 1993 for Ph.D. studies. He became involved in automotive research in 1995 when he was a Summer intern at Ford Scientific Research Lab in Dearborn, MI. Upon his graduation from the EE Department of Texas Tech University, Lubbock, in 1997, he joined Ford to pursue application-driven research on neural networks and other machine learning algorithms. While at Ford, he took part in several production-bound projects including neural network based engine misfire detection. Since 2005 he is with Toyota Technical Center, Ann Arbor, MI, overseeing important mid- and long-term research projects in computational intelligence. (orig.)

  13. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    2013-01-01

    Finding a location for a new facility such that the facility attracts the maximal number of customers is a challenging problem. Existing studies either model customers as static sites and thus do not consider customer movement, or they focus on theoretical aspects and do not provide solutions...... that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...... traversal is assigned a score that is distributed among the road segments covered by the route according to a score distribution model. The query returns the road segment(s) with the highest score. To achieve low latency, it is essential to prune the very large search space. We propose two algorithms...

  14. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    Finding a location for a new facility such that the facility attracts the maximal number of customers is a challenging problem. Existing studies either model customers as static sites and thus do not consider customer movement, or they focus on theoretical aspects and do not provide solutions...... that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...... traversal is assigned a score that is distributed among the road segments covered by the route according to a score distribution model. The query returns the road segment(s) with the highest score. To achieve low latency, it is essential to prune the very large search space. We propose two algorithms...

  15. Application of ABWR construction database to nuclear power plant project

    International Nuclear Information System (INIS)

    Takashima, Atsushi; Katsube, Yasuhiko

    1999-01-01

    Tokyo Electric Power Company (TEPCO) completed the construction of Kashiwazaki-Kariwa Nuclear Power Station Unit No. 6 and No. 7 (K-6/7) as the first advanced boiling water reactors (ABWR) in the world successfully. K-6 and K-7 started their commercial operations in November, 1996 and in July, 1997 respectively. We consider ABWR as a standard BWR in the world as well as in Japan because ABWR is highly reputed. However, because the interval of our nuclear power plant construction is going to be longer, our engineering level on plant construction will be declining. Hence it is necessary for us to maintain our engineering level. In addition to this circumstance, we are planning to wide application of separated purchase orders for further cost reduction. Also there is an expectation for our contribution to ABWR plant constructions overseas. As facing these circumstances, we have developed a construction database based on our experience for ABWR construction. As the first step of developing the database for these use, we analyzed our own activities in the previous ABWR construction. Through this analysis, we could define activity units of which the project consists. As the second step, we clarified the data which are treated in each activity unit and the interface among them. By taking these steps, we could develop our database efficiently. (author)

  16. Development of comprehensive material performance database for nuclear applications

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime

    1993-01-01

    This paper introduces the present status of the comprehensive material performance database for nuclear applications, which was named JAERI Material Performance Database (JMPD), and examples of its utilization. The JMPD has been developed since 1986 in JAERI with a view to utilizing various kinds of characteristics data of nuclear materials efficiently. Management system of relational database, PLANNER, was employed, and supporting systems for data retrieval and output were expanded. In order to improve user-friendliness of the retrieval system, the menu selection type procedures have been developed where knowledge of the system or the data structures are not required for end-users. As to utilization of the JMPD, two types of data analyses are mentioned as follows: (1) A series of statistical analyses was performed in order to estimate the design values both of the yield strength (Sy) and the tensile strength (Su) for aluminum alloys which are widely used as structural materials for research reactors. (2) Statistical analyses were accomplished by using the cyclic crack growth rate data for nuclear pressure vessel steels, and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and ΔK-constant type tests. (author)

  17. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Md Saion Salikin; Muhammad Farid Abdul Khalid

    2002-01-01

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  18. Computer Applications in the Design Process.

    Science.gov (United States)

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  19. A database for CO2 Separation Performances of MOFs based on Computational Materials Screening.

    Science.gov (United States)

    Altintas, Cigdem; Avci, Gokay; Daglar, Hilal; Nemati Vesali Azar, Ayda; Velioglu, Sadiye; Erucar, Ilknur; Keskin, Seda

    2018-05-03

    Metal organic frameworks (MOFs) have been considered as great candidates for CO2 capture. Considering the very large number of available MOFs, high-throughput computational screening plays a critical role in identifying the top performing materials for target applications in a time-effective manner. In this work, we used molecular simulations to screen the most recent and complete MOF database for identifying the most promising materials for CO2 separation from flue gas (CO2/N2) and landfill gas (CO2/CH4) under realistic operating conditions. We first validated our approach by comparing the results of our molecular simulations for the CO2 uptakes, CO2/N2 and CO2/CH4 selectivities of various types of MOFs with the available experimental data. We then computed binary CO2/N2 and CO2/CH4 mixture adsorption data for the entire MOF database and used these results to calculate several adsorbent selection metrics such as selectivity, working capacity, adsorbent performance score, regenerability, and separation potential. MOFs were ranked based on the combination of these metrics and the top performing MOF adsorbents that can achieve CO2/N2 and CO2/CH4 separations with high performance were identified. Molecular simulations for the adsorption of a ternary CO2/N2/CH4 mixture were performed for these top materials in order to provide a more realistic performance assessment of MOF adsorbents. Structure-performance analysis showed that MOFs with ΔQ>30 kJ/mol, 3.8 A≤PLD≤5 A, 5 A≤LCD≤7.5 A, 0.5≤ϕ≤0.75, SA≤1,000 m2/g, ρ>1 g/cm 3 are the best candidates for selective separation of CO2 from flue gas and landfill gas. This information will be very useful to design novel MOFs with the desired structural features that can lead to high CO2 separation potentials. Finally, an online, freely accessible database https://cosmoserc.ku.edu.tr was established, for the first time in the literature, which reports all computed adsorbent metrics of 3,816 MOFs for CO2/N2, CO2/CH4

  20. Using AMDD method for Database Design in Mobile Cloud Computing Systems

    OpenAIRE

    Silviu Claudiu POPA; Mihai-Constantin AVORNICULUI; Vasile Paul BRESFELEAN

    2013-01-01

    The development of the technologies of wireless telecommunications gave birth of new kinds of e-commerce, the so called Mobile e-Commerce or m-Commerce. Mobile Cloud Computing (MCC) represents a new IT research area that combines mobile computing and cloud compu-ting techniques. Behind a cloud mobile commerce system there is a database containing all necessary information for transactions. By means of Agile Model Driven Development (AMDD) method, we are able to achieve many benefits that smoo...

  1. Application Of Database Program in selecting Sorghum (Sorghum bicolor L) Mutant Lines

    International Nuclear Information System (INIS)

    H, Soeranto

    2000-01-01

    Computer database software namely MSTAT and paradox have been exercised in the field of mutation breeding especially in the process of selecting plant mutant lines of sorghum. In MSTAT, selecting mutant lines can be done by activating the SELECTION function and then followed by entering mathematical formulas for the selection criterion. Another alternative is by defining the desired selection intensity to the analysis results of subprogram SORT. Including the selected plant mutant lines in BRSERIES program, it will make their progenies be easier to be traced in subsequent generations. In paradox, an application program for selecting mutant lines can be made by combining facilities of Table, form and report. Selecting mutant lines with defined selection criterion can easily be done through filtering data. As a relation database, paradox ensures that the application program for selecting mutant lines and progeny trachings, can be made easier, efficient and interactive

  2. Computer graphics from basic to application

    International Nuclear Information System (INIS)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-01

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  3. Computer graphics from basic to application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-15

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  4. THE NASA AMES POLYCYCLIC AROMATIC HYDROCARBON INFRARED SPECTROSCOPIC DATABASE: THE COMPUTED SPECTRA

    International Nuclear Information System (INIS)

    Bauschlicher, C. W.; Ricca, A.; Boersma, C.; Mattioda, A. L.; Cami, J.; Peeters, E.; Allamandola, L. J.; Sanchez de Armas, F.; Puerta Saborido, G.; Hudgins, D. M.

    2010-01-01

    The astronomical emission features, formerly known as the unidentified infrared bands, are now commonly ascribed to polycyclic aromatic hydrocarbons (PAHs). The laboratory experiments and computational modeling done at the NASA Ames Research Center to create a collection of PAH IR spectra relevant to test and refine the PAH hypothesis have been assembled into a spectroscopic database. This database now contains over 800 PAH spectra spanning 2-2000 μm (5000-5 cm -1 ). These data are now available on the World Wide Web at www.astrochem.org/pahdb. This paper presents an overview of the computational spectra in the database and the tools developed to analyze and interpret astronomical spectra using the database. A description of the online and offline user tools available on the Web site is also presented.

  5. Brain perfusion: computed tomography applications

    International Nuclear Information System (INIS)

    Miles, K.A.

    2004-01-01

    Within recent years, the broad introduction of fast multi-detector computed tomography (CT) systems and the availability of commercial software for perfusion analysis have made cerebral perfusion imaging with CT a practical technique for the clinical environment. The technique is widely available at low cost, accurate and easy to perform. Perfusion CT is particularly applicable to those clinical circumstances where patients already undergo CT for other reasons, including stroke, head injury, subarachnoid haemorrhage and radiotherapy planning. Future technical developments in multi-slice CT systems may diminish the current limitations of limited spatial coverage and radiation burden. CT perfusion imaging on combined PET-CT systems offers new opportunities to improve the evaluation of patients with cerebral ischaemia or tumours by demonstrating the relationship between cerebral blood flow and metabolism. Yet CT is often not perceived as a technique for imaging cerebral perfusion. This article reviews the use of CT for imaging cerebral perfusion, highlighting its advantages and disadvantages and draws comparisons between perfusion CT and magnetic resonance imaging. (orig.)

  6. EXPLORATIONS IN QUANTUM COMPUTING FOR FINANCIAL APPLICATIONS

    OpenAIRE

    Gare, Jesse

    2010-01-01

    Quantum computers have the potential to increase the solution speed for many computational problems. This paper is a first step into possible applications for quantum computing in the context of computational finance. The fundamental ideas of quantum computing are introduced, followed by an exposition of the algorithms of Deutsch and Grover. Improved mean and median estimation are shown as results of Grover?s generalized framework. The algorithm for mean estimation is refined to an improved M...

  7. A computer database system to calculate staff radiation doses and maintain records

    International Nuclear Information System (INIS)

    Clewer, P.

    1985-01-01

    A database has been produced to record the personal dose records of all employees monitored for radiation exposure in the Wessex Health Region. Currently there are more than 2000 personnel in 115 departments but the capacity of the database allows for expansion. The computer is interfaced to a densitometer for film badge reading. The hardware used by the database, which is based on a popular microcomputer, is described, as are the various programs that make up the software. The advantages over the manual card index system that it replaces are discussed. (author)

  8. Brain-Computer Interfaces : Beyond Medical Applications

    NARCIS (Netherlands)

    Erp, J.B.F. van; Lotte, F.; Tangermann, M.

    2012-01-01

    Brain-computer interaction has already moved from assistive care to applications such as gaming. Improvements in usability, hardware, signal processing, and system integration should yield applications in other nonmedical areas.

  9. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  10. Practical application of computer graphics in nuclear power plant engineering

    International Nuclear Information System (INIS)

    Machiba, Hiroshi; Kawamura, Hirobumi; Sasaki, Norio

    1992-01-01

    A nuclear power plant is composed of a vast amount of equipment, piping, and so on, and six or seven years are required to complete the design and engineering from the initial planning stage to the time of commercial operation. Furthermore, operating plants must be continually maintained and improved for a long period. Computer graphics were first applied to the composite arrangement design of nuclear power plants in the form of 3-dimensional CAD. Subsequently, as the introduction of CAE has progressed, a huge assortment of information has been accumulated in database, and measures have been sought that would permit the convenient utilization of this information. Using computer graphics technologies, improvement of the interface between the user and such databases has recently been accomplished. In response to the growth in environmental consciousness, photo-realistic simulations for artistic design of the interior and overviews showing harmony with the surroundings have been achieved through the application of computer graphics. (author)

  11. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  12. Application of modern reliability database techniques to military system data

    International Nuclear Information System (INIS)

    Bunea, Cornel; Mazzuchi, Thomas A.; Sarkani, Shahram; Chang, H.-C.

    2008-01-01

    This paper focuses on analysis techniques of modern reliability databases, with an application to military system data. The analysis of military system data base consists of the following steps: clean the data and perform operation on it in order to obtain good estimators; present simple plots of data; analyze the data with statistical and probabilistic methods. Each step is dealt with separately and the main results are presented. Competing risks theory is advocated as the mathematical support for the analysis. The general framework of competing risks theory is presented together with simple independent and dependent competing risks models available in literature. These models are used to identify the reliability and maintenance indicators required by the operating personnel. Model selection is based on graphical interpretation of plotted data

  13. Applications of the Cambridge Structural Database in chemical education1

    Science.gov (United States)

    Battle, Gary M.; Ferrence, Gregory M.; Allen, Frank H.

    2010-01-01

    The Cambridge Structural Database (CSD) is a vast and ever growing compendium of accurate three-dimensional structures that has massive chemical diversity across organic and metal–organic compounds. For these reasons, the CSD is finding significant uses in chemical education, and these applications are reviewed. As part of the teaching initiative of the Cambridge Crystallographic Data Centre (CCDC), a teaching subset of more than 500 CSD structures has been created that illustrate key chemical concepts, and a number of teaching modules have been devised that make use of this subset in a teaching environment. All of this material is freely available from the CCDC website, and the subset can be freely viewed and interrogated using WebCSD, an internet application for searching and displaying CSD information content. In some cases, however, the complete CSD System is required for specific educational applications, and some examples of these more extensive teaching modules are also discussed. The educational value of visualizing real three-dimensional structures, and of handling real experimental results, is stressed throughout. PMID:20877495

  14. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  15. Computer Applications in Reading. Third Edition.

    Science.gov (United States)

    Blanchard, Jay S.; And Others

    Intended as a reference for researchers, teachers, and administrators, this book chronicles research, programs, and uses of computers in reading. Chapter 1 provides a broad view of computer applications in education, while Chapter 2 provides annotated references for computer based reading and language arts programs for children and adults in…

  16. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  17. Grid computing infrastructure, service, and applications

    CERN Document Server

    Jie, Wei; Chen, Jinjun

    2009-01-01

    Offering a comprehensive discussion of advances in grid computing, this book summarizes the concepts, methods, technologies, and applications. It covers topics such as philosophy, middleware, architecture, services, and applications. It also includes technical details to demonstrate how grid computing works in the real world

  18. Applications of computational intelligence in nuclear reactors

    International Nuclear Information System (INIS)

    Jayalal, M.L.; Jehadeesan, R.

    2016-01-01

    Computational intelligence techniques have been successfully employed in a wide range of applications which include the domains of medical, bioinformatics, electronics, communications and business. There has been progress in applying of computational intelligence in the nuclear reactor domain during the last two decades. The stringent nuclear safety regulations pertaining to reactor environment present challenges in the application of computational intelligence in various nuclear sub-systems. The applications of various methods of computational intelligence in the domain of nuclear reactors are discussed in this paper. (author)

  19. Practical applications of soft computing in engineering

    CERN Document Server

    2001-01-01

    Soft computing has been presented not only with the theoretical developments but also with a large variety of realistic applications to consumer products and industrial systems. Application of soft computing has provided the opportunity to integrate human-like vagueness and real-life uncertainty into an otherwise hard computer program. This book highlights some of the recent developments in practical applications of soft computing in engineering problems. All the chapters have been sophisticatedly designed and revised by international experts to achieve wide but in-depth coverage. Contents: Au

  20. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  1. Integer programming theory, applications, and computations

    CERN Document Server

    Taha, Hamdy A

    1975-01-01

    Integer Programming: Theory, Applications, and Computations provides information pertinent to the theory, applications, and computations of integer programming. This book presents the computational advantages of the various techniques of integer programming.Organized into eight chapters, this book begins with an overview of the general categorization of integer applications and explains the three fundamental techniques of integer programming. This text then explores the concept of implicit enumeration, which is general in a sense that it is applicable to any well-defined binary program. Other

  2. An Embedded Database Application for the Aggregation of Farming Device Data

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem; Pedersen, Torben Bach

    2010-01-01

    In order to store massive amounts of data produced by the farming devices and to keep data that spans long intervals of time for analysis, reporting and maintenance purposes; it is desirable to reduce the size of the data by maintaining the data at different aggregate levels. The older data can...... be made coarse-grained while keeping the newest data fine-grained. Considering the availability of a limited amount of storage capacity on the farm machinery, an application written in C was developed to collect the data from a CAN-BUS, store it into the embedded database efficiently and perform gradual...... data aggregation effectively. Furthermore, the aggregation is achieved by using either two ratio-based aggregation methods or a time-granularity based aggregation method. A detailed description of the embedded database technology on a tractor computer is also presented in this paper....

  3. Advances in computational metabolomics and databases deepen the understanding of metabolisms.

    Science.gov (United States)

    Tsugawa, Hiroshi

    2018-01-29

    Mass spectrometry (MS)-based metabolomics is the popular platform for metabolome analyses. Computational techniques for the processing of MS raw data, for example, feature detection, peak alignment, and the exclusion of false-positive peaks, have been established. The next stage of untargeted metabolomics would be to decipher the mass fragmentation of small molecules for the global identification of human-, animal-, plant-, and microbiota metabolomes, resulting in a deeper understanding of metabolisms. This review is an update on the latest computational metabolomics including known/expected structure databases, chemical ontology classifications, and mass spectrometry cheminformatics for the interpretation of mass fragmentations and for the elucidation of unknown metabolites. The importance of metabolome 'databases' and 'repositories' is also discussed because novel biological discoveries are often attributable to the accumulation of data, to relational databases, and to their statistics. Lastly, a practical guide for metabolite annotations is presented as the summary of this review. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. The development and application of a Mycoplasma gallisepticum sequence database.

    Science.gov (United States)

    Armour, Natalie K; Laibinis, Victoria A; Collett, Stephen R; Ferguson-Noel, Naola

    2013-01-01

    Molecular analysis was conducted on 36 Mycoplasma gallisepticum DNA extracts from tracheal swab samples of commercial poultry in seven South African provinces between 2009 and 2012. Twelve unique M. gallisepticum genotypes were identified by polymerase chain reaction and sequence analysis of the 16S-23S rRNA intergenic spacer region (IGSR), M. gallisepticum cytadhesin 2 (mgc2), MGA_0319 and gapA genetic regions. The DNA sequences of these genotypes were distinct from those of M. gallisepticum isolates in a database composed of sequences from other countries, vaccine and reference strains. The most prevalent genotype (SA-WT#7) was detected in samples from commercial broilers, broiler breeders and layers in five provinces. South African M. gallisepticum sequences were more similar to those of the live vaccines commercially available in South Africa, but were distinct from that of F strain vaccine, which is not registered for use in South Africa. The IGSR, mgc2 or MGA_0319 sequences of three South African genotypes were identical to those of the ts-11 vaccine strain, necessitating a combination of mgc2 and IGSR targeted sequencing to differentiate South African wild-type genotypes from ts-11 vaccine. To identify and differentiate all 12 wild-types, mgc2, IGSR and MGA_0319 sequencing was required. Sequencing of gapA was least effective at strain differentiation. This research serves as a model for the development of an M. gallisepticum sequence database, and illustrates its application to characterize M. gallisepticum genotypes, select diagnostic tests and better understand the epidemiology of M. gallisepticum.

  5. Outline of computer application in PNC

    International Nuclear Information System (INIS)

    Aoki, Minoru

    1990-01-01

    Computer application systems are an important resource for the R and D (research and development) in PNC. Various types of computer systems are widely used on the R and D of experiment, evaluation and analysis, plant operation and other jobs in PNC. Currently, the computer centers in PNC have been established in Oarai engineering Center and Tokai Works. The former uses a large scale digital computer and supercomputer systems. The latter uses only a large scale digital computer system. These computer systems have joined in the PNC Information Network that connects between Head Office and Branches, Oarai, Tokai, Ningyotoge and Fugen, by means of super digital circuit. In the near future, the computer centers will be brought together in order to raise up efficiency of operation of the computer systems. New computer center called 'Information Center' is under construction in Oarai Engineering Center. (author)

  6. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON; Logiciel de controle et commande et base de donnees orientee objet: application dans le cadre de la mise en oeuvre d`un accelerateur de particules, le VIVITRON

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, A

    1996-01-11

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O{sub 2} which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author) 38 refs.

  7. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    Science.gov (United States)

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  8. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  9. A database application for the Naval Command Physical Readiness Testing Program

    OpenAIRE

    Quinones, Frances M.

    1998-01-01

    Approved for public release; distribution is unlimited 1T21 envisions a Navy with tandardized, state-of-art computer systems. Based on this vision, Naval database management systems will also need to become standardized among Naval commands. Today most commercial off the shelf (COTS) database management systems provide a graphical user interface. Among the many Naval database systems currently in use, the Navy's Physical Readiness Program database has continued to exist at the command leve...

  10. Grid computing techniques and applications

    CERN Document Server

    Wilkinson, Barry

    2009-01-01

    ''… the most outstanding aspect of this book is its excellent structure: it is as though we have been given a map to help us move around this technology from the base to the summit … I highly recommend this book …''Jose Lloret, Computing Reviews, March 2010

  11. Color in Computer Vision Fundamentals and Applications

    CERN Document Server

    Gevers, Theo; van de Weijer, Joost; Geusebroek, Jan-Mark

    2012-01-01

    While the field of computer vision drives many of today’s digital technologies and communication networks, the topic of color has emerged only recently in most computer vision applications. One of the most extensive works to date on color in computer vision, this book provides a complete set of tools for working with color in the field of image understanding. Based on the authors’ intense collaboration for more than a decade and drawing on the latest thinking in the field of computer science, the book integrates topics from color science and computer vision, clearly linking theor

  12. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    Science.gov (United States)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  13. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  14. Monomial ideals, computations and applications

    CERN Document Server

    Gimenez, Philippe; Sáenz-de-Cabezón, Eduardo

    2013-01-01

    This work covers three important aspects of monomials ideals in the three chapters "Stanley decompositions" by Jürgen Herzog, "Edge ideals" by Adam Van Tuyl and "Local cohomology" by Josep Álvarez Montaner. The chapters, written by top experts, include computer tutorials that emphasize the computational aspects of the respective areas. Monomial ideals and algebras are, in a sense, among the simplest structures in commutative algebra and the main objects of combinatorial commutative algebra. Also, they are of major importance for at least three reasons. Firstly, Gröbner basis theory allows us to treat certain problems on general polynomial ideals by means of monomial ideals. Secondly, the combinatorial structure of monomial ideals connects them to other combinatorial structures and allows us to solve problems on both sides of this correspondence using the techniques of each of the respective areas. And thirdly, the combinatorial nature of monomial ideals also makes them particularly well suited to the devel...

  15. Real-life applications with membrane computing

    CERN Document Server

    Zhang, Gexiang; Gheorghe, Marian

    2017-01-01

    This book thoroughly investigates the underlying theoretical basis of membrane computing models, and reveals their latest applications. In addition, to date there have been no illustrative case studies or complex real-life applications that capitalize on the full potential of the sophisticated membrane systems computational apparatus; gaps that this book remedies. By studying various complex applications – including engineering optimization, power systems fault diagnosis, mobile robot controller design, and complex biological systems involving data modeling and process interactions – the book also extends the capabilities of membrane systems models with features such as formal verification techniques, evolutionary approaches, and fuzzy reasoning methods. As such, the book offers a comprehensive and up-to-date guide for all researchers, PhDs and undergraduate students in the fields of computer science, engineering and the bio-sciences who are interested in the applications of natural computing models.

  16. Journal of Computer Science and Its Application

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application ... Cloud model construct for transaction-based cooperative systems · EMAIL FULL TEXT EMAIL FULL TEXT ... The evaluation of tertiary institution service quality using HiEdQUAL and fuzzy ...

  17. Computer vision for biomedical image applications. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yanxi [Carnegie Mellon Univ., Pittsburgh, PA (United States). School of Computer Science, The Robotics Institute; Jiang, Tianzi [Chinese Academy of Sciences, Beijing (China). National Lab. of Pattern Recognition, Inst. of Automation; Zhang, Changshui (eds.) [Tsinghua Univ., Beijing, BJ (China). Dept. of Automation

    2005-07-01

    This book constitutes the refereed proceedings of the First International Workshop on Computer Vision for Biomedical Image Applications: Current Techniques and Future Trends, CVBIA 2005, held in Beijing, China, in October 2005 within the scope of ICCV 20. (orig.)

  18. Computer applications in project KARP

    International Nuclear Information System (INIS)

    Raju, R.P.; Siddiqui, H.R.

    1992-01-01

    For effective project implementation of Kalpakkam Reprocessing Plant (KARP) at Kalpakkam, an elaborate Management Information Systems (MIS) was developed in-house for physical and financial progress monitoring and reporting. Computer aided design software for design of process piping layout was also developed and implemented for generation of process cell piping drawings for construction purposes. Modelling and simulation studies were carried out to optimize process parameters and fault tree analysis techniques utilised for evaluating plant availability factors. (author). 2 tabs

  19. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  20. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  1. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  2. Computer-aided diagnosis system for bone scintigrams from Japanese patients: importance of training database

    DEFF Research Database (Denmark)

    Horikoshi, Hiroyuki; Kikuchi, Akihiro; Onoguchi, Masahisa

    2012-01-01

    higher performance than the corresponding CAD software trained with a European database for the analysis of bone scans from Japanese patients. These results could at least partly be caused by the physical differences between Japanese and European patients resulting in less influence of attenuation......Computer-aided diagnosis (CAD) software for bone scintigrams have recently been introduced as a clinical quality assurance tool. The purpose of this study was to compare the diagnostic accuracy of two CAD systems, one based on a European and one on a Japanese training database, in a group of bone...... scans from Japanese patients.The two CAD software are trained to interpret bone scans using training databases consisting of bone scans with the desired interpretation, metastatic disease or not. One software was trained using 795 bone scans from European patients and the other with 904 bone scans from...

  3. The Effect of Relational Database Technology on Administrative Computing at Carnegie Mellon University.

    Science.gov (United States)

    Golden, Cynthia; Eisenberger, Dorit

    1990-01-01

    Carnegie Mellon University's decision to standardize its administrative system development efforts on relational database technology and structured query language is discussed and its impact is examined in one of its larger, more widely used applications, the university information system. Advantages, new responsibilities, and challenges of the…

  4. Thermochemistry in BWR. An overview of applications of program codes and databases

    International Nuclear Information System (INIS)

    Hermansson, H-P.; Becker, R.

    2010-01-01

    The Swedish work on thermodynamics of metal-water systems relevant to BWR conditions has been ongoing since the 70ies, and at present time a compilation and adaptation of codes and thermodynamic databases are in progress. In the previous work, basic thermodynamic data were compiled for parts of the system Fe-Cr-Ni-Co-Zn-S-H 2 O at 25-300 °C. Since some thermodynamic information necessary for temperature extrapolations of data up to 300 °C was not published in the earlier works, these data have now been partially recalculated. This applies especially to the parameters of the HKF-model, which are used to extrapolate the thermodynamic data for ionic and neutral aqua species from 25 °C to BWR temperatures. Using the completed data, e.g. the change in standard Gibbs energy (ΔG 0 ) and the equilibrium constant (log K) can be calculated for further applications at BWR/LWR conditions. In addition a computer program is currently being developed at Studsvik for the calculation of equilibrium conductivity in high temperature water. The program is intended for PWR applications, but can also be applied to BWR environment. Data as described above will be added to the database of this program. It will be relatively easy to further develop the program e.g. to calculate Pourbaix diagrams, and these graphs could then be calculated at any temperature. This means that there will be no limitation to the temperatures and total concentrations (usually 10 -6 to 10 -8 mol/kg) as reported in earlier work. It is also easy to add a function generating ΔG 0 and log K values at selected temperatures. One of the fundamentals for this work was also to overview and collect publicly available thermodynamic program codes and databases of relevance for BWR conditions found in open sources. The focus has been on finding already done compilations and reviews, and some 40 codes and 15 databases were found. Codes and data-bases are often integrated and such a package is often developed for

  5. Application of new type of distributed multimedia databases to networked electronic museum

    Science.gov (United States)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed

  6. Internet Databases of the Properties, Enzymatic Reactions, and Metabolism of Small Molecules—Search Options and Applications in Food Science

    Directory of Open Access Journals (Sweden)

    Piotr Minkiewicz

    2016-12-01

    Full Text Available Internet databases of small molecules, their enzymatic reactions, and metabolism have emerged as useful tools in food science. Database searching is also introduced as part of chemistry or enzymology courses for food technology students. Such resources support the search for information about single compounds and facilitate the introduction of secondary analyses of large datasets. Information can be retrieved from databases by searching for the compound name or structure, annotating with the help of chemical codes or drawn using molecule editing software. Data mining options may be enhanced by navigating through a network of links and cross-links between databases. Exemplary databases reviewed in this article belong to two classes: tools concerning small molecules (including general and specialized databases annotating food components and tools annotating enzymes and metabolism. Some problems associated with database application are also discussed. Data summarized in computer databases may be used for calculation of daily intake of bioactive compounds, prediction of metabolism of food components, and their biological activity as well as for prediction of interactions between food component and drugs.

  7. GAMCAT - a personal computer database on alpha particles and gamma rays from radioactive decay

    International Nuclear Information System (INIS)

    Tepel, J.W.; Mueller, H.W.

    1990-01-01

    The GAMCAT database is a compilation of data describing the alpha particles and gamma rays that occur in the radioactive decay of all known nuclides, adapted for IBM Personal Computers and compatible systems. These compiled data have been previously published, and are now available as a compact database. Entries can be retrieved by defining the properties of the parent nuclei as well as alpha-particle and gamma-ray energies or any combination of these parameters. The system provides fast access to the data and has been completely written in C to run on an AT-compatible computer, with a hard disk and 640K of memory under DOS 2.11 or higher. GAMCAT is available from the Fachinformationszentrum Karlsruhe. (orig.)

  8. DESIGN AND CONSTRUCTION OF A FOREST SPATIAL DATABASE: AN APPLICATION

    Directory of Open Access Journals (Sweden)

    Turan Sönmez

    2006-11-01

    Full Text Available General Directorate of Forests (GDF has not yet created the spatial forest database to manage forest and catch the developed countries in forestry. The lack of spatial forest database results in collection of the spatial data redundancy, communication problems among the forestry organizations. Also it causes Turkish forestry to be backward of informatics’ era. To solve these problems; GDF should establish spatial forest database supported Geographic Information System (GIS. To design the spatial database, supported GIS, which provides accurate, on time and current data/info for decision makers and operators in forestry, and to develop sample interface program to apply and monitor classical forest management plans is paramount in contemporary forest management planning process. This research is composed of three major stages: (i spatial rototype database design considering required by the three hierarchical organizations of GDF (regional directorate of forests, forest enterprise, and territorial division, (ii user interface program developed to apply and monitor classical management plans based on the designed database, (iii the implementation of the designed database and its user interface in Artvin Central Planning Unit.

  9. DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs

    Science.gov (United States)

    Wrenn, Gregory A.

    2005-01-01

    This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.

  10. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  11. Artificial immune system applications in computer security

    CERN Document Server

    Tan, Ying

    2016-01-01

    This book provides state-of-the-art information on the use, design, and development of the Artificial Immune System (AIS) and AIS-based solutions to computer security issues. Artificial Immune System: Applications in Computer Security focuses on the technologies and applications of AIS in malware detection proposed in recent years by the Computational Intelligence Laboratory of Peking University (CIL@PKU). It offers a theoretical perspective as well as practical solutions for readers interested in AIS, machine learning, pattern recognition and computer security. The book begins by introducing the basic concepts, typical algorithms, important features, and some applications of AIS. The second chapter introduces malware and its detection methods, especially for immune-based malware detection approaches. Successive chapters present a variety of advanced detection approaches for malware, including Virus Detection System, K-Nearest Neighbour (KNN), RBF networ s, and Support Vector Machines (SVM), Danger theory, ...

  12. Collectively loading an application in a parallel computer

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Miller, Samuel J.; Mundy, Michael B.

    2016-01-05

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  13. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  14. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  15. Interactive computer graphics applications for compressible aerodynamics

    Science.gov (United States)

    Benson, Thomas J.

    1994-01-01

    Three computer applications have been developed to solve inviscid compressible fluids problems using interactive computer graphics. The first application is a compressible flow calculator which solves for isentropic flow, normal shocks, and oblique shocks or centered expansions produced by two dimensional ramps. The second application couples the solutions generated by the first application to a more graphical presentation of the results to produce a desk top simulator of three compressible flow problems: 1) flow past a single compression ramp; 2) flow past two ramps in series; and 3) flow past two opposed ramps. The third application extends the results of the second to produce a design tool which solves for the flow through supersonic external or mixed compression inlets. The applications were originally developed to run on SGI or IBM workstations running GL graphics. They are currently being extended to solve additional types of flow problems and modified to operate on any X-based workstation.

  16. Statistical methods and computer applications

    CERN Document Server

    Arora, PN

    2009-01-01

    Some of the exclusive features of the book are: Every concept has been explained with the help of solved examples. Working rules showing the various steps for the applications of formulae have also been given. The diagrams and graphs have been neatly and correctly drawn in such a way that the students have the complete understanding of the problem by simply looking at them. Efforts have been made to make the subject throughly exhaustive and nothing important has been omitted. Answer to all the problems have been throughly checked. It is a user-friendly book containing many, solved problems and

  17. Access database application in medical treatment management platform

    International Nuclear Information System (INIS)

    Wu Qingming

    2014-01-01

    For timely, accurate and flexible access to medical expenses data, we applied Microsoft Access 2003 database management software, and we finished the establishment of a management platform for medical expenses. By developing management platform for medical expenses, overall hospital costs for medical expenses can be controlled to achieve a real-time monitoring of medical expenses. Using the Access database management platform for medical expenses not only changes the management model, but also promotes a sound management system for medical expenses. (authors)

  18. GSAC - Generic Seismic Application Computing

    Science.gov (United States)

    Herrmann, R. B.; Ammon, C. J.; Koper, K. D.

    2004-12-01

    With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient

  19. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  20. Conformal geometry computational algorithms and engineering applications

    CERN Document Server

    Jin, Miao; He, Ying; Wang, Yalin

    2018-01-01

    This book offers an essential overview of computational conformal geometry applied to fundamental problems in specific engineering fields. It introduces readers to conformal geometry theory and discusses implementation issues from an engineering perspective.  The respective chapters explore fundamental problems in specific fields of application, and detail how computational conformal geometric methods can be used to solve them in a theoretically elegant and computationally efficient way. The fields covered include computer graphics, computer vision, geometric modeling, medical imaging, and wireless sensor networks. Each chapter concludes with a summary of the material covered and suggestions for further reading, and numerous illustrations and computational algorithms complement the text.  The book draws on courses given by the authors at the University of Louisiana at Lafayette, the State University of New York at Stony Brook, and Tsinghua University, and will be of interest to senior undergraduates, gradua...

  1. Relational databases for rare disease study: application to vascular anomalies.

    Science.gov (United States)

    Perkins, Jonathan A; Coltrera, Marc D

    2008-01-01

    To design a relational database integrating clinical and basic science data needed for multidisciplinary treatment and research in the field of vascular anomalies. Based on data points agreed on by the American Society of Pediatric Otolaryngology (ASPO) Vascular Anomalies Task Force. The database design enables sharing of data subsets in a Health Insurance Portability and Accountability Act (HIPAA)-compliant manner for multisite collaborative trials. Vascular anomalies pose diagnostic and therapeutic challenges. Our understanding of these lesions and treatment improvement is limited by nonstandard terminology, severity assessment, and measures of treatment efficacy. The rarity of these lesions places a premium on coordinated studies among multiple participant sites. The relational database design is conceptually centered on subjects having 1 or more lesions. Each anomaly can be tracked individually along with their treatment outcomes. This design allows for differentiation between treatment responses and untreated lesions' natural course. The relational database design eliminates data entry redundancy and results in extremely flexible search and data export functionality. Vascular anomaly programs in the United States. A relational database correlating clinical findings and photographic, radiologic, histologic, and treatment data for vascular anomalies was created for stand-alone and multiuser networked systems. Proof of concept for independent site data gathering and HIPAA-compliant sharing of data subsets was demonstrated. The collaborative effort by the ASPO Vascular Anomalies Task Force to create the database helped define a common vascular anomaly data set. The resulting relational database software is a powerful tool to further the study of vascular anomalies and the development of evidence-based treatment innovation.

  2. Student Advising and Retention Application in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2016-11-01

    Full Text Available  This paper proposes a new user-friendly application enhancing and expanding the current advising services of Gradesfirst currently being used for advising and retention by the Athletic department of UMES with a view to implement new performance activities like mentoring, tutoring, scheduling, and study hall hours into existing tools. This application includes various measurements that can be used to monitor and improve the performance of the students in the Athletic Department of UMES by monitoring students’ weekly study hall hours, and tutoring schedules. It also supervises tutors’ login and logout activities in order to monitor their effectiveness, supervises tutor-tutee interaction, and stores and analyzes the overall academic progress of each student. A dedicated server for providing services will be developed at the local site. The paper has been implemented in three steps. The first step involves the creation of an independent cloud computing environment that provides resources such as database creation, query-based statistical data, performance measures activities, and automated support of performance measures such as advising, mentoring, monitoring and tutoring. The second step involves the creation of an application known as Student Advising and Retention (SAR application in a cloud computing environment. This application has been designed to be a comprehensive database management system which contains relevant data regarding student academic development that supports various strategic advising and monitoring of students. The third step involves the creation of a systematic advising chart and frameworks which help advisors. The paper shows ways of creating the most appropriate advising technique based on the student’s academic needs. The proposed application runs in a Windows-based system. As stated above, the proposed application is expected to enhance and expand the current advising service of Gradesfirst tool. A brief

  3. Computer systems and methods for the query and visualization of multidimensional databases

    Science.gov (United States)

    Stolte, Chris; Tang, Diane L; Hanrahan, Patrick

    2015-03-03

    A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes multiple operand names, each operand corresponding to one or more fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first operands with the columns shelf and to associate one or more second operands with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first operands, and each pane has a y-axis defined based on data for the one or more second operands.

  4. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    Science.gov (United States)

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  5. Computer systems and methods for the query and visualization of multidimensional databases

    Science.gov (United States)

    Stolte, Chris [Palo Alto, CA; Tang, Diane L [Palo Alto, CA; Hanrahan, Patrick [Portola Valley, CA

    2011-02-01

    In response to a user request, a computer generates a graphical user interface on a computer display. A schema information region of the graphical user interface includes multiple operand names, each operand name associated with one or more fields of a multi-dimensional database. A data visualization region of the graphical user interface includes multiple shelves. Upon detecting a user selection of the operand names and a user request to associate each user-selected operand name with a respective shelf in the data visualization region, the computer generates a visual table in the data visualization region in accordance with the associations between the operand names and the corresponding shelves. The visual table includes a plurality of panes, each pane having at least one axis defined based on data for the fields associated with a respective operand name.

  6. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  7. IP Telephony Applicability in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Francisco Palacios

    2018-02-01

    Full Text Available This paper carries out a research related to the applicability of VoIP over Cloud Computing to guarantee service stability and elasticity of the organizations. In this paper, Elastix is used as an open source software that allows the management and control of a Private Branch Exchange (PBX; and for developing, it is used the services given Amazon Web Services due to their leadership and experience in cloud computing providing security, scalability, backup service and feasibility for the users.

  8. Accomplish the Application Area in Cloud Computing

    OpenAIRE

    Bansal, Nidhi; Awasthi, Amit

    2012-01-01

    In the cloud computing application area of accomplish, we find the fact that cloud computing covers a lot of areas are its main asset. At a top level, it is an approach to IT where many users, some even from different companies get access to shared IT resources such as servers, routers and various file extensions, instead of each having their own dedicated servers. This offers many advantages like lower costs and higher efficiency. Unfortunately there have been some high profile incidents whe...

  9. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  10. New Concepts and Applications in Soft Computing

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária

    2013-01-01

                  The book provides a sample of research on the innovative theory and applications of soft computing paradigms.             The idea of Soft Computing was initiated in 1981 when Professor Zadeh published his first paper on soft data analysis and constantly evolved ever since. Professor Zadeh defined Soft Computing as the fusion of the fields of fuzzy logic (FL), neural network theory (NN) and probabilistic reasoning (PR), with the latter subsuming belief networks, evolutionary computing including DNA computing, chaos theory and parts of learning theory into one multidisciplinary system. As Zadeh said the essence of soft computing is that unlike the traditional, hard computing, soft computing is aimed at an accommodation with the pervasive imprecision of the real world. Thus, the guiding principle of soft computing is to exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. ...

  11. Cloud computing for data-intensive applications

    CERN Document Server

    Li, Xiaolin

    2014-01-01

    This book presents a range of cloud computing platforms for data-intensive scientific applications. It covers systems that deliver infrastructure as a service, including: HPC as a service; virtual networks as a service; scalable and reliable storage; algorithms that manage vast cloud resources and applications runtime; and programming models that enable pragmatic programming and implementation toolkits for eScience applications. Many scientific applications in clouds are also introduced, such as bioinformatics, biology, weather forecasting and social networks. Most chapters include case studie

  12. Application of cluster computing in materials science

    International Nuclear Information System (INIS)

    Kuzmin, A.

    2006-01-01

    Solution of many problems in materials science requires that high performance computing (HPC) be used. Therefore, a cluster computer, Latvian Super-cluster (LASC), was constructed at the Institute of Solid State Physics of the University of Latvia in 2002. The LASC is used for advanced research in the fields of quantum chemistry, solid state physics and nano materials. In this work we overview currently available computational technologies and exemplify their application by interpretation of x-ray absorption spectra for nano-sized ZnO. (author)

  13. Cloud computing with e-science applications

    CERN Document Server

    Terzo, Olivier

    2015-01-01

    The amount of data in everyday life has been exploding. This data increase has been especially significant in scientific fields, where substantial amounts of data must be captured, communicated, aggregated, stored, and analyzed. Cloud Computing with e-Science Applications explains how cloud computing can improve data management in data-heavy fields such as bioinformatics, earth science, and computer science. The book begins with an overview of cloud models supplied by the National Institute of Standards and Technology (NIST), and then:Discusses the challenges imposed by big data on scientific

  14. Application engineering for process computer systems

    International Nuclear Information System (INIS)

    Mueller, K.

    1975-01-01

    The variety of tasks for process computers in nuclear power stations necessitates the centralization of all production stages from the planning stage to the delivery of the finished process computer system (PRA) to the user. This so-called 'application engineering' comprises all of the activities connected with the application of the PRA: a) establishment of the PRA concept, b) project counselling, c) handling of offers, d) handling of orders, e) internal handling of orders, f) technical counselling, g) establishing of parameters, h) monitoring deadlines, i) training of customers, j) compiling an operation manual. (orig./AK) [de

  15. Database Application for a Youth Market Livestock Production Education Program

    Science.gov (United States)

    Horney, Marc R.

    2013-01-01

    This article offers an example of a database designed to support teaching animal production and husbandry skills in county youth livestock programs. The system was used to manage production goals, animal growth and carcass data, photos and other imagery, and participant records. These were used to produce a variety of customized reports to help…

  16. Development and Use of an EFL Reading Practice Application for an Android Tablet Computer

    Science.gov (United States)

    Ishikawa, Yasushige; Smith, Craig; Kondo, Mutsumi; Akano, Ichiro; Maher, Kate; Wada, Norihisa

    2014-01-01

    This paper reports on the use of an English-language reading practice application for an Android tablet computer with students who are not native speakers of English. The application materials for vocabulary learning in reading-passage contexts were created to include words from a database of low-frequency and technical noun-verb collocations…

  17. ARIDA: An Arabic Inter-Language Database and Its Applications: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Ghazi Abuhakema

    2009-08-01

    Full Text Available This paper describes a pilot study in which we collected a small learner corpus of Arabic, developed a tagset for error annotation and performed simple Computer-aided Error Analysis (CEA on the data. For this study, we adapted the French Interlanguage Database (FRIDA (Granger, 2003a tagset to the data. We chose FRIDA in order to keep our tagging in line with a known standard. The paper describes the need for learner corpora, the learner data we have collected, the tagset we have developed, its advantages and disadvantages, the preliminary CEA results, other potential applications of the error-annotated corpus of Arabic, and the error frequency distribution of both proficiency levels as well as our ongoing work.

  18. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    Science.gov (United States)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  19. Computational methods for industrial radiation measurement applications

    International Nuclear Information System (INIS)

    Gardner, R.P.; Guo, P.; Ao, Q.

    1996-01-01

    Computational methods have been used with considerable success to complement radiation measurements in solving a wide range of industrial problems. The almost exponential growth of computer capability and applications in the last few years leads to a open-quotes black boxclose quotes mentality for radiation measurement applications. If a black box is defined as any radiation measurement device that is capable of measuring the parameters of interest when a wide range of operating and sample conditions may occur, then the development of computational methods for industrial radiation measurement applications should now be focused on the black box approach and the deduction of properties of interest from the response with acceptable accuracy and reasonable efficiency. Nowadays, increasingly better understanding of radiation physical processes, more accurate and complete fundamental physical data, and more advanced modeling and software/hardware techniques have made it possible to make giant strides in that direction with new ideas implemented with computer software. The Center for Engineering Applications of Radioisotopes (CEAR) at North Carolina State University has been working on a variety of projects in the area of radiation analyzers and gauges for accomplishing this for quite some time, and they are discussed here with emphasis on current accomplishments

  20. Application of computer technique in SMCAMS

    International Nuclear Information System (INIS)

    Lu Deming

    2001-01-01

    A series of applications of computer technique in SMCAMS physics design and magnetic field measurement is described, including digital calculation of electric-magnetic field, beam dynamics, calculation of beam injection and extraction, and mapping and shaping of the magnetic field

  1. CT applications of medical computer graphics

    International Nuclear Information System (INIS)

    Rhodes, M.L.

    1985-01-01

    Few applications of computer graphics show as much promise and early success as that for CT. Unlike electron microscopy, ultrasound, business, military, and animation applications, CT image data are inherently digital. CT pictures can be processed directly by programs well established in the fields of computer graphics and digital image processing. Methods for reformatting digital pictures, enhancing structure shape, reducing image noise, and rendering three-dimensional (3D) scenes of anatomic structures have all become routine at many CT centers. In this chapter, the authors provide a brief introduction to computer graphics terms and techniques commonly applied to CT pictures and, when appropriate, to those showing promise for magnetic resonance images. Topics discussed here are image-processing options that are applied to digital images already constructed. In the final portion of this chapter techniques for ''slicing'' CT image data are presented, and geometric principles that describe the specification of oblique and curved images are outlined. Clinical examples are included

  2. Application of cloud database in the management of clinical data of patients with skin diseases.

    Science.gov (United States)

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  3. Computational electromagnetics recent advances and engineering applications

    CERN Document Server

    2014-01-01

    Emerging Topics in Computational Electromagnetics in Computational Electromagnetics presents advances in Computational Electromagnetics. This book is designed to fill the existing gap in current CEM literature that only cover the conventional numerical techniques for solving traditional EM problems. The book examines new algorithms, and applications of these algorithms for solving problems of current interest that are not readily amenable to efficient treatment by using the existing techniques. The authors discuss solution techniques for problems arising in nanotechnology, bioEM, metamaterials, as well as multiscale problems. They present techniques that utilize recent advances in computer technology, such as parallel architectures, and the increasing need to solve large and complex problems in a time efficient manner by using highly scalable algorithms.

  4. Wearable computer technology for dismounted applications

    Science.gov (United States)

    Daniels, Reginald

    2010-04-01

    Small computing devices which rival the compact size of traditional personal digital assistants (PDA) have recently established a market niche. These computing devices are small enough to be considered unobtrusive for humans to wear. The computing devices are also powerful enough to run full multi-tasking general purpose operating systems. This paper will explore the wearable computer information system for dismounted applications recently fielded for ground-based US Air Force use. The environments that the information systems are used in will be reviewed, as well as a description of the net-centric, ground-based warrior. The paper will conclude with a discussion regarding the importance of intuitive, usable, and unobtrusive operator interfaces for dismounted operators.

  5. A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.

    Science.gov (United States)

    Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo

    2015-01-01

    The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.

  6. In-database processing of a large collection of remote sensing data: applications and implementation

    Science.gov (United States)

    Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina

    2016-04-01

    Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability

  7. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  8. A database of fragmentation cross section measurements applicable to cosmic ray propagation calculations

    International Nuclear Information System (INIS)

    Crawford, H.J.; Engelage, J.; Jones, F.C.

    1989-08-01

    A database of single particle inclusive fragment production cross section measurements has been established and is accessible over common computer networks. These measurements have been obtained from both published literature and direct communication with experimenters and include cross sections for nuclear beams on H, He, and heavier targets, and for H and He beams on nuclear targets, for energies >30 MeV/nucleon. These cross sections are directly applicable to calculations involving cosmic ray nuclear interactions with matter. The data base includes projectile, target, and fragment specifications, beam energy, cross section with uncertainty, literature reference, and comment code. It is continuously updated to assure accuracy and completeness. Also available are widely used semi-empirical formulations for calculating production cross sections and excitation functions. In this paper we discuss the database in detail and describe how it can be accessed. We compare the measurements with semi-empirical calculations and point out areas where improved calculations and further cross section measurements are required. 5 refs., 2 figs

  9. A personal digital assistant application (MobilDent) for dental fieldwork data collection, information management and database handling.

    Science.gov (United States)

    Forsell, M; Häggström, M; Johansson, O; Sjögren, P

    2008-11-08

    To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.

  10. Text-Mining Applications for Creation of Biofilm Literature Database

    Directory of Open Access Journals (Sweden)

    Kanika Gupta

    2017-10-01

    So in the present research published corpora of 34306 documents for biofilm was collected from PubMed database along with non-indexed resources like books, conferences, newspaper articles, etc. and these were divided into five categories i.e. classification, growth and development, physiology, drug effects and radiation effects. These five categories were further individually divided into three parts i.e. Journal Title, Abstract Title, and Abstract Text to make indexing highly specific. Text-processing was done using the software Rapid Miner_v5.3, which tokenizes the entire text into words and provides the frequency of each word within the document. The obtained words were normalized using Remove Stop and Stem Word command of Rapid Miner_v5.3 which removes the stopping and stemming words. The obtained words were stored in MS-Excel 2007 and were sorted in decreasing order of frequency using Sort & Filter command of MS-Excel 2007. The words are visualization through networks obtained by Cytoscape_v2.7.0. Now the words obtained were highly specific for biofilms, generating a controlled biofilm vocabulary and this vocabulary could be used for indexing articles for biofilm (similar to MeSH database which indexes articles for PubMed. The obtained keywords information was stored in the relational database which is locally hosted using the WAMP_v2.4 (Windows, Apache, MySQL, PHP server. The available biofilm vocabulary will be significant for researchers studying biofilm literature, making their search easy and efficient.

  11. DEVELOPING FLEXIBLE APPLICATIONS WITH XML AND DATABASE INTEGRATION

    Directory of Open Access Journals (Sweden)

    Hale AS

    2004-04-01

    Full Text Available In recent years the most popular subject in Information System area is Enterprise Application Integration (EAI. It can be defined as a process of forming a standart connection between different systems of an organization?s information system environment. The incorporating, gaining and marriage of corporations are the major reasons of popularity in Enterprise Application Integration. The main purpose is to solve the application integrating problems while similar systems in such corporations continue working together for a more time. With the help of XML technology, it is possible to find solutions to the problems of application integration either within the corporation or between the corporations.

  12. Enterprise Android programming Android database applications for the enterprise

    CERN Document Server

    Mednieks, Zigurd; Dornin, Laird; Pan, Zane

    2013-01-01

    The definitive guide to building data-driven Android applications for enterprise systems Android devices represent a rapidly growing share of the mobile device market. With the release of Android 4, they are moving beyond consumer applications into corporate/enterprise use. Developers who want to start building data-driven Android applications that integrate with enterprise systems will learn how with this book. In the tradition of Wrox Professional guides, it thoroughly covers sharing and displaying data, transmitting data to enterprise applications, and much more. Shows Android developers w

  13. Analysis and Design of Web-Based Database Application for Culinary Community

    Directory of Open Access Journals (Sweden)

    Choirul Huda

    2017-03-01

    Full Text Available This research is based on the rapid development of the culinary and information technology. The difficulties in communicating with the culinary expert and on recipe documentation make a proper support for media very important. Therefore, a web-based database application for the public is important to help the culinary community in communication, searching and recipe management. The aim of the research was to design a web-based database application that could be used as social media for the culinary community. This research used literature review, user interviews, and questionnaires. Moreover, the database system development life cycle was used as a guide for designing a database especially for conceptual database design, logical database design, and physical design database. Web-based application design used eight golden rules for user interface design. The result of this research is the availability of a web-based database application that can fulfill the needs of users in the culinary field related to communication and recipe management.

  14. Development of Integrated PSA Database and Application Technology

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Hoon; Park, Jin Hee; Kim, Seung Hwan; Choi, Sun Yeong; Jung, Woo Sik; Jeong, Kwang Sub; Ha Jae Joo; Yang, Joon Eon; Min Kyung Ran; Kim, Tae Woon

    2005-04-15

    The purpose of this project is to develop 1) the reliability database framework, 2) the methodology for the reactor trip and abnormal event analysis, and 3) the prototype PSA information DB system. We already have a part of the reactor trip and component reliability data. In this study, we extend the collection of data up to 2002. We construct the pilot reliability database for common cause failure and piping failure data. A reactor trip or a component failure may have an impact on the safety of a nuclear power plant. We perform the precursor analysis for such events that occurred in the KSNP, and to develop a procedure for the precursor analysis. A risk monitor provides a mean to trace the changes in the risk following the changes in the plant configurations. We develop a methodology incorporating the model of secondary system related to the reactor trip into the risk monitor model. We develop a prototype PSA information system for the UCN 3 and 4 PSA models where information for the PSA is inputted into the system such as PSA reports, analysis reports, thermal-hydraulic analysis results, system notebooks, and so on. We develop a unique coherent BDD method to quantify a fault tree and the fastest fault tree quantification engine FTREX. We develop quantification software for a full PSA model and a one top model.

  15. Semantic-Based Concurrency Control for Object-Oriented Database Systems Supporting Real-Time Applications

    National Research Council Canada - National Science Library

    Lee, Juhnyoung; Son, Sang H

    1994-01-01

    .... This paper investigates major issues in designing semantic-based concurrency control for object-oriented database systems supporting real-time applications, and it describes approaches to solving...

  16. Applicability of Computational Systems Biology in Toxicology

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Hadrup, Niels; Audouze, Karine Marie Laure

    2014-01-01

    be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method......Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources...... and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search...

  17. Analysis and Design of Web-Based Database Application for Culinary Community

    OpenAIRE

    Huda, Choirul; Awang, Osel Dharmawan; Raymond, Raymond; Raynaldi, Raynaldi

    2017-01-01

    This research is based on the rapid development of the culinary and information technology. The difficulties in communicating with the culinary expert and on recipe documentation make a proper support for media very important. Therefore, a web-based database application for the public is important to help the culinary community in communication, searching and recipe management. The aim of the research was to design a web-based database application that could be used as social media for the cu...

  18. Computational logic: its origins and applications.

    Science.gov (United States)

    Paulson, Lawrence C

    2018-02-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.

  19. Stochastic Collocation Applications in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Dragan Poljak

    2018-01-01

    Full Text Available The paper reviews the application of deterministic-stochastic models in some areas of computational electromagnetics. Namely, in certain problems there is an uncertainty in the input data set as some properties of a system are partly or entirely unknown. Thus, a simple stochastic collocation (SC method is used to determine relevant statistics about given responses. The SC approach also provides the assessment of related confidence intervals in the set of calculated numerical results. The expansion of statistical output in terms of mean and variance over a polynomial basis, via SC method, is shown to be robust and efficient approach providing a satisfactory convergence rate. This review paper provides certain computational examples from the previous work by the authors illustrating successful application of SC technique in the areas of ground penetrating radar (GPR, human exposure to electromagnetic fields, and buried lines and grounding systems.

  20. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  1. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  2. Computing on Encrypted Data: Theory and Application

    Science.gov (United States)

    2016-01-01

    permits short ciphertexts – e.g., encrypted using AES – to be de-compressed to longer ciphertexts that permit homomorphic operations. Bootstrapping...allows us to save memory by storing data encrypted in the compressed form – e.g., under AES . Here, we revisit bootstrapping, viewing it as an...COMPUTING ON ENCRYPTED DATA: THEORY AND APPLICATION MASSACHUSETTS INSTITUTE OF TECHNOLOGY JANUARY 2016 FINAL TECHNICAL REPORT

  3. Application of database management software to probabilistic risk assessment calculations

    International Nuclear Information System (INIS)

    Wyss, G.D.

    1993-01-01

    Probabilistic risk assessment (PRA) calculations require the management and processing of large amounts of information. This data normally falls into two general categories. For example, a commercial nuclear power plant PRA study makes use of plant blueprints and system schematics, formal plant safety analysis reports, incident reports, letters, memos, handwritten notes from plant visits, and even the analyst's ''engineering judgment''. This information must be documented and cross-referenced in order to properly execute and substantiate the models used in a PRA study. The first category is composed of raw data that is accumulated from equipment testing and operational experiences. These data describe the equipment, its service or testing conditions, its failure mode, and its performance history. The second category is composed of statistical distributions. These distributions can represent probabilities, frequencies, or values of important parameters that are not time-related. Probability and frequency distributions are often obtained by fitting raw data to an appropriate statistical distribution. Database management software is used to store both types of data so that it can be readily queried, manipulated, and archived. This paper provides an overview of the information models used for storing PRA data and illustrates the implementation of these models using examples from current PRA software packages

  4. Discovering Knowledge from AIS Database for Application in VTS

    Science.gov (United States)

    Tsou, Ming-Cheng

    The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.

  5. [Adverse Effect Predictions Based on Computational Toxicology Techniques and Large-scale Databases].

    Science.gov (United States)

    Uesawa, Yoshihiro

    2018-01-01

     Understanding the features of chemical structures related to the adverse effects of drugs is useful for identifying potential adverse effects of new drugs. This can be based on the limited information available from post-marketing surveillance, assessment of the potential toxicities of metabolites and illegal drugs with unclear characteristics, screening of lead compounds at the drug discovery stage, and identification of leads for the discovery of new pharmacological mechanisms. This present paper describes techniques used in computational toxicology to investigate the content of large-scale spontaneous report databases of adverse effects, and it is illustrated with examples. Furthermore, volcano plotting, a new visualization method for clarifying the relationships between drugs and adverse effects via comprehensive analyses, will be introduced. These analyses may produce a great amount of data that can be applied to drug repositioning.

  6. A Computer Knowledge Database of accidents at work in the construction industry

    Science.gov (United States)

    Hoła, B.; Szóstak, M.

    2017-10-01

    At least 60,000 fatal accidents at work occur on building sites all over the world each year, which means that on average, every 10 minutes an employee dies during the execution of work. In 2015 on Polish building sites, 5,776 accidents at work happened, of which 69 resulted in the death of an employee. Accidents are an enormous social and economic burden for companies, communities and countries. The vast majority of accidents at work can be prevented by appropriate and effective preventive measures. Therefore, the Computer Knowledge Database (CKD) was formulated for this purpose and it enables data and information on accidents at work in the construction industry to be collected and processed in order to obtain necessary knowledge. This gained knowledge will be the basis to form conclusions of a preventive nature

  7. Applications of computational intelligence in biomedical technology

    CERN Document Server

    Majernik, Jaroslav; Pancerz, Krzysztof; Zaitseva, Elena

    2016-01-01

    This book presents latest results and selected applications of Computational Intelligence in Biomedical Technologies. Most of contributions deal with problems of Biomedical and Medical Informatics, ranging from theoretical considerations to practical applications. Various aspects of development methods and algorithms in Biomedical and Medical Informatics as well as Algorithms for medical image processing, modeling methods are discussed. Individual contributions also cover medical decision making support, estimation of risks of treatments, reliability of medical systems, problems of practical clinical applications and many other topics  This book is intended for scientists interested in problems of Biomedical Technologies, for researchers and academic staff, for all dealing with Biomedical and Medical Informatics, as well as PhD students. Useful information is offered also to IT companies, developers of equipment and/or software for medicine and medical professionals.  .

  8. Bacterial computing: a form of natural computing and its applications

    Directory of Open Access Journals (Sweden)

    Rafael eLahoz-Beltra

    2014-03-01

    Full Text Available The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular learning along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  9. Bacterial computing: a form of natural computing and its applications.

    Science.gov (United States)

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  10. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  11. The new Cloud Dynamics and Radiation Database algorithms for AMSR2 and GMI: exploitation of the GPM observational database for operational applications

    Science.gov (United States)

    Cinzia Marra, Anna; Casella, Daniele; Martins Costa do Amaral, Lia; Sanò, Paolo; Dietrich, Stefano; Panegrossi, Giulia

    2017-04-01

    Two new precipitation retrieval algorithms for the Advanced Microwave Scanning Radiometer 2 (AMSR2) and for the GPM Microwave Imager (GMI) are presented. The algorithms are based on the Cloud Dynamics and Radiation Database (CDRD) Bayesian approach and represent an evolution of the previous version applied to Special Sensor Microwave Imager/Sounder (SSMIS) observations, and used operationally within the EUMETSAT Satellite Application Facility on support to Operational Hydrology and Water Management (H-SAF). These new products present as main innovation the use of an extended database entirely empirical, derived from coincident radar and radiometer observations from the NASA/JAXA Global Precipitation Measurement Core Observatory (GPM-CO) (Dual-frequency Precipitation Radar-DPR and GMI). The other new aspects are: 1) a new rain-no-rain screening approach; 2) the use of Empirical Orthogonal Functions (EOF) and Canonical Correlation Analysis (CCA) both in the screening approach, and in the Bayesian algorithm; 2) the use of new meteorological and environmental ancillary variables to categorize the database and mitigate the problem of non-uniqueness of the retrieval solution; 3) the development and implementations of specific modules for computational time minimization. The CDRD algorithms for AMSR2 and GMI are able to handle an extremely large observational database available from GPM-CO and provide the rainfall estimate with minimum latency, making them suitable for near-real time hydrological and operational applications. As far as CDRD for AMSR2, a verification study over Italy using ground-based radar data and over the MSG full disk area using coincident GPM-CO/AMSR2 observations has been carried out. Results show remarkable AMSR2 capabilities for rainfall rate (RR) retrieval over ocean (for RR > 0.25 mm/h), good capabilities over vegetated land (for RR > 1 mm/h), while for coastal areas the results are less certain. Comparisons with NASA GPM products, and with

  12. The Service Status and Development Strategy of the Mobile Application Service of Ancient Books Database

    Directory of Open Access Journals (Sweden)

    Yang Siluo

    2017-12-01

    Full Text Available [Purpose/significance] The mobile application of ancient books database is a change of the ancient books database from the online version to the mobile one. At present, the mobile application of ancient books database is in the initial stage of development, so it is necessary to investigate the current situation and provide suggestions for the development of it. [Method/process] This paper selected two kinds of ancient books databases, namely WeChat platform and the mobile phone client, and analyzed the operation mode and the main function. [Result/conclusion] We come to conclusion that the ancient database mobile application has some defects, such as resources in a small scale, single content and data form, and the function of single platform construction is not perfect, users pay inadequate attention to such issues. Then, we put forward some corresponding suggestions and point out that in order to construct ancient books database mobile applications, it is necessary to improve the platform construction, enrich the data form and quantity, optimize the function, emphasize the communication and interaction with the user.

  13. 6th International Conference on Computer Science and its Applications

    CERN Document Server

    Stojmenovic, Ivan; Jeong, Hwa; Yi, Gangman

    2015-01-01

    The 6th FTRA International Conference on Computer Science and its Applications (CSA-14) will be held in Guam, USA, Dec. 17 - 19, 2014. CSA-14 presents a comprehensive conference focused on the various aspects of advances in engineering systems in computer science, and applications, including ubiquitous computing, U-Health care system, Big Data, UI/UX for human-centric computing, Computing Service, Bioinformatics and Bio-Inspired Computing and will show recent advances on various aspects of computing technology, Ubiquitous Computing Services and its application.

  14. Archives: Journal of Computer Science and Its Application

    African Journals Online (AJOL)

    Items 1 - 9 of 9 ... Archives: Journal of Computer Science and Its Application. Journal Home > Archives: Journal of Computer Science and Its Application. Log in or Register to get access to full text downloads.

  15. Journal of Computer Science and Its Application: Site Map

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: Site Map. Journal Home > About the Journal > Journal of Computer Science and Its Application: Site Map. Log in or Register to get access to full text downloads.

  16. Journal of Computer Science and Its Application: About this journal

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: About this journal. Journal Home > Journal of Computer Science and Its Application: About this journal. Log in or Register to get access to full text downloads.

  17. Journal of Computer Science and Its Application: Journal Sponsorship

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: Journal Sponsorship. Journal Home > About the Journal > Journal of Computer Science and Its Application: Journal Sponsorship. Log in or Register to get access to full text downloads.

  18. Construction, database integration, and application of an Oenothera EST library.

    Science.gov (United States)

    Mrácek, Jaroslav; Greiner, Stephan; Cho, Won Kyong; Rauwolf, Uwe; Braun, Martha; Umate, Pavan; Altstätter, Johannes; Stoppel, Rhea; Mlcochová, Lada; Silber, Martina V; Volz, Stefanie M; White, Sarah; Selmeier, Renate; Rudd, Stephen; Herrmann, Reinhold G; Meurer, Jörg

    2006-09-01

    Coevolution of cellular genetic compartments is a fundamental aspect in eukaryotic genome evolution that becomes apparent in serious developmental disturbances after interspecific organelle exchanges. The genus Oenothera represents a unique, at present the only available, resource to study the role of the compartmentalized plant genome in diversification of populations and speciation processes. An integrated approach involving cDNA cloning, EST sequencing, and bioinformatic data mining was chosen using Oenothera elata with the genetic constitution nuclear genome AA with plastome type I. The Gene Ontology system grouped 1621 unique gene products into 17 different functional categories. Application of arrays generated from a selected fraction of ESTs revealed significantly differing expression profiles among closely related Oenothera species possessing the potential to generate fertile and incompatible plastid/nuclear hybrids (hybrid bleaching). Furthermore, the EST library provides a valuable source of PCR-based polymorphic molecular markers that are instrumental for genotyping and molecular mapping approaches.

  19. 77 FR 66617 - HIT Policy and Standards Committees; Workgroup Application Database

    Science.gov (United States)

    2012-11-06

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy and Standards Committees; Workgroup Application... of New ONC HIT FACA Workgroup Application Database. The Office of the National Coordinator (ONC) has.... Name of Committees: HIT Standards Committee and HIT Policy Committee. General Function of the...

  20. Ontology to relational database transformation for web application development and maintenance

    Science.gov (United States)

    Mahmudi, Kamal; Inggriani Liem, M. M.; Akbar, Saiful

    2018-03-01

    Ontology is used as knowledge representation while database is used as facts recorder in a KMS (Knowledge Management System). In most applications, data are managed in a database system and updated through the application and then they are transformed to knowledge as needed. Once a domain conceptor defines the knowledge in the ontology, application and database can be generated from the ontology. Most existing frameworks generate application from its database. In this research, ontology is used for generating the application. As the data are updated through the application, a mechanism is designed to trigger an update to the ontology so that the application can be rebuilt based on the newest ontology. By this approach, a knowledge engineer has a full flexibility to renew the application based on the latest ontology without dependency to a software developer. In many cases, the concept needs to be updated when the data changed. The framework is built and tested in a spring java environment. A case study was conducted to proof the concepts.

  1. Medical applications: a database and characterization of apps in Apple iOS and Android platforms.

    Science.gov (United States)

    Seabrook, Heather J; Stromer, Julie N; Shevkenek, Cole; Bharwani, Aleem; de Grood, Jill; Ghali, William A

    2014-08-27

    Medical applications (apps) for smart phones and tablet computers are growing in number and are commonly used in healthcare. In this context, there is a need for a diverse community of app users, medical researchers, and app developers to better understand the app landscape. In mid-2012, we undertook an environmental scan and classification of the medical app landscape in the two dominant platforms by searching the medical category of the Apple iTunes and Google Play app download sites. We identified target audiences, functions, costs and content themes using app descriptions and captured these data in a database. We only included apps released or updated between October 1, 2011 and May 31, 2012, with a primary "medical" app store categorization, in English, that contained health or medical content. Our sample of Android apps was limited to the most popular apps in the medical category. Our final sample of Apple iOS (n = 4561) and Android (n = 293) apps illustrate a diverse medical app landscape. The proportion of Apple iOS apps for the public (35%) and for physicians (36%) is similar. Few Apple iOS apps specifically target nurses (3%). Within the Android apps, those targeting the public dominated in our sample (51%). The distribution of app functions is similar in both platforms with reference being the most common function. Most app functions and content themes vary considerably by target audience. Social media apps are more common for patients and the public, while conference apps target physicians. We characterized existing medical apps and illustrated their diversity in terms of target audience, main functions, cost and healthcare topic. The resulting app database is a resource for app users, app developers and health informatics researchers.

  2. Decomposability queueing and computer system applications

    CERN Document Server

    Courtois, P J

    1977-01-01

    Decomposability: Queueing and Computer System Applications presents a set of powerful methods for systems analysis. This 10-chapter text covers the theory of nearly completely decomposable systems upon which specific analytic methods are based.The first chapters deal with some of the basic elements of a theory of nearly completely decomposable stochastic matrices, including the Simon-Ando theorems and the perturbation theory. The succeeding chapters are devoted to the analysis of stochastic queuing networks that appear as a type of key model. These chapters also discuss congestion problems in

  3. Application of protons to computer tomography

    International Nuclear Information System (INIS)

    Hanson, K.M.; Bradbury, J.N.; Cannon, T.M.; Hutson, R.L.; Laubacher, D.B.; Macek, R.; Paciotti, M.A.; Taylor, C.A.

    1977-01-01

    It was demonstrated that the application of protons to computed tomography can result in a significant dose advantage relative to x rays. Thus, at the same dose as is delivered by contemporary commercial x-ray scanners, a proton scanner could produce reconstructions with a factor of 2 or more improvement in density resolution. Whether such an improvement can result in significantly better diagnoses of human disease is an open question which can only be answered by the implementation of a proton scanner in a clinical situation

  4. Application of material databases for improved reliability of reactor pressure vessels

    International Nuclear Information System (INIS)

    Griesbach, T.J.; Server, W.L.; Beaudoin, B.F.; Burgos, B.N.

    1994-01-01

    A vital part of reactor vessel Life Cycle Management program must begin with an accurate characterization of the vessel material properties. Uncertainties in vessel material properties or use of bounding values may result in unnecessary conservatisms in vessel integrity calculations. These conservatisms may be eliminated through a better understanding of the material properties in reactor vessels, both in the unirradiated and irradiated conditions. Reactor vessel material databases are available for quantifying the chemistry and Charpy shift behavior of individual heats of reactor vessel materials. Application of the databases for vessels with embrittlement concerns has proven to be an effective embrittlement management tool. This paper presents details of database development and applications which demonstrate the value of using material databases for improving material chemistry and for maximizing the data from integrated material surveillance programs

  5. MicroComputed Tomography: Methodology and Applications

    International Nuclear Information System (INIS)

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  6. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    Science.gov (United States)

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  7. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  8. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  9. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Prorocol (WAP) applications in medical information processing

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Dørup, Jens

    2001-01-01

    script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2......) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. RESULTS: A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol...... service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. CONCLUSIONS: We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further...

  10. Soft Computing Applications : Proceedings of the 5th International Workshop Soft Computing Applications

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária; Dombi, Joszef; Jain, Lakhmi

    2013-01-01

                    This volume contains the Proceedings of the 5thInternational Workshop on Soft Computing Applications (SOFA 2012).                                The book covers a broad spectrum of soft computing techniques, theoretical and practical applications employing knowledge and intelligence to find solutions for world industrial, economic and medical problems. The combination of such intelligent systems tools and a large number of applications introduce a need for a synergy of scientific and technological disciplines in order to show the great potential of Soft Computing in all domains.                   The conference papers included in these proceedings, published post conference, were grouped into the following area of research: ·         Soft Computing and Fusion Algorithms in Biometrics, ·         Fuzzy Theory, Control andApplications, ·         Modelling and Control Applications, ·         Steps towa...

  11. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  12. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  13. COMPARISON OF POPULAR BIOINFORMATICS DATABASES

    OpenAIRE

    Abdulganiyu Abdu Yusuf; Zahraddeen Sufyanu; Kabir Yusuf Mamman; Abubakar Umar Suleiman

    2016-01-01

    Bioinformatics is the application of computational tools to capture and interpret biological data. It has wide applications in drug development, crop improvement, agricultural biotechnology and forensic DNA analysis. There are various databases available to researchers in bioinformatics. These databases are customized for a specific need and are ranged in size, scope, and purpose. The main drawbacks of bioinformatics databases include redundant information, constant change, data spread over m...

  14. Improving the Computational Performance of Ontology-Based Classification Using Graph Databases

    Directory of Open Access Journals (Sweden)

    Thomas J. Lampoltshammer

    2015-07-01

    Full Text Available The increasing availability of very high-resolution remote sensing imagery (i.e., from satellites, airborne laser scanning, or aerial photography represents both a blessing and a curse for researchers. The manual classification of these images, or other similar geo-sensor data, is time-consuming and leads to subjective and non-deterministic results. Due to this fact, (semi- automated classification approaches are in high demand in affected research areas. Ontologies provide a proper way of automated classification for various kinds of sensor data, including remotely sensed data. However, the processing of data entities—so-called individuals—is one of the most cost-intensive computational operations within ontology reasoning. Therefore, an approach based on graph databases is proposed to overcome the issue of a high time consumption regarding the classification task. The introduced approach shifts the classification task from the classical Protégé environment and its common reasoners to the proposed graph-based approaches. For the validation, the authors tested the approach on a simulation scenario based on a real-world example. The results demonstrate a quite promising improvement of classification speed—up to 80,000 times faster than the Protégé-based approach.

  15. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Bhateja, Vikrant; Udgata, Siba; Pattnaik, Prasant

    2017-01-01

    The book is a collection of high-quality peer-reviewed research papers presented at International Conference on Frontiers of Intelligent Computing: Theory and applications (FICTA 2016) held at School of Computer Engineering, KIIT University, Bhubaneswar, India during 16 – 17 September 2016. The book presents theories, methodologies, new ideas, experiences and applications in all areas of intelligent computing and its applications to various engineering disciplines like computer science, electronics, electrical and mechanical engineering.

  16. SWEETLEAD: an in silico database of approved drugs, regulated chemicals, and herbal isolates for computer-aided drug discovery.

    Directory of Open Access Journals (Sweden)

    Paul A Novick

    Full Text Available In the face of drastically rising drug discovery costs, strategies promising to reduce development timelines and expenditures are being pursued. Computer-aided virtual screening and repurposing approved drugs are two such strategies that have shown recent success. Herein, we report the creation of a highly-curated in silico database of chemical structures representing approved drugs, chemical isolates from traditional medicinal herbs, and regulated chemicals, termed the SWEETLEAD database. The motivation for SWEETLEAD stems from the observance of conflicting information in publicly available chemical databases and the lack of a highly curated database of chemical structures for the globally approved drugs. A consensus building scheme surveying information from several publicly accessible databases was employed to identify the correct structure for each chemical. Resulting structures are filtered for the active pharmaceutical ingredient, standardized, and differing formulations of the same drug were combined in the final database. The publically available release of SWEETLEAD (https://simtk.org/home/sweetlead provides an important tool to enable the successful completion of computer-aided repurposing and drug discovery campaigns.

  17. Converting analog interpretive data to digital formats for use in database and GIS applications

    Science.gov (United States)

    Flocks, James G.

    2004-01-01

    There is a growing need by researchers and managers for comprehensive and unified nationwide datasets of scientific data. These datasets must be in a digital format that is easily accessible using database and GIS applications, providing the user with access to a wide variety of current and historical information. Although most data currently being collected by scientists are already in a digital format, there is still a large repository of information in the literature and paper archive. Converting this information into a format accessible by computer applications is typically very difficult and can result in loss of data. However, since scientific data are commonly collected in a repetitious, concise matter (i.e., forms, tables, graphs, etc.), these data can be recovered digitally by using a conversion process that relates the position of an attribute in two-dimensional space to the information that the attribute signifies. For example, if a table contains a certain piece of information in a specific row and column, then the space that the row and column occupies becomes an index of that information. An index key is used to identify the relation between the physical location of the attribute and the information the attribute contains. The conversion process can be achieved rapidly, easily and inexpensively using widely available digitizing and spreadsheet software, and simple programming code. In the geological sciences, sedimentary character is commonly interpreted from geophysical profiles and descriptions of sediment cores. In the field and laboratory, these interpretations were typically transcribed to paper. The information from these paper archives is still relevant and increasingly important to scientists, engineers and managers to understand geologic processes affecting our environment. Direct scanning of this information produces a raster facsimile of the data, which allows it to be linked to the electronic world. But true integration of the content with

  18. Parapsychology and the neurosciences: a computer-based content analysis of abstracts in the database "MEDLINE" from 1975 to 1995.

    Science.gov (United States)

    Fassbender, P

    1997-04-01

    A computer-based content of 109 abstracts retrieved by the subject heading "parapsychology" from the database MEDLINE for the years 1975-1995 is presented. Data were analyzed by four categories to terms denoting (1) research methods, (2) neurosciences, (3) humanities/psychodynamics, and (4) parapsychology. Results indicated a growing interest in neuroscientific and neuropsychological explanations and theories.

  19. ComputerApplications and Virtual Environments (CAVE)

    Science.gov (United States)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  20. Applications and issues in automotive computational aeroacoustics

    International Nuclear Information System (INIS)

    Karbon, K.J.; Kumarasamy, S.; Singh, R.

    2002-01-01

    Automotive aeroacoustics is the noise generated due to the airflow around a moving vehicle. Previously regarded as a minor contributor, wind noise is now recognized as one of the dominant vehicle sound sources, since significant progress has been made in suppressing engine and tire noise. Currently, almost all aeroacoustic development work is performed experimentally on a full-scale vehicle in the wind tunnel. Any reduction in hardware models is recognized as one of the major enablers to quickly bring the vehicle to market. In addition, prediction of noise sources and characteristics at the early stages of vehicle design will help in reducing the costly fixes at the later stages. However, predictive methods such as Computational Fluid Dynamics (CFD) and Computational Aeroacoustics (CAA) are still under development and are not considered mainstream design tools. This paper presents some initial applications and findings of CFD and CAA analysis towards vehicle aeroacoustics. Transient Reynolds Averaged Navier Stokes (RANS) and Lighthill-Curle methods are used to model low frequency buffeting and high frequency wind rush noise. Benefits and limitations of the approaches are described. (author)

  1. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    Science.gov (United States)

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  2. Professional iPhone and iPad Database Application Programming

    CERN Document Server

    Alessi, Patrick

    2010-01-01

    A much-needed resource on database development and enterprise integration for the iPhone. An enormous demand exists for getting iPhone applications into the enterprise and this book guides you through all the necessary steps for integrating an iPhone app within an existing enterprise. Experienced iPhone developers will learn how to take advantage of the built-in capabilities of the iPhone to confidently implement a data-driven application for the iPhone.: Shows you how to integrate iPhone applications into enterprise class systems; Introduces development of data-driven applications on the iPho

  3. Validation and application of a physics database for fast reactor fuel cycle analysis

    International Nuclear Information System (INIS)

    McKnight, R.D.; Stillman, J.A.; Toppel, B.J.; Khalil, H.S.

    1994-01-01

    An effort has been made to automate the execution of fast reactor fuel cycle analysis, using EBR-II as a demonstration vehicle, and to validate the analysis results for application to the IFR closed fuel cycle demonstration at EBR-II and its fuel cycle facility. This effort has included: (1) the application of the standard ANL depletion codes to perform core-follow analyses for an extensive series of EBR-II runs, (2) incorporation of the EBR-II data into a physics database, (3) development and verification of software to update, maintain and verify the database files, (4) development and validation of fuel cycle models and methodology, (5) development and verification of software which utilizes this physics database to automate the application of the ANL depletion codes, methods and models to perform the core-follow analysis, and (6) validation studies of the ANL depletion codes and of their application in support of anticipated near-term operations in EBR-II and the Fuel Cycle Facility. Results of the validation tests indicate the physics database and associated analysis codes and procedures are adequate to predict required quantities in support of early phases of FCF operations

  4. Monet: a next-generation database kernel for query-intensive applications

    NARCIS (Netherlands)

    P.A. Boncz (Peter)

    2002-01-01

    htmlabstractMonet is a database kernel targeted at query-intensive, heavy analysis applications (the opposite of transaction processing), which include OLAP and data mining, but also go beyond the business domain in GIS processing, multi-media retrieval and XML. The clean sheet approach of Monet

  5. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  6. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  7. Proceedings of the 11th China symposium on computer application in modern science and technology

    International Nuclear Information System (INIS)

    2003-01-01

    The 11th China symposium on computer application in modern science and technology were held by China Electronics Society and Nuclear Electronics and Nuclear Detecting Technology branch Society of China Nuclear Society on september, 8th-12th, 2003 in Changdao of Shandong province 77 articles is collected in the proceedings. The contents included calculation and calculation method, software system and software application, data acquisition and control system, database, and management information system, general system, network application and grid calculation and its application system and so on

  8. Development of a Database for Study Data in Registration Applications for Veterinary Medicinal Products

    Directory of Open Access Journals (Sweden)

    Anke Finnah

    2017-02-01

    Full Text Available Objective: In the present study, the feasibility of a systematic record of clinical study data from marketing authorisation applications for veterinary medicinal products (VMP and benefits of the selected approach were investigated.Background: Drug registration dossiers for veterinary medicinal products contain extensive data from drug studies, which are not easily accessible to assessors.Evidentiary value: Fast access to these data including specific search tools could facilitate a meaningful use of the data and allow assessors for comparison of test and studies from different dossiers.Methods: First, pivotal test parameters and their mutual relationships were identified. Second, a data model was developed and implemented in a relational database management system, including a data entry form and various reports for database searches. Compilation of study data in the database was demonstrated using all available clinical studies involving VMPs containing the anthelmintic drug Praziquantel. By means of descriptive data analysis possibilities of data evaluation including graphical presentation were shown. Suitability of the database to support the performance of meta-analyses was tentatively validated.Results: The data model was designed to cover the specific requirements arising from study data. A total of 308 clinical studies related to 95 VMPs containing Praziquantel (single agent and combination drugs was selected for prototype testing. The relevant data extracted from these studies were appropriately structured and shown to be basically suitable for descriptive data analyses as well as for meta-analyses.Conclusion: The database-supported collection of study data would provide users with easy access to the continuously increasing pool of scientific information held by competent authorities. It enables specific data analyses. Database design allows expanding the data model to all types of studies and classes of drugs registered in veterinary

  9. Institute for Computer Applications in Science and Engineering (ICASE)

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period April 1, 1983 through September 30, 1983 is summarized.

  10. Ontology and Cloud Computing in Various Applications: The ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... to emphasize the importance of both ontology and cloud computing in various .... of knowledge management applications and retrieve information using .... above in terms of hard drive space, but any device ordinary computer ...

  11. Manufacturing and application of micro computer for control

    International Nuclear Information System (INIS)

    Park, Seung Man; Heo, Gyeong; Yun, Jun Young

    1990-05-01

    This book deals with machine code and assembly program for micro computer. It composed of 20 chapters, which are micro computer system, practice of a storage cell, manufacturing 1 of micro computer, manufacturing 2 of micro computer, manufacturing of micro computer AID-80A, making of machine language, interface like Z80-PIO and 8255A(PPI), counter and timer interface, exercise of basic command, arithmetic operation, arrangement operation, an indicator control, music playing, detection of input of PIO. control of LED of PIO, PIO mode, CTC control by micro computer, SIO control by micro computer and application by micro computer.

  12. Implementation of Secondary Index on Cloud Computing NoSQL Database in Big Data Environment

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2015-01-01

    Full Text Available This paper introduces the combination of NoSQL database HBase and enterprise search platform Solr so as to tackle the problem of the secondary index function with fast query. In order to verify the effectiveness and efficiency of the proposed approach, the assessment using Cost-Performance ratio has been done for several competitive benchmark databases and the proposed one. As a result, our proposed approach outperforms the other databases and fulfills secondary index function with fast query in NoSQL database. Moreover, according to the cross-sectional analysis, the proposed combination of HBase and Solr database is capable of performing an excellent query/response in a big data environment.

  13. Migrating to the Cloud IT Application, Database, and Infrastructure Innovation and Consolidation

    CERN Document Server

    Laszewski, Tom

    2011-01-01

    Whether your company is planning on database migration, desktop application migration, or has IT infrastructure consolidation projects, this book gives you all the resources you'll need. It gives you recommendations on tools, strategy and best practices and serves as a guide as you plan, determine effort and budget, design, execute and roll your modern Oracle system out to production. Focusing on Oracle grid relational database technology and Oracle Fusion Middleware as the target cloud-based architecture, your company can gain organizational efficiency, agility, increase innovation and reduce

  14. Computer applications in water conservancy and hydropower engineering

    Energy Technology Data Exchange (ETDEWEB)

    Chen, J

    1984-09-20

    The use of computers in China's water conservancy and hydropower construction began in the 1960s for exploration surveys, planning, design, construction, operation, and scientific research. Despite the positive results, and the formation of a 1000-person computer computation contingent, computer development among different professions is not balanced. The weaknesses and disparities in computer applications include an overall low level of application relative to the rest of the world, which is partly due to inadequate hardware and programs. The report suggests five ways to improve applications and popularize microcomputers which emphasize leadership and planning.

  15. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  16. Elements of quantum computing history, theories and engineering applications

    CERN Document Server

    Akama, Seiki

    2015-01-01

    A quantum computer is a computer based on a computational model which uses quantum mechanics, which is a subfield of physics to study phenomena at the micro level. There has been a growing interest on quantum computing in the 1990's, and some quantum computers at the experimental level were recently implemented. Quantum computers enable super-speed computation, and can solve some important problems whose solutions were regarded impossible or intractable with traditional computers. This book provides a quick introduction to quantum computing for readers who have no backgrounds of both theory of computation and quantum mechanics. “Elements of Quantum Computing” presents the history, theories, and engineering applications of quantum computing. The book is suitable to computer scientists, physicist, and software engineers.

  17. The Development of a Web Based Database Applications of Procurement, Inventory, and Sales at PT. Interjaya Surya Megah

    OpenAIRE

    Huda, Choirul; Hudyanto, Chendra; Sisillia, Sisillia; Persada, Revin Kencana

    2011-01-01

    The objective of this research is to develop a web based database application for the procurement, inventory and sales at PT. Interjaya Surya Megah. The current system at PT. Interjaya Surya Megah is running manually, so the company has difficulty in carrying out its activities. The methodology, that is used in this research, includes interviews, observation, literature review, conceptual database design, logical database design and physical database design. The results are the establishment ...

  18. Teachers of Advertising Media Courses Describe Techniques, Show Computer Applications.

    Science.gov (United States)

    Lancaster, Kent M.; Martin, Thomas C.

    1989-01-01

    Reports on a survey of university advertising media teachers regarding textbooks and instructional aids used, teaching techniques, computer applications, student placement, instructor background, and faculty publishing. (SR)

  19. A Method to Ease the Deployment of Web Applications that Involve Database Systems A Method to Ease the Deployment of Web Applications that Involve Database Systems

    Directory of Open Access Journals (Sweden)

    Antonio Vega Corona

    2012-02-01

    Full Text Available El crecimiento continuo de la Internet ha permitido a las personas, alrededor de todo mundo, realizar transacciones en línea, buscar información o navegar usando el explorador de la Web. A medida que más gente se siente cómoda usando los exploradores de Web, más empresas productoras de software tratan de ofrecer interfaces Web como una forma alternativa para proporcionar acceso a sus aplicaciones. La naturaleza de la conexión Web y las restricciones impuestas por el ancho de banda disponible, hacen la integración de aplicaciones Web y los sistemas de bases de datos críticas. Debido a que las aplicaciones que usan bases de datos proporcionan una interfase gráfica para editar la información en la base de datos y debido a que cada columna en una tabla de una base de datos corresponde a un control en una interfase gráfica, el desarrollo de estas aplicaciones puede consumirun tiempo considerable, ya que la validación de campos y reglas de integridad referencial deben ser respetadas. Se propone un diseño orientado a objetos para así facilitar el desarrollo de aplicaciones que usan sistemas de bases de datos.The continuous growth of the Internet has driven people, all around the globe, to performtransactions on-line, search information or navigate using a browser. As more people feelcomfortable using a Web browser, more software companies are trying to alternatively offerWeb interfaces to provide access to their applications. The consequent nature of the Webconnection and the restrictions imposed by the available bandwidth make the successfulintegration of Web applications and database systems critical. Because popular databaseapplications provide a user interface to edit and maintain the information in the databaseand because each column in the database table maps to a graphic user interface control,the deployment of these applications can be time consuming; appropriate fi eld validationand referential integrity rules must be observed

  20. Some malpractices in application of computed radiography

    International Nuclear Information System (INIS)

    Liu Ruihong; Jia Shaotian; Wang Yusheng; Li Baohua; Chen Lin; Wang Zhenguang; Liu Jianxin; Gong Jingyue; Liu Daoyong; Xie Xuesong

    2007-01-01

    Objective: To improve the CR image quality and to promote the digital image standard constitution by analyzing the common problems and malpractices in application of computed radiography. Methods: Phenomenon and reasons of 107 CR junk-films from nine three-'A'-hospitals were analyzed, discussed, recorded, and statistised by 20 radiologists, radiographers and engineers. Results: Among 107 junk films, there are 36 cases (33.64%) of incorrect operations, 29 cases (27.10%) of artifacts in reading and transferring the data of IP, 15 cases (14.02%) of artifacts in IP system, and 13 cases (12.15%) of selection of inappropriate radiographic parameters, and 9 cases (8.41%) of printer-failures, and 5 cases (4.67%) of inappropriate post-processing techniques. By analyzing the reasons of 107 junk films we found that 60.74% were due to less responsibilities and incorrect operations, and 35.51% were due to new problems in CR techniques, and other were due to inappropriate post-processing techniques. Conclusion: Responsibilities, operation regulations, digital image quality standards, studying of new techniques and appropriate use of the post-processing techniques are the key points for improving the CR image quality and the diagnosis level. (authors)

  1. Computational materials design for energy applications

    Science.gov (United States)

    Ozolins, Vidvuds

    2013-03-01

    General adoption of sustainable energy technologies depends on the discovery and development of new high-performance materials. For instance, waste heat recovery and electricity generation via the solar thermal route require bulk thermoelectrics with a high figure of merit (ZT) and thermal stability at high-temperatures. Energy recovery applications (e.g., regenerative braking) call for the development of rapidly chargeable systems for electrical energy storage, such as electrochemical supercapacitors. Similarly, use of hydrogen as vehicular fuel depends on the ability to store hydrogen at high volumetric and gravimetric densities, as well as on the ability to extract it at ambient temperatures at sufficiently rapid rates. We will discuss how first-principles computational methods based on quantum mechanics and statistical physics can drive the understanding, improvement and prediction of new energy materials. We will cover prediction and experimental verification of new earth-abundant thermoelectrics, transition metal oxides for electrochemical supercapacitors, and kinetics of mass transport in complex metal hydrides. Research has been supported by the US Department of Energy under grant Nos. DE-SC0001342, DE-SC0001054, DE-FG02-07ER46433, and DE-FC36-08GO18136.

  2. Security issues of cloud computing environment in possible military applications

    OpenAIRE

    Samčović, Andreja B.

    2013-01-01

    The evolution of cloud computing over the past few years is potentially one of major advances in the history of computing and telecommunications. Although there are many benefits of adopting cloud computing, there are also some significant barriers to adoption, security issues being the most important of them. This paper introduces the concept of cloud computing; looks at relevant technologies in cloud computing; takes into account cloud deployment models and some military applications. Addit...

  3. Parallel computing: numerics, applications, and trends

    National Research Council Canada - National Science Library

    Trobec, Roman; Vajteršic, Marián; Zinterhof, Peter

    2009-01-01

    ... and/or distributed systems. The contributions to this book are focused on topics most concerned in the trends of today's parallel computing. These range from parallel algorithmics, programming, tools, network computing to future parallel computing. Particular attention is paid to parallel numerics: linear algebra, differential equations, numerica...

  4. Application of computational intelligence to biology

    CERN Document Server

    Sekhar, Akula

    2016-01-01

    This book is a contribution of translational and allied research to the proceedings of the International Conference on Computational Intelligence and Soft Computing. It explains how various computational intelligence techniques can be applied to investigate various biological problems. It is a good read for Research Scholars, Engineers, Medical Doctors and Bioinformatics researchers.

  5. Symbolic initiative and its application to computers

    Energy Technology Data Exchange (ETDEWEB)

    Hellerman, L

    1982-01-01

    The author reviews the role of symbolic initiative in mathematics and then defines a sense in which computers compute mathematical functions. This allows a clarification of the semantics of computer and communication data. Turing's view of machine intelligence is examined in terms of its use of symbolic initiative. 12 references.

  6. GSTARS computer models and their applications, Part II: Applications

    Science.gov (United States)

    Simoes, F.J.M.; Yang, C.T.

    2008-01-01

    In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  7. Development and application of basis database for materials life cycle assessment in china

    Science.gov (United States)

    Li, Xiaoqing; Gong, Xianzheng; Liu, Yu

    2017-03-01

    As the data intensive method, high quality environmental burden data is an important premise of carrying out materials life cycle assessment (MLCA), and the reliability of data directly influences the reliability of the assessment results and its application performance. Therefore, building Chinese MLCA database is the basic data needs and technical supports for carrying out and improving LCA practice. Firstly, some new progress on database which related to materials life cycle assessment research and development are introduced. Secondly, according to requirement of ISO 14040 series standards, the database framework and main datasets of the materials life cycle assessment are studied. Thirdly, MLCA data platform based on big data is developed. Finally, the future research works were proposed and discussed.

  8. The Application and Future of Big Database Studies in Cardiology: A Single-Center Experience.

    Science.gov (United States)

    Lee, Kuang-Tso; Hour, Ai-Ling; Shia, Ben-Chang; Chu, Pao-Hsien

    2017-11-01

    As medical research techniques and quality have improved, it is apparent that cardiovascular problems could be better resolved by more strict experiment design. In fact, substantial time and resources should be expended to fulfill the requirements of high quality studies. Many worthy ideas and hypotheses were unable to be verified or proven due to ethical or economic limitations. In recent years, new and various applications and uses of databases have received increasing attention. Important information regarding certain issues such as rare cardiovascular diseases, women's heart health, post-marketing analysis of different medications, or a combination of clinical and regional cardiac features could be obtained by the use of rigorous statistical methods. However, there are limitations that exist among all databases. One of the key essentials to creating and correctly addressing this research is through reliable processes of analyzing and interpreting these cardiologic databases.

  9. The use of computational thermodynamics to predict properties of multicomponent materials for nuclear applications

    International Nuclear Information System (INIS)

    Sundman, B.; Gueneau, C.

    2013-01-01

    Computational Thermodynamics is based on physically realistic models to describe metallic and oxide crystalline phases as well as the liquid and gas in a consistent manner. The models are used to assess experimental and theoretical data for many different materials and several thermodynamic databases has been developed for steels, ceramics, semiconductor materials as well as materials for nuclear applications. Within CEA a long term work is ongoing to develop a database for the properties of nuclear fuels and structural materials. An overview of the modelling technique will be given and several examples of the application of the database to different problems, both for traditional phase diagram calculations and its use in simulating phase transformations. The following diagrams (Fig. 1, Fig. 2 and Fig.3) show calculations in the U-Pu-O system. (authors)

  10. Grid Computing Application for Brain Magnetic Resonance Image Processing

    International Nuclear Information System (INIS)

    Valdivia, F; Crépeault, B; Duchesne, S

    2012-01-01

    This work emphasizes the use of grid computing and web technology for automatic post-processing of brain magnetic resonance images (MRI) in the context of neuropsychiatric (Alzheimer's disease) research. Post-acquisition image processing is achieved through the interconnection of several individual processes into pipelines. Each process has input and output data ports, options and execution parameters, and performs single tasks such as: a) extracting individual image attributes (e.g. dimensions, orientation, center of mass), b) performing image transformations (e.g. scaling, rotation, skewing, intensity standardization, linear and non-linear registration), c) performing image statistical analyses, and d) producing the necessary quality control images and/or files for user review. The pipelines are built to perform specific sequences of tasks on the alphanumeric data and MRIs contained in our database. The web application is coded in PHP and allows the creation of scripts to create, store and execute pipelines and their instances either on our local cluster or on high-performance computing platforms. To run an instance on an external cluster, the web application opens a communication tunnel through which it copies the necessary files, submits the execution commands and collects the results. We present result on system tests for the processing of a set of 821 brain MRIs from the Alzheimer's Disease Neuroimaging Initiative study via a nonlinear registration pipeline composed of 10 processes. Our results show successful execution on both local and external clusters, and a 4-fold increase in performance if using the external cluster. However, the latter's performance does not scale linearly as queue waiting times and execution overhead increase with the number of tasks to be executed.

  11. Practical clinical applications of the computer in nuclear medicine

    International Nuclear Information System (INIS)

    Price, R.R.; Erickson, J.J.; Patton, J.A.; Jones, J.P.; Lagan, J.E.; Rollo, F.D.

    1978-01-01

    The impact of the computer on the practice of nuclear medicine has been felt primarily in the area of rapid dynamic studies. At this time it is difficult to find a clinic which routinely performs computer processing of static images. The general purpose digital computer is a sophisticated and flexible instrument. The number of applications for which one can use the computer to augment data acquisition, analysis, or display is essentially unlimited. In this light, the purpose of this exhibit is not to describe all possible applications of the computer in nuclear medicine but rather to illustrate those applications which have generally been accepted as practical in the routine clinical environment. Specifically, we have chosen examples of computer augmented cardiac, and renal function studies as well as examples of relative organ blood flow studies. In addition, a short description of basic computer components and terminology along with a few examples of non-imaging applications are presented

  12. CGPD: Cancer Genetics and Proteomics Database - A Dataset for Computational Analysis and Online Cancer Diagnostic Centre

    Directory of Open Access Journals (Sweden)

    Muhammad Rizwan Riaz

    2014-06-01

    Full Text Available Cancer Genetics and Proteomics Database (CGPD is a repository for genetics and proteomics data of those Homo sapiens genes which are involved in Cancer. These genes are categorized in the database on the basis of cancer type. 72 genes of 13 types of cancers are considered in this database yet. Primers, promoters and peptides of these genes are also made available. Primers provided for each gene, with their features and conditions given to facilitate the researchers, are useful in PCR amplification, especially in cloning experiments. CGPD also contains Online Cancer Diagnostic Center (OCDC. It also contains transcription and translation tools to assist research work in progressive manner. The database is publicly available at http://www.cgpd.comyr.com.

  13. Data Linkage Graph: computation, querying and knowledge discovery of life science database networks

    Directory of Open Access Journals (Sweden)

    Lange Matthias

    2007-12-01

    Full Text Available To support the interpretation of measured molecular facts, like gene expression experiments or EST sequencing, the functional or the system biological context has to be considered. Doing so, the relationship to existing biological knowledge has to be discovered. In general, biological knowledge is worldwide represented in a network of databases. In this paper we present a method for knowledge extraction in life science databases, which prevents the scientists from screen scraping and web clicking approaches.

  14. A Representation-Theoretic Approach to Reversible Computation with Applications

    DEFF Research Database (Denmark)

    Maniotis, Andreas Milton

    Reversible computing is a sub-discipline of computer science that helps to understand the foundations of the interplay between physics, algebra, and logic in the context of computation. Its subjects of study are computational devices and abstract models of computation that satisfy the constraint ......, there is still no uniform and consistent theory that is general in the sense of giving a model-independent account to the field....... of information conservation. Such machine models, which are known as reversible models of computation, have been examined both from a theoretical perspective and from an engineering perspective. While a bundle of many isolated successful findings and applications concerning reversible computing exists...

  15. Coupling computer-interpretable guidelines with a drug-database through a web-based system – The PRESGUID project

    Directory of Open Access Journals (Sweden)

    Fieschi Marius

    2004-03-01

    Full Text Available Abstract Background Clinical Practice Guidelines (CPGs available today are not extensively used due to lack of proper integration into clinical settings, knowledge-related information resources, and lack of decision support at the point of care in a particular clinical context. Objective The PRESGUID project (PREScription and GUIDelines aims to improve the assistance provided by guidelines. The project proposes an online service enabling physicians to consult computerized CPGs linked to drug databases for easier integration into the healthcare process. Methods Computable CPGs are structured as decision trees and coded in XML format. Recommendations related to drug classes are tagged with ATC codes. We use a mapping module to enhance computerized guidelines coupling with a drug database, which contains detailed information about each usable specific medication. In this way, therapeutic recommendations are backed up with current and up-to-date information from the database. Results Two authoritative CPGs, originally diffused as static textual documents, have been implemented to validate the computerization process and to illustrate the usefulness of the resulting automated CPGs and their coupling with a drug database. We discuss the advantages of this approach for practitioners and the implications for both guideline developers and drug database providers. Other CPGs will be implemented and evaluated in real conditions by clinicians working in different health institutions.

  16. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  17. The establish and application of equipment reliability database in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Zheng Wei; Li He

    2006-03-01

    Take the case of Daya Bay Nuclear Power Plant, the collecting and handling of equipment reliability data, the calculation method of reliability parameters and the establish and application of reliability databases, etc. are discussed. The data source involved the design information of the equipment, the operation information, the maintenance information and periodically test record, etc. Equipment reliability database built on a base of the operation experience. It provided the valid tool for thoroughly and objectively recording the operation history and the present condition of various equipment of the plant; supervising the appearance of the equipment, especially the safety-related equipment, provided the very practical worth information for enhancing the safety and availability management of the equipment and insuring the safety and economic operation of the plant; and provided the essential data for the research and applications in safety management, reliability analysis, probabilistic safety assessment, reliability centered maintenance and economic management in nuclear power plant. (authors)

  18. Application of computational systems biology to explore environmental toxicity hazards

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Grandjean, Philippe

    2011-01-01

    Background: Computer-based modeling is part of a new approach to predictive toxicology.Objectives: We investigated the usefulness of an integrated computational systems biology approach in a case study involving the isomers and metabolites of the pesticide dichlorodiphenyltrichloroethane (DDT......) to ascertain their possible links to relevant adverse effects.Methods: We extracted chemical-protein association networks for each DDT isomer and its metabolites using ChemProt, a disease chemical biology database that includes both binding and gene expression data, and we explored protein-protein interactions...... using a human interactome network. To identify associated dysfunctions and diseases, we integrated protein-disease annotations into the protein complexes using the Online Mendelian Inheritance in Man database and the Comparative Toxicogenomics Database.Results: We found 175 human proteins linked to p,p´-DDT...

  19. A parallel model for SQL astronomical databases based on solid state storage. Application to the Gaia Archive PostgreSQL database

    Science.gov (United States)

    González-Núñez, J.; Gutiérrez-Sánchez, R.; Salgado, J.; Segovia, J. C.; Merín, B.; Aguado-Agelet, F.

    2017-07-01

    Query planning and optimisation algorithms in most popular relational databases were developed at the times hard disk drives were the only storage technology available. The advent of higher parallel random access capacity devices, such as solid state disks, opens up the way for intra-machine parallel computing over large datasets. We describe a two phase parallel model for the implementation of heavy analytical processes in single instance PostgreSQL astronomical databases. This model is particularised to fulfil two frequent astronomical problems, density maps and crossmatch computation with Quad Tree Cube (Q3C) indexes. They are implemented as part of the relational databases infrastructure for the Gaia Archive and performance is assessed. Improvement of a factor 28.40 in comparison to sequential execution is observed in the reference implementation for a histogram computation. Speedup ratios of 3.7 and 4.0 are attained for the reference positional crossmatches considered. We observe large performance enhancements over sequential execution for both CPU and disk access intensive computations, suggesting these methods might be useful with the growing data volumes in Astronomy.

  20. Application of computational intelligence in emerging power systems

    African Journals Online (AJOL)

    ... in the electrical engineering applications. This paper highlights the application of computational intelligence methods in power system problems. Various types of CI methods, which are widely used in power system, are also discussed in the brief. Keywords: Power systems, computational intelligence, artificial intelligence.

  1. A Tabu Search Algorithm for application placement in computer clustering

    NARCIS (Netherlands)

    van der Gaast, Jelmer; Rietveld, Cornelieus A.; Gabor, Adriana; Zhang, Yingqian

    2014-01-01

    This paper presents and analyzes a model for the problem of placing applications on computer clusters (APP). In this problem, organizations requesting a set of software applications have to be assigned to computer clusters such that the costs of opening clusters and installing the necessary

  2. APPLICATIONS OF CLOUD COMPUTING SERVICES IN EDUCATION – CASE STUDY

    Directory of Open Access Journals (Sweden)

    Tomasz Cieplak

    2014-11-01

    Full Text Available Applications of Cloud Computing in enterprises are very wide-ranging. In opposition, educational applications of Cloud Computing in Poland are someway limited. On the other hand, young people use services of Cloud Computing frequently. Utilization of Facebook, Google or other services in Poland by young people is almost the same as in Western Europe or in the USA. Taking into account those considerations, few years ago authors have started process of popularization and usage of Cloud Computing educational services in their professional work. This article briefly summarizes authors’ experience with selected and most popular Cloud Computing services.

  3. High-Throughput Computational Screening of the Metal Organic Framework Database for CH4/H2 Separations.

    Science.gov (United States)

    Altintas, Cigdem; Erucar, Ilknur; Keskin, Seda

    2018-01-31

    Metal organic frameworks (MOFs) have been considered as one of the most exciting porous materials discovered in the last decade. Large surface areas, high pore volumes, and tailorable pore sizes make MOFs highly promising in a variety of applications, mainly in gas separations. The number of MOFs has been increasing very rapidly, and experimental identification of materials exhibiting high gas separation potential is simply impractical. High-throughput computational screening studies in which thousands of MOFs are evaluated to identify the best candidates for target gas separation is crucial in directing experimental efforts to the most useful materials. In this work, we used molecular simulations to screen the most complete and recent collection of MOFs from the Cambridge Structural Database to unlock their CH 4 /H 2 separation performances. This is the first study in the literature, which examines the potential of all existing MOFs for adsorption-based CH 4 /H 2 separation. MOFs (4350) were ranked based on several adsorbent evaluation metrics including selectivity, working capacity, adsorbent performance score, sorbent selection parameter, and regenerability. A large number of MOFs were identified to have extraordinarily large CH 4 /H 2 selectivities compared to traditional adsorbents such as zeolites and activated carbons. We examined the relations between structural properties of MOFs such as pore sizes, porosities, and surface areas and their selectivities. Correlations between the heat of adsorption, adsorbility, metal type of MOFs, and selectivities were also studied. On the basis of these relations, a simple mathematical model that can predict the CH 4 /H 2 selectivity of MOFs was suggested, which will be very useful in guiding the design and development of new MOFs with extraordinarily high CH 4 /H 2 separation performances.

  4. TuBaFrost 5: multifunctional central database application for a European tumor bank.

    Science.gov (United States)

    Isabelle, M; Teodorovic, I; Morente, M M; Jaminé, D; Passioukov, A; Lejeune, S; Therasse, P; Dinjens, W N M; Oosterhuis, J W; Lam, K H; Oomen, M H A; Spatz, A; Ratcliffe, C; Knox, K; Mager, R; Kerr, D; Pezzella, F; van de Vijver, M; van Boven, H; Alonso, S; Kerjaschki, D; Pammer, J; Lopez-Guerrero, J A; Llombart Bosch, A; Carbone, A; Gloghini, A; van Veen, E-B; van Damme, B; Riegman, P H J

    2006-12-01

    Developing a tissue bank database has become more than just logically arranging data in tables combined with a search engine. Current demand for high quality samples and data, and the ever-changing legal and ethical regulations mean that the application must reflect TuBaFrost rules and protocols for the collection, exchange and use of tissue. To ensure continuation and extension of the TuBaFrost European tissue bank, the custodianship of the samples, and hence the decision over whether to issue samples to requestors, remains with the local collecting centre. The database application described in this article has been developed to facilitate this open structure virtual tissue bank model serving a large group. It encompasses many key tasks, without the requirement for personnel, hence minimising operational costs. The Internet-accessible database application enables search, selection and request submission for requestors, whereas collectors can upload and edit their collection. Communication between requestor and involved collectors is started with automatically generated e-mails.

  5. THE DEVELOPMENT OF A WEB BASED DATABASE APPLICATIONS OF PROCUREMENT, INVENTORY, AND SALES AT PT. INTERJAYA SURYA MEGAH

    Directory of Open Access Journals (Sweden)

    Choirul Huda

    2011-10-01

    Full Text Available The objective of this research is to develop a web based database application for the procurement, inventory and sales at PT. Interjaya Surya Megah. The current system at PT. Interjaya Surya Megah is running manually, so the company has difficulty in carrying out its activities. The methodology, that is used in this research, includes interviews, observation, literature review, conceptual database design, logical database design and physical database design. The results are the establishment of a web-based database application at PT. Interjaya Surya Megah. The conclusion is the company can be easier to run the day-to-day activities because data processing becomes faster and more accurate, faster report generation and more accurate, more secure data storage.Keywords: database; application; procurement; inventory; sales

  6. Application of computed radiography to ERCP

    International Nuclear Information System (INIS)

    Lee, Shigeki; Mochizuki, Fukuji; Fujita, Naotaka; Itoh, Shoichiro; Ikeda, Takashi; Toyohara, Tokiaki; Matsumoto, Kyoichi

    1984-01-01

    Computed radiography technic was applied to ERCP. Fuji Computed Radiography System was used. The pancreatogram obtained by this method was compared with that of conventional screen-film radiograph. The much finer changes of the pancreatogram can be delineated by the new method. The diagnostic ability of ERCP is thus enhanced by the introduction of FCR. (author)

  7. Exploring Natural Products from the Biodiversity of Pakistan for Computational Drug Discovery Studies: Collection, Optimization, Design and Development of A Chemical Database (ChemDP).

    Science.gov (United States)

    Mirza, Shaher Bano; Bokhari, Habib; Fatmi, Muhammad Qaiser

    2015-01-01

    Pakistan possesses a rich and vast source of natural products (NPs). Some of these secondary metabolites have been identified as potent therapeutic agents. However, the medicinal usage of most of these compounds has not yet been fully explored. The discoveries for new scaffolds of NPs as inhibitors of certain enzymes or receptors using advanced computational drug discovery approaches are also limited due to the unavailability of accurate 3D structures of NPs. An organized database incorporating all relevant information, therefore, can facilitate to explore the medicinal importance of the metabolites from Pakistani Biodiversity. The Chemical Database of Pakistan (ChemDP; release 01) is a fully-referenced, evolving, web-based, virtual database which has been designed and developed to introduce natural products (NPs) and their derivatives from the biodiversity of Pakistan to Global scientific communities. The prime aim is to provide quality structures of compounds with relevant information for computer-aided drug discovery studies. For this purpose, over 1000 NPs have been identified from more than 400 published articles, for which 2D and 3D molecular structures have been generated with a special focus on their stereochemistry, where applicable. The PM7 semiempirical quantum chemistry method has been used to energy optimize the 3D structure of NPs. The 2D and 3D structures can be downloaded as .sdf, .mol, .sybyl, .mol2, and .pdb files - readable formats by many chemoinformatics/bioinformatics software packages. Each entry in ChemDP contains over 100 data fields representing various molecular, biological, physico-chemical and pharmacological properties, which have been properly documented in the database for end users. These pieces of information have been either manually extracted from the literatures or computationally calculated using various computational tools. Cross referencing to a major data repository i.e. ChemSpider has been made available for overlapping

  8. Requirements for a system to analyze HEP events using database computing

    International Nuclear Information System (INIS)

    May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Day, C.T.; Loken, S.; MacFarlane, J.F.; Baden, A.

    1992-01-01

    We describe the requirements for the design and prototyping of an object-oriented database designed to analyze data in high energy physics. Our goal is to satisfy the data processing and analysis needs of a generic high energy physics experiment to be proposed for the Superconducting SuperCollider (SSC), and requires the collection and analysis of between 10 and 100 million sets of vectors (events), each approximately one megabyte in length. We sketch how this analysis would proceed using an object-oriented database which support the basic data types used in HEP

  9. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2014-01-01

    This volume contains the papers presented at the Second International Conference on Frontiers in Intelligent Computing: Theory and Applications (FICTA-2013) held during 14-16 November 2013 organized by Bhubaneswar Engineering College (BEC), Bhubaneswar, Odisha, India. It contains 63 papers focusing on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, Fuzzy systems, Machine Intelligence and ANN, Web technologies and Multimedia applications and Intelligent computing and Networking etc.

  10. An approach for access differentiation design in medical distributed applications built on databases.

    Science.gov (United States)

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  11. Development of a North American paleoclimate pollen-based reconstruction database application

    Science.gov (United States)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  12. Development and application of computational aerothermodynamics flowfield computer codes

    Science.gov (United States)

    Venkatapathy, Ethiraj

    1993-01-01

    Computations are presented for one-dimensional, strong shock waves that are typical of those that form in front of a reentering spacecraft. The fluid mechanics and thermochemistry are modeled using two different approaches. The first employs traditional continuum techniques in solving the Navier-Stokes equations. The second-approach employs a particle simulation technique (the direct simulation Monte Carlo method, DSMC). The thermochemical models employed in these two techniques are quite different. The present investigation presents an evaluation of thermochemical models for nitrogen under hypersonic flow conditions. Four separate cases are considered. The cases are governed, respectively, by the following: vibrational relaxation; weak dissociation; strong dissociation; and weak ionization. In near-continuum, hypersonic flow, the nonequilibrium thermochemical models employed in continuum and particle simulations produce nearly identical solutions. Further, the two approaches are evaluated successfully against available experimental data for weakly and strongly dissociating flows.

  13. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  14. PRIDE and "Database on Demand" as valuable tools for computational proteomics.

    Science.gov (United States)

    Vizcaíno, Juan Antonio; Reisinger, Florian; Côté, Richard; Martens, Lennart

    2011-01-01

    The Proteomics Identifications Database (PRIDE, http://www.ebi.ac.uk/pride ) provides users with the ability to explore and compare mass spectrometry-based proteomics experiments that reveal details of the protein expression found in a broad range of taxonomic groups, tissues, and disease states. A PRIDE experiment typically includes identifications of proteins, peptides, and protein modifications. Additionally, many of the submitted experiments also include the mass spectra that provide the evidence for these identifications. Finally, one of the strongest advantages of PRIDE in comparison with other proteomics repositories is the amount of metadata it contains, a key point to put the above-mentioned data in biological and/or technical context. Several informatics tools have been developed in support of the PRIDE database. The most recent one is called "Database on Demand" (DoD), which allows custom sequence databases to be built in order to optimize the results from search engines. We describe the use of DoD in this chapter. Additionally, in order to show the potential of PRIDE as a source for data mining, we also explore complex queries using federated BioMart queries to integrate PRIDE data with other resources, such as Ensembl, Reactome, or UniProt.

  15. Integrated Optoelectronic Networks for Application-Driven Multicore Computing

    Science.gov (United States)

    2017-05-08

    AFRL-AFOSR-VA-TR-2017-0102 Integrated Optoelectronic Networks for Application- Driven Multicore Computing Sudeep Pasricha COLORADO STATE UNIVERSITY...AND SUBTITLE Integrated Optoelectronic Networks for Application-Driven Multicore Computing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-13-1-0110 5c...and supportive materials with innovative architectural designs that integrate these components according to system-wide application needs. 15

  16. Deep Learning and Applications in Computational Biology

    KAUST Repository

    Zeng, Jianyang

    2016-01-01

    -transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three

  17. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.

    2003-09-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)

  18. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.

    2004-01-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC

  19. Emerging Trends in Technology Education Computer Applications.

    Science.gov (United States)

    Hazari, Sunil I.

    1993-01-01

    Graphical User Interface (GUI)--and its variant, pen computing--is rapidly replacing older types of operating environments. Despite its heavier demand for processing power, GUI has many advantages. (SK)

  20. Near-term quantum computing for applications

    Data.gov (United States)

    National Aeronautics and Space Administration — From habitat automation to navigation and scheduling of tasks to networking, the challenges of modern space exploration are as much computational as they are...

  1. Computer applications in veterinary medicine | Hassan | Nigerian ...

    African Journals Online (AJOL)

    ... become essential tools in almost every field of research and applied technology. ... Computers in veterinary medicine have been used for veterinary education; ... agro-veterinary project design, monitoring and implementation; preparation of ...

  2. Cloud Computing Application for Romanian SMEs

    Directory of Open Access Journals (Sweden)

    Pistol, Luminiţa

    2017-09-01

    Full Text Available The article studies the current economical state of Romanian SMEs and the utility of cloud computing technologies in the process of sustainable open innovation. The study is based on a supply chain adapted for SMEs, on a model of innovation within a network business environment and on a decision tree dedicated for SMEs when starting a new project. Taking into account the statements of the article, a new framework of cloud computing economics can be developed.

  3. Statistical and thermal physics with computer applications

    CERN Document Server

    Gould, Harvey

    2010-01-01

    This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the

  4. Small Computer Applications for Base Supply.

    Science.gov (United States)

    1984-03-01

    research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign

  5. Investigation of Cloud Computing: Applications and Challenges

    OpenAIRE

    Amid Khatibi Bardsiri; Anis Vosoogh; Fatemeh Ahoojoosh

    2014-01-01

    Cloud computing is a model for saving data or knowledge in distance servers through Internet. It can be save the required memory space and reduce cost of extending memory capacity in users’ own machines and etc., Therefore, Cloud Computing has several benefits for individuals as well as organizations. It provides protection for personal and organizational data. Further, with the help of cloud service, one business owner, organization manager or service provider will be able to make privacy an...

  6. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  7. 2nd International Conference on Intelligent Computing and Applications

    CERN Document Server

    Dash, Subhransu; Das, Swagatam; Panigrahi, Bijaya

    2017-01-01

    Second International Conference on Intelligent Computing and Applications was the annual research conference aimed to bring together researchers around the world to exchange research results and address open issues in all aspects of Intelligent Computing and Applications. The main objective of the second edition of the conference for the scientists, scholars, engineers and students from the academia and the industry is to present ongoing research activities and hence to foster research relations between the Universities and the Industry. The theme of the conference unified the picture of contemporary intelligent computing techniques as an integral concept that highlights the trends in computational intelligence and bridges theoretical research concepts with applications. The conference covered vital issues ranging from intelligent computing, soft computing, and communication to machine learning, industrial automation, process technology and robotics. This conference also provided variety of opportunities for ...

  8. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  9. Study on the application of mobile internet cloud computing platform

    Science.gov (United States)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  10. Mobile Computing: The Emerging Technology, Sensing, Challenges and Applications

    International Nuclear Information System (INIS)

    Bezboruah, T.

    2010-12-01

    The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)

  11. Membrane computing: brief introduction, recent results and applications.

    Science.gov (United States)

    Păun, Gheorghe; Pérez-Jiménez, Mario J

    2006-07-01

    The internal organization and functioning of living cells, as well as their cooperation in tissues and higher order structures, can be a rich source of inspiration for computer science, not fully exploited at the present date. Membrane computing is an answer to this challenge, well developed at the theoretical (mathematical and computability theory) level, already having several applications (via usual computers), but without having yet a bio-lab implementation. After briefly discussing some general issues related to natural computing, this paper provides an informal introduction to membrane computing, focused on the main ideas, the main classes of results and of applications. Then, three recent achievements, of three different types, are briefly presented, with emphasis on the usefulness of membrane computing as a framework for devising models of interest for biological and medical research.

  12. Applicability of thermodynamic database of radioactive elements developed for the Japanese performance assessment of HLW repository

    International Nuclear Information System (INIS)

    Yui, Mikazu; Shibata, Masahiro; Rai, Dhanpat; Ochs, Michael

    2003-01-01

    In 1999 Japan Nuclear Cycle Development Institute (JNC) published a second progress report (also known as H12 report) on high-level radioactive waste (HLW) disposal in Japan (JNC 1999). This report helped to develop confidence in the selected HLW disposal system and to establish the implementation body in 2000 for the disposal of HLW. JNC developed an in-house thermodynamic database for radioactive elements for performance analysis of the engineered barrier system (EBS) and the geosphere for H12 report. This paper briefly presents the status of the JNC's thermodynamic database and its applicability to perform realistic analyses of the solubilities of radioactive elements, evolution of solubility-limiting solid phases, predictions of the redox state of Pu in the neutral pH range under reducing conditions, and to estimate solubilities of radioactive elements in cementitious conditions. (author)

  13. Computational nanotechnology modeling and applications with MATLAB

    National Research Council Canada - National Science Library

    Musa, Sarhan M

    2012-01-01

    .... Offering thought-provoking perspective on the developments that are poised to revolutionize the field, the author explores both existing and future nanotechnology applications, which hold great...

  14. Intelligent computational systems for space applications

    Science.gov (United States)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  15. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  16. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  17. Cyclone: java-based querying and computing with Pathway/Genome databases.

    Science.gov (United States)

    Le Fèvre, François; Smidtas, Serge; Schächter, Vincent

    2007-05-15

    Cyclone aims at facilitating the use of BioCyc, a collection of Pathway/Genome Databases (PGDBs). Cyclone provides a fully extensible Java Object API to analyze and visualize these data. Cyclone can read and write PGDBs, and can write its own data in the CycloneML format. This format is automatically generated from the BioCyc ontology by Cyclone itself, ensuring continued compatibility. Cyclone objects can also be stored in a relational database CycloneDB. Queries can be written in SQL, and in an intuitive and concise object-oriented query language, Hibernate Query Language (HQL). In addition, Cyclone interfaces easily with Java software including the Eclipse IDE for HQL edition, the Jung API for graph algorithms or Cytoscape for graph visualization. Cyclone is freely available under an open source license at: http://sourceforge.net/projects/nemo-cyclone. For download and installation instructions, tutorials, use cases and examples, see http://nemo-cyclone.sourceforge.net.

  18. A computer network system for mutual usage four databases of nuclear materials (Data-Free-Way)

    International Nuclear Information System (INIS)

    Fujita, M.; Kurihara, Y.; Shindou, M.; Yokoyama, N.; Tachi, Y.; Kano, S.; Iwata, S.

    1996-01-01

    Distributed database system named 'Data-Free-Way' for advanced nuclear materials has been developed by National Research Institute for Metals (NRIM), Japan Atomic Energy Research Institute (JAERI) and Power Reactor and Nuclear Fuel Development Corporation (PNC) under cooperation agreement between these three organizations. In the paper, features and functions of the system including input data are described together with method to share database among the three organizations as well as examples of the easy accessible search of material properties. Results of analysis of tensile and creep properties data on type 316 stainless steel collected by the different organizations and stored in the present system are also introduced as an example of attractive utilization of the system. Moreover, in order to consider the system in near future, some trails of WWW server of several sites in 'Data-Free-Way' to supply the information on nuclear materials to Internet are introduced. (author)

  19. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    Science.gov (United States)

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  20. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  1. Computational Phase Imaging for Biomedical Applications

    Science.gov (United States)

    Nguyen, Tan Huu

    laser comes at the expense of speckles, which degrades image quality. Therefore, solutions purely based on physical modeling and computations to remove these artifacts, using white-light illumination, are highly desirable. Here, using physical optics, we develop a theoretical model that accurately explains the effects of partial coherence on image information and phase information. The model is further combined with numerical processing to suppress the artifacts, and recover the correct phase information. The third topic is devoted to applying QPI to clinical applications. Traditionally, stained tissues are used in prostate cancer diagnosis instead. The reason is that tissue samples used in diagnosis are nearly transparent under bright field inspection if unstained. Contrast-enhanced microscopy techniques, e.g., phase contrast microscopy (PC) and differential interference contrast microscopy (DIC), can render visibility of the untagged samples with high throughput. However, since these methods are intensity-based, the contrast of acquired images varies significantly from one imaging facility to another, preventing them from being used in diagnosis. Inheriting the merits of PC, SLIM produces phase maps, which measure the refractive index of label-free samples. However, the maps measured by SLIM are not affected by variation in imaging conditions, e.g., illumination, magnification, etc., allowing consistent imaging results when using SLIM across different clinical institutions. Here, we combine SLIM images with machine learning for automatic diagnosis results for prostate cancer. We focus on two diagnosis problems of automatic Gleason grading and cancer vs. non-cancer diagnosis. Finally, we introduce a new imaging modality, named Gradient Light Interference Microscopy (GLIM), which is able to image through optically thick samples using low spatial coherence illumination. The key benefit of GLIM comes from a large numerical aperture of the condenser, which is 0.55 NA

  2. Wind power systems. Applications of computational intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lingfeng [Toledo Univ., OH (United States). Dept. of Electrical Engineering and Computer Science; Singh, Chanan [Texas A and M Univ., College Station, TX (United States). Electrical and Computer Engineering Dept.; Kusiak, Andrew (eds.) [Iowa Univ., Iowa City, IA (United States). Mechanical and Industrial Engineering Dept.

    2010-07-01

    Renewable energy sources such as wind power have attracted much attention because they are environmentally friendly, do not produce carbon dioxide and other emissions, and can enhance a nation's energy security. For example, recently more significant amounts of wind power are being integrated into conventional power grids. Therefore, it is necessary to address various important and challenging issues related to wind power systems, which are significantly different from the traditional generation systems. This book is a resource for engineers, practitioners, and decision-makers interested in studying or using the power of computational intelligence based algorithms in handling various important problems in wind power systems at the levels of power generation, transmission, and distribution. Researchers have been developing biologically-inspired algorithms in a wide variety of complex large-scale engineering domains. Distinguished from the traditional analytical methods, the new methods usually accomplish the task through their computationally efficient mechanisms. Computational intelligence methods such as evolutionary computation, neural networks, and fuzzy systems have attracted much attention in electric power systems. Meanwhile, modern electric power systems are becoming more and more complex in order to meet the growing electricity market. In particular, the grid complexity is continuously enhanced by the integration of intermittent wind power as well as the current restructuring efforts in electricity industry. Quite often, the traditional analytical methods become less efficient or even unable to handle this increased complexity. As a result, it is natural to apply computational intelligence as a powerful tool to deal with various important and pressing problems in the current wind power systems. This book presents the state-of-the-art development in the field of computational intelligence applied to wind power systems by reviewing the most up

  3. Managing Associated Risks in Cloud Computer Applications ...

    African Journals Online (AJOL)

    Java programming language and Google App engine were the tools used to develop and deploy the application. The work demonstrates the benefits of deploying applications using the cloud service over on-premise deployment especially where real time data is needed like reporting incidents during elections. Keywords: ...

  4. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2013-01-01

    The volume contains the papers presented at FICTA 2012: International Conference on Frontiers in Intelligent Computing: Theory and Applications held on December 22-23, 2012 in Bhubaneswar engineering College, Bhubaneswar, Odissa, India. It contains 86 papers contributed by authors from the globe. These research papers mainly focused on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, image processing, cloud computing, networking etc.

  5. The power of an ontology-driven developmental toxicity database for data mining and computational modeling

    Science.gov (United States)

    Modeling of developmental toxicology presents a significant challenge to computational toxicology due to endpoint complexity and lack of data coverage. These challenges largely account for the relatively few modeling successes using the structure–activity relationship (SAR) parad...

  6. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  7. Proceedings of the 2011 2nd International Congress on Computer Applications and Computational Science

    CERN Document Server

    Nguyen, Quang

    2012-01-01

    The latest inventions in computer technology influence most of human daily activities. In the near future, there is tendency that all of aspect of human life will be dependent on computer applications. In manufacturing, robotics and automation have become vital for high quality products. In education, the model of teaching and learning is focusing more on electronic media than traditional ones. Issues related to energy savings and environment is becoming critical.   Computational Science should enhance the quality of human life,  not only solve their problems. Computational Science should help humans to make wise decisions by presenting choices and their possible consequences. Computational Science should help us make sense of observations, understand natural language, plan and reason with extensive background knowledge. Intelligence with wisdom is perhaps an ultimate goal for human-oriented science.   This book is a compilation of some recent research findings in computer application and computational sci...

  8. Computational structural biology: methods and applications

    National Research Council Canada - National Science Library

    Schwede, Torsten; Peitsch, Manuel Claude

    2008-01-01

    ... sequencing reinforced the observation that structural information is needed to understand the detailed function and mechanism of biological molecules such as enzyme reactions and molecular recognition events. Furthermore, structures are obviously key to the design of molecules with new or improved functions. In this context, computational structural biology...

  9. Geometric Series and Computers--An Application.

    Science.gov (United States)

    McNerney, Charles R.

    1983-01-01

    This article considers the sum of a finite geometric series as applied to numeric data storage in the memory of an electronic digital computer. The presentation is viewed as relevant to programing in several languages and removes some of the mystique associated with syntax constraints that any language imposes. (MP)

  10. Computer Application Systems at the University.

    Science.gov (United States)

    Bazewicz, Mieczyslaw

    1979-01-01

    The results of the WASC Project at the Technical University of Wroclaw have confirmed the possibility of constructing informatic systems based on the recognized size and specifics of user's needs (needs of the university) and provided some solutions to the problem of collaboration of computer systems at remote universities. (Author/CMV)

  11. Engineering applications of computational fluid dynamics

    CERN Document Server

    Awang, Mokhtar

    2015-01-01

    This volume presents the results of Computational Fluid Dynamics (CFD) analysis that can be used for conceptual studies of product design, detail product development, process troubleshooting. It demonstrates the benefit of CFD modeling as a cost saving, timely, safe and easy to scale-up methodology.

  12. Computer Applications in Balancing Chemical Equations.

    Science.gov (United States)

    Kumar, David D.

    2001-01-01

    Discusses computer-based approaches to balancing chemical equations. Surveys 13 methods, 6 based on matrix, 2 interactive programs, 1 stand-alone system, 1 developed in algorithm in Basic, 1 based on design engineering, 1 written in HyperCard, and 1 prepared for the World Wide Web. (Contains 17 references.) (Author/YDS)

  13. Cloud computing applications for biomedical science: A perspective.

    Science.gov (United States)

    Navale, Vivek; Bourne, Philip E

    2018-06-01

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

  14. 6th International Conference on Practical Applications of Computational Biology & Bioinformatics

    CERN Document Server

    Luscombe, Nicholas; Fdez-Riverola, Florentino; Rodríguez, Juan; Practical Applications of Computational Biology & Bioinformatics

    2012-01-01

    The growth in the Bioinformatics and Computational Biology fields over the last few years has been remarkable.. The analysis of the datasets of Next Generation Sequencing needs new algorithms and approaches from fields such as Databases, Statistics, Data Mining, Machine Learning, Optimization, Computer Science and Artificial Intelligence. Also Systems Biology has also been emerging as an alternative to the reductionist view that dominated biological research in the last decades. This book presents the results of the  6th International Conference on Practical Applications of Computational Biology & Bioinformatics held at University of Salamanca, Spain, 28-30th March, 2012 which brought together interdisciplinary scientists that have a strong background in the biological and computational sciences.

  15. Guide to cloud computing for business and technology managers from distributed computing to cloudware applications

    CERN Document Server

    Kale, Vivek

    2014-01-01

    Guide to Cloud Computing for Business and Technology Managers: From Distributed Computing to Cloudware Applications unravels the mystery of cloud computing and explains how it can transform the operating contexts of business enterprises. It provides a clear understanding of what cloud computing really means, what it can do, and when it is practical to use. Addressing the primary management and operation concerns of cloudware, including performance, measurement, monitoring, and security, this pragmatic book:Introduces the enterprise applications integration (EAI) solutions that were a first ste

  16. Digital Rights Enforcement for Pervasive Computing Applications

    OpenAIRE

    Dahlem, Dominik

    2005-01-01

    Increasingly, application software is expanding from the desktop into mobile application environments, such as handset devices and embedded systems which are more limited in resources and volatile in their network connectivity. An integrated architecture that can protect intellectual property for both types of environments should offer the promise of reduced software maintenance costs. Software licensing is an existing mechanism by which specific license agreements are enforced be...

  17. Scalable Computational Chemistry: New Developments and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri [Iowa State Univ., Ames, IA (United States)

    2002-01-01

    The computational part of the thesis is the investigation of titanium chloride (II) as a potential catalyst for the bis-silylation reaction of ethylene with hexaclorodisilane at different levels of theory. Bis-silylation is an important reaction for producing bis(silyl) compounds and new C-Si bonds, which can serve as monomers for silicon containing polymers and silicon carbides. Ab initio calculations on the steps involved in a proposed mechanism are presented. This choice of reactants allows them to study this reaction at reliable levels of theory without compromising accuracy. The calculations indicate that this is a highly exothermic barrierless reaction. The TiCl2 catalyst removes a 50 kcal/mol activation energy barrier required for the reaction without the catalyst. The first step is interaction of TiCl2 with ethylene to form an intermediate that is 60 kcal/mol below the energy of the reactants. This is the driving force for the entire reaction. Dynamic correlation plays a significant role because RHF calculations indicate that the net barrier for the catalyzed reaction is 50 kcal/mol. They conclude that divalent Ti has the potential to become an important industrial catalyst for silylation reactions. In the programming part of the thesis, parallelization of different quantum chemistry methods is presented. The parallelization of code is becoming important aspects of quantum chemistry code development. Two trends contribute to it: the overall desire to study large chemical systems and the desire to employ highly correlated methods which are usually computationally and memory expensive. In the presented distributed data algorithms computation is parallelized and the largest arrays are evenly distributed among CPUs. First, the parallelization of the Hartree-Fock self-consistent field (SCF) method is considered. SCF method is the most common starting point for more accurate calculations. The Fock build (sub step of SCF) from AO integrals is

  18. Evaluation of Computational Docking to Identify Pregnane X Receptor Agonists in the ToxCast Database

    OpenAIRE

    Kortagere, Sandhya; Krasowski, Matthew D.; Reschly, Erica J.; Venkatesh, Madhukumar; Mani, Sridhar; Ekins, Sean

    2010-01-01

    Background The pregnane X receptor (PXR) is a key transcriptional regulator of many genes [e.g., cytochrome P450s (CYP2C9, CYP3A4, CYP2B6), MDR1] involved in xenobiotic metabolism and excretion. Objectives As part of an evaluation of different approaches to predict compound affinity for nuclear hormone receptors, we used the molecular docking program GOLD and a hybrid scoring scheme based on similarity weighted GoldScores to predict potential PXR agonists in the ToxCast database of pesticides...

  19. The Emdros Text Database Engine as a Platform for Persuasive Computing

    DEFF Research Database (Denmark)

    Sandborg-Petersen, Ulrik

    2013-01-01

    This paper describes the nature and scope of Emdros, a text database engine for annotated text. Three case-studies of persuasive learning systems using Emdros as an important architectural component are described, and their status as to participation in the three legs of BJ Fogg's Functional Triad...... of Persuasive Design is assessed. Various properties of Emdros are discussed, both with respect to competing systems, and with respect to the three case studies. It is argued that these properties together enable Emdros to form part of the foundation for a large class of systems whose primary function involves...

  20. Applications of Protein Thermodynamic Database for Understanding Protein Mutant Stability and Designing Stable Mutants.

    Science.gov (United States)

    Gromiha, M Michael; Anoosha, P; Huang, Liang-Tsung

    2016-01-01

    Protein stability is the free energy difference between unfolded and folded states of a protein, which lies in the range of 5-25 kcal/mol. Experimentally, protein stability is measured with circular dichroism, differential scanning calorimetry, and fluorescence spectroscopy using thermal and denaturant denaturation methods. These experimental data have been accumulated in the form of a database, ProTherm, thermodynamic database for proteins and mutants. It also contains sequence and structure information of a protein, experimental methods and conditions, and literature information. Different features such as search, display, and sorting options and visualization tools have been incorporated in the database. ProTherm is a valuable resource for understanding/predicting the stability of proteins and it can be accessed at http://www.abren.net/protherm/ . ProTherm has been effectively used to examine the relationship among thermodynamics, structure, and function of proteins. We describe the recent progress on the development of methods for understanding/predicting protein stability, such as (1) general trends on mutational effects on stability, (2) relationship between the stability of protein mutants and amino acid properties, (3) applications of protein three-dimensional structures for predicting their stability upon point mutations, (4) prediction of protein stability upon single mutations from amino acid sequence, and (5) prediction methods for addressing double mutants. A list of online resources for predicting has also been provided.

  1. Relational database hybrid model, of high performance and storage capacity for nuclear engineering applications

    International Nuclear Information System (INIS)

    Gomes Neto, Jose

    2008-01-01

    The objective of this work is to present the relational database, named FALCAO. It was created and implemented to support the storage of the monitored variables in the IEA-R1 research reactor, located in the Instituto de Pesquisas Energeticas e Nucleares, IPEN/CNEN-SP. The data logical model and its direct influence in the integrity of the provided information are carefully considered. The concepts and steps of normalization and de normalization including the entities and relations involved in the logical model are presented. It is also presented the effects of the model rules in the acquisition, loading and availability of the final information, under the performance concept since the acquisition process loads and provides lots of information in small intervals of time. The SACD application, through its functionalities, presents the information stored in the FALCAO database in a practical and optimized form. The implementation of the FALCAO database occurred successfully and its existence leads to a considerably favorable situation. It is now essential to the routine of the researchers involved, not only due to the substantial improvement of the process but also to the reliability associated to it. (author)

  2. MIRNA-DISTILLER: a stand-alone application to compile microRNA data from databases

    Directory of Open Access Journals (Sweden)

    Jessica K. Rieger

    2011-07-01

    Full Text Available MicroRNAs (miRNA are small non-coding RNA molecules of ~22 nucleotides which regulate large numbers of genes by binding to seed sequences at the 3’-UTR of target gene transcripts. The target mRNA is then usually degraded or translation is inhibited, although thus resulting in posttranscriptional down regulation of gene expression at the mRNA and/or protein level. Due to the bioinformatic difficulties in predicting functional miRNA binding sites, several publically available databases have been developed that predict miRNA binding sites based on different algorithms. The parallel use of different databases is currently indispensable, but highly uncomfortable and time consuming, especially when working with numerous genes of interest. We have therefore developed a new stand-alone program, termed MIRNA-DISTILLER, which allows to compile miRNA data for given target genes from public databases. Currently implemented are TargetScan, microCosm, and miRDB, which may be queried independently, pairwise, or together to calculate the respective intersections. Data are stored locally for application of further analysis tools including freely definable biological parameter filters, customized output-lists for both miRNAs and target genes, and various graphical facilities. The software, a data example file and a tutorial are freely available at http://www.ikp-stuttgart.de/content/language1/html/10415.asp

  3. MIRNA-DISTILLER: A Stand-Alone Application to Compile microRNA Data from Databases.

    Science.gov (United States)

    Rieger, Jessica K; Bodan, Denis A; Zanger, Ulrich M

    2011-01-01

    MicroRNAs (miRNA) are small non-coding RNA molecules of ∼22 nucleotides which regulate large numbers of genes by binding to seed sequences at the 3'-untranslated region of target gene transcripts. The target mRNA is then usually degraded or translation is inhibited, although thus resulting in posttranscriptional down regulation of gene expression at the mRNA and/or protein level. Due to the bioinformatic difficulties in predicting functional miRNA binding sites, several publically available databases have been developed that predict miRNA binding sites based on different algorithms. The parallel use of different databases is currently indispensable, but highly uncomfortable and time consuming, especially when working with numerous genes of interest. We have therefore developed a new stand-alone program, termed MIRNA-DISTILLER, which allows to compile miRNA data for given target genes from public databases. Currently implemented are TargetScan, microCosm, and miRDB, which may be queried independently, pairwise, or together to calculate the respective intersections. Data are stored locally for application of further analysis tools including freely definable biological parameter filters, customized output-lists for both miRNAs and target genes, and various graphical facilities. The software, a data example file and a tutorial are freely available at http://www.ikp-stuttgart.de/content/language1/html/10415.asp.

  4. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  5. Emission computed tomography: methodology and applications

    International Nuclear Information System (INIS)

    Reivich, M.; Alavi, A.; Greenberg, J.; Fowler, J.; Christman, D.; Rosenquist, A.; Rintelmann, W.; Hand, P.; MacGregor, R.; Wolf, A.

    1980-01-01

    A technique for the determination of local cerebral glucose metabolism using positron emission computed tomography is described as an example of the development of use of this methodology for the study of these parameters in man. The method for the determination of local cerebral glucose metabolism utilizes 18 F-2-fluoro-2-deoxyglucose ([ 18 F]-FDG). In this method [ 18 F]-FDG is used as a tracer for the exchange of glucose between plasma and brain and its phosphorylation by hexokinase in the tissue. The labelled product of metabolism, [ 18 F]-FDG phosphate, is essentially trapped in the tissue over the time course of the measurement. The studies demonstrate the potential usefulness of emission computed tomography for the measurement of various biochemical and physiological parameters in man. (Auth.)

  6. Application of Computer Vision in Agriculture

    OpenAIRE

    Archana B. Patankar; Priya A. Tayade

    2015-01-01

    Grading and sorting of fruits, leaf is one of the most important process in fruits production, while this process is typically performed manually in most countries. Computer vision techniques have applied for evaluating food quality as well as fruit grading. In this project different technique used that is image preprocessing, image segmentation k-means clustering algorithm to find out the infection present in image also calculate percentage of infection, from that percentage did the...

  7. The computer coordination method and research of inland river traffic based on ship database

    Science.gov (United States)

    Liu, Shanshan; Li, Gen

    2018-04-01

    A computer coordinated management method for inland river ship traffic is proposed in this paper, Get the inland ship's position, speed and other navigation information by VTS, building ship's statics and dynamic data bases, writing a program of computer coordinated management of inland river traffic by VB software, Automatic simulation and calculation of the meeting states of ships, Providing ship's long-distance collision avoidance information. The long-distance collision avoidance of ships will be realized. The results show that, Ships avoid or reduce meetings, this method can effectively control the macro collision avoidance of ships.

  8. ECOS: a configurable, multi-terabyte database supporting engineering and technical computing at Sizewell B

    International Nuclear Information System (INIS)

    Binns, F.; Fish, A.

    1992-01-01

    One of the three main classes of computing support systems is concerned with the technical and engineering aspects of Sizewell-B power station. These aspects are primarily concerned with engineering means to optimise plant use to maximise power output by increasing availability and efficiency. At Sizewell-B the Engineering Computer system (ECOS) will provide the necessary support facilities, and is described. ECOS is being used by the station commissioning team and for monitoring the state of some plant already in service. (Author)

  9. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  10. Restricted access processor - An application of computer security technology

    Science.gov (United States)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  11. Two-phase computer codes for zero-gravity applications

    International Nuclear Information System (INIS)

    Krotiuk, W.J.

    1986-10-01

    This paper discusses the problems existing in the development of computer codes which can analyze the thermal-hydraulic behavior of two-phase fluids especially in low gravity nuclear reactors. The important phenomenon affecting fluid flow and heat transfer in reduced gravity is discussed. The applicability of using existing computer codes for space applications is assessed. Recommendations regarding the use of existing earth based fluid flow and heat transfer correlations are made and deficiencies in these correlations are identified

  12. From handwriting analysis to pen-computer applications

    NARCIS (Netherlands)

    Schomaker, L

    1998-01-01

    In this paper, pen computing, i.e. the use of computers and applications in which the pen is the main input device, will be described from four different viewpoints. Firstly a brief overview of the hardware developments in pen systems is given, leading to the conclusion that the technological

  13. Journal of Computer Science and Its Application: Submissions

    African Journals Online (AJOL)

    Author Guidelines. The Journal of Computer Science and Its Applications welcomes submission of complete and original research manuscripts, which are not under review in any other conference or journal. The topics covered by the journal include but are not limited to Artificial Intelligence, Bioinformatics, Computational ...

  14. Embedded computer systems for control applications in EBR-II

    International Nuclear Information System (INIS)

    Carlson, R.B.; Start, S.E.

    1993-01-01

    The purpose of this paper is to describe the embedded computer systems approach taken at Experimental Breeder Reactor II (EBR-II) for non-safety related systems. The hardware and software structures for typical embedded systems are presented The embedded systems development process is described. Three examples are given which illustrate typical embedded computer applications in EBR-II

  15. Computational Fluid Dynamics Methods and Their Applications in Medical Science

    Directory of Open Access Journals (Sweden)

    Kowalewski Wojciech

    2016-12-01

    Full Text Available As defined by the National Institutes of Health: “Biomedical engineering integrates physical, chemical, mathematical, and computational sciences and engineering principles to study biology, medicine, behavior, and health”. Many issues in this area are closely related to fluid dynamics. This paper provides an overview of the basic concepts concerning Computational Fluid Dynamics and its applications in medicine.

  16. Engineering of systems for application of scientific computing in industry

    OpenAIRE

    Loeve, W.; Loeve, W.

    1992-01-01

    Mathematics software is of growing importance for computer simulation in industrial computer aided engineering. To be applicable in industry the mathematics software and supporting software must be structured in such a way that functions and performance can be maintained easily. In the present paper a method is described for development of mathematics software in such a way that this requirement can be met.

  17. The NASA Ames Polycyclic Aromatic Hydrocarbon Infrared Spectroscopic Database : The Computed Spectra

    NARCIS (Netherlands)

    Bauschlicher, C. W.; Boersma, C.; Ricca, A.; Mattioda, A. L.; Cami, J.; Peeters, E.; de Armas, F. Sanchez; Saborido, G. Puerta; Hudgins, D. M.; Allamandola, L. J.

    The astronomical emission features, formerly known as the unidentified infrared bands, are now commonly ascribed to polycyclic aromatic hydrocarbons (PAHs). The laboratory experiments and computational modeling done at the NASA Ames Research Center to create a collection of PAH IR spectra relevant

  18. Mobile computing acceptance grows as applications evolve.

    Science.gov (United States)

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years.

  19. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    International Nuclear Information System (INIS)

    Jacobs, Colin; Prokop, Mathias; Rikxoort, Eva M. van; Ginneken, Bram van; Murphy, Keelin; Schaefer-Prokop, Cornelia M.

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process. The LIDC/IDRI database contains 888 thoracic CT scans with a section thickness of 2.5 mm or lower. We report performance of two commercial and one academic CAD system. The influence of presence of contrast, section thickness, and reconstruction kernel on CAD performance was assessed. Four radiologists independently analyzed the false positive CAD marks of the best CAD system. The updated commercial CAD system showed the best performance with a sensitivity of 82 % at an average of 3.1 false positive detections per scan. Forty-five false positive CAD marks were scored as nodules by all four radiologists in our study. On the largest publicly available reference database for lung nodule detection in chest CT, the updated commercial CAD system locates the vast majority of pulmonary nodules at a low false positive rate. Potential for CAD is substantiated by the fact that it identifies pulmonary nodules that were not marked during the extensive four-fold LIDC annotation process. (orig.)

  20. Intelligent decision support systems for sustainable computing paradigms and applications

    CERN Document Server

    Abraham, Ajith; Siarry, Patrick; Sheng, Michael

    2017-01-01

    This unique book dicusses the latest research, innovative ideas, challenges and computational intelligence (CI) solutions in sustainable computing. It presents novel, in-depth fundamental research on achieving a sustainable lifestyle for society, either from a methodological or from an application perspective. Sustainable computing has expanded to become a significant research area covering the fields of computer science and engineering, electrical engineering and other engineering disciplines, and there has been an increase in the amount of literature on aspects sustainable computing such as energy efficiency and natural resources conservation that emphasizes the role of ICT (information and communications technology) in achieving system design and operation objectives. The energy impact/design of more efficient IT infrastructures is a key challenge in realizing new computing paradigms. The book explores the uses of computational intelligence (CI) techniques for intelligent decision support that can be explo...

  1. Computational applications of DNA physical scales

    DEFF Research Database (Denmark)

    Baldi, Pierre; Chauvin, Yves; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example we construct a strand invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combinations with hidden Markov models......The authors study from a computational standpoint several different physical scales associated with structural features of DNA sequences, including dinucleotide scales such as base stacking energy and propellor twist, and trinucleotide scales such as bendability and nucleosome positioning. We show...

  2. Computational applications of DNA structural scales

    DEFF Research Database (Denmark)

    Baldi, P.; Chauvin, Y.; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example, we construct a strand-invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combination with hidden Markov models......Studies several different physical scales associated with the structural features of DNA sequences from a computational standpoint, including dinucleotide scales, such as base stacking energy and propeller twist, and trinucleotide scales, such as bendability and nucleosome positioning. We show...

  3. An industrial application of computer assisted tomography

    International Nuclear Information System (INIS)

    Tonner, P.D.; Tosello, G.

    1984-10-01

    Computer assisted tomography (CAT) scanning is a nondestructive testing technique used to obtain quantitatively accurate mappings of the distribution of linear attenuation coefficients inside an object. To demonstrate the potential of the technique for accurately locating defects in three dimensions a sectioned 5 cm gate valve, with a shrink cavity made visible by the sectioning, was tomographically imaged using a Co-60 source. The tomographic images revealed a larger cavity below the sectioned surface. The position of this cavity was located with an in-plane and axial precision of approximately +- 1 mm. The volume of the cavity was estimated to be approximately 40 mm 3

  4. Statistical theory applications and associated computer codes

    International Nuclear Information System (INIS)

    Prince, A.

    1980-01-01

    The general format is along the same lines as that used in the O.M. Session, i.e. an introduction to the nature of the physical problems and methods of solution based on the statistical model of the nucleus. Both binary and higher multiple reactions are considered. The computer codes used in this session are a combination of optical model and statistical theory. As with the O.M. sessions, the preparation of input and analysis of output are thoroughly examined. Again, comparison with experimental data serves to demonstrate the validity of the results and possible areas for improvement. (author)

  5. An overview of interactive computer graphics and its application to computer-aided engineering and design

    International Nuclear Information System (INIS)

    Van Dam, A.

    1983-01-01

    The purpose of this brief birds-eye view of interactive graphics is to list the key ideas, and to show how one of the most important application areas, Computer Aided Engineering/Design takes advantage of it. (orig.)

  6. Application of bioinformatics tools and databases in microbial dehalogenation research (a review).

    Science.gov (United States)

    Satpathy, R; Konkimalla, V B; Ratha, J

    2015-01-01

    Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.

  7. Application of embedded database to digital power supply system in HIRFL

    International Nuclear Information System (INIS)

    Wu Guanghua; Yan Huaihai; Chen Youxin; Huang Yuzhen; Zhou Zhongzu; Gao Daqing

    2014-01-01

    Background: This paper introduces the application of embedded MySQL database in the real-time monitoring system of the digital power supply system in Heavy Ion Research Facility in Lanzhou (HIRFL). Purpose: The aim is to optimize the real-time monitoring system of the digital power supply system for better performance. Methods: The MySQL database is designed and implemented under Linux operation system running on ARM processor, together with the related functions for real-time data monitoring, such as collection, storage and query. All status parameters of digital power supply system is collected and communicated with ARM by a FPGA, whilst the user interface is realized by Qt toolkits at ARM end. Results: The actual operation indicates that digital power supply can realize the function of real-time data monitoring, collection, storage and so on. Conclusion: Through practical application, we have found some aspects we can improve and we will try to optimize them in the future. (authors)

  8. [Stroke mortality in Poland--role of observational studies based on computer databases].

    Science.gov (United States)

    Mazurek, Maciej

    2005-01-01

    Stroke is a leading cause of death worldwide and remains one of the major public health problems. Most European countries have experienced declines in stroke mortality in contrast to central and eastern European countries including Poland. The World Health Organization Data Bank is an invaluable source of information especially for mortality trends. Stroke mortality in Poland and some problems with accuracy of ICD coding for the identification of patients with acute stroke are discussed. Computerized databases are increasingly being used to identify patients with acute stroke for epidemiological, quality of care, and cost studies. More accurate methods of collecting and analysis of the data should be implemented to gain more information from these bases.

  9. Evolution of computational models in BioModels Database and the Physiome Model Repository.

    Science.gov (United States)

    Scharm, Martin; Gebhardt, Tom; Touré, Vasundra; Bagnacani, Andrea; Salehzadeh-Yazdi, Ali; Wolkenhauer, Olaf; Waltemath, Dagmar

    2018-04-12

    A useful model is one that is being (re)used. The development of a successful model does not finish with its publication. During reuse, models are being modified, i.e. expanded, corrected, and refined. Even small changes in the encoding of a model can, however, significantly affect its interpretation. Our motivation for the present study is to identify changes in models and make them transparent and traceable. We analysed 13734 models from BioModels Database and the Physiome Model Repository. For each model, we studied the frequencies and types of updates between its first and latest release. To demonstrate the impact of changes, we explored the history of a Repressilator model in BioModels Database. We observed continuous updates in the majority of models. Surprisingly, even the early models are still being modified. We furthermore detected that many updates target annotations, which improves the information one can gain from models. To support the analysis of changes in model repositories we developed MoSt, an online tool for visualisations of changes in models. The scripts used to generate the data and figures for this study are available from GitHub https://github.com/binfalse/BiVeS-StatsGenerator and as a Docker image at https://hub.docker.com/r/binfalse/bives-statsgenerator/ . The website https://most.bio.informatik.uni-rostock.de/ provides interactive access to model versions and their evolutionary statistics. The reuse of models is still impeded by a lack of trust and documentation. A detailed and transparent documentation of all aspects of the model, including its provenance, will improve this situation. Knowledge about a model's provenance can avoid the repetition of mistakes that others already faced. More insights are gained into how the system evolves from initial findings to a profound understanding. We argue that it is the responsibility of the maintainers of model repositories to offer transparent model provenance to their users.

  10. Advances in Computer Science, Engineering & Applications : Proceedings of the Second International Conference on Computer Science, Engineering & Applications

    CERN Document Server

    Zizka, Jan; Nagamalai, Dhinaharan

    2012-01-01

    The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of  intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.  

  11. Advances in Computer Science, Engineering & Applications : Proceedings of the Second International Conference on Computer Science, Engineering & Applications

    CERN Document Server

    Zizka, Jan; Nagamalai, Dhinaharan

    2012-01-01

    The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of  intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.

  12. Cloud computing and digital media fundamentals, techniques, and applications

    CERN Document Server

    Li, Kuan-Ching; Shih, Timothy K

    2014-01-01

    Cloud Computing and Digital Media: Fundamentals, Techniques, and Applications presents the fundamentals of cloud and media infrastructure, novel technologies that integrate digital media with cloud computing, and real-world applications that exemplify the potential of cloud computing for next-generation digital media. It brings together technologies for media/data communication, elastic media/data storage, security, authentication, cross-network media/data fusion, interdevice media interaction/reaction, data centers, PaaS, SaaS, and more.The book covers resource optimization for multimedia clo

  13. Implementation of DFT application on ternary optical computer

    Science.gov (United States)

    Junjie, Peng; Youyi, Fu; Xiaofeng, Zhang; Shuai, Kong; Xinyu, Wei

    2018-03-01

    As its characteristics of huge number of data bits and low energy consumption, optical computing may be used in the applications such as DFT etc. which needs a lot of computation and can be implemented in parallel. According to this, DFT implementation methods in full parallel as well as in partial parallel are presented. Based on resources ternary optical computer (TOC), extensive experiments were carried out. Experimental results show that the proposed schemes are correct and feasible. They provide a foundation for further exploration of the applications on TOC that needs a large amount calculation and can be processed in parallel.

  14. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    International Nuclear Information System (INIS)

    Scipioni, B.; Liu, D.; Song, T.

    1993-05-01

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL's systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor and analyze the PDSF

  15. Medical imaging technology reviews and computational applications

    CERN Document Server

    Dewi, Dyah

    2015-01-01

    This book presents the latest research findings and reviews in the field of medical imaging technology, covering ultrasound diagnostics approaches for detecting osteoarthritis, breast carcinoma and cardiovascular conditions, image guided biopsy and segmentation techniques for detecting lung cancer, image fusion, and simulating fluid flows for cardiovascular applications. It offers a useful guide for students, lecturers and professional researchers in the fields of biomedical engineering and image processing.

  16. The computation of fixed points and applications

    CERN Document Server

    Todd, Michael J

    1976-01-01

    Fixed-point algorithms have diverse applications in economics, optimization, game theory and the numerical solution of boundary-value problems. Since Scarf's pioneering work [56,57] on obtaining approximate fixed points of continuous mappings, a great deal of research has been done in extending the applicability and improving the efficiency of fixed-point methods. Much of this work is available only in research papers, although Scarf's book [58] gives a remarkably clear exposition of the power of fixed-point methods. However, the algorithms described by Scarf have been super~eded by the more sophisticated restart and homotopy techniques of Merrill [~8,~9] and Eaves and Saigal [1~,16]. To understand the more efficient algorithms one must become familiar with the notions of triangulation and simplicial approxi- tion, whereas Scarf stresses the concept of primitive set. These notes are intended to introduce to a wider audience the most recent fixed-point methods and their applications. Our approach is therefore ...

  17. Computational intelligence in digital forensics forensic investigation and applications

    CERN Document Server

    Choo, Yun-Huoy; Abraham, Ajith; Srihari, Sargur

    2014-01-01

    Computational Intelligence techniques have been widely explored in various domains including forensics. Analysis in forensic encompasses the study of pattern analysis that answer the question of interest in security, medical, legal, genetic studies and etc. However, forensic analysis is usually performed through experiments in lab which is expensive both in cost and time. Therefore, this book seeks to explore the progress and advancement of computational intelligence technique in different focus areas of forensic studies. This aims to build stronger connection between computer scientists and forensic field experts.   This book, Computational Intelligence in Digital Forensics: Forensic Investigation and Applications, is the first volume in the Intelligent Systems Reference Library series. The book presents original research results and innovative applications of computational intelligence in digital forensics. This edited volume contains seventeen chapters and presents the latest state-of-the-art advancement ...

  18. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  19. Towards Process Support for Migrating Applications to Cloud Computing

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2012-01-01

    Cloud computing is an active area of research for industry and academia. There are a large number of organizations providing cloud computing infrastructure and services. In order to utilize these infrastructure resources and services, existing applications need to be migrated to clouds. However...... for supporting migration to cloud computing based on our experiences from migrating an Open Source System (OSS), Hackystat, to two different cloud computing platforms. We explained the process by performing a comparative analysis of our efforts to migrate Hackystate to Amazon Web Services and Google App Engine....... We also report the potential challenges, suitable solutions, and lesson learned to support the presented process framework. We expect that the reported experiences can serve guidelines for those who intend to migrate software applications to cloud computing....

  20. Design and applications of Computed Industrial Tomographic Imaging System (CITIS)

    International Nuclear Information System (INIS)

    Ramakrishna, G.S.; Umesh Kumar; Datta, S.S.; Rao, S.M.

    1996-01-01

    Computed tomographic imaging is an advanced technique for nondestructive testing (NDT) and examination. For the first time in India a computed aided tomography system has been indigenously developed in BARC for testing industrial components and was successfully demonstrated. The system in addition to Computed Tomography (CT) can also perform Digital Radiography (DR) to serve as a powerful tool for NDT applications. It has wider applications in the fields of nuclear, space and allied fields. The authors have developed a computed industrial tomographic imaging system with Cesium 137 gamma radiation source for nondestructive examination of engineering and industrial specimens. This presentation highlights the design and development of a prototype system and its software for image reconstruction, simulation and display. The paper also describes results obtained with several tests specimens, current development and possibility of using neutrons as well as high energy x-rays in computed tomography. (author)

  1. Computed Tomography Technology: Development and Applications for Defence

    International Nuclear Information System (INIS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-01-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT and E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  2. Technical property and application of industrial computed tomography

    International Nuclear Information System (INIS)

    Sun Lingxia; Ye Yunchang

    2006-01-01

    The main technical property of industrial computed tomography (ICT) and its application in non-destructive testing (NDT) were described. And some examples of ICT applications in such fields as defects detection, welding quality, density uniformity, structure analysis and making-up quality were given. (authors)

  3. Applications of Deontic Logic in Computer Science: A Concise Overview

    NARCIS (Netherlands)

    Meyer, J.-J.Ch.; Meyer, John-Jules Ch.; Wieringa, Roelf J.

    1993-01-01

    Deontic logic is the logic that deals with actual as well as ideal behavior of systems. In this paper, we survey a number of applications of deontic logic in computer science that have arisen in the eighties, and give a systematic framework in which these applications can be classified. Many

  4. Computer Applications in Production and Engineering

    DEFF Research Database (Denmark)

    Sørensen, Torben

    1997-01-01

    This paper address how neutral product model interfaces can be identified, specified, and implemented to provide intelligent and flexible means for information management in manufacturing of discrete mechanical products.The use of advanced computer based systems, such as CAD, CAE, CNC, and robotics......, offers a potential for significant cost-savings and quality improvements in manufacturing of discrete mechanical products.However, these systems are introduced into production as 'islands of automation' or 'islands of information', and to benefit from the said potential, the systems must be integrated...... domains; the CA(X) systems are placed in two different domains for design and planning, respectively. A third domain within the CIME architecture comprises the automated equipment on the shop floor....

  5. Application of computer voice input/output

    International Nuclear Information System (INIS)

    Ford, W.; Shirk, D.G.

    1981-01-01

    The advent of microprocessors and other large-scale integration (LSI) circuits is making voice input and output for computers and instruments practical; specialized LSI chips for speech processing are appearing on the market. Voice can be used to input data or to issue instrument commands; this allows the operator to engage in other tasks, move about, and to use standard data entry systems. Voice synthesizers can generate audible, easily understood instructions. Using voice characteristics, a control system can verify speaker identity for security purposes. Two simple voice-controlled systems have been designed at Los Alamos for nuclear safeguards applicaations. Each can easily be expanded as time allows. The first system is for instrument control that accepts voice commands and issues audible operator prompts. The second system is for access control. The speaker's voice is used to verify his identity and to actuate external devices

  6. Geochemical databases. Part 1. Pmatch: a program to manage thermochemical data. Part 2. The experimental validation of geochemical computer models

    International Nuclear Information System (INIS)

    Pearson, F.J. Jr.; Avis, J.D.; Nilsson, K.; Skytte Jensen, B.

    1993-01-01

    This work is carried out under cost-sharing contract with European Atomic Energy Community in the framework of its programme on Management and Storage of Radioactive Wastes. Part 1: PMATCH, A Program to Manage Thermochemical Data, describes the development and use of a computer program, by means of which new thermodynamic data from literature may be referenced to a common frame and thereby become internally consistent with an existing database. The report presents the relevant thermodynamic expressions and their use in the program is discussed. When there is not sufficient thermodynamic data available to describe a species behaviour under all conceivable conditions, the problems arising are thoroughly discussed and the available data is handled by approximating expressions. Part II: The Experimental Validation of Geochemical Computer models are the results of experimental investigations of the equilibria established in aqueous suspensions of mixtures of carbonate minerals (Calcium, magnesium, manganese and europium carbonates) compared with theoretical calculations made by means of the geochemical JENSEN program. The study revealed that the geochemical computer program worked well, and that its database was of sufficient validity. However, it was observed that experimental difficulties could hardly be avoided, when as here a gaseous component took part in the equilibria. Whereas the magnesium and calcium carbonates did not demonstrate mutual solid solubility, this produced abnormal effects when manganese and calcium carbonates were mixed resulting in a diminished solubility of both manganese and calcium. With tracer amounts of europium added to a suspension of calcite in sodium carbonate solutions long term experiments revealed a transition after 1-2 months, whereby the tracer became more strongly adsorbed onto calcite. The transition is interpreted as the nucleation and formation of a surface phase incorporating the 'species' NaEu(Co 3 ) 2

  7. Exploring the Ligand-Protein Networks in Traditional Chinese Medicine: Current Databases, Methods, and Applications

    Directory of Open Access Journals (Sweden)

    Mingzhu Zhao

    2013-01-01

    Full Text Available The traditional Chinese medicine (TCM, which has thousands of years of clinical application among China and other Asian countries, is the pioneer of the “multicomponent-multitarget” and network pharmacology. Although there is no doubt of the efficacy, it is difficult to elucidate convincing underlying mechanism of TCM due to its complex composition and unclear pharmacology. The use of ligand-protein networks has been gaining significant value in the history of drug discovery while its application in TCM is still in its early stage. This paper firstly surveys TCM databases for virtual screening that have been greatly expanded in size and data diversity in recent years. On that basis, different screening methods and strategies for identifying active ingredients and targets of TCM are outlined based on the amount of network information available, both on sides of ligand bioactivity and the protein structures. Furthermore, applications of successful in silico target identification attempts are discussed in detail along with experiments in exploring the ligand-protein networks of TCM. Finally, it will be concluded that the prospective application of ligand-protein networks can be used not only to predict protein targets of a small molecule, but also to explore the mode of action of TCM.

  8. Application of the British Food Standards Agency nutrient profiling system in a French food composition database.

    Science.gov (United States)

    Julia, Chantal; Kesse-Guyot, Emmanuelle; Touvier, Mathilde; Méjean, Caroline; Fezeu, Léopold; Hercberg, Serge

    2014-11-28

    Nutrient profiling systems are powerful tools for public health initiatives, as they aim at categorising foods according to their nutritional quality. The British Food Standards Agency (FSA) nutrient profiling system (FSA score) has been validated in a British food database, but the application of the model in other contexts has not yet been evaluated. The objective of the present study was to assess the application of the British FSA score in a French food composition database. Foods from the French NutriNet-Santé study food composition table were categorised according to their FSA score using the Office of Communication (OfCom) cut-off value ('healthier' ≤ 4 for foods and ≤ 1 for beverages; 'less healthy' >4 for foods and >1 for beverages) and distribution cut-offs (quintiles for foods, quartiles for beverages). Foods were also categorised according to the food groups used for the French Programme National Nutrition Santé (PNNS) recommendations. Foods were weighted according to their relative consumption in a sample drawn from the NutriNet-Santé study (n 4225), representative of the French population. Classification of foods according to the OfCom cut-offs was consistent with food groups described in the PNNS: 97·8 % of fruit and vegetables, 90·4 % of cereals and potatoes and only 3·8 % of sugary snacks were considered as 'healthier'. Moreover, variability in the FSA score allowed for a discrimination between subcategories in the same food group, confirming the possibility of using the FSA score as a multiple category system, for example as a basis for front-of-pack nutrition labelling. Application of the FSA score in the French context would adequately complement current public health recommendations.

  9. 'Isotopo' a database application for facile analysis and management of mass isotopomer data.

    Science.gov (United States)

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eylert, Eva; Eisenreich, Wolfgang; Dandekar, Thomas

    2014-01-01

    The composition of stable-isotope labelled isotopologues/isotopomers in metabolic products can be measured by mass spectrometry and supports the analysis of pathways and fluxes. As a prerequisite, the original mass spectra have to be processed, managed and stored to rapidly calculate, analyse and compare isotopomer enrichments to study, for instance, bacterial metabolism in infection. For such applications, we provide here the database application 'Isotopo'. This software package includes (i) a database to store and process isotopomer data, (ii) a parser to upload and translate different data formats for such data and (iii) an improved application to process and convert signal intensities from mass spectra of (13)C-labelled metabolites such as tertbutyldimethylsilyl-derivatives of amino acids. Relative mass intensities and isotopomer distributions are calculated applying a partial least square method with iterative refinement for high precision data. The data output includes formats such as graphs for overall enrichments in amino acids. The package is user-friendly for easy and robust data management of multiple experiments. The 'Isotopo' software is available at the following web link (section Download): http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. The package contains three additional files: software executable setup (installer), one data set file (discussed in this article) and one excel file (which can be used to convert data from excel to '.iso' format). The 'Isotopo' software is compatible only with the Microsoft Windows operating system. http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. © The Author(s) 2014. Published by Oxford University Press.

  10. Analysis on Cloud Computing Database in Cloud Environment – Concept and Adoption Paradigm

    OpenAIRE

    Elena-Geanina ULARU; Florina PUICAN; Manole VELICANU

    2012-01-01

    With the development of the Internet’s new technical functionalities, new concepts have started to take shape. These concepts have an important role especially in the development of corporate IT. Such a concept is „the Cloud”. Various marketing campaigns have started to focus on the Cloud and began to promote it in different but confusing ways. This campaigns do little, to explain what cloud computing is and why it is becoming increasingly necessary. The lack of understanding in this new tech...

  11. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  12. Computer, Informatics, Cybernetics and Applications : Proceedings of the CICA 2011

    CERN Document Server

    Hua, Ertian; Lin, Yun; Liu, Xiaozhu

    2012-01-01

    Computer Informatics Cybernetics and Applications offers 91 papers chosen for publication from among 184 papers accepted for presentation to the International Conference on Computer, Informatics, Cybernetics and Applications 2011 (CICA 2011), held in Hangzhou, China, September 13-16, 2011. The CICA 2011 conference provided a forum for engineers and scientists in academia, industry, and government to address the most innovative research and development including technical challenges and social, legal, political, and economic issues, and to present and discuss their ideas, results, work in progress and experience on all aspects of Computer, Informatics, Cybernetics and Applications. Reflecting the broad scope of the conference, the contents are organized in these topical categories: Communication Technologies and Applications Intelligence and Biometrics Technologies Networks Systems and Web Technologies Data Modeling and Programming Languages Digital Image Processing Optimization and Scheduling Education and In...

  13. Introduction to lattice theory with computer science applications

    CERN Document Server

    Garg, Vijay K

    2015-01-01

    A computational perspective on partial order and lattice theory, focusing on algorithms and their applications This book provides a uniform treatment of the theory and applications of lattice theory. The applications covered include tracking dependency in distributed systems, combinatorics, detecting global predicates in distributed systems, set families, and integer partitions. The book presents algorithmic proofs of theorems whenever possible. These proofs are written in the calculational style advocated by Dijkstra, with arguments explicitly spelled out step by step. The author's intent

  14. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  15. Computational fluid dynamics principles and applications

    CERN Document Server

    Blazek, J

    2005-01-01

    Computational Fluid Dynamics (CFD) is an important design tool in engineering and also a substantial research tool in various physical sciences as well as in biology. The objective of this book is to provide university students with a solid foundation for understanding the numerical methods employed in today's CFD and to familiarise them with modern CFD codes by hands-on experience. It is also intended for engineers and scientists starting to work in the field of CFD or for those who apply CFD codes. Due to the detailed index, the text can serve as a reference handbook too. Each chapter includes an extensive bibliography, which provides an excellent basis for further studies. The accompanying companion website contains the sources of 1-D and 2-D Euler and Navier-Stokes flow solvers (structured and unstructured) as well as of grid generators. Provided are also tools for Von Neumann stability analysis of 1-D model equations. Finally, the companion website includes the source code of a dedicated visualisation so...

  16. Parallel evolutionary computation in bioinformatics applications.

    Science.gov (United States)

    Pinho, Jorge; Sobral, João Luis; Rocha, Miguel

    2013-05-01

    A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Polymorphous Computing Architecture (PCA) Application Benchmark 1: Three-Dimensional Radar Data Processing

    National Research Council Canada - National Science Library

    Lebak, J

    2001-01-01

    The DARPA Polymorphous Computing Architecture (PCA) program is building advanced computer architectures that can reorganize their computation and communication structures to achieve better overall application performance...

  18. CERN Computing Colloquium | Scientific Databases at Scale and SciDB | 27 May

    CERN Multimedia

    2013-01-01

    by Dr. Michael Stonebraker (MIT - Massachusetts Institute of Technology - Cambridge MA, USA) Monday 27 May 2013 from 2 p.m. to 4 p.m. at CERN ( 222-R-001 - Filtration Plant ) Abstract: As a general rule, scientists have shunned relational data management systems (RDBMS), choosing instead to “roll their own” on top of file system technology.  We first discuss why file systems are a poor choice for science data storage, especially as data volumes become large and scalability becomes important. Then, we continue with the reasons why RDBMSs work poorly on most science applications.  These include a data model “impedance mismatch” and missing features. We discuss array DBMSs, and why they are a much better choice for science applications, and use SciDB as an exemplar of this new class of DBMSs. Most science applications require a mix of data management and complex analytics.  In most cases, the analytics entail a sequence of linear a...

  19. Deep Learning and Applications in Computational Biology

    KAUST Repository

    Zeng, Jianyang

    2016-01-26

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this work, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https://github.com/thucombio/deepnet-rbp.

  20. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  1. Application of CAPEC Lipid Property Databases in the Synthesis and Design of Biorefinery Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Cunico, Larissa; Gani, Rafiqul

    Petroleum is currently the primary raw material for the production of fuels and chemicals. Consequently, our society is highly dependent on fossil non-renewable resources. However, renewable raw materials are recently receiving increasing interest for the production of chemicals and fuels, so a n...... of biorefinery networks. The objective of this work is to show the application of databases of physical and thermodynamic properties of lipid components to the synthesis and design of biorefinery networks.......]. The wide variety and complex nature of components in biorefineries poses a challenge with respect to the synthesis and design of these types of processes. Whereas physical and thermodynamic property data or models for petroleum-based processes are widely available, most data and models for biobased...

  2. MySQL/PHP web database applications for IPAC proposal submission

    Science.gov (United States)

    Crane, Megan K.; Storrie-Lombardi, Lisa J.; Silbermann, Nancy A.; Rebull, Luisa M.

    2008-07-01

    The Infrared Processing and Analysis Center (IPAC) is NASA's multi-mission center of expertise for long-wavelength astrophysics. Proposals for various IPAC missions and programs are ingested via MySQL/PHP web database applications. Proposers use web forms to enter coversheet information and upload PDF files related to the proposal. Upon proposal submission, a unique directory is created on the webserver into which all of the uploaded files are placed. The coversheet information is converted into a PDF file using a PHP extension called FPDF. The files are concatenated into one PDF file using the command-line tool pdftk and then forwarded to the review committee. This work was performed at the California Institute of Technology under contract to the National Aeronautics and Space Administration.

  3. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  4. First International Conference on Intelligent Computing and Applications

    CERN Document Server

    Kar, Rajib; Das, Swagatam; Panigrahi, Bijaya

    2015-01-01

    The idea of the 1st International Conference on Intelligent Computing and Applications (ICICA 2014) is to bring the Research Engineers, Scientists, Industrialists, Scholars and Students together from in and around the globe to present the on-going research activities and hence to encourage research interactions between universities and industries. The conference provides opportunities for the delegates to exchange new ideas, applications and experiences, to establish research relations and to find global partners for future collaboration. The proceedings covers latest progresses in the cutting-edge research on various research areas of Image, Language Processing, Computer Vision and Pattern Recognition, Machine Learning, Data Mining and Computational Life Sciences, Management of Data including Big Data and Analytics, Distributed and Mobile Systems including Grid and Cloud infrastructure, Information Security and Privacy, VLSI, Electronic Circuits, Power Systems, Antenna, Computational fluid dynamics & Hea...

  5. The BioFragment Database (BFDb): An open-data platform for computational chemistry analysis of noncovalent interactions

    Science.gov (United States)

    Burns, Lori A.; Faver, John C.; Zheng, Zheng; Marshall, Michael S.; Smith, Daniel G. A.; Vanommeslaeghe, Kenno; MacKerell, Alexander D.; Merz, Kenneth M.; Sherrill, C. David

    2017-10-01

    Accurate potential energy models are necessary for reliable atomistic simulations of chemical phenomena. In the realm of biomolecular modeling, large systems like proteins comprise very many noncovalent interactions (NCIs) that can contribute to the protein's stability and structure. This work presents two high-quality chemical databases of common fragment interactions in biomolecular systems as extracted from high-resolution Protein DataBank crystal structures: 3380 sidechain-sidechain interactions and 100 backbone-backbone interactions that inaugurate the BioFragment Database (BFDb). Absolute interaction energies are generated with a computationally tractable explicitly correlated coupled cluster with perturbative triples [CCSD(T)-F12] "silver standard" (0.05 kcal/mol average error) for NCI that demands only a fraction of the cost of the conventional "gold standard," CCSD(T) at the complete basis set limit. By sampling extensively from biological environments, BFDb spans the natural diversity of protein NCI motifs and orientations. In addition to supplying a thorough assessment for lower scaling force-field (2), semi-empirical (3), density functional (244), and wavefunction (45) methods (comprising >1M interaction energies), BFDb provides interactive tools for running and manipulating the resulting large datasets and offers a valuable resource for potential energy model development and validation.

  6. [Mobile phone-computer wireless interactive graphics transmission technology and its medical application].

    Science.gov (United States)

    Huang, Shuo; Liu, Jing

    2010-05-01

    Application of clinical digital medical imaging has raised many tough issues to tackle, such as data storage, management, and information sharing. Here we investigated a mobile phone based medical image management system which is capable of achieving personal medical imaging information storage, management and comprehensive health information analysis. The technologies related to the management system spanning the wireless transmission technology, the technical capabilities of phone in mobile health care and management of mobile medical database were discussed. Taking medical infrared images transmission between phone and computer as an example, the working principle of the present system was demonstrated.

  7. Proceedings of national symposium on computer applications in power plants

    International Nuclear Information System (INIS)

    1992-01-01

    The National Symposium on Computer Applications in Power Plants was organized to help promote exchange of views among scientists and engineers engaged in design, engineering, operation and maintenance of computer based systems in nuclear power plants, conventional power plants, heavy water plants, nuclear fuel cycle facilities and allied industries. About one hundred papers were presented at the Symposium. Those falling within the subject scope of INIS have been processed separately. (author)

  8. Computer language Mathsy and applications to solid state physics

    International Nuclear Information System (INIS)

    Peterson, G.; Budgor, A.B.

    1980-01-01

    The high-level interactive mathematics and graphics computer language, Mathsy, is discussed and demonstrated with sample applications. Mathsy is an interpretive, interactive, mathematical, array processing, and graphics system. Among its diverse uses in the laser fusion project at the Lawrence Livermore Laboratory, it has enabled the conceptualization of a new algorithm to compute the density of electron or phonon states spectra which requires no root solving

  9. Computational biomechanics for medicine from algorithms to models and applications

    CERN Document Server

    Joldes, Grand; Nielsen, Poul; Doyle, Barry; Miller, Karol

    2017-01-01

    This volume comprises the latest developments in both fundamental science and patient-specific applications, discussing topics such as: cellular mechanics; injury biomechanics; biomechanics of heart and vascular system; medical image analysis; and both patient-specific fluid dynamics and solid mechanics simulations. With contributions from researchers world-wide, the Computational Biomechanics for Medicine series of titles provides an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements.

  10. Application of the newly developed Japanese adenosine normal database for adenosine stress myocardial scintigraphy.

    Science.gov (United States)

    Harata, Shingo; Isobe, Satoshi; Morishima, Itsuro; Suzuki, Susumu; Tsuboi, Hideyuki; Sone, Takahito; Ishii, Hideki; Murohara, Toyoaki

    2015-10-01

    The currently available Japanese normal database (NDB) in stress myocardial perfusion scintigraphy recommended by the Japanese Society of Nuclear Medicine (JSNM-NDB) is created based on the data from exercise tests. The newly developed adenosine normal database (ADS-NDB) remains to be validated for patients undergoing adenosine stress test. We tested whether the diagnostic accuracy of adenosine stress test is improved by the use of ADS-NDB (Kanazawa University). Of 233 consecutive patients undergoing (99m)Tc-MIBI adenosine stress test, 112 patients were tested. The stress/rest myocardial (99m)Tc-MIBI single-photon emission computed tomography (SPECT) images were analyzed by AutoQUANT 7.2 with both ADS-NDB and JSNM-NDB. The summed stress score (SSS) and summed difference score (SDS) were calculated. The agreements of the post-stress defect severity between ADS-NDB and JSNM-NDB were assessed using a weighted kappa statistic. In all patients, mean SSSs of all, right coronary artery (RCA), left anterior descending (LAD), and left circumflex (LCx) territories were significantly lower with ADS-NDB than those with JSNM-NDB. Mean SDSs in all, RCA, and LAD territories were significantly lower with ADS-NDB than those with JSNM-NDB. In 28 patients with significant coronary stenosis, the mean SSS in the RCA territory was significantly lower with ADS-NDB than that with JSNM-NDB. In 84 patients without ischemia, both mean SSSs and SDSs in all, RCA, LAD, and LCx territories were significantly lower with ADS-NDB than those with JSNM-NDB. Weighted kappa values of all patients, patients with significant stenosis, and patients without ischemia were 0.89, 0.83, and 0.92, respectively. Differences were observed between results from ADS-NDB and JSNM-NDB. The diagnostic accuracy of adenosine stress myocardial perfusion scintigraphy may be improved by reducing false-positive results.

  11. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    Science.gov (United States)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  12. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Kothe, Douglas B [ORNL; Nam, Hai Ah [ORNL

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  13. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    International Nuclear Information System (INIS)

    Joubert, Wayne; Kothe, Douglas B.; Nam, Hai Ah

    2009-01-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  14. Comparison of the thermodynamic databases for radioactive elements in application to the calculation of the solubilities in the porewater

    International Nuclear Information System (INIS)

    Doi, Reisuke; Shibata, Masahiro

    2006-07-01

    To calculate the solubility of radioactive elements which is the important parameter for performance assessment of geological disposal system, the thermodynamic database must be reliable and based on the latest information. In this research, it has been compared in the calculation of the solubilities of the representative radioactive elements in the porewater compositions of the compacted bentonite which were set up in the second progress report (H12) that the thermodynamic database of JNC, OECD/NEA, Nagra/PSI. And the causes of the differences among the results from application of different databases were investigated and discussed. (author)

  15. Geometric computations with interval and new robust methods applications in computer graphics, GIS and computational geometry

    CERN Document Server

    Ratschek, H

    2003-01-01

    This undergraduate and postgraduate text will familiarise readers with interval arithmetic and related tools to gain reliable and validated results and logically correct decisions for a variety of geometric computations plus the means for alleviating the effects of the errors. It also considers computations on geometric point-sets, which are neither robust nor reliable in processing with standard methods. The authors provide two effective tools for obtaining correct results: (a) interval arithmetic, and (b) ESSA the new powerful algorithm which improves many geometric computations and makes th

  16. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  17. Fluid dynamics applications of the Illiac IV computer

    Science.gov (United States)

    Maccormack, R. W.; Stevens, K. G., Jr.

    1976-01-01

    The Illiac IV is a parallel-structure computer with computing power an order of magnitude greater than that of conventional computers. It can be used for experimental tasks in fluid dynamics which can be simulated more economically, for simulating flows that cannot be studied by experiment, and for combining computer and experimental simulations. The architecture of Illiac IV is described, and the use of its parallel operation is demonstrated on the example of its solution of the one-dimensional wave equation. For fluid dynamics problems, a special FORTRAN-like vector programming language was devised, called CFD language. Two applications are described in detail: (1) the determination of the flowfield around the space shuttle, and (2) the computation of transonic turbulent separated flow past a thick biconvex airfoil.

  18. Reversible logic synthesis methodologies with application to quantum computing

    CERN Document Server

    Taha, Saleem Mohammed Ridha

    2016-01-01

    This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions  are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Rese...

  19. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  20. Introduction to computational mass transfer with applications to chemical engineering

    CERN Document Server

    Yu, Kuo-Tsung

    2017-01-01

    This book offers an easy-to-understand introduction to the computational mass transfer (CMT) method. On the basis of the contents of the first edition, this new edition is characterized by the following additional materials. It describes the successful application of this method to the simulation of the mass transfer process in a fluidized bed, as well as recent investigations and computing methods for predictions for the multi-component mass transfer process. It also demonstrates the general issues concerning computational methods for simulating the mass transfer of the rising bubble process. This new edition has been reorganized by moving the preparatory materials for Computational Fluid Dynamics (CFD) and Computational Heat Transfer into appendices, additions of new chapters, and including three new appendices on, respectively, generalized representation of the two-equation model for the CMT, derivation of the equilibrium distribution function in the lattice-Boltzmann method, and derivation of the Navier-S...

  1. Application of computer data processing of well logging in Azerbaijan

    International Nuclear Information System (INIS)

    Vorob'ev, Yu.A.; Shilov, G.Ya.; Samedova, A.S.

    1989-01-01

    Transition from the mannal quantitative interpretation of materials of well-logging study (WLS) to application of computer in production association (PA) Azneftegeologiya is described. WLS materials were processed manually in PA till 1986. Later on interpretation was conducted with the use of computer in order to determine clayiness, porosity, oil and gas saturation, fluid of strata. Examples of presentation of results of computer interpretation of WLS data (including gamma-logging and neutron-gamma-logging) for determining porosity and oil saturation of sandy mudrocks are given

  2. Application of chaos and fractals to computer vision

    CERN Document Server

    Farmer, Michael E

    2014-01-01

    This book provides a thorough investigation of the application of chaos theory and fractal analysis to computer vision. The field of chaos theory has been studied in dynamical physical systems, and has been very successful in providing computational models for very complex problems ranging from weather systems to neural pathway signal propagation. Computer vision researchers have derived motivation for their algorithms from biology and physics for many years as witnessed by the optical flow algorithm, the oscillator model underlying graphical cuts and of course neural networks. These algorithm

  3. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  4. Artificial intelligence program in a computer application supporting reactor operations

    International Nuclear Information System (INIS)

    Stratton, R.C.; Town, G.G.

    1985-01-01

    Improving nuclear reactor power plant operability is an ever-present concern for the nuclear industry. The definition of plant operability involves a complex interaction of the ideas of reliability, safety, and efficiency. This paper presents observations concerning the issues involved and the benefits derived from the implementation of a computer application which combines traditional computer applications with artificial intelligence (AI) methodologies. A system, the Component Configuration Control System (CCCS), is being installed to support nuclear reactor operations at the Experimental Breeder Reactor II

  5. [Cardiac computed tomography: new applications of an evolving technique].

    Science.gov (United States)

    Martín, María; Corros, Cecilia; Calvo, Juan; Mesa, Alicia; García-Campos, Ana; Rodríguez, María Luisa; Barreiro, Manuel; Rozado, José; Colunga, Santiago; de la Hera, Jesús M; Morís, César; Luyando, Luis H

    2015-01-01

    During the last years we have witnessed an increasing development of imaging techniques applied in Cardiology. Among them, cardiac computed tomography is an emerging and evolving technique. With the current possibility of very low radiation studies, the applications have expanded and go further coronariography In the present article we review the technical developments of cardiac computed tomography and its new applications. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  6. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  7. Application of Database Approaches to the Study of Earth's Aeolian Environments: Community Needs and Goals

    Science.gov (United States)

    Scuderi, Louis A.; Weissmann, Gary S.; Hartley, Adrian J.; Yang, Xiaoping; Lancaster, Nicholas

    2017-08-01

    Aeolian science is faced with significant challenges that impact its ability to benefit from recent advances in information technology. The discipline deals with high-end systems in the form of ground and satellite based sensors, computer modeling and simulation, and wind tunnel experiments. Aeolian scientists also collect field data manually with observational methods that may differ significantly between studies with little agreement on even basic morphometric parameters and terminology. Data produced from these studies, while forming the core of research papers and reports, is rarely available to the community at large. Recent advances are also superimposed on an underlying semantic structure that dates to the 1800's or earlier that is confusing, with ambiguously defined, and at times even contradictory, meanings. The aeolian "world-view" does not always fit within neat increments nor is defined by crisp objects. Instead change is continuous and features are fuzzy. Development of an ontological framework to guide spatiotemporal research is the fundamental starting point for organizing data in aeolian science. This requires a "rethinking" of how we define, collect, process, store and share data along with the development of a community-wide collaborative approach designed to bring the discipline into a data rich future. There is also a pressing need to develop efficient methods to integrate, analyze and manage spatial and temporal data and to promote data produced by aeolian scientists so it is available for preparing diagnostic studies, as input into a range of environmental models, and for advising national and international bodies that drive research agendas. This requires the establishment of working groups within the discipline to deal with content, format, processing pipelines, knowledge discovery tools and database access issues unique to aeolian science. Achieving this goal requires the development of comprehensive and highly-organized databases, tools

  8. Are ‘Agent’ Exclusion Clauses a Legitimate Application of the EU Database Directive?

    Directory of Open Access Journals (Sweden)

    Jimi Groom

    2004-03-01

    Full Text Available This article explores the implications of the implementation of the European Database Directive in the area of autonomous agents and the use of exclusion tools in the part of database owners to stop agents accessing their works.

  9. Are Agent Exclusion Clauses a Legitimate Application of the EU Database Directive?

    OpenAIRE

    Jimi Groom

    2004-01-01

    This article explores the implications of the implementation of the European Database Directive in the area of autonomous agents and the use of exclusion tools in the part of database owners to stop agents accessing their works.

  10. A PC/workstation cluster computing environment for reservoir engineering simulation applications

    International Nuclear Information System (INIS)

    Hermes, C.E.; Koo, J.

    1995-01-01

    Like the rest of the petroleum industry, Texaco has been transferring its applications and databases from mainframes to PC's and workstations. This transition has been very positive because it provides an environment for integrating applications, increases end-user productivity, and in general reduces overall computing costs. On the down side, the transition typically results in a dramatic increase in workstation purchases and raises concerns regarding the cost and effective management of computing resources in this new environment. The workstation transition also places the user in a Unix computing environment which, to say the least, can be quite frustrating to learn and to use. This paper describes the approach, philosophy, architecture, and current status of the new reservoir engineering/simulation computing environment developed at Texaco's E and P Technology Dept. (EPTD) in Houston. The environment is representative of those under development at several other large oil companies and is based on a cluster of IBM and Silicon Graphics Intl. (SGI) workstations connected by a fiber-optics communications network and engineering PC's connected to local area networks, or Ethernets. Because computing resources and software licenses are shared among a group of users, the new environment enables the company to get more out of its investments in workstation hardware and software

  11. Design of multi-tiered database application based on CORBA component in SDUV-FEL system

    International Nuclear Information System (INIS)

    Sun Xiaoying; Shen Liren; Dai Zhimin

    2004-01-01

    The drawback of usual two-tiered database architecture was analyzed and the Shanghai Deep Ultraviolet-Free Electron Laser database system under development was discussed. A project for realizing the multi-tiered database architecture based on common object request broker architecture (CORBA) component and middleware model constructed by C++ was presented. A magnet database was given to exhibit the design of the CORBA component. (authors)

  12. The acceptability of computer applications to group practices.

    Science.gov (United States)

    Zimmerman, J; Gordon, R S; Tao, D K; Boxerman, S B

    1978-01-01

    Of the 72 identified group practices in a midwest urban environment, 39 were found to use computers. The practices had been influenced strongly by vendors in their selection of an automated system or service, and had usually spent less than a work-month analyzing their needs and reviewing alternate ways in which those needs could be met. Ninety-seven percent of the practices had some financial applications and 64% had administrative applications, but only 2.5% had medical applications. For half the practices at least 2 months elapsed from the time the automated applications were put into operation until they were considered to be integrated into the office routine. Advantages experienced by at least a third of the practices using computers were that the work was done faster, information was more readily available, and costs were reduced. The most common disadvantage was inflexibility. Most (89%) of the practices believed that automation was preferable to their previous manual system.

  13. Discover knowledge in databases: Mining of data and applications; Descubrir conocimiento en bases de datos: Mineria de datos y aplicaciones

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Martinez, Andres F [Instituto de Investigaciones Electricas, Temixco, Morelos (Mexico); Morales Manzanares, Eduardo [Instituto Tecnologico de Estudios Superiores de Monterrey (ITESM), Campus Cuernavaca, Morelos (Mexico)

    2000-07-01

    In the last years it has existed an enormous growth in the generation capacity and information storage, due to the increasing automation of processes in general and to the advances in the information capacity storage. Unfortunately, the information analysis techniques have not shown an equivalent development, reason why it exists the necessity of a new generation of computing techniques and tools that can assist the one who makes decisions in the automatic and intelligent analysis of large information volumes. To find useful knowledge among great amounts of data is the main objective of the area of discovery of knowledge in databases. The present article has like objective the spread of the process of discovering the knowledge in databases in general and the concept of mining of data in particular; to establish the relation that exists between the process of discovering knowledge in databases and the mining of data; as well as to fix the characteristics and complexities of looking for useful patterns in the data. Also the main methods of mining of data and the areas of application are described, where these algorithms have had greater success. [Spanish] En los ultimos anos ha existido un enorme crecimiento en la capacidad de generacion y almacenamiento de informacion, debido a la creciente automatizacion de procesos en general y a los avances en las capacidades de almacenamiento de informacion. Desafortunadamente, las tecnicas de analisis de informacion no han mostrado un desarrollo equivalente, por lo que existe la necesidad de una nueva generacion de tecnicas y herramientas computacionales que puedan asistir a quien toma decisiones en el analisis automatico e inteligente de grandes volumenes de informacion. Encontrar conocimiento util entre grandes cantidades de datos es el objetivo principal del area de descubrimiento de conocimiento en bases de datos. El presente articulo tiene como objetivo difundir el proceso de descubrir conocimiento en bases de datos en

  14. International Conference on Soft Computing Techniques and Engineering Application

    CERN Document Server

    Li, Xiaolong

    2014-01-01

    The main objective of ICSCTEA 2013 is to provide a platform for researchers, engineers and academicians from all over the world to present their research results and development activities in soft computing techniques and engineering application. This conference provides opportunities for them to exchange new ideas and application experiences face to face, to establish business or research relations and to find global partners for future collaboration.

  15. Computer-aided diagnosis workstation and database system for chest diagnosis based on multi-helical CT images

    Science.gov (United States)

    Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou

    2006-03-01

    Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.

  16. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  17. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  18. Identifying Social Impacts in Product Supply Chains:Overview and Application of the Social Hotspot Database

    Directory of Open Access Journals (Sweden)

    Gregory Norris

    2012-08-01

    Full Text Available One emerging tool to measure the social-related impacts in supply chains is Social Life Cycle Assessment (S-LCA, a derivative of the well-established environmental LCA technique. LCA has recently started to gain popularity among large corporations and initiatives, such as The Sustainability Consortium or the Sustainable Apparel Coalition. Both have made the technique a cornerstone of their applied-research program. The Social Hotspots Database (SHDB is an overarching, global database that eases the data collection burden in S-LCA studies. Proposed “hotspots” are production activities or unit processes (also defined as country-specific sectors in the supply chain that may be at risk for social issues to be present. The SHDB enables efficient application of S-LCA by allowing users to prioritize production activities for which site-specific data collection is most desirable. Data for three criteria are used to inform prioritization: (1 labor intensity in worker hours per unit process and (2 risk for, or opportunity to affect, relevant social themes or sub-categories related to Human Rights, Labor Rights and Decent Work, Governance and Access to Community Services (3 gravity of a social issue. The Worker Hours Model was developed using a global input/output economic model and wage rate data. Nearly 200 reputable sources of statistical data have been used to develop 20 Social Theme Tables by country and sector. This paper presents an overview of the SHDB development and features, as well as results from a pilot study conducted on strawberry yogurt. This study, one of seven Social Scoping Assessments mandated by The Sustainability Consortium, identifies the potential social hotspots existing in the supply chain of strawberry yogurt. With this knowledge, companies that manufacture or sell yogurt can refine their data collection efforts in order to put their social responsibility performance in perspective and effectively set up programs and

  19. The Swedish Family-Cancer Database: Update, Application to Colorectal Cancer and Clinical Relevance

    Directory of Open Access Journals (Sweden)

    Hemminki Kari

    2005-01-01

    Full Text Available Abstract The Swedish Family-Cancer Database has been used for almost 10 years in the study of familial risks at all common sites. In the present paper we describe some main features of version VI of this Database, assembled in 2004. This update included all Swedes born in 1932 and later (offspring with their biological parents, a total of 10.5 million individuals. Cancer cases were retrieved from the Swedish Cancer Registry from 1958-2002, including over 1.2 million first and multiple primary cancers and in situ tumours. Compared to previous versions, only 6.0% of deceased offspring with a cancer diagnosis lack any parental information. We show one application of the Database in the study of familial risks in colorectal adenocarcinoma, with defined age-group and anatomic site specific analyses. Familial standardized incidence ratios (SIRs were determined for offspring when parents or sibling were diagnosed with colon or rectal cancer. As a novel finding it was shown that risks for siblings were higher than those for offspring of affected parents. The excess risk was limited to colon cancer and particularly to right-sided colon cancer. The SIRs for colon cancer in age matched populations were 2.58 when parents were probands and 3.81 when siblings were probands; for right-sided colon cancer the SIRs were 3.66 and 7.53, respectively. Thus the familial excess (SIR-1.00 was more than two fold higher for right-sided colon cancer. Colon and rectal cancers appeared to be distinguished between high-penetrant and recessive conditions that only affect the colon, whereas low-penetrant familial effects are shared by the two sites. Epidemiological studies can be used to generate clinical estimates for familial risk, conditioned on numbers of affected family members and their ages of onset. Useful risk estimates have been developed for familial breast and prostate cancers. Reliable risk estimates for other cancers should also be seriously considered for

  20. The European power plant infrastructure-Presentation of the Chalmers energy infrastructure database with applications

    International Nuclear Information System (INIS)

    Kjaerstad, Jan; Johnsson, Filip

    2007-01-01

    This paper presents a newly established database of the European power plant infrastructure (power plants, fuel infrastructure, fuel resources and CO 2 storage options) for the EU25 member states (MS) and applies the database in a general discussion of the European power plant and natural gas infrastructure as well as in a simple simulation analysis of British and German power generation up to the year 2050 with respect to phase-out of existing generation capacity, fuel mix and fuel dependency. The results are discussed with respect to age structure of the current production plants, CO 2 emissions, natural gas dependency and CO 2 capture and storage (CCS) under stringent CO 2 emission constraints. The analysis of the information from the power plant database, which includes planned projects, shows large variations in power plant infrastructure between the MS and a clear shift to natural gas-fuelled power plants during the last decade. The data indicates that this shift may continue in the short-term up to 2010 since the majority of planned plants are natural gas fired. The gas plants are, however, geographically concentrated to southern and northwest Europe. The data also shows large activities in the upstream gas sector to accommodate the ongoing shift to gas with pipelines, liquefaction plants and regasification terminals being built and gas fields being prepared for production. At the same time, utilities are integrating upwards in the fuel chain in order to secure supply while oil and gas companies are moving downwards the fuel chain to secure access to markets. However, it is not yet possible to state whether the ongoing shift to natural gas will continue in the medium term, i.e. after 2010, since this will depend on a number of factors as specified below. Recently there have also been announcements for construction of a number of new coal plants. The results of the simulations for the German and British power sector show that combination of a relatively low

  1. Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

    Science.gov (United States)

    Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M

    2014-04-01

    Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.

  2. Cloud-Based NoSQL Open Database of Pulmonary Nodules for Computer-Aided Lung Cancer Diagnosis and Reproducible Research.

    Science.gov (United States)

    Ferreira Junior, José Raniery; Oliveira, Marcelo Costa; de Azevedo-Marques, Paulo Mazzoncini

    2016-12-01

    Lung cancer is the leading cause of cancer-related deaths in the world, and its main manifestation is pulmonary nodules. Detection and classification of pulmonary nodules are challenging tasks that must be done by qualified specialists, but image interpretation errors make those tasks difficult. In order to aid radiologists on those hard tasks, it is important to integrate the computer-based tools with the lesion detection, pathology diagnosis, and image interpretation processes. However, computer-aided diagnosis research faces the problem of not having enough shared medical reference data for the development, testing, and evaluation of computational methods for diagnosis. In order to minimize this problem, this paper presents a public nonrelational document-oriented cloud-based database of pulmonary nodules characterized by 3D texture attributes, identified by experienced radiologists and classified in nine different subjective characteristics by the same specialists. Our goal with the development of this database is to improve computer-aided lung cancer diagnosis and pulmonary nodule detection and classification research through the deployment of this database in a cloud Database as a Service framework. Pulmonary nodule data was provided by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), image descriptors were acquired by a volumetric texture analysis, and database schema was developed using a document-oriented Not only Structured Query Language (NoSQL) approach. The proposed database is now with 379 exams, 838 nodules, and 8237 images, 4029 of them are CT scans and 4208 manually segmented nodules, and it is allocated in a MongoDB instance on a cloud infrastructure.

  3. Web application for monitoring mainframe computer, Linux operating systems and application servers

    OpenAIRE

    Dimnik, Tomaž

    2016-01-01

    This work presents the idea and the realization of web application for monitoring the operation of the mainframe computer, servers with Linux operating system and application servers. Web application is intended for administrators of these systems, as an aid to better understand the current state, load and operation of the individual components of the server systems.

  4. The Poor Man's Guide to Computer Networks and their Applications

    DEFF Research Database (Denmark)

    Sharp, Robin

    2003-01-01

    These notes for DTU course 02220, Concurrent Programming, give an introduction to computer networks, with focus on the modern Internet. Basic Internet protocols such as IP, TCP and UDP are presented, and two Internet application protocols, SMTP and HTTP, are described in some detail. Techniques...

  5. Graphing and Percentage Applications Using the Personal Computer.

    Science.gov (United States)

    Innes, Jay

    1985-01-01

    The paper describes how "IBM Graphing Assistant" and "Apple Softgraph" can foster a multifaceted approach to application of mathematical concepts and how a survey can be undertaken using the computer as word processor, data bank, and source of visual displays. Mathematical skills reinforced include estimating, rounding, graphing, and solving…

  6. Evaluating the Effectiveness of Computer Applications in Developing English Learning

    Science.gov (United States)

    Whitaker, James Todd

    2016-01-01

    I examined the effectiveness of self-directed learning and English learning with computer applications on college students in Bangkok, Thailand, in a control-group experimental-group pretest-posttest design. The hypothesis was tested using a t test: two-sample assuming unequal variances to establish the significance of mean scores between the two…

  7. Use of Computer-Generated Holograms in Security Hologram Applications

    Directory of Open Access Journals (Sweden)

    Bulanovs A.

    2016-10-01

    Full Text Available The article discusses the use of computer-generated holograms (CGHs for the application as one of the security features in the relief-phase protective holograms. An improved method of calculating CGHs is presented, based on ray-tracing approach in the case of interference of parallel rays.

  8. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  9. Recent developments and applications in mathematics and computer science

    International Nuclear Information System (INIS)

    Churchhouse, R.F.; Tahir Shah, K.; Zanella, P.

    1991-01-01

    The book contains 8 invited lectures and 4 short seminars presented at the College on Recent Developments and Applications in Mathematics and Computer Science held in Trieste from 7 May to 1 June 1990. A separate abstract was prepared for each paper. Refs, figs and tabs

  10. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  11. Design of the system of maintenance operations occupational safety and health database application of nuclear power station

    International Nuclear Information System (INIS)

    Wang Xuehong; Li Xiangyang; Ye Yongjun

    2011-01-01

    Based on the KKS code of building equipment in nuclear power station, this paper introduces the method of establishing the system of maintenance operation occupational safety and health database application. Through the application system of maintenance occupational safety and health database, it can summarize systematically all kinds of maintenance operation dangerous factor of nuclear power station, and make a convenience for staff to learn the maintenance operation dangerous factors and the prevention measures, so that it can achieve the management concept of 'precaution crucial, continuous improvement' that advocated by OSHMS. (authors)

  12. Application of Google Maps API service for creating web map of information retrieved from CORINE land cover databases

    Directory of Open Access Journals (Sweden)

    Kilibarda Milan

    2010-01-01

    Full Text Available Today, Google Maps API application based on Ajax technology as standard web service; facilitate users with publication interactive web maps, thus opening new possibilities in relation to the classical analogue maps. CORINE land cover databases are recognized as the fundamental reference data sets for numerious spatial analysis. The theoretical and applicable aspects of Google Maps API cartographic service are considered on the case of creating web map of change in urban areas in Belgrade and surround from 2000. to 2006. year, obtained from CORINE databases.

  13. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  14. Computational Intelligence and Decision Making Trends and Applications

    CERN Document Server

    Madureira, Ana; Marques, Viriato

    2013-01-01

    This book provides a general overview and original analysis of new developments and applications in several areas of Computational Intelligence and Information Systems. Computational Intelligence has become an important tool for engineers to develop and analyze novel techniques to solve problems in basic sciences such as physics, chemistry, biology, engineering, environment and social sciences.   The material contained in this book addresses the foundations and applications of Artificial Intelligence and Decision Support Systems, Complex and Biological Inspired Systems, Simulation and Evolution of Real and Artificial Life Forms, Intelligent Models and Control Systems, Knowledge and Learning Technologies, Web Semantics and Ontologies, Intelligent Tutoring Systems, Intelligent Power Systems, Self-Organized and Distributed Systems, Intelligent Manufacturing Systems and Affective Computing. The contributions have all been written by international experts, who provide current views on the topics discussed and pr...

  15. STAR - A computer language for hybrid AI applications

    Science.gov (United States)

    Borchardt, G. C.

    1986-01-01

    Constructing Artificial Intelligence application systems which rely on both symbolic and non-symbolic processing places heavy demands on the communication of data between dissimilar languages. This paper describes STAR (Simple Tool for Automated Reasoning), a computer language for the development of AI application systems which supports the transfer of data structures between a symbolic level and a non-symbolic level defined in languages such as FORTRAN, C and PASCAL. The organization of STAR is presented, followed by the description of an application involving STAR in the interpretation of airborne imaging spectrometer data.

  16. Tsunami early warning in the Mediterranean: role, structure and tricks of pre-computed tsunami simulation databases and matching/forecasting algorithms

    Science.gov (United States)

    Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano

    2014-05-01

    The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the

  17. Computer applications for the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Worth, G.A.; Patterson, J.R.

    1976-01-01

    Computer applications for the FFTF reactor include plant surveillance functions and fuel handling and examination control functions. Plant surveillance systems provide the reactor operator with a selection of over forty continuously updated, formatted displays of correlated data. All data are checked for limits and validity and the operator is advised of any anomaly. Data are also recorded on magnetic tape for historical purposes. The system also provides calculated variables, such as reactor thermal power and anomalous reactivity. Supplementing the basic plant surveillance computer system is a minicomputer system that monitors the reactor cover gas to detect and characterize absorber or fuel pin failures. In addition to plant surveillance functions, computers are used in the FFTF for controlling selected refueling equipment and for post-irradiation fuel pin examination. Four fuel handling or examination systems operate under computer control with manual monitoring and over-ride capability

  18. Computer graphics application in the engineering design integration system

    Science.gov (United States)

    Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.

    1975-01-01

    The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.

  19. Research on application of computer technologies in jewelry process

    Directory of Open Access Journals (Sweden)

    Junbo Xia

    2017-06-01

    Full Text Available Jewelry production is a process of precious raw materials and low losses in processing. The traditional manual mode is unable to meet the needs of enterprises in reality, while the involvement of computer technology can just solve this practical problem. At present, the problem of restricting the application for computer in jewelry production is mainly a failure to find a production model that can serve the whole industry chain with the computer as the core of production. This paper designs a “synchronous and diversified” production model with “computer aided design technology” and “rapid prototyping technology” as the core, and tests with actual production cases, and achieves certain results, which are forward-looking and advanced.

  20. First Database Course--Keeping It All Organized

    Science.gov (United States)

    Baugh, Jeanne M.

    2015-01-01

    All Computer Information Systems programs require a database course for their majors. This paper describes an approach to such a course in which real world examples, both design projects and actual database application projects are incorporated throughout the semester. Students are expected to apply the traditional database concepts to actual…