WorldWideScience

Sample records for integrates numerous database

  1. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  2. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  3. Numerical databases in marine biology

    Digital Repository Service at National Institute of Oceanography (India)

    Sarupria, J.S.; Bhargava, R.M.S.

    stream_size 9 stream_content_type text/plain stream_name Natl_Workshop_Database_Networking_Mar_Biol_1991_45.pdf.txt stream_source_info Natl_Workshop_Database_Networking_Mar_Biol_1991_45.pdf.txt Content-Encoding ISO-8859-1 Content-Type... text/plain; charset=ISO-8859-1 ...

  4. A Database Integrity Pattern Language

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2004-08-01

    Full Text Available Patterns and Pattern Languages are ways to capture experience and make it re-usable for others, and describe best practices and good designs. Patterns are solutions to recurrent problems.This paper addresses the database integrity problems from a pattern perspective. Even if the number of vendors of database management systems is quite high, the number of available solutions to integrity problems is limited. They all learned from the past experience applying the same solutions over and over again.The solutions to avoid integrity threats applied to in database management systems (DBMS can be formalized as a pattern language. Constraints, transactions, locks, etc, are recurrent integrity solutions to integrity threats and therefore they should be treated accordingly, as patterns.

  5. Integrated spent nuclear fuel database system

    International Nuclear Information System (INIS)

    Henline, S.P.; Klingler, K.G.; Schierman, B.H.

    1994-01-01

    The Distributed Information Systems software Unit at the Idaho National Engineering Laboratory has designed and developed an Integrated Spent Nuclear Fuel Database System (ISNFDS), which maintains a computerized inventory of all US Department of Energy (DOE) spent nuclear fuel (SNF). Commercial SNF is not included in the ISNFDS unless it is owned or stored by DOE. The ISNFDS is an integrated, single data source containing accurate, traceable, and consistent data and provides extensive data for each fuel, extensive facility data for every facility, and numerous data reports and queries

  6. Cuba: Multidimensional numerical integration library

    Science.gov (United States)

    Hahn, Thomas

    2016-08-01

    The Cuba library offers four independent routines for multidimensional numerical integration: Vegas, Suave, Divonne, and Cuhre. The four algorithms work by very different methods, and can integrate vector integrands and have very similar Fortran, C/C++, and Mathematica interfaces. Their invocation is very similar, making it easy to cross-check by substituting one method by another. For further safeguarding, the output is supplemented by a chi-square probability which quantifies the reliability of the error estimate.

  7. 數據資料庫 Numeric Databases

    Directory of Open Access Journals (Sweden)

    Mei-ling Wang Chen

    1989-03-01

    Full Text Available 無In 1979, the International Communication Bureau of R.O.C. connected some U.S. information service centers through the international telecommunication network. Since then, there are Dialog, ORBIT & BRS introduced into this country. However, the users are interested in the bibliographic databases and seldomly know the non-bibliographic databases or the numeric databases. This article mainly describes the numeric database about its definition & characteristics, comparison with bibliographic databases, its producers. Service systems & users, data element, a brief introduction by the subject, its problem and future, Iibrary role and the present use status in the R.O.C.

  8. Methods for enhancing numerical integration

    International Nuclear Information System (INIS)

    Doncker, Elise de

    2003-01-01

    We give a survey of common strategies for numerical integration (adaptive, Monte-Carlo, Quasi-Monte Carlo), and attempt to delineate their realm of applicability. The inherent accuracy and error bounds for basic integration methods are given via such measures as the degree of precision of cubature rules, the index of a family of lattice rules, and the discrepancy of uniformly distributed point sets. Strategies incorporating these basic methods often use paradigms to reduce the error by, e.g., increasing the number of points in the domain or decreasing the mesh size, locally or uniformly. For these processes the order of convergence of the strategy is determined by the asymptotic behavior of the error, and may be too slow in practice for the type of problem at hand. For certain problem classes we may be able to improve the effectiveness of the method or strategy by such techniques as transformations, absorbing a difficult part of the integrand into a weight function, suitable partitioning of the domain, transformations and extrapolation or convergence acceleration. Situations warranting the use of these techniques (possibly in an 'automated' way) are described and illustrated by sample applications

  9. On Simplification of Database Integrity Constraints

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2006-01-01

    Without proper simplification techniques, database integrity checking can be prohibitively time consuming. Several methods have been developed for producing simplified incremental checks for each update but none until now of sufficient quality and generality for providing a true practical impact,...

  10. Integrating Variances into an Analytical Database

    Science.gov (United States)

    Sanchez, Carlos

    2010-01-01

    For this project, I enrolled in numerous SATERN courses that taught the basics of database programming. These include: Basic Access 2007 Forms, Introduction to Database Systems, Overview of Database Design, and others. My main job was to create an analytical database that can handle many stored forms and make it easy to interpret and organize. Additionally, I helped improve an existing database and populate it with information. These databases were designed to be used with data from Safety Variances and DCR forms. The research consisted of analyzing the database and comparing the data to find out which entries were repeated the most. If an entry happened to be repeated several times in the database, that would mean that the rule or requirement targeted by that variance has been bypassed many times already and so the requirement may not really be needed, but rather should be changed to allow the variance's conditions permanently. This project did not only restrict itself to the design and development of the database system, but also worked on exporting the data from the database to a different format (e.g. Excel or Word) so it could be analyzed in a simpler fashion. Thanks to the change in format, the data was organized in a spreadsheet that made it possible to sort the data by categories or types and helped speed up searches. Once my work with the database was done, the records of variances could be arranged so that they were displayed in numerical order, or one could search for a specific document targeted by the variances and restrict the search to only include variances that modified a specific requirement. A great part that contributed to my learning was SATERN, NASA's resource for education. Thanks to the SATERN online courses I took over the summer, I was able to learn many new things about computers and databases and also go more in depth into topics I already knew about.

  11. Numerical approach to one-loop integrals

    International Nuclear Information System (INIS)

    Fujimoto, Junpei; Shimizu, Yoshimitsu; Kato, Kiyoshi; Oyanagi, Yoshio.

    1992-01-01

    Two numerical methods are proposed for the calculation of one-loop scalar integrals. In the first method, the singularity is cancelled by the symmetrization of the integrand and the integration is done by a Monte-Carlo method. In the second one, after the transform of the integrand into a standard form, the integral is reduced into a regular numerical integral. These methods provide us practical tools to evaluate one-loop Feynman diagrams with desired numerical accuracy. They are extended to the integral with numerator and the treatment of the one-loop virtual correction to the cross section is also presented. (author)

  12. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  13. Loopedia, a database for loop integrals

    Science.gov (United States)

    Bogner, C.; Borowka, S.; Hahn, T.; Heinrich, G.; Jones, S. P.; Kerner, M.; von Manteuffel, A.; Michel, M.; Panzer, E.; Papara, V.

    2018-04-01

    Loopedia is a new database at loopedia.org for information on Feynman integrals, intended to provide both bibliographic information as well as results made available by the community. Its bibliometry is complementary to that of INSPIRE or arXiv in the sense that it admits searching for integrals by graph-theoretical objects, e.g. its topology.

  14. SINBAD: Shielding integral benchmark archive and database

    International Nuclear Information System (INIS)

    Hunter, H.T.; Ingersoll, D.T.; Roussin, R.W.

    1996-01-01

    SINBAD is a new electronic database developed to store a variety of radiation shielding benchmark data so that users can easily retrieve and incorporate the data into their calculations. SINBAD is an excellent data source for users who require the quality assurance necessary in developing cross-section libraries or radiation transport codes. The future needs of the scientific community are best served by the electronic database format of SINBAD and its user-friendly interface, combined with its data accuracy and integrity

  15. Optimal database locks for efficient integrity checking

    DEFF Research Database (Denmark)

    Martinenghi, Davide

    2004-01-01

    In concurrent database systems, correctness of update transactions refers to the equivalent effects of the execution schedule and some serial schedule over the same set of transactions. Integrity constraints add further semantic requirements to the correctness of the database states reached upon...... the execution of update transactions. Several methods for efficient integrity checking and enforcing exist. We show in this paper how to apply one such method to automatically extend update transactions with locks and simplified consistency tests on the locked entities. All schedules produced in this way...

  16. Nuclear integrated database and design advancement system

    International Nuclear Information System (INIS)

    Ha, Jae Joo; Jeong, Kwang Sub; Kim, Seung Hwan; Choi, Sun Young.

    1997-01-01

    The objective of NuIDEAS is to computerize design processes through an integrated database by eliminating the current work style of delivering hardcopy documents and drawings. The major research contents of NuIDEAS are the advancement of design processes by computerization, the establishment of design database and 3 dimensional visualization of design data. KSNP (Korea Standard Nuclear Power Plant) is the target of legacy database and 3 dimensional model, so that can be utilized in the next plant design. In the first year, the blueprint of NuIDEAS is proposed, and its prototype is developed by applying the rapidly revolutionizing computer technology. The major results of the first year research were to establish the architecture of the integrated database ensuring data consistency, and to build design database of reactor coolant system and heavy components. Also various softwares were developed to search, share and utilize the data through networks, and the detailed 3 dimensional CAD models of nuclear fuel and heavy components were constructed, and walk-through simulation using the models are developed. This report contains the major additions and modifications to the object oriented database and associated program, using methods and Javascript.. (author). 36 refs., 1 tab., 32 figs

  17. An Integrative Theory of Numerical Development

    Science.gov (United States)

    Siegler, Robert; Lortie-Forgues, Hugues

    2014-01-01

    Understanding of numerical development is growing rapidly, but the volume and diversity of findings can make it difficult to perceive any coherence in the process. The integrative theory of numerical development posits that a coherent theme is present, however--progressive broadening of the set of numbers whose magnitudes can be accurately…

  18. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  19. Numerical time integration for air pollution models

    NARCIS (Netherlands)

    J.G. Verwer (Jan); W. Hundsdorfer (Willem); J.G. Blom (Joke)

    1998-01-01

    textabstractDue to the large number of chemical species and the three space dimensions, off-the-shelf stiff ODE integrators are not feasible for the numerical time integration of stiff systems of advection-diffusion-reaction equations [ fracpar{c{t + nabla cdot left( vu{u c right) = nabla cdot left(

  20. A numerical method for resonance integral calculations

    International Nuclear Information System (INIS)

    Tanbay, Tayfun; Ozgener, Bilge

    2013-01-01

    A numerical method has been proposed for resonance integral calculations and a cubic fit based on least squares approximation to compute the optimum Bell factor is given. The numerical method is based on the discretization of the neutron slowing down equation. The scattering integral is approximated by taking into account the location of the upper limit in energy domain. The accuracy of the method has been tested by performing computations of resonance integrals for uranium dioxide isolated rods and comparing the results with empirical values. (orig.)

  1. High-integrity databases for helicopter operations

    Science.gov (United States)

    Pschierer, Christian; Schiefele, Jens; Lüthy, Juerg

    2009-05-01

    Helicopter Emergency Medical Service missions (HEMS) impose a high workload on pilots due to short preparation time, operations in low level flight, and landings in unknown areas. The research project PILAS, a cooperation between Eurocopter, Diehl Avionics, DLR, EADS, Euro Telematik, ESG, Jeppesen, the Universities of Darmstadt and Munich, and funded by the German government, approached this problem by researching a pilot assistance system which supports the pilots during all phases of flight. The databases required for the specified helicopter missions include different types of topological and cultural data for graphical display on the SVS system, AMDB data for operations at airports and helipads, and navigation data for IFR segments. The most critical databases for the PILAS system however are highly accurate terrain and obstacle data. While RTCA DO-276 specifies high accuracies and integrities only for the areas around airports, HEMS helicopters typically operate outside of these controlled areas and thus require highly reliable terrain and obstacle data for their designated response areas. This data has been generated by a LIDAR scan of the specified test region. Obstacles have been extracted into a vector format. This paper includes a short overview of the complete PILAS system and then focus on the generation of the required high quality databases.

  2. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  3. Distortion-Free Watermarking Approach for Relational Database Integrity Checking

    Directory of Open Access Journals (Sweden)

    Lancine Camara

    2014-01-01

    Full Text Available Nowadays, internet is becoming a suitable way of accessing the databases. Such data are exposed to various types of attack with the aim to confuse the ownership proofing or the content protection. In this paper, we propose a new approach based on fragile zero watermarking for the authentication of numeric relational data. Contrary to some previous databases watermarking techniques which cause some distortions in the original database and may not preserve the data usability constraints, our approach simply seeks to generate the watermark from the original database. First, the adopted method partitions the database relation into independent square matrix groups. Then, group-based watermarks are securely generated and registered in a trusted third party. The integrity verification is performed by computing the determinant and the diagonal’s minor for each group. As a result, tampering can be localized up to attribute group level. Theoretical and experimental results demonstrate that the proposed technique is resilient against tuples insertion, tuples deletion, and attributes values modification attacks. Furthermore, comparison with recent related effort shows that our scheme performs better in detecting multifaceted attacks.

  4. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  5. Automatic numerical integration methods for Feynman integrals through 3-loop

    International Nuclear Information System (INIS)

    De Doncker, E; Olagbemi, O; Yuasa, F; Ishikawa, T; Kato, K

    2015-01-01

    We give numerical integration results for Feynman loop diagrams through 3-loop such as those covered by Laporta [1]. The methods are based on automatic adaptive integration, using iterated integration and extrapolation with programs from the QUADPACK package, or multivariate techniques from the ParInt package. The Dqags algorithm from QuadPack accommodates boundary singularities of fairly general types. PARINT is a package for multivariate integration layered over MPI (Message Passing Interface), which runs on clusters and incorporates advanced parallel/distributed techniques such as load balancing among processes that may be distributed over a network of nodes. Results are included for 3-loop self-energy diagrams without IR (infra-red) or UV (ultra-violet) singularities. A procedure based on iterated integration and extrapolation yields a novel method of numerical regularization for integrals with UV terms, and is applied to a set of 2-loop self-energy diagrams with UV singularities. (paper)

  6. DENdb: database of integrated human enhancers

    KAUST Repository

    Ashoor, Haitham

    2015-09-05

    Enhancers are cis-acting DNA regulatory regions that play a key role in distal control of transcriptional activities. Identification of enhancers, coupled with a comprehensive functional analysis of their properties, could improve our understanding of complex gene transcription mechanisms and gene regulation processes in general. We developed DENdb, a centralized on-line repository of predicted enhancers derived from multiple human cell-lines. DENdb integrates enhancers predicted by five different methods generating an enriched catalogue of putative enhancers for each of the analysed cell-lines. DENdb provides information about the overlap of enhancers with DNase I hypersensitive regions, ChIP-seq regions of a number of transcription factors and transcription factor binding motifs, means to explore enhancer interactions with DNA using several chromatin interaction assays and enhancer neighbouring genes. DENdb is designed as a relational database that facilitates fast and efficient searching, browsing and visualization of information.

  7. DENdb: database of integrated human enhancers

    KAUST Repository

    Ashoor, Haitham; Kleftogiannis, Dimitrios A.; Radovanovic, Aleksandar; Bajic, Vladimir B.

    2015-01-01

    Enhancers are cis-acting DNA regulatory regions that play a key role in distal control of transcriptional activities. Identification of enhancers, coupled with a comprehensive functional analysis of their properties, could improve our understanding of complex gene transcription mechanisms and gene regulation processes in general. We developed DENdb, a centralized on-line repository of predicted enhancers derived from multiple human cell-lines. DENdb integrates enhancers predicted by five different methods generating an enriched catalogue of putative enhancers for each of the analysed cell-lines. DENdb provides information about the overlap of enhancers with DNase I hypersensitive regions, ChIP-seq regions of a number of transcription factors and transcription factor binding motifs, means to explore enhancer interactions with DNA using several chromatin interaction assays and enhancer neighbouring genes. DENdb is designed as a relational database that facilitates fast and efficient searching, browsing and visualization of information.

  8. Database specification for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB)

    Energy Technology Data Exchange (ETDEWEB)

    Faby, E.Z.; Fluker, J.; Hancock, B.R.; Grubb, J.W.; Russell, D.L. [Univ. of Tennessee, Knoxville, TN (United States); Loftis, J.P.; Shipe, P.C.; Truett, L.F. [Oak Ridge National Lab., TN (United States)

    1994-03-01

    This Database Specification for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB) describes the database organization and storage allocation, provides the detailed data model of the logical and physical designs, and provides information for the construction of parts of the database such as tables, data elements, and associated dictionaries and diagrams.

  9. An integrated numerical protection system (SPIN)

    International Nuclear Information System (INIS)

    Savornin, J.L.; Bouchet, J.M.; Furet, J.L.; Jover, P.; Sala, A.

    1978-01-01

    Developments in technology have now made it possible to perform more sophisticated protection functions which follow more closely the physical phenomena to be monitored. For this reason the Commissariat a l'energie atomique, Merlin-Gerin, Cerci and Framatome have embarked on the joint development of an Integrated Numerical Protection System (SPIN) which will fulfil this objective and will improve the safety and availability of power stations. The system described involves the use of programmed numerical techniques and a structure based on multiprocessors. The architecture has a redundancy of four. Throughout the development of the project the validity of the studies was confirmed by experiments. A first numerical model of a protection function was tested in the laboratory and is now in operation in a power station. A set of models was then introduced for checking the main components of the equipment finally chosen prior to building and testing a prototype. (author)

  10. An Integrated Molecular Database on Indian Insects.

    Science.gov (United States)

    Pratheepa, Maria; Venkatesan, Thiruvengadam; Gracy, Gandhi; Jalali, Sushil Kumar; Rangheswaran, Rajagopal; Antony, Jomin Cruz; Rai, Anil

    2018-01-01

    MOlecular Database on Indian Insects (MODII) is an online database linking several databases like Insect Pest Info, Insect Barcode Information System (IBIn), Insect Whole Genome sequence, Other Genomic Resources of National Bureau of Agricultural Insect Resources (NBAIR), Whole Genome sequencing of Honey bee viruses, Insecticide resistance gene database and Genomic tools. This database was developed with a holistic approach for collecting information about phenomic and genomic information of agriculturally important insects. This insect resource database is available online for free at http://cib.res.in. http://cib.res.in/.

  11. Numerical methods for engine-airframe integration

    International Nuclear Information System (INIS)

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison of full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment

  12. Ontology based heterogeneous materials database integration and semantic query

    Science.gov (United States)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  13. Emission & Generation Resource Integrated Database (eGRID)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Emissions & Generation Resource Integrated Database (eGRID) is an integrated source of data on environmental characteristics of electric power generation....

  14. Applying recursive numerical integration techniques for solving high dimensional integrals

    International Nuclear Information System (INIS)

    Ammon, Andreas; Genz, Alan; Hartung, Tobias; Jansen, Karl; Volmer, Julia; Leoevey, Hernan

    2016-11-01

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  15. Applying recursive numerical integration techniques for solving high dimensional integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [IVU Traffic Technologies AG, Berlin (Germany); Genz, Alan [Washington State Univ., Pullman, WA (United States). Dept. of Mathematics; Hartung, Tobias [King' s College, London (United Kingdom). Dept. of Mathematics; Jansen, Karl; Volmer, Julia [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leoevey, Hernan [Humboldt Univ. Berlin (Germany). Inst. fuer Mathematik

    2016-11-15

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  16. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  17. Integrated optical circuits for numerical computation

    Science.gov (United States)

    Verber, C. M.; Kenan, R. P.

    1983-01-01

    The development of integrated optical circuits (IOC) for numerical-computation applications is reviewed, with a focus on the use of systolic architectures. The basic architecture criteria for optical processors are shown to be the same as those proposed by Kung (1982) for VLSI design, and the advantages of IOCs over bulk techniques are indicated. The operation and fabrication of electrooptic grating structures are outlined, and the application of IOCs of this type to an existing 32-bit, 32-Mbit/sec digital correlator, a proposed matrix multiplier, and a proposed pipeline processor for polynomial evaluation is discussed. The problems arising from the inherent nonlinearity of electrooptic gratings are considered. Diagrams and drawings of the application concepts are provided.

  18. [A web-based integrated clinical database for laryngeal cancer].

    Science.gov (United States)

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  19. Integrating pattern mining in relational databases

    NARCIS (Netherlands)

    Calders, T.; Goethals, B.; Prado, A.; Fürnkranz, J.; Scheffer, T.; Spiliopoulou, M.

    2006-01-01

    Almost a decade ago, Imielinski and Mannila introduced the notion of Inductive Databases to manage KDD applications just as DBMSs successfully manage business applications. The goal is to follow one of the key DBMS paradigms: building optimizing compilers for ad hoc queries. During the past decade,

  20. Numeric databases on the kinetics of transient species in solution

    International Nuclear Information System (INIS)

    Helman, W.P.; Hug, G.L.; Carmichael, Ian; Ross, A.B.

    1988-01-01

    A description is given of data compilations on the kinetics of transient species in solution. In particular information is available for the reactions of radicals in aqueous solution and for excited states such as singlet molecular oxygen and those of metal complexes in solution. Methods for compilation and use of the information in computer-readable form are also described. Emphasis is placed on making the database available for online searching. (author)

  1. Integration of Biodiversity Databases in Taiwan and Linkage to Global Databases

    Directory of Open Access Journals (Sweden)

    Kwang-Tsao Shao

    2007-03-01

    Full Text Available The biodiversity databases in Taiwan were dispersed to various institutions and colleges with limited amount of data by 2001. The Natural Resources and Ecology GIS Database sponsored by the Council of Agriculture, which is part of the National Geographic Information System planned by the Ministry of Interior, was the most well established biodiversity database in Taiwan. But thisThis database was, however, mainly collectingcollected the distribution data of terrestrial animals and plants within the Taiwan area. In 2001, GBIF was formed, and Taiwan joined as one of the an Associate Participant and started, starting the establishment and integration of animal and plant species databases; therefore, TaiBIF was able to co-operate with GBIF. The information of Catalog of Life, specimens, and alien species were integrated by the Darwin core. The standard. These metadata standards allowed the biodiversity information of Taiwan to connect with global databases.

  2. Numerical integration of asymptotic solutions of ordinary differential equations

    Science.gov (United States)

    Thurston, Gaylen A.

    1989-01-01

    Classical asymptotic analysis of ordinary differential equations derives approximate solutions that are numerically stable. However, the analysis also leads to tedious expansions in powers of the relevant parameter for a particular problem. The expansions are replaced with integrals that can be evaluated by numerical integration. The resulting numerical solutions retain the linear independence that is the main advantage of asymptotic solutions. Examples, including the Falkner-Skan equation from laminar boundary layer theory, illustrate the method of asymptotic analysis with numerical integration.

  3. Database Translator (DATALATOR) for Integrated Exploitation

    Science.gov (United States)

    2010-10-31

    via the Internet to Fortune 1000 clients including Mercedes Benz , Procter & Gamble, and HP. I look forward to hearing of your successful proposal and working with you to build a successful business. Sincerely, ...testing the DATALATOR experimental prototype (IRL 4) designed to demonstrate its core functions based on Next (icneration Software technology . Die...sources, but is not directly dependent on the platform such as database technology or data formats. In other words, there is a clear air gap between

  4. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  5. Integration of Oracle and Hadoop: Hybrid Databases Affordable at Scale

    Science.gov (United States)

    Canali, L.; Baranowski, Z.; Kothuri, P.

    2017-10-01

    This work reports on the activities aimed at integrating Oracle and Hadoop technologies for the use cases of CERN database services and in particular on the development of solutions for offloading data and queries from Oracle databases into Hadoop-based systems. The goal and interest of this investigation is to increase the scalability and optimize the cost/performance footprint for some of our largest Oracle databases. These concepts have been applied, among others, to build offline copies of CERN accelerator controls and logging databases. The tested solution allows to run reports on the controls data offloaded in Hadoop without affecting the critical production database, providing both performance benefits and cost reduction for the underlying infrastructure. Other use cases discussed include building hybrid database solutions with Oracle and Hadoop, offering the combined advantages of a mature relational database system with a scalable analytics engine.

  6. Integration of functions in logic database systems

    NARCIS (Netherlands)

    Lambrichts, E.; Nees, P.; Paredaens, J.; Peelman, P.; Tanca, L.

    1990-01-01

    We extend Datalog, a logic programming language for rule-based systems, by respectively integrating types, negation and functions. This extention of Datalog is called MilAnt. Furthermore, MilAnt consistency is defined as a stronger form of consistency for functions. It is known that consistency for

  7. An Integrated Enterprise Accelerator Database for the SLC Control System

    International Nuclear Information System (INIS)

    2002-01-01

    Since its inception in the early 1980's, the SLC Control System has been driven by a highly structured memory-resident real-time database. While efficient, its rigid structure and file-based sources makes it difficult to maintain and extract relevant information. The goal of transforming the sources for this database into a relational form is to enable it to be part of a Control System Enterprise Database that is an integrated central repository for SLC accelerator device and Control System data with links to other associated databases. We have taken the concepts developed for the NLC Enterprise Database and used them to create and load a relational model of the online SLC Control System database. This database contains data and structure to allow querying and reporting on beamline devices, their associations and parameters. In the future this will be extended to allow generation of EPICS and SLC database files, setup of applications and links to other databases such as accelerator maintenance, archive data, financial and personnel records, cabling information, documentation etc. The database is implemented using Oracle 8i. In the short term it will be updated daily in batch from the online SLC database. In the longer term, it will serve as the primary source for Control System static data, an R and D platform for the NLC, and contribute to SLC Control System operations

  8. A Generalized Technique in Numerical Integration

    Science.gov (United States)

    Safouhi, Hassan

    2018-02-01

    Integration by parts is one of the most popular techniques in the analysis of integrals and is one of the simplest methods to generate asymptotic expansions of integral representations. The product of the technique is usually a divergent series formed from evaluating boundary terms; however, sometimes the remaining integral is also evaluated. Due to the successive differentiation and anti-differentiation required to form the series or the remaining integral, the technique is difficult to apply to problems more complicated than the simplest. In this contribution, we explore a generalized and formalized integration by parts to create equivalent representations to some challenging integrals. As a demonstrative archetype, we examine Bessel integrals, Fresnel integrals and Airy functions.

  9. High speed numerical integration algorithm using FPGA | Razak ...

    African Journals Online (AJOL)

    Conventionally, numerical integration algorithm is executed in software and time consuming to accomplish. Field Programmable Gate Arrays (FPGAs) can be used as a much faster, very efficient and reliable alternative to implement the numerical integration algorithm. This paper proposed a hardware implementation of four ...

  10. Construction of an integrated database to support genomic sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, W.; Overbeek, R.

    1994-11-01

    The central goal of this project is to develop an integrated database to support comparative analysis of genomes including DNA sequence data, protein sequence data, gene expression data and metabolism data. In developing the logic-based system GenoBase, a broader integration of available data was achieved due to assistance from collaborators. Current goals are to easily include new forms of data as they become available and to easily navigate through the ensemble of objects described within the database. This report comments on progress made in these areas.

  11. Numerical method of singular problems on singular integrals

    International Nuclear Information System (INIS)

    Zhao Huaiguo; Mou Zongze

    1992-02-01

    As first part on the numerical research of singular problems, a numerical method is proposed for singular integrals. It is shown that the procedure is quite powerful for solving physics calculation with singularity such as the plasma dispersion function. Useful quadrature formulas for some class of the singular integrals are derived. In general, integrals with more complex singularities can be dealt by this method easily

  12. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna; Tramontano, Anna; Marcatili, Paolo

    2011-01-01

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  13. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna

    2011-11-10

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  14. Integrating spatial and numerical structure in mathematical patterning

    Science.gov (United States)

    Ni’mah, K.; Purwanto; Irawan, E. B.; Hidayanto, E.

    2018-03-01

    This paper reports a study monitoring the integrating spatial and numerical structure in mathematical patterning skills of 30 students grade 7th of junior high school. The purpose of this research is to clarify the processes by which learners construct new knowledge in mathematical patterning. Findings indicate that: (1) students are unable to organize the structure of spatial and numerical, (2) students were only able to organize the spatial structure, but the numerical structure is still incorrect, (3) students were only able to organize numerical structure, but its spatial structure is still incorrect, (4) students were able to organize both of the spatial and numerical structure.

  15. INE: a rice genome database with an integrated map view.

    Science.gov (United States)

    Sakata, K; Antonio, B A; Mukai, Y; Nagasaki, H; Sakai, Y; Makino, K; Sasaki, T

    2000-01-01

    The Rice Genome Research Program (RGP) launched a large-scale rice genome sequencing in 1998 aimed at decoding all genetic information in rice. A new genome database called INE (INtegrated rice genome Explorer) has been developed in order to integrate all the genomic information that has been accumulated so far and to correlate these data with the genome sequence. A web interface based on Java applet provides a rapid viewing capability in the database. The first operational version of the database has been completed which includes a genetic map, a physical map using YAC (Yeast Artificial Chromosome) clones and PAC (P1-derived Artificial Chromosome) contigs. These maps are displayed graphically so that the positional relationships among the mapped markers on each chromosome can be easily resolved. INE incorporates the sequences and annotations of the PAC contig. A site on low quality information ensures that all submitted sequence data comply with the standard for accuracy. As a repository of rice genome sequence, INE will also serve as a common database of all sequence data obtained by collaborating members of the International Rice Genome Sequencing Project (IRGSP). The database can be accessed at http://www. dna.affrc.go.jp:82/giot/INE. html or its mirror site at http://www.staff.or.jp/giot/INE.html

  16. Toward an interactive article: integrating journals and biological databases

    Directory of Open Access Journals (Sweden)

    Marygold Steven J

    2011-05-01

    Full Text Available Abstract Background Journal articles and databases are two major modes of communication in the biological sciences, and thus integrating these critical resources is of urgent importance to increase the pace of discovery. Projects focused on bridging the gap between journals and databases have been on the rise over the last five years and have resulted in the development of automated tools that can recognize entities within a document and link those entities to a relevant database. Unfortunately, automated tools cannot resolve ambiguities that arise from one term being used to signify entities that are quite distinct from one another. Instead, resolving these ambiguities requires some manual oversight. Finding the right balance between the speed and portability of automation and the accuracy and flexibility of manual effort is a crucial goal to making text markup a successful venture. Results We have established a journal article mark-up pipeline that links GENETICS journal articles and the model organism database (MOD WormBase. This pipeline uses a lexicon built with entities from the database as a first step. The entity markup pipeline results in links from over nine classes of objects including genes, proteins, alleles, phenotypes and anatomical terms. New entities and ambiguities are discovered and resolved by a database curator through a manual quality control (QC step, along with help from authors via a web form that is provided to them by the journal. New entities discovered through this pipeline are immediately sent to an appropriate curator at the database. Ambiguous entities that do not automatically resolve to one link are resolved by hand ensuring an accurate link. This pipeline has been extended to other databases, namely Saccharomyces Genome Database (SGD and FlyBase, and has been implemented in marking up a paper with links to multiple databases. Conclusions Our semi-automated pipeline hyperlinks articles published in GENETICS to

  17. Parallel Algorithm for Adaptive Numerical Integration

    International Nuclear Information System (INIS)

    Sujatmiko, M.; Basarudin, T.

    1997-01-01

    This paper presents an automation algorithm for integration using adaptive trapezoidal method. The interval is adaptively divided where the width of sub interval are different and fit to the behavior of its function. For a function f, an integration on interval [a,b] can be obtained, with maximum tolerance ε, using estimation (f, a, b, ε). The estimated solution is valid if the error is still in a reasonable range, fulfil certain criteria. If the error is big, however, the problem is solved by dividing it into to similar and independent sub problem on to separate [a, (a+b)/2] and [(a+b)/2, b] interval, i. e. ( f, a, (a+b)/2, ε/2) and (f, (a+b)/2, b, ε/2) estimations. The problems are solved in two different kinds of processor, root processor and worker processor. Root processor function ti divide a main problem into sub problems and distribute them to worker processor. The division mechanism may go further until all of the sub problem are resolved. The solution of each sub problem is then submitted to the root processor such that the solution for the main problem can be obtained. The algorithm is implemented on C-programming-base distributed computer networking system under parallel virtual machine platform

  18. Case studies in the numerical solution of oscillatory integrals

    International Nuclear Information System (INIS)

    Adam, G.

    1992-06-01

    A numerical solution of a number of 53,249 test integrals belonging to nine parametric classes was attempted by two computer codes: EAQWOM (Adam and Nobile, IMA Journ. Numer. Anal. (1991) 11, 271-296) and DO1ANF (Mark 13, 1988) from the NAG library software. For the considered test integrals, EAQWOM was found to be superior to DO1ANF as it concerns robustness, reliability, and friendly user information in case of failure. (author). 9 refs, 3 tabs

  19. Comparison of direct numerical simulation databases of turbulent channel flow at $Re_{\\tau}$ = 180

    NARCIS (Netherlands)

    Vreman, A.W.; Kuerten, Johannes G.M.

    2014-01-01

    Direct numerical simulation (DNS) databases are compared to assess the accuracy and reproducibility of standard and non-standard turbulence statistics of incompressible plane channel flow at $Re_{\\tau}$ = 180. Two fundamentally different DNS codes are shown to produce maximum relative deviations

  20. Comparison of direct numerical simulation databases of turbulent channel flow at Re = 180

    NARCIS (Netherlands)

    Vreman, A.W.; Kuerten, J.G.M.

    2014-01-01

    Direct numerical simulation (DNS) databases are compared to assess the accuracy and reproducibility of standard and non-standard turbulence statistics of incompressible plane channel flow at Re t = 180. Two fundamentally different DNS codes are shown to produce maximum relative deviations below 0.2%

  1. Database of episode-integrated solar energetic proton fluences

    Science.gov (United States)

    Robinson, Zachary D.; Adams, James H.; Xapsos, Michael A.; Stauffer, Craig A.

    2018-04-01

    A new database of proton episode-integrated fluences is described. This database contains data from two different instruments on multiple satellites. The data are from instruments on the Interplanetary Monitoring Platform-8 (IMP8) and the Geostationary Operational Environmental Satellites (GOES) series. A method to normalize one set of data to one another is presented to create a seamless database spanning 1973 to 2016. A discussion of some of the characteristics that episodes exhibit is presented, including episode duration and number of peaks. As an example of what can be understood about episodes, the July 4, 2012 episode is examined in detail. The coronal mass ejections and solar flares that caused many of the fluctuations of the proton flux seen at Earth are associated with peaks in the proton flux during this episode. The reasoning for each choice is laid out to provide a reference for how CME and solar flares associations are made.

  2. Database of episode-integrated solar energetic proton fluences

    Directory of Open Access Journals (Sweden)

    Robinson Zachary D.

    2018-01-01

    Full Text Available A new database of proton episode-integrated fluences is described. This database contains data from two different instruments on multiple satellites. The data are from instruments on the Interplanetary Monitoring Platform-8 (IMP8 and the Geostationary Operational Environmental Satellites (GOES series. A method to normalize one set of data to one another is presented to create a seamless database spanning 1973 to 2016. A discussion of some of the characteristics that episodes exhibit is presented, including episode duration and number of peaks. As an example of what can be understood about episodes, the July 4, 2012 episode is examined in detail. The coronal mass ejections and solar flares that caused many of the fluctuations of the proton flux seen at Earth are associated with peaks in the proton flux during this episode. The reasoning for each choice is laid out to provide a reference for how CME and solar flares associations are made.

  3. Heterogeneous Biomedical Database Integration Using a Hybrid Strategy: A p53 Cancer Research Database

    Directory of Open Access Journals (Sweden)

    Vadim Y. Bichutskiy

    2006-01-01

    Full Text Available Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.

  4. A Support Database System for Integrated System Health Management (ISHM)

    Science.gov (United States)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  5. Deep Time Data Infrastructure: Integrating Our Current Geologic and Biologic Databases

    Science.gov (United States)

    Kolankowski, S. M.; Fox, P. A.; Ma, X.; Prabhu, A.

    2016-12-01

    As our knowledge of Earth's geologic and mineralogical history grows, we require more efficient methods of sharing immense amounts of data. Databases across numerous disciplines have been utilized to offer extensive information on very specific Epochs of Earth's history up to its current state, i.e. Fossil record, rock composition, proteins, etc. These databases could be a powerful force in identifying previously unseen correlations such as relationships between minerals and proteins. Creating a unifying site that provides a portal to these databases will aid in our ability as a collaborative scientific community to utilize our findings more effectively. The Deep-Time Data Infrastructure (DTDI) is currently being defined as part of a larger effort to accomplish this goal. DTDI will not be a new database, but an integration of existing resources. Current geologic and related databases were identified, documentation of their schema was established and will be presented as a stage by stage progression. Through conceptual modeling focused around variables from their combined records, we will determine the best way to integrate these databases using common factors. The Deep-Time Data Infrastructure will allow geoscientists to bridge gaps in data and further our understanding of our Earth's history.

  6. The Center for Integrated Molecular Brain Imaging (Cimbi) database

    DEFF Research Database (Denmark)

    Knudsen, Gitte M.; Jensen, Peter S.; Erritzoe, David

    2016-01-01

    We here describe a multimodality neuroimaging containing data from healthy volunteers and patients, acquired within the Lundbeck Foundation Center for Integrated Molecular Brain Imaging (Cimbi) in Copenhagen, Denmark. The data is of particular relevance for neurobiological research questions rela...... currently contains blood and in some instances saliva samples from about 500 healthy volunteers and 300 patients with e.g., major depression, dementia, substance abuse, obesity, and impulsive aggression. Data continue to be added to the Cimbi database and biobank....

  7. An information integration system for structured documents, Web, and databases

    OpenAIRE

    Morishima, Atsuyuki

    1998-01-01

    Rapid advance in computer network technology has changed the style of computer utilization. Distributed computing resources over world-wide computer networks are available from our local computers. They include powerful computers and a variety of information sources. This change is raising more advanced requirements. Integration of distributed information sources is one of such requirements. In addition to conventional databases, structured documents have been widely used, and have increasing...

  8. Numerical integration subprogrammes in Fortran II-D

    Energy Technology Data Exchange (ETDEWEB)

    Fry, C. R.

    1966-12-15

    This note briefly describes some integration subprogrammes written in FORTRAN II-D for the IBM 1620-II at CARDE. These presented are two Newton-Cotes, Chebyshev polynomial summation, Filon's, Nordsieck's and optimum Runge-Kutta and predictor-corrector methods. A few miscellaneous numerical integration procedures are also mentioned covering statistical functions, oscillating integrands and functions occurring in electrical engineering.

  9. Integrity Checking and Maintenance with Active Rules in XML Databases

    DEFF Research Database (Denmark)

    Christiansen, Henning; Rekouts, Maria

    2007-01-01

    While specification languages for integrity constraints for XML data have been considered in the literature, actual technologies and methodologies for checking and maintaining integrity are still in their infancy. Triggers, or active rules, which are widely used in previous technologies for the p...... updates, the method indicates trigger conditions and correctness criteria to be met by the trigger code supplied by a developer or possibly automatic methods. We show examples developed in the Sedna XML database system which provides a running implementation of XML triggers....

  10. DPTEdb, an integrative database of transposable elements in dioecious plants.

    Science.gov (United States)

    Li, Shu-Fen; Zhang, Guo-Jun; Zhang, Xue-Jin; Yuan, Jin-Hong; Deng, Chuan-Liang; Gu, Lian-Feng; Gao, Wu-Jun

    2016-01-01

    Dioecious plants usually harbor 'young' sex chromosomes, providing an opportunity to study the early stages of sex chromosome evolution. Transposable elements (TEs) are mobile DNA elements frequently found in plants and are suggested to play important roles in plant sex chromosome evolution. The genomes of several dioecious plants have been sequenced, offering an opportunity to annotate and mine the TE data. However, comprehensive and unified annotation of TEs in these dioecious plants is still lacking. In this study, we constructed a dioecious plant transposable element database (DPTEdb). DPTEdb is a specific, comprehensive and unified relational database and web interface. We used a combination of de novo, structure-based and homology-based approaches to identify TEs from the genome assemblies of previously published data, as well as our own. The database currently integrates eight dioecious plant species and a total of 31 340 TEs along with classification information. DPTEdb provides user-friendly web interfaces to browse, search and download the TE sequences in the database. Users can also use tools, including BLAST, GetORF, HMMER, Cut sequence and JBrowse, to analyze TE data. Given the role of TEs in plant sex chromosome evolution, the database will contribute to the investigation of TEs in structural, functional and evolutionary dynamics of the genome of dioecious plants. In addition, the database will supplement the research of sex diversification and sex chromosome evolution of dioecious plants.Database URL: http://genedenovoweb.ticp.net:81/DPTEdb/index.php. © The Author(s) 2016. Published by Oxford University Press.

  11. Database modeling to integrate macrobenthos data in Spatial Data Infrastructure

    Directory of Open Access Journals (Sweden)

    José Alberto Quintanilha

    2012-08-01

    Full Text Available Coastal zones are complex areas that include marine and terrestrial environments. Besides its huge environmental wealth, they also attracts humans because provides food, recreation, business, and transportation, among others. Some difficulties to manage these areas are related with their complexity, diversity of interests and the absence of standardization to collect and share data to scientific community, public agencies, among others. The idea to organize, standardize and share this information based on Web Atlas is essential to support planning and decision making issues. The construction of a spatial database integrating the environmental business, to be used on Spatial Data Infrastructure (SDI is illustrated by a bioindicator that indicates the quality of the sediments. The models show the phases required to build Macrobenthos spatial database based on Santos Metropolitan Region as a reference. It is concluded that, when working with environmental data the structuring of knowledge in a conceptual model is essential for their subsequent integration into the SDI. During the modeling process it can be noticed that methodological issues related to the collection process may obstruct or prejudice the integration of data from different studies of the same area. The development of a database model, as presented in this study, can be used as a reference for further research with similar goals.

  12. A Numerical Study of Quantization-Based Integrators

    Directory of Open Access Journals (Sweden)

    Barros Fernando

    2014-01-01

    Full Text Available Adaptive step size solvers are nowadays considered fundamental to achieve efficient ODE integration. While, traditionally, ODE solvers have been designed based on discrete time machines, new approaches based on discrete event systems have been proposed. Quantization provides an efficient integration technique based on signal threshold crossing, leading to independent and modular solvers communicating through discrete events. These solvers can benefit from the large body of knowledge on discrete event simulation techniques, like parallelization, to obtain efficient numerical integration. In this paper we introduce new solvers based on quantization and adaptive sampling techniques. Preliminary numerical results comparing these solvers are presented.

  13. Integrated database for rapid mass movements in Norway

    Directory of Open Access Journals (Sweden)

    C. Jaedicke

    2009-03-01

    Full Text Available Rapid gravitational slope mass movements include all kinds of short term relocation of geological material, snow or ice. Traditionally, information about such events is collected separately in different databases covering selected geographical regions and types of movement. In Norway the terrain is susceptible to all types of rapid gravitational slope mass movements ranging from single rocks hitting roads and houses to large snow avalanches and rock slides where entire mountainsides collapse into fjords creating flood waves and endangering large areas. In addition, quick clay slides occur in desalinated marine sediments in South Eastern and Mid Norway. For the authorities and inhabitants of endangered areas, the type of threat is of minor importance and mitigation measures have to consider several types of rapid mass movements simultaneously.

    An integrated national database for all types of rapid mass movements built around individual events has been established. Only three data entries are mandatory: time, location and type of movement. The remaining optional parameters enable recording of detailed information about the terrain, materials involved and damages caused. Pictures, movies and other documentation can be uploaded into the database. A web-based graphical user interface has been developed allowing new events to be entered, as well as editing and querying for all events. An integration of the database into a GIS system is currently under development.

    Datasets from various national sources like the road authorities and the Geological Survey of Norway were imported into the database. Today, the database contains 33 000 rapid mass movement events from the last five hundred years covering the entire country. A first analysis of the data shows that the most frequent type of recorded rapid mass movement is rock slides and snow avalanches followed by debris slides in third place. Most events are recorded in the steep fjord

  14. Canonical algorithms for numerical integration of charged particle motion equations

    Science.gov (United States)

    Efimov, I. N.; Morozov, E. A.; Morozova, A. R.

    2017-02-01

    A technique for numerically integrating the equation of charged particle motion in a magnetic field is considered. It is based on the canonical transformations of the phase space in Hamiltonian mechanics. The canonical transformations make the integration process stable against counting error accumulation. The integration algorithms contain a minimum possible amount of arithmetics and can be used to design accelerators and devices of electron and ion optics.

  15. An integrated web medicinal materials DNA database: MMDBD (Medicinal Materials DNA Barcode Database

    Directory of Open Access Journals (Sweden)

    But Paul

    2010-06-01

    Full Text Available Abstract Background Thousands of plants and animals possess pharmacological properties and there is an increased interest in using these materials for therapy and health maintenance. Efficacies of the application is critically dependent on the use of genuine materials. For time to time, life-threatening poisoning is found because toxic adulterant or substitute is administered. DNA barcoding provides a definitive means of authentication and for conducting molecular systematics studies. Owing to the reduced cost in DNA authentication, the volume of the DNA barcodes produced for medicinal materials is on the rise and necessitates the development of an integrated DNA database. Description We have developed an integrated DNA barcode multimedia information platform- Medicinal Materials DNA Barcode Database (MMDBD for data retrieval and similarity search. MMDBD contains over 1000 species of medicinal materials listed in the Chinese Pharmacopoeia and American Herbal Pharmacopoeia. MMDBD also contains useful information of the medicinal material, including resources, adulterant information, medical parts, photographs, primers used for obtaining the barcodes and key references. MMDBD can be accessed at http://www.cuhk.edu.hk/icm/mmdbd.htm. Conclusions This work provides a centralized medicinal materials DNA barcode database and bioinformatics tools for data storage, analysis and exchange for promoting the identification of medicinal materials. MMDBD has the largest collection of DNA barcodes of medicinal materials and is a useful resource for researchers in conservation, systematic study, forensic and herbal industry.

  16. KAIKObase: An integrated silkworm genome database and data mining tool

    Directory of Open Access Journals (Sweden)

    Nagaraju Javaregowda

    2009-10-01

    Full Text Available Abstract Background The silkworm, Bombyx mori, is one of the most economically important insects in many developing countries owing to its large-scale cultivation for silk production. With the development of genomic and biotechnological tools, B. mori has also become an important bioreactor for production of various recombinant proteins of biomedical interest. In 2004, two genome sequencing projects for B. mori were reported independently by Chinese and Japanese teams; however, the datasets were insufficient for building long genomic scaffolds which are essential for unambiguous annotation of the genome. Now, both the datasets have been merged and assembled through a joint collaboration between the two groups. Description Integration of the two data sets of silkworm whole-genome-shotgun sequencing by the Japanese and Chinese groups together with newly obtained fosmid- and BAC-end sequences produced the best continuity (~3.7 Mb in N50 scaffold size among the sequenced insect genomes and provided a high degree of nucleotide coverage (88% of all 28 chromosomes. In addition, a physical map of BAC contigs constructed by fingerprinting BAC clones and a SNP linkage map constructed using BAC-end sequences were available. In parallel, proteomic data from two-dimensional polyacrylamide gel electrophoresis in various tissues and developmental stages were compiled into a silkworm proteome database. Finally, a Bombyx trap database was constructed for documenting insertion positions and expression data of transposon insertion lines. Conclusion For efficient usage of genome information for functional studies, genomic sequences, physical and genetic map information and EST data were compiled into KAIKObase, an integrated silkworm genome database which consists of 4 map viewers, a gene viewer, and sequence, keyword and position search systems to display results and data at the level of nucleotide sequence, gene, scaffold and chromosome. Integration of the

  17. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    International Nuclear Information System (INIS)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-01-01

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  18. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-06-17

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  19. Integration of curated databases to identify genotype-phenotype associations

    Directory of Open Access Journals (Sweden)

    Li Jianrong

    2006-10-01

    Full Text Available Abstract Background The ability to rapidly characterize an unknown microorganism is critical in both responding to infectious disease and biodefense. To do this, we need some way of anticipating an organism's phenotype based on the molecules encoded by its genome. However, the link between molecular composition (i.e. genotype and phenotype for microbes is not obvious. While there have been several studies that address this challenge, none have yet proposed a large-scale method integrating curated biological information. Here we utilize a systematic approach to discover genotype-phenotype associations that combines phenotypic information from a biomedical informatics database, GIDEON, with the molecular information contained in National Center for Biotechnology Information's Clusters of Orthologous Groups database (NCBI COGs. Results Integrating the information in the two databases, we are able to correlate the presence or absence of a given protein in a microbe with its phenotype as measured by certain morphological characteristics or survival in a particular growth media. With a 0.8 correlation score threshold, 66% of the associations found were confirmed by the literature and at a 0.9 correlation threshold, 86% were positively verified. Conclusion Our results suggest possible phenotypic manifestations for proteins biochemically associated with sugar metabolism and electron transport. Moreover, we believe our approach can be extended to linking pathogenic phenotypes with functionally related proteins.

  20. Numerical calculations in elementary quantum mechanics using Feynman path integrals

    International Nuclear Information System (INIS)

    Scher, G.; Smith, M.; Baranger, M.

    1980-01-01

    We show that it is possible to do numerical calculations in elementary quantum mechanics using Feynman path integrals. Our method involves discretizing both time and space, and summing paths through matrix multiplication. We give numerical results for various one-dimensional potentials. The calculations of energy levels and wavefunctions take approximately 100 times longer than with standard methods, but there are other problems for which such an approach should be more efficient

  1. Integrated olfactory receptor and microarray gene expression databases

    Directory of Open Access Journals (Sweden)

    Crasto Chiquito J

    2007-06-01

    Full Text Available Abstract Background Gene expression patterns of olfactory receptors (ORs are an important component of the signal encoding mechanism in the olfactory system since they determine the interactions between odorant ligands and sensory neurons. We have developed the Olfactory Receptor Microarray Database (ORMD to house OR gene expression data. ORMD is integrated with the Olfactory Receptor Database (ORDB, which is a key repository of OR gene information. Both databases aim to aid experimental research related to olfaction. Description ORMD is a Web-accessible database that provides a secure data repository for OR microarray experiments. It contains both publicly available and private data; accessing the latter requires authenticated login. The ORMD is designed to allow users to not only deposit gene expression data but also manage their projects/experiments. For example, contributors can choose whether to make their datasets public. For each experiment, users can download the raw data files and view and export the gene expression data. For each OR gene being probed in a microarray experiment, a hyperlink to that gene in ORDB provides access to genomic and proteomic information related to the corresponding olfactory receptor. Individual ORs archived in ORDB are also linked to ORMD, allowing users access to the related microarray gene expression data. Conclusion ORMD serves as a data repository and project management system. It facilitates the study of microarray experiments of gene expression in the olfactory system. In conjunction with ORDB, ORMD integrates gene expression data with the genomic and functional data of ORs, and is thus a useful resource for both olfactory researchers and the public.

  2. Quality controls in integrative approaches to detect errors and inconsistencies in biological databases

    Directory of Open Access Journals (Sweden)

    Ghisalberti Giorgio

    2010-12-01

    Full Text Available Numerous biomolecular data are available, but they are scattered in many databases and only some of them are curated by experts. Most available data are computationally derived and include errors and inconsistencies. Effective use of available data in order to derive new knowledge hence requires data integration and quality improvement. Many approaches for data integration have been proposed. Data warehousing seams to be the most adequate when comprehensive analysis of integrated data is required. This makes it the most suitable also to implement comprehensive quality controls on integrated data. We previously developed GFINDer (http://www.bioinformatics.polimi.it/GFINDer/, a web system that supports scientists in effectively using available information. It allows comprehensive statistical analysis and mining of functional and phenotypic annotations of gene lists, such as those identified by high-throughput biomolecular experiments. GFINDer backend is composed of a multi-organism genomic and proteomic data warehouse (GPDW. Within the GPDW, several controlled terminologies and ontologies, which describe gene and gene product related biomolecular processes, functions and phenotypes, are imported and integrated, together with their associations with genes and proteins of several organisms. In order to ease maintaining updated the GPDW and to ensure the best possible quality of data integrated in subsequent updating of the data warehouse, we developed several automatic procedures. Within them, we implemented numerous data quality control techniques to test the integrated data for a variety of possible errors and inconsistencies. Among other features, the implemented controls check data structure and completeness, ontological data consistency, ID format and evolution, unexpected data quantification values, and consistency of data from single and multiple sources. We use the implemented controls to analyze the quality of data available from several

  3. Numerical computation of molecular integrals via optimized (vectorized) FORTRAN code

    International Nuclear Information System (INIS)

    Scott, T.C.; Grant, I.P.; Saunders, V.R.

    1997-01-01

    The calculation of molecular properties based on quantum mechanics is an area of fundamental research whose horizons have always been determined by the power of state-of-the-art computers. A computational bottleneck is the numerical calculation of the required molecular integrals to sufficient precision. Herein, we present a method for the rapid numerical evaluation of molecular integrals using optimized FORTRAN code generated by Maple. The method is based on the exploitation of common intermediates and the optimization can be adjusted to both serial and vectorized computations. (orig.)

  4. Numerical solution of boundary-integral equations for molecular electrostatics.

    Science.gov (United States)

    Bardhan, Jaydeep P

    2009-03-07

    Numerous molecular processes, such as ion permeation through channel proteins, are governed by relatively small changes in energetics. As a result, theoretical investigations of these processes require accurate numerical methods. In the present paper, we evaluate the accuracy of two approaches to simulating boundary-integral equations for continuum models of the electrostatics of solvation. The analysis emphasizes boundary-element method simulations of the integral-equation formulation known as the apparent-surface-charge (ASC) method or polarizable-continuum model (PCM). In many numerical implementations of the ASC/PCM model, one forces the integral equation to be satisfied exactly at a set of discrete points on the boundary. We demonstrate in this paper that this approach to discretization, known as point collocation, is significantly less accurate than an alternative approach known as qualocation. Furthermore, the qualocation method offers this improvement in accuracy without increasing simulation time. Numerical examples demonstrate that electrostatic part of the solvation free energy, when calculated using the collocation and qualocation methods, can differ significantly; for a polypeptide, the answers can differ by as much as 10 kcal/mol (approximately 4% of the total electrostatic contribution to solvation). The applicability of the qualocation discretization to other integral-equation formulations is also discussed, and two equivalences between integral-equation methods are derived.

  5. Numerical evaluation of tensor Feynman integrals in Euclidean kinematics

    Energy Technology Data Exchange (ETDEWEB)

    Gluza, J.; Kajda [Silesia Univ., Katowice (Poland). Inst. of Physics; Riemann, T.; Yundin, V. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2010-10-15

    For the investigation of higher order Feynman integrals, potentially with tensor structure, it is highly desirable to have numerical methods and automated tools for dedicated, but sufficiently 'simple' numerical approaches. We elaborate two algorithms for this purpose which may be applied in the Euclidean kinematical region and in d=4-2{epsilon} dimensions. One method uses Mellin-Barnes representations for the Feynman parameter representation of multi-loop Feynman integrals with arbitrary tensor rank. Our Mathematica package AMBRE has been extended for that purpose, and together with the packages MB (M. Czakon) or MBresolve (A. V. Smirnov and V. A. Smirnov) one may perform automatically a numerical evaluation of planar tensor Feynman integrals. Alternatively, one may apply sector decomposition to planar and non-planar multi-loop {epsilon}-expanded Feynman integrals with arbitrary tensor rank. We automatized the preparations of Feynman integrals for an immediate application of the package sectordecomposition (C. Bogner and S. Weinzierl) so that one has to give only a proper definition of propagators and numerators. The efficiency of the two implementations, based on Mellin-Barnes representations and sector decompositions, is compared. The computational packages are publicly available. (orig.)

  6. Numerical Integration of the Transport Equation For Infinite Homogeneous Media

    Energy Technology Data Exchange (ETDEWEB)

    Haakansson, Rune

    1962-01-15

    The transport equation for neutrons in infinite homogeneous media is solved by direct numerical integration. Accounts are taken to the anisotropy and the inelastic scattering. The integration has been performed by means of the trapezoidal rule and the length of the energy intervals are constant in lethargy scale. The machine used is a Ferranti Mercury computer. Results are given for water, heavy water, aluminium water mixture and iron-aluminium-water mixture.

  7. Modelling of multidimensional quantum systems by the numerical functional integration

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.; Zhidkov, E.P.

    1990-01-01

    The employment of the numerical functional integration for the description of multidimensional systems in quantum and statistical physics is considered. For the multiple functional integrals with respect to Gaussian measures in the full separable metric spaces the new approximation formulas exact on a class of polynomial functionals of a given summary degree are constructed. The use of the formulas is demonstrated on example of computation of the Green function and the ground state energy in multidimensional Calogero model. 15 refs.; 2 tabs

  8. GDR (Genome Database for Rosaceae): integrated web-database for Rosaceae genomics and genetics data.

    Science.gov (United States)

    Jung, Sook; Staton, Margaret; Lee, Taein; Blenda, Anna; Svancara, Randall; Abbott, Albert; Main, Dorrie

    2008-01-01

    The Genome Database for Rosaceae (GDR) is a central repository of curated and integrated genetics and genomics data of Rosaceae, an economically important family which includes apple, cherry, peach, pear, raspberry, rose and strawberry. GDR contains annotated databases of all publicly available Rosaceae ESTs, the genetically anchored peach physical map, Rosaceae genetic maps and comprehensively annotated markers and traits. The ESTs are assembled to produce unigene sets of each genus and the entire Rosaceae. Other annotations include putative function, microsatellites, open reading frames, single nucleotide polymorphisms, gene ontology terms and anchored map position where applicable. Most of the published Rosaceae genetic maps can be viewed and compared through CMap, the comparative map viewer. The peach physical map can be viewed using WebFPC/WebChrom, and also through our integrated GDR map viewer, which serves as a portal to the combined genetic, transcriptome and physical mapping information. ESTs, BACs, markers and traits can be queried by various categories and the search result sites are linked to the mapping visualization tools. GDR also provides online analysis tools such as a batch BLAST/FASTA server for the GDR datasets, a sequence assembly server and microsatellite and primer detection tools. GDR is available at http://www.rosaceae.org.

  9. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  10. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  11. Monograph - The Numerical Integration of Ordinary Differential Equations.

    Science.gov (United States)

    Hull, T. E.

    The materials presented in this monograph are intended to be included in a course on ordinary differential equations at the upper division level in a college mathematics program. These materials provide an introduction to the numerical integration of ordinary differential equations, and they can be used to supplement a regular text on this…

  12. Preserving Simplecticity in the Numerical Integration of Linear Beam Optics

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Christopher K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-07-01

    Presented are mathematical tools and methods for the development of numerical integration techniques that preserve the symplectic condition inherent to mechanics. The intended audience is for beam physicists with backgrounds in numerical modeling and simulation with particular attention to beam optics applications. The paper focuses on Lie methods that are inherently symplectic regardless of the integration accuracy order. Section 2 provides the mathematically tools used in the sequel and necessary for the reader to extend the covered techniques. Section 3 places those tools in the context of charged-particle beam optics; in particular linear beam optics is presented in terms of a Lie algebraic matrix representation. Section 4 presents numerical stepping techniques with particular emphasis on a third-order leapfrog method. Section 5 discusses the modeling of field imperfections with particular attention to the fringe fields of quadrupole focusing magnets. The direct computation of a third order transfer matrix for a fringe field is shown.

  13. Symbolic-Numeric Integration of the Dynamical Cosserat Equations

    KAUST Repository

    Lyakhov, Dmitry A.

    2017-08-29

    We devise a symbolic-numeric approach to the integration of the dynamical part of the Cosserat equations, a system of nonlinear partial differential equations describing the mechanical behavior of slender structures, like fibers and rods. This is based on our previous results on the construction of a closed form general solution to the kinematic part of the Cosserat system. Our approach combines methods of numerical exponential integration and symbolic integration of the intermediate system of nonlinear ordinary differential equations describing the dynamics of one of the arbitrary vector-functions in the general solution of the kinematic part in terms of the module of the twist vector-function. We present an experimental comparison with the well-established generalized \\\\alpha -method illustrating the computational efficiency of our approach for problems in structural mechanics.

  14. Symbolic-Numeric Integration of the Dynamical Cosserat Equations

    KAUST Repository

    Lyakhov, Dmitry A.; Gerdt, Vladimir P.; Weber, Andreas G.; Michels, Dominik L.

    2017-01-01

    We devise a symbolic-numeric approach to the integration of the dynamical part of the Cosserat equations, a system of nonlinear partial differential equations describing the mechanical behavior of slender structures, like fibers and rods. This is based on our previous results on the construction of a closed form general solution to the kinematic part of the Cosserat system. Our approach combines methods of numerical exponential integration and symbolic integration of the intermediate system of nonlinear ordinary differential equations describing the dynamics of one of the arbitrary vector-functions in the general solution of the kinematic part in terms of the module of the twist vector-function. We present an experimental comparison with the well-established generalized \\alpha -method illustrating the computational efficiency of our approach for problems in structural mechanics.

  15. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    Science.gov (United States)

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  16. Integration of a clinical trial database with a PACS

    International Nuclear Information System (INIS)

    Van Herk, M

    2014-01-01

    Many clinical trials use Electronic Case Report Forms (ECRF), e.g., from OpenClinica. Trial data is augmented if DICOM scans, dose cubes, etc. from the Picture Archiving and Communication System (PACS) are included for data mining. Unfortunately, there is as yet no structured way to collect DICOM objects in trial databases. In this paper, we obtain a tight integration of ECRF and PACS using open source software. Methods: DICOM identifiers for selected images/series/studies are stored in associated ECRF events (e.g., baseline) as follows: 1) JavaScript added to OpenClinica communicates using HTML with a gateway server inside the hospitals firewall; 2) On this gateway, an open source DICOM server runs scripts to query and select the data, returning anonymized identifiers; 3) The scripts then collects, anonymizes, zips and transmits selected data to a central trial server; 4) Here data is stored in a DICOM archive which allows authorized ECRF users to view and download the anonymous images associated with each event. Results: All integration scripts are open source. The PACS administrator configures the anonymization script and decides to use the gateway in passive (receiving) mode or in an active mode going out to the PACS to gather data. Our ECRF centric approach supports automatic data mining by iterating over the cases in the ECRF database, providing the identifiers to load images and the clinical data to correlate with image analysis results. Conclusions: Using open source software and web technology, a tight integration has been achieved between PACS and ECRF.

  17. Dynamically Integrating OSM Data into a Borderland Database

    Directory of Open Access Journals (Sweden)

    Xiaoguang Zhou

    2015-09-01

    Full Text Available Spatial data are fundamental for borderland analyses of geography, natural resources, demography, politics, economy, and culture. As the spatial data used in borderland research usually cover the borderland regions of several neighboring countries, it is difficult for anyone research institution of government to collect them. Volunteered Geographic Information (VGI is a highly successful method for acquiring timely and detailed global spatial data at a very low cost. Therefore, VGI is a reasonable source of borderland spatial data. OpenStreetMap (OSM is known as the most successful VGI resource. However, OSM's data model is far different from the traditional geographic information model. Thus, the OSM data must be converted in the scientist’s customized data model. Because the real world changes rapidly, the converted data must be updated incrementally. Therefore, this paper presents a method used to dynamically integrate OSM data into the borderland database. In this method, a basic transformation rule base is formed by comparing the OSM Map Feature description document and the destination model definitions. Using the basic rules, the main features can be automatically converted to the destination model. A human-computer interaction model transformation and a rule/automatic-remember mechanism are developed to interactively transfer the unusual features that cannot be transferred by the basic rules to the target model and to remember the reusable rules automatically. To keep the borderland database current, the global OsmChange daily diff file is used to extract the change-only information for the research region. To extract the changed objects in the region under study, the relationship between the changed object and the research region is analyzed considering the evolution of the involved objects. In addition, five rules are determined to select the objects and integrate the changed objects with multi-versions over time. The objects

  18. Microwave Breast Imaging System Prototype with Integrated Numerical Characterization

    Directory of Open Access Journals (Sweden)

    Mark Haynes

    2012-01-01

    Full Text Available The increasing number of experimental microwave breast imaging systems and the need to properly model them have motivated our development of an integrated numerical characterization technique. We use Ansoft HFSS and a formalism we developed previously to numerically characterize an S-parameter- based breast imaging system and link it to an inverse scattering algorithm. We show successful reconstructions of simple test objects using synthetic and experimental data. We demonstrate the sensitivity of image reconstructions to knowledge of the background dielectric properties and show the limits of the current model.

  19. TOMATOMICS: A Web Database for Integrated Omics Information in Tomato

    KAUST Repository

    Kudo, Toru; Kobayashi, Masaaki; Terashima, Shin; Katayama, Minami; Ozaki, Soichi; Kanno, Maasa; Saito, Misa; Yokoyama, Koji; Ohyanagi, Hajime; Aoki, Koh; Kubo, Yasutaka; Yano, Kentaro

    2016-01-01

    Solanum lycopersicum (tomato) is an important agronomic crop and a major model fruit-producing plant. To facilitate basic and applied research, comprehensive experimental resources and omics information on tomato are available following their development. Mutant lines and cDNA clones from a dwarf cultivar, Micro-Tom, are two of these genetic resources. Large-scale sequencing data for ESTs and full-length cDNAs from Micro-Tom continue to be gathered. In conjunction with information on the reference genome sequence of another cultivar, Heinz 1706, the Micro-Tom experimental resources have facilitated comprehensive functional analyses. To enhance the efficiency of acquiring omics information for tomato biology, we have integrated the information on the Micro-Tom experimental resources and the Heinz 1706 genome sequence. We have also inferred gene structure by comparison of sequences between the genome of Heinz 1706 and the transcriptome, which are comprised of Micro-Tom full-length cDNAs and Heinz 1706 RNA-seq data stored in the KaFTom and Sequence Read Archive databases. In order to provide large-scale omics information with streamlined connectivity we have developed and maintain a web database TOMATOMICS (http://bioinf.mind.meiji.ac.jp/tomatomics/). In TOMATOMICS, access to the information on the cDNA clone resources, full-length mRNA sequences, gene structures, expression profiles and functional annotations of genes is available through search functions and the genome browser, which has an intuitive graphical interface.

  20. TOMATOMICS: A Web Database for Integrated Omics Information in Tomato

    KAUST Repository

    Kudo, Toru

    2016-11-29

    Solanum lycopersicum (tomato) is an important agronomic crop and a major model fruit-producing plant. To facilitate basic and applied research, comprehensive experimental resources and omics information on tomato are available following their development. Mutant lines and cDNA clones from a dwarf cultivar, Micro-Tom, are two of these genetic resources. Large-scale sequencing data for ESTs and full-length cDNAs from Micro-Tom continue to be gathered. In conjunction with information on the reference genome sequence of another cultivar, Heinz 1706, the Micro-Tom experimental resources have facilitated comprehensive functional analyses. To enhance the efficiency of acquiring omics information for tomato biology, we have integrated the information on the Micro-Tom experimental resources and the Heinz 1706 genome sequence. We have also inferred gene structure by comparison of sequences between the genome of Heinz 1706 and the transcriptome, which are comprised of Micro-Tom full-length cDNAs and Heinz 1706 RNA-seq data stored in the KaFTom and Sequence Read Archive databases. In order to provide large-scale omics information with streamlined connectivity we have developed and maintain a web database TOMATOMICS (http://bioinf.mind.meiji.ac.jp/tomatomics/). In TOMATOMICS, access to the information on the cDNA clone resources, full-length mRNA sequences, gene structures, expression profiles and functional annotations of genes is available through search functions and the genome browser, which has an intuitive graphical interface.

  1. Numerical treatments for solving nonlinear mixed integral equation

    Directory of Open Access Journals (Sweden)

    M.A. Abdou

    2016-12-01

    Full Text Available We consider a mixed type of nonlinear integral equation (MNLIE of the second kind in the space C[0,T]×L2(Ω,T<1. The Volterra integral terms (VITs are considered in time with continuous kernels, while the Fredholm integral term (FIT is considered in position with singular general kernel. Using the quadratic method and separation of variables method, we obtain a nonlinear system of Fredholm integral equations (NLSFIEs with singular kernel. A Toeplitz matrix method, in each case, is then used to obtain a nonlinear algebraic system. Numerical results are calculated when the kernels take a logarithmic form or Carleman function. Moreover, the error estimates, in each case, are then computed.

  2. Implementation of a revised numerical integration technique into QAD

    International Nuclear Information System (INIS)

    De Gangi, N.L.

    1983-01-01

    A technique for numerical integration through a uniform volume source is developed. It is applied to gamma radiation transport shielding problems. The method is based on performing a numerical angular and ray point kernel integration and is incorporated into the QAD-CG computer code (i.e. QAD-UE). Several test problems are analyzed with this technique. Convergence properties of the method are analyzed. Gamma dose rates from a large tank and post LOCA dose rates inside a containment building are evaluated. Results are consistent with data from other methods. The new technique provides several advantages. User setup requirements for large volume source problems are reduced from standard point kernel requirements. Calculational efficiencies are improved. An order of magnitude improvement is seen with a test problem

  3. Application of symplectic integrator to numerical fluid analysis

    International Nuclear Information System (INIS)

    Tanaka, Nobuatsu

    2000-01-01

    This paper focuses on application of the symplectic integrator to numerical fluid analysis. For the purpose, we introduce Hamiltonian particle dynamics to simulate fluid behavior. The method is based on both the Hamiltonian formulation of a system and the particle methods, and is therefore called Hamiltonian Particle Dynamics (HPD). In this paper, an example of HPD applications, namely the behavior of incompressible inviscid fluid, is solved. In order to improve accuracy of HPD with respect to space, CIVA, which is a highly accurate interpolation method, is combined, but the combined method is subject to problems in that the invariants of the system are not conserved in a long-time computation. For solving the problems, symplectic time integrators are introduced and the effectiveness is confirmed by numerical analyses. (author)

  4. Practical integrated simulation systems for coupled numerical simulations in parallel

    Energy Technology Data Exchange (ETDEWEB)

    Osamu, Hazama; Zhihong, Guo [Japan Atomic Energy Research Inst., Centre for Promotion of Computational Science and Engineering, Tokyo (Japan)

    2003-07-01

    In order for the numerical simulations to reflect 'real-world' phenomena and occurrences, incorporation of multidisciplinary and multi-physics simulations considering various physical models and factors are becoming essential. However, there still exist many obstacles which inhibit such numerical simulations. For example, it is still difficult in many instances to develop satisfactory software packages which allow for such coupled simulations and such simulations will require more computational resources. A precise multi-physics simulation today will require parallel processing which again makes it a complicated process. Under the international cooperative efforts between CCSE/JAERI and Fraunhofer SCAI, a German institute, a library called the MpCCI, or Mesh-based Parallel Code Coupling Interface, has been implemented together with a library called STAMPI to couple two existing codes to develop an 'integrated numerical simulation system' intended for meta-computing environments. (authors)

  5. Data integration for plant genomics--exemplars from the integration of Arabidopsis thaliana databases.

    Science.gov (United States)

    Lysenko, Artem; Lysenko, Atem; Hindle, Matthew Morritt; Taubert, Jan; Saqi, Mansoor; Rawlings, Christopher John

    2009-11-01

    The development of a systems based approach to problems in plant sciences requires integration of existing information resources. However, the available information is currently often incomplete and dispersed across many sources and the syntactic and semantic heterogeneity of the data is a challenge for integration. In this article, we discuss strategies for data integration and we use a graph based integration method (Ondex) to illustrate some of these challenges with reference to two example problems concerning integration of (i) metabolic pathway and (ii) protein interaction data for Arabidopsis thaliana. We quantify the degree of overlap for three commonly used pathway and protein interaction information sources. For pathways, we find that the AraCyc database contains the widest coverage of enzyme reactions and for protein interactions we find that the IntAct database provides the largest unique contribution to the integrated dataset. For both examples, however, we observe a relatively small amount of data common to all three sources. Analysis and visual exploration of the integrated networks was used to identify a number of practical issues relating to the interpretation of these datasets. We demonstrate the utility of these approaches to the analysis of groups of coexpressed genes from an individual microarray experiment, in the context of pathway information and for the combination of coexpression data with an integrated protein interaction network.

  6. Loop integration results using numerical extrapolation for a non-scalar integral

    International Nuclear Information System (INIS)

    Doncker, E. de; Shimizu, Y.; Fujimoto, J.; Yuasa, F.; Kaugars, K.; Cucos, L.; Van Voorst, J.

    2004-01-01

    Loop integration results have been obtained using numerical integration and extrapolation. An extrapolation to the limit is performed with respect to a parameter in the integrand which tends to zero. Results are given for a non-scalar four-point diagram. Extensions to accommodate loop integration by existing integration packages are also discussed. These include: using previously generated partitions of the domain and roundoff error guards

  7. Reactor core materials research and integrated material database establishment

    International Nuclear Information System (INIS)

    Ryu, Woo Seog; Jang, J. S.; Kim, D. W.

    2002-03-01

    Mainly two research areas were covered in this project. One is to establish the integrated database of nuclear materials, and the other is to study the behavior of reactor core materials, which are usually under the most severe condition in the operating plants. During the stage I of the project (for three years since 1999) in- and out of reactor properties of stainless steel, the major structural material for the core structures of PWR (Pressurized Water Reactor), were evaluated and specification of nuclear grade material was established. And the damaged core components from domestic power plants, e.g. orifice of CVCS, support pin of CRGT, etc. were investigated and the causes were revealed. To acquire more resistant materials to the nuclear environments, development of the alternative alloys was also conducted. For the integrated DB establishment, a task force team was set up including director of nuclear materials technology team, and projector leaders and relevant members from each project. The DB is now opened in public through the Internet

  8. Free and constrained symplectic integrators for numerical general relativity

    International Nuclear Information System (INIS)

    Richter, Ronny; Lubich, Christian

    2008-01-01

    We consider symplectic time integrators in numerical general relativity and discuss both free and constrained evolution schemes. For free evolution of ADM-like equations we propose the use of the Stoermer-Verlet method, a standard symplectic integrator which here is explicit in the computationally expensive curvature terms. For the constrained evolution we give a formulation of the evolution equations that enforces the momentum constraints in a holonomically constrained Hamiltonian system and turns the Hamilton constraint function from a weak to a strong invariant of the system. This formulation permits the use of the constraint-preserving symplectic RATTLE integrator, a constrained version of the Stoermer-Verlet method. The behavior of the methods is illustrated on two effectively (1+1)-dimensional versions of Einstein's equations, which allow us to investigate a perturbed Minkowski problem and the Schwarzschild spacetime. We compare symplectic and non-symplectic integrators for free evolution, showing very different numerical behavior for nearly-conserved quantities in the perturbed Minkowski problem. Further we compare free and constrained evolution, demonstrating in our examples that enforcing the momentum constraints can turn an unstable free evolution into a stable constrained evolution. This is demonstrated in the stabilization of a perturbed Minkowski problem with Dirac gauge, and in the suppression of the propagation of boundary instabilities into the interior of the domain in Schwarzschild spacetime

  9. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  10. Numerical integration of massive two-loop Mellin-Barnes integrals in Minkowskian regions

    International Nuclear Information System (INIS)

    Dubovyk, Ievgen

    2016-07-01

    Mellin-Barnes (MB) techniques applied to integrals emerging in particle physics perturbative calculations are summarized. New versions of AMBRE packages which construct planar and nonplanar MB representations are shortly discussed. The numerical package MBnumerics.m is presented for the first time which is able to calculate with a high precision multidimensional MB integrals in Minkowskian regions. Examples are given for massive vertex integrals which include threshold effects and several scale parameters.

  11. Numerical integration of massive two-loop Mellin-Barnes integrals in Minkowskian regions

    Energy Technology Data Exchange (ETDEWEB)

    Dubovyk, Ievgen [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Gluza, Janusz [Uniwersytet Slaski, Katowice (Poland). Inst. Fizyki; Riemann, Tord [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Uniwersytet Slaski, Katowice (Poland). Inst. Fizyki; Usovitsch, Johann [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2016-07-15

    Mellin-Barnes (MB) techniques applied to integrals emerging in particle physics perturbative calculations are summarized. New versions of AMBRE packages which construct planar and nonplanar MB representations are shortly discussed. The numerical package MBnumerics.m is presented for the first time which is able to calculate with a high precision multidimensional MB integrals in Minkowskian regions. Examples are given for massive vertex integrals which include threshold effects and several scale parameters.

  12. Numerical integration of some new unified plasticity-creep formulations

    International Nuclear Information System (INIS)

    Krieg, R.D.

    1977-01-01

    The unified formulations seem to lead to very non-linear systems of equations which are very well behaved in some regions and very stiff in other regions where the word 'stiff' is used in the mathematical sense. Simple conventional methods of integrating incremental constitutive equations are observed to be totally inadequate. A method of numerically integrating the equations is presented. Automatic step size determination based on accuracy and stability is a necessary expense. In the region where accuracy is the limiting condition the equations can be integrated directly. A forward Euler predictor with a trapezoidal corrector is used in the paper. In the region where stability is the limiting condition, direct integration methods become inefficient and an implicit integrator which is suited to stiff equations must be used. A backward Euler method is used in the paper. It is implemented with a Picard iteration method in which a Newton method is used to predict inelastic strainrate and speed convergence in a Newton-Raphson manner. This allows an analytic expression for the Jacobian to be used, where a full Newton-Raphson would require a numerical approximation to the Jacobian. The starting procedure for the iteration is an adaptation of time independent plasticity ideas. Because of the inherent capability of the unified plasticity-creep formulations, it is felt that these theories will become accepted in the metallurgical community. Structural analysts will then be required to incorporate these formulations and must be prepared to face the difficult implementation inherent in these models. This paper is an attempt to shed some light on the difficulties and expenses involved

  13. Functional integration of automated system databases by means of artificial intelligence

    Science.gov (United States)

    Dubovoi, Volodymyr M.; Nikitenko, Olena D.; Kalimoldayev, Maksat; Kotyra, Andrzej; Gromaszek, Konrad; Iskakova, Aigul

    2017-08-01

    The paper presents approaches for functional integration of automated system databases by means of artificial intelligence. The peculiarities of turning to account the database in the systems with the usage of a fuzzy implementation of functions were analyzed. Requirements for the normalization of such databases were defined. The question of data equivalence in conditions of uncertainty and collisions in the presence of the databases functional integration is considered and the model to reveal their possible occurrence is devised. The paper also presents evaluation method of standardization of integrated database normalization.

  14. Numerical counting ratemeter with variable time constant and integrated circuits

    International Nuclear Information System (INIS)

    Kaiser, J.; Fuan, J.

    1967-01-01

    We present here the prototype of a numerical counting ratemeter which is a special version of variable time-constant frequency meter (1). The originality of this work lies in the fact that the change in the time constant is carried out automatically. Since the criterion for this change is the accuracy in the annunciated result, the integration time is varied as a function of the frequency. For the prototype described in this report, the time constant varies from 1 sec to 1 millisec. for frequencies in the range 10 Hz to 10 MHz. This prototype is built entirely of MECL-type integrated circuits from Motorola and is thus contained in two relatively small boxes. (authors) [fr

  15. Development of an integrated database management system to evaluate integrity of flawed components of nuclear power plant

    International Nuclear Information System (INIS)

    Mun, H. L.; Choi, S. N.; Jang, K. S.; Hong, S. Y.; Choi, J. B.; Kim, Y. J.

    2001-01-01

    The object of this paper is to develop an NPP-IDBMS(Integrated DataBase Management System for Nuclear Power Plants) for evaluating the integrity of components of nuclear power plant using relational data model. This paper describes the relational data model, structure and development strategy for the proposed NPP-IDBMS. The NPP-IDBMS consists of database, database management system and interface part. The database part consists of plant, shape, operating condition, material properties and stress database, which are required for the integrity evaluation of each component in nuclear power plants. For the development of stress database, an extensive finite element analysis was performed for various components considering operational transients. The developed NPP-IDBMS will provide efficient and accurate way to evaluate the integrity of flawed components

  16. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  17. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    Science.gov (United States)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  18. MitBASE : a comprehensive and integrated mitochondrial DNA database. The present status

    NARCIS (Netherlands)

    Attimonelli, M.; Altamura, N.; Benne, R.; Brennicke, A.; Cooper, J. M.; D'Elia, D.; Montalvo, A.; Pinto, B.; de Robertis, M.; Golik, P.; Knoop, V.; Lanave, C.; Lazowska, J.; Licciulli, F.; Malladi, B. S.; Memeo, F.; Monnerot, M.; Pasimeni, R.; Pilbout, S.; Schapira, A. H.; Sloof, P.; Saccone, C.

    2000-01-01

    MitBASE is an integrated and comprehensive database of mitochondrial DNA data which collects, under a single interface, databases for Plant, Vertebrate, Invertebrate, Human, Protist and Fungal mtDNA and a Pilot database on nuclear genes involved in mitochondrial biogenesis in Saccharomyces

  19. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  20. Brassica ASTRA: an integrated database for Brassica genomic research.

    Science.gov (United States)

    Love, Christopher G; Robinson, Andrew J; Lim, Geraldine A C; Hopkins, Clare J; Batley, Jacqueline; Barker, Gary; Spangenberg, German C; Edwards, David

    2005-01-01

    Brassica ASTRA is a public database for genomic information on Brassica species. The database incorporates expressed sequences with Swiss-Prot and GenBank comparative sequence annotation as well as secondary Gene Ontology (GO) annotation derived from the comparison with Arabidopsis TAIR GO annotations. Simple sequence repeat molecular markers are identified within resident sequences and mapped onto the closely related Arabidopsis genome sequence. Bacterial artificial chromosome (BAC) end sequences derived from the Multinational Brassica Genome Project are also mapped onto the Arabidopsis genome sequence enabling users to identify candidate Brassica BACs corresponding to syntenic regions of Arabidopsis. This information is maintained in a MySQL database with a web interface providing the primary means of interrogation. The database is accessible at http://hornbill.cspp.latrobe.edu.au.

  1. Integrated numerical modeling of a laser gun injector

    International Nuclear Information System (INIS)

    Liu, H.; Benson, S.; Bisognano, J.; Liger, P.; Neil, G.; Neuffer, D.; Sinclair, C.; Yunn, B.

    1993-06-01

    CEBAF is planning to incorporate a laser gun injector into the linac front end as a high-charge cw source for a high-power free electron laser and nuclear physics. This injector consists of a DC laser gun, a buncher, a cryounit and a chicane. The performance of the injector is predicted based on integrated numerical modeling using POISSON, SUPERFISH and PARMELA. The point-by-point method incorporated into PARMELA by McDonald is chosen for space charge treatment. The concept of ''conditioning for final bunching'' is employed to vary several crucial parameters of the system for achieving highest peak current while maintaining low emittance and low energy spread. Extensive parameter variation studies show that the design will perform beyond the specifications for FEL operations aimed at industrial applications and fundamental scientific research. The calculation also shows that the injector will perform as an extremely bright cw electron source

  2. Using XML technology for the ontology-based semantic integration of life science databases.

    Science.gov (United States)

    Philippi, Stephan; Köhler, Jacob

    2004-06-01

    Several hundred internet accessible life science databases with constantly growing contents and varying areas of specialization are publicly available via the internet. Database integration, consequently, is a fundamental prerequisite to be able to answer complex biological questions. Due to the presence of syntactic, schematic, and semantic heterogeneities, large scale database integration at present takes considerable efforts. As there is a growing apprehension of extensible markup language (XML) as a means for data exchange in the life sciences, this article focuses on the impact of XML technology on database integration in this area. In detail, a general architecture for ontology-driven data integration based on XML technology is introduced, which overcomes some of the traditional problems in this area. As a proof of concept, a prototypical implementation of this architecture based on a native XML database and an expert system shell is described for the realization of a real world integration scenario.

  3. Numerical integration of some new unified plasticity-creep formulations

    International Nuclear Information System (INIS)

    Krieg, R.D.

    1977-01-01

    The usual constitutive description of metals at high temperature treats creep as a phenomenon which must be added to time independent phenomena. A new approach is now being advocated by some people, principally metallurgists. They all treat the inelastic strain as a unified quantity, incapable of being separated into time dependent and time independent parts. This paper examines the behavior of the differential formulations reported in the literature together with one proposed by the author. These formulations are capable of representing primary and secondary creep, cyclic hardening to a stable cyclic stress-strain loop, a conventional plasticity behavior, and a Bauchinger effect which may be creep induced and discernable either at fast or slow loading rates. The new unified formulations seem to lead to very non-linear systems of equations which are very well behaved in some regions and very stiff in other regions where the word 'stiff' is used in the mathematical sense. Simple conventional methods of integrating incremental constitutive equations are observed to be totally inadequate. A method of numerically integrating the equations is presented. (Auth.)

  4. Advances in Integrated Vehicle Thermal Management and Numerical Simulation

    Directory of Open Access Journals (Sweden)

    Yan Wang

    2017-10-01

    Full Text Available With the increasing demands for vehicle dynamic performance, economy, safety and comfort, and with ever stricter laws concerning energy conservation and emissions, vehicle power systems are becoming much more complex. To pursue high efficiency and light weight in automobile design, the power system and its vehicle integrated thermal management (VITM system have attracted widespread attention as the major components of modern vehicle technology. Regarding the internal combustion engine vehicle (ICEV, its integrated thermal management (ITM mainly contains internal combustion engine (ICE cooling, turbo-charged cooling, exhaust gas recirculation (EGR cooling, lubrication cooling and air conditioning (AC or heat pump (HP. As for electric vehicles (EVs, the ITM mainly includes battery cooling/preheating, electric machines (EM cooling and AC or HP. With the rational effective and comprehensive control over the mentioned dynamic devices and thermal components, the modern VITM can realize collaborative optimization of multiple thermodynamic processes from the aspect of system integration. Furthermore, the computer-aided calculation and numerical simulation have been the significant design methods, especially for complex VITM. The 1D programming can correlate multi-thermal components and the 3D simulating can develop structuralized and modularized design. Additionally, co-simulations can virtualize simulation of various thermo-hydraulic behaviors under the vehicle transient operational conditions. This article reviews relevant researching work and current advances in the ever broadening field of modern vehicle thermal management (VTM. Based on the systematic summaries of the design methods and applications of ITM, future tasks and proposals are presented. This article aims to promote innovation of ITM, strengthen the precise control and the performance predictable ability, furthermore, to enhance the level of research and development (R&D.

  5. KRILLBASE: a circumpolar database of Antarctic krill and salp numerical densities, 1926-2016

    Science.gov (United States)

    Atkinson, Angus; Hill, Simeon L.; Pakhomov, Evgeny A.; Siegel, Volker; Anadon, Ricardo; Chiba, Sanae; Daly, Kendra L.; Downie, Rod; Fielding, Sophie; Fretwell, Peter; Gerrish, Laura; Hosie, Graham W.; Jessopp, Mark J.; Kawaguchi, So; Krafft, Bjørn A.; Loeb, Valerie; Nishikawa, Jun; Peat, Helen J.; Reiss, Christian S.; Ross, Robin M.; Quetin, Langdon B.; Schmidt, Katrin; Steinberg, Deborah K.; Subramaniam, Roshni C.; Tarling, Geraint A.; Ward, Peter

    2017-03-01

    Antarctic krill (Euphausia superba) and salps are major macroplankton contributors to Southern Ocean food webs and krill are also fished commercially. Managing this fishery sustainably, against a backdrop of rapid regional climate change, requires information on distribution and time trends. Many data on the abundance of both taxa have been obtained from net sampling surveys since 1926, but much of this is stored in national archives, sometimes only in notebooks. In order to make these important data accessible we have collated available abundance data (numerical density, no. m-2) of postlarval E. superba and salp individual (multiple species, and whether singly or in chains). These were combined into a central database, KRILLBASE, together with environmental information, standardisation and metadata. The aim is to provide a temporal-spatial data resource to support a variety of research such as biogeochemistry, autecology, higher predator foraging and food web modelling in addition to fisheries management and conservation. Previous versions of KRILLBASE have led to a series of papers since 2004 which illustrate some of the potential uses of this database. With increasing numbers of requests for these data we here provide an updated version of KRILLBASE that contains data from 15 194 net hauls, including 12 758 with krill abundance data and 9726 with salp abundance data. These data were collected by 10 nations and span 56 seasons in two epochs (1926-1939 and 1976-2016). Here, we illustrate the seasonal, inter-annual, regional and depth coverage of sampling, and provide both circumpolar- and regional-scale distribution maps. Krill abundance data have been standardised to accommodate variation in sampling methods, and we have presented these as well as the raw data. Information is provided on how to screen, interpret and use KRILLBASE to reduce artefacts in interpretation, with contact points for the main data providers. The DOI for the published data set is doi:10

  6. Integr8: enhanced inter-operability of European molecular biology databases.

    Science.gov (United States)

    Kersey, P J; Morris, L; Hermjakob, H; Apweiler, R

    2003-01-01

    The increasing production of molecular biology data in the post-genomic era, and the proliferation of databases that store it, require the development of an integrative layer in database services to facilitate the synthesis of related information. The solution of this problem is made more difficult by the absence of universal identifiers for biological entities, and the breadth and variety of available data. Integr8 was modelled using UML (Universal Modelling Language). Integr8 is being implemented as an n-tier system using a modern object-oriented programming language (Java). An object-relational mapping tool, OJB, is being used to specify the interface between the upper layers and an underlying relational database. The European Bioinformatics Institute is launching the Integr8 project. Integr8 will be an automatically populated database in which we will maintain stable identifiers for biological entities, describe their relationships with each other (in accordance with the central dogma of biology), and store equivalences between identified entities in the source databases. Only core data will be stored in Integr8, with web links to the source databases providing further information. Integr8 will provide the integrative layer of the next generation of bioinformatics services from the EBI. Web-based interfaces will be developed to offer gene-centric views of the integrated data, presenting (where known) the links between genome, proteome and phenotype.

  7. DEVELOPING FLEXIBLE APPLICATIONS WITH XML AND DATABASE INTEGRATION

    Directory of Open Access Journals (Sweden)

    Hale AS

    2004-04-01

    Full Text Available In recent years the most popular subject in Information System area is Enterprise Application Integration (EAI. It can be defined as a process of forming a standart connection between different systems of an organization?s information system environment. The incorporating, gaining and marriage of corporations are the major reasons of popularity in Enterprise Application Integration. The main purpose is to solve the application integrating problems while similar systems in such corporations continue working together for a more time. With the help of XML technology, it is possible to find solutions to the problems of application integration either within the corporation or between the corporations.

  8. Efficient Integrity Checking for Databases with Recursive Views

    DEFF Research Database (Denmark)

    Martinenghi, Davide; Christiansen, Henning

    2005-01-01

    Efficient and incremental maintenance of integrity constraints involving recursive views is a difficult issue that has received some attention in the past years, but for which no widely accepted solution exists yet. In this paper a technique is proposed for compiling such integrity constraints in...... approaches have not achieved comparable optimization with the same level of generality....

  9. How to integrate divergent integrals: a pure numerical approach to complex loop calculations

    International Nuclear Information System (INIS)

    Caravaglios, F.

    2000-01-01

    Loop calculations involve the evaluation of divergent integrals. Usually [G. 't Hooft, M. Veltman, Nucl. Phys. B 44 (1972) 189] one computes them in a number of dimensions different than four where the integral is convergent and then one performs the analytical continuation and considers the Laurent expansion in powers of ε=n-4. In this paper we discuss a method to extract directly all coefficients of this expansion by means of concrete and well defined integrals in a five-dimensional space. We by-pass the formal and symbolic procedure of analytic continuation; instead we can numerically compute the integrals to extract directly both the coefficient of the pole 1/ε and the finite part

  10. Coordinate Systems Integration for Craniofacial Database from Multimodal Devices

    Directory of Open Access Journals (Sweden)

    Deni Suwardhi

    2005-05-01

    Full Text Available This study presents a data registration method for craniofacial spatial data of different modalities. The data consists of three dimensional (3D vector and raster data models. The data is stored in object relational database. The data capture devices are Laser scanner, CT (Computed Tomography scan and CR (Close Range Photogrammetry. The objective of the registration is to transform the data from various coordinate systems into a single 3-D Cartesian coordinate system. The standard error of the registration obtained from multimodal imaging devices using 3D affine transformation is in the ranged of 1-2 mm. This study is a step forward for storing the craniofacial spatial data in one reference system in database.

  11. M4FT-16LL080302052-Update to Thermodynamic Database Development and Sorption Database Integration

    Energy Technology Data Exchange (ETDEWEB)

    Zavarin, Mavrik [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Glenn T. Seaborg Inst.. Physical and Life Sciences; Wolery, T. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Akima Infrastructure Services, LLC; Atkins-Duffin, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Global Security

    2016-08-16

    This progress report (Level 4 Milestone Number M4FT-16LL080302052) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Argillite Disposal R&D Work Package Number FT-16LL08030205. The focus of this research is the thermodynamic modeling of Engineered Barrier System (EBS) materials and properties and development of thermodynamic databases and models to evaluate the stability of EBS materials and their interactions with fluids at various physico-chemical conditions relevant to subsurface repository environments. The development and implementation of equilibrium thermodynamic models are intended to describe chemical and physical processes such as solubility, sorption, and diffusion.

  12. Integrated Strategic Tracking and Recruiting Database (iSTAR) Data Inventory

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Integrated Strategic Tracking and Recruiting Database (iSTAR) Data Inventory contains measured and modeled partnership and contact data. It is comprised of basic...

  13. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  14. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  15. Development of Integrated PSA Database and Application Technology

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Hoon; Park, Jin Hee; Kim, Seung Hwan; Choi, Sun Yeong; Jung, Woo Sik; Jeong, Kwang Sub; Ha Jae Joo; Yang, Joon Eon; Min Kyung Ran; Kim, Tae Woon

    2005-04-15

    The purpose of this project is to develop 1) the reliability database framework, 2) the methodology for the reactor trip and abnormal event analysis, and 3) the prototype PSA information DB system. We already have a part of the reactor trip and component reliability data. In this study, we extend the collection of data up to 2002. We construct the pilot reliability database for common cause failure and piping failure data. A reactor trip or a component failure may have an impact on the safety of a nuclear power plant. We perform the precursor analysis for such events that occurred in the KSNP, and to develop a procedure for the precursor analysis. A risk monitor provides a mean to trace the changes in the risk following the changes in the plant configurations. We develop a methodology incorporating the model of secondary system related to the reactor trip into the risk monitor model. We develop a prototype PSA information system for the UCN 3 and 4 PSA models where information for the PSA is inputted into the system such as PSA reports, analysis reports, thermal-hydraulic analysis results, system notebooks, and so on. We develop a unique coherent BDD method to quantify a fault tree and the fastest fault tree quantification engine FTREX. We develop quantification software for a full PSA model and a one top model.

  16. Numerical solution of integral equations, describing mass spectrum of vector mesons

    International Nuclear Information System (INIS)

    Zhidkov, E.P.; Nikonov, E.G.; Sidorov, A.V.; Skachkov, N.B.; Khoromskij, B.N.

    1988-01-01

    The description of the numerical algorithm for solving quasipotential integral equation in impulse space is presented. The results of numerical computations of the vector meson mass spectrum and the leptonic decay width are given in comparison with the experimental data

  17. Autism genetic database (AGD: a comprehensive database including autism susceptibility gene-CNVs integrated with known noncoding RNAs and fragile sites

    Directory of Open Access Journals (Sweden)

    Talebizadeh Zohreh

    2009-09-01

    Full Text Available Abstract Background Autism is a highly heritable complex neurodevelopmental disorder, therefore identifying its genetic basis has been challenging. To date, numerous susceptibility genes and chromosomal abnormalities have been reported in association with autism, but most discoveries either fail to be replicated or account for a small effect. Thus, in most cases the underlying causative genetic mechanisms are not fully understood. In the present work, the Autism Genetic Database (AGD was developed as a literature-driven, web-based, and easy to access database designed with the aim of creating a comprehensive repository for all the currently reported genes and genomic copy number variations (CNVs associated with autism in order to further facilitate the assessment of these autism susceptibility genetic factors. Description AGD is a relational database that organizes data resulting from exhaustive literature searches for reported susceptibility genes and CNVs associated with autism. Furthermore, genomic information about human fragile sites and noncoding RNAs was also downloaded and parsed from miRBase, snoRNA-LBME-db, piRNABank, and the MIT/ICBP siRNA database. A web client genome browser enables viewing of the features while a web client query tool provides access to more specific information for the features. When applicable, links to external databases including GenBank, PubMed, miRBase, snoRNA-LBME-db, piRNABank, and the MIT siRNA database are provided. Conclusion AGD comprises a comprehensive list of susceptibility genes and copy number variations reported to-date in association with autism, as well as all known human noncoding RNA genes and fragile sites. Such a unique and inclusive autism genetic database will facilitate the evaluation of autism susceptibility factors in relation to known human noncoding RNAs and fragile sites, impacting on human diseases. As a result, this new autism database offers a valuable tool for the research

  18. Carbon Dioxide Dispersion in the Combustion Integrated Rack Simulated Numerically

    Science.gov (United States)

    Wu, Ming-Shin; Ruff, Gary A.

    2004-01-01

    When discharged into an International Space Station (ISS) payload rack, a carbon dioxide (CO2) portable fire extinguisher (PFE) must extinguish a fire by decreasing the oxygen in the rack by 50 percent within 60 sec. The length of time needed for this oxygen reduction throughout the rack and the length of time that the CO2 concentration remains high enough to prevent the fire from reigniting is important when determining the effectiveness of the response and postfire procedures. Furthermore, in the absence of gravity, the local flow velocity can make the difference between a fire that spreads rapidly and one that self-extinguishes after ignition. A numerical simulation of the discharge of CO2 from PFE into the Combustion Integrated Rack (CIR) in microgravity was performed to obtain the local velocity and CO2 concentration. The complicated flow field around the PFE nozzle exits was modeled by sources of equivalent mass and momentum flux at a location downstream of the nozzle. The time for the concentration of CO2 to reach a level that would extinguish a fire anywhere in the rack was determined using the Fire Dynamics Simulator (FDS), a computational fluid dynamics code developed by the National Institute of Standards and Technology specifically to evaluate the development of a fire and smoke transport. The simulation shows that CO2, as well as any smoke and combustion gases produced by a fire, would be discharged into the ISS cabin through the resource utility panel at the bottom of the rack. These simulations will be validated by comparing the results with velocity and CO2 concentration measurements obtained during the fire suppression system verification tests conducted on the CIR in March 2003. Once these numerical simulations are validated, portions of the ISS labs and living areas will be modeled to determine the local flow conditions before, during, and after a fire event. These simulations can yield specific information about how long it takes for smoke and

  19. Building an integrated neurodegenerative disease database at an academic health center.

    Science.gov (United States)

    Xie, Sharon X; Baek, Young; Grossman, Murray; Arnold, Steven E; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M-Y; Trojanowski, John Q

    2011-07-01

    It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration. These comparative studies rely on powerful database tools to quickly generate data sets that match diverse and complementary criteria set by them. In this article, we present a novel integrated neurodegenerative disease (INDD) database, which was developed at the University of Pennsylvania (Penn) with the help of a consortium of Penn investigators. Because the work of these investigators are based on Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration, it allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used the Microsoft SQL server as a platform, with built-in "backwards" functionality to provide Access as a frontend client to interface with the database. We used PHP Hypertext Preprocessor to create the "frontend" web interface and then used a master lookup table to integrate individual neurodegenerative disease databases. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Using the INDD database, we compared the results of a biomarker study with those using an alternative approach by querying individual databases separately. We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies on several neurodegenerative diseases. Copyright © 2011 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  20. SolveDB: Integrating Optimization Problem Solvers Into SQL Databases

    DEFF Research Database (Denmark)

    Siksnys, Laurynas; Pedersen, Torben Bach

    2016-01-01

    for optimization problems, (2) an extensible infrastructure for integrating different solvers, and (3) query optimization techniques to achieve the best execution performance and/or result quality. Extensive experiments with the PostgreSQL-based implementation show that SolveDB is a versatile tool offering much...

  1. Representing clinical communication knowledge through database management system integration.

    Science.gov (United States)

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  2. Numerical evaluation of integrals containing a spherical Bessel function by product integration

    International Nuclear Information System (INIS)

    Lehman, D.R.; Parke, W.C.; Maximon, L.C.

    1981-01-01

    A method is developed for numerical evaluation of integrals with k-integration range from 0 to infinity that contain a spherical Bessel function j/sub l/(kr) explicitly. The required quadrature weights are easily calculated and the rate of convergence is rapid: only a relatively small number of quadrature points is needed: for an accurate evaluation even when r is large. The quadrature rule is obtained by the method of product integration. With the abscissas chosen to be those of Clenshaw--Curtis and the Chebyshev polynomials as the interpolating polynomials, quadrature weights are obtained that depend on the spherical Bessel function. An inhomogenous recurrence relation is derived from which the weights can be calculated without accumulation of roundoff error. The procedure is summarized as an easily implementable algorithm. Questions of convergence are discussed and the rate of convergence demonstrated for several test integrals. Alternative procedures are given for generating the integration weights and an error analysis of the method is presented

  3. Different nonideality relationships, different databases and their effects on modeling precipitation from concentrated solutions using numerical speciation codes

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.F.; Ebinger, M.H.

    1996-08-01

    Four simple precipitation problems are solved to examine the use of numerical equilibrium codes. The study emphasizes concentrated solutions, assumes both ideal and nonideal solutions, and employs different databases and different activity-coefficient relationships. The study uses the EQ3/6 numerical speciation codes. The results show satisfactory material balances and agreement between solubility products calculated from free-energy relationships and those calculated from concentrations and activity coefficients. Precipitates show slightly higher solubilities when the solutions are regarded as nonideal than when considered ideal, agreeing with theory. When a substance may precipitate from a solution dilute in the precipitating substance, a code may or may not predict precipitation, depending on the database or activity-coefficient relationship used. In a problem involving a two-component precipitation, there are only small differences in the precipitate mass and composition between the ideal and nonideal solution calculations. Analysis of this result indicates that this may be a frequent occurrence. An analytical approach is derived for judging whether this phenomenon will occur in any real or postulated precipitation situation. The discussion looks at applications of this approach. In the solutes remaining after the precipitations, there seems to be little consistency in the calculated concentrations and activity coefficients. They do not appear to depend in any coherent manner on the database or activity-coefficient relationship used. These results reinforce warnings in the literature about perfunctory or mechanical use of numerical speciation codes.

  4. Self-adaptive numerical integrator for analytic functions

    International Nuclear Information System (INIS)

    Garribba, S.; Quartapelle, L.; Reina, G.

    1978-01-01

    A new adaptive algorithm for the integration of analytical functions is presented. The algorithm processes the integration interval by generating local subintervals whose length is controlled through a feedback loop. The control is obtained by means of a relation derived on an analytical basis and valid for an arbitrary integration rule: two different estimates of an integral are used to compute the interval length necessary to obtain an integral estimate with accuracy within the assigned error bounds. The implied method for local generation of subintervals and an effective assumption of error partition among subintervals give rise to an adaptive algorithm provided with a highly accurate and very efficient integration procedure. The particular algorithm obtained by choosing the 6-point Gauss-Legendre integration rule is considered and extensive comparisons are made with other outstanding integration algorithms

  5. Improving Microbial Genome Annotations in an Integrated Database Context

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.; Chu, Ken; Anderson, Iain; Mavromatis, Konstantinos; Kyrpides, Nikos C.; Ivanova, Natalia N.

    2013-01-01

    Effective comparative analysis of microbial genomes requires a consistent and complete view of biological data. Consistency regards the biological coherence of annotations, while completeness regards the extent and coverage of functional characterization for genomes. We have developed tools that allow scientists to assess and improve the consistency and completeness of microbial genome annotations in the context of the Integrated Microbial Genomes (IMG) family of systems. All publicly available microbial genomes are characterized in IMG using different functional annotation and pathway resources, thus providing a comprehensive framework for identifying and resolving annotation discrepancies. A rule based system for predicting phenotypes in IMG provides a powerful mechanism for validating functional annotations, whereby the phenotypic traits of an organism are inferred based on the presence of certain metabolic reactions and pathways and compared to experimentally observed phenotypes. The IMG family of systems are available at http://img.jgi.doe.gov/. PMID:23424620

  6. Improving microbial genome annotations in an integrated database context.

    Directory of Open Access Journals (Sweden)

    I-Min A Chen

    Full Text Available Effective comparative analysis of microbial genomes requires a consistent and complete view of biological data. Consistency regards the biological coherence of annotations, while completeness regards the extent and coverage of functional characterization for genomes. We have developed tools that allow scientists to assess and improve the consistency and completeness of microbial genome annotations in the context of the Integrated Microbial Genomes (IMG family of systems. All publicly available microbial genomes are characterized in IMG using different functional annotation and pathway resources, thus providing a comprehensive framework for identifying and resolving annotation discrepancies. A rule based system for predicting phenotypes in IMG provides a powerful mechanism for validating functional annotations, whereby the phenotypic traits of an organism are inferred based on the presence of certain metabolic reactions and pathways and compared to experimentally observed phenotypes. The IMG family of systems are available at http://img.jgi.doe.gov/.

  7. Construction, database integration, and application of an Oenothera EST library.

    Science.gov (United States)

    Mrácek, Jaroslav; Greiner, Stephan; Cho, Won Kyong; Rauwolf, Uwe; Braun, Martha; Umate, Pavan; Altstätter, Johannes; Stoppel, Rhea; Mlcochová, Lada; Silber, Martina V; Volz, Stefanie M; White, Sarah; Selmeier, Renate; Rudd, Stephen; Herrmann, Reinhold G; Meurer, Jörg

    2006-09-01

    Coevolution of cellular genetic compartments is a fundamental aspect in eukaryotic genome evolution that becomes apparent in serious developmental disturbances after interspecific organelle exchanges. The genus Oenothera represents a unique, at present the only available, resource to study the role of the compartmentalized plant genome in diversification of populations and speciation processes. An integrated approach involving cDNA cloning, EST sequencing, and bioinformatic data mining was chosen using Oenothera elata with the genetic constitution nuclear genome AA with plastome type I. The Gene Ontology system grouped 1621 unique gene products into 17 different functional categories. Application of arrays generated from a selected fraction of ESTs revealed significantly differing expression profiles among closely related Oenothera species possessing the potential to generate fertile and incompatible plastid/nuclear hybrids (hybrid bleaching). Furthermore, the EST library provides a valuable source of PCR-based polymorphic molecular markers that are instrumental for genotyping and molecular mapping approaches.

  8. Numerical modeling in photonic crystals integrated technology: the COPERNICUS Project

    DEFF Research Database (Denmark)

    Malaguti, Stefania; Armaroli, Andrea; Bellanca, Gaetano

    2011-01-01

    Photonic crystals will play a fundamental role in the future of optical communications. The relevance of the numerical modeling for the success of this technology is assessed by using some examples concerning the experience of the COPERNICUS Project.......Photonic crystals will play a fundamental role in the future of optical communications. The relevance of the numerical modeling for the success of this technology is assessed by using some examples concerning the experience of the COPERNICUS Project....

  9. CyanOmics: an integrated database of omics for the model cyanobacterium Synechococcus sp. PCC 7002.

    Science.gov (United States)

    Yang, Yaohua; Feng, Jie; Li, Tao; Ge, Feng; Zhao, Jindong

    2015-01-01

    Cyanobacteria are an important group of organisms that carry out oxygenic photosynthesis and play vital roles in both the carbon and nitrogen cycles of the Earth. The annotated genome of Synechococcus sp. PCC 7002, as an ideal model cyanobacterium, is available. A series of transcriptomic and proteomic studies of Synechococcus sp. PCC 7002 cells grown under different conditions have been reported. However, no database of such integrated omics studies has been constructed. Here we present CyanOmics, a database based on the results of Synechococcus sp. PCC 7002 omics studies. CyanOmics comprises one genomic dataset, 29 transcriptomic datasets and one proteomic dataset and should prove useful for systematic and comprehensive analysis of all those data. Powerful browsing and searching tools are integrated to help users directly access information of interest with enhanced visualization of the analytical results. Furthermore, Blast is included for sequence-based similarity searching and Cluster 3.0, as well as the R hclust function is provided for cluster analyses, to increase CyanOmics's usefulness. To the best of our knowledge, it is the first integrated omics analysis database for cyanobacteria. This database should further understanding of the transcriptional patterns, and proteomic profiling of Synechococcus sp. PCC 7002 and other cyanobacteria. Additionally, the entire database framework is applicable to any sequenced prokaryotic genome and could be applied to other integrated omics analysis projects. Database URL: http://lag.ihb.ac.cn/cyanomics. © The Author(s) 2015. Published by Oxford University Press.

  10. Integrating numerical computation into the undergraduate education physics curriculum using spreadsheet excel

    Science.gov (United States)

    Fauzi, Ahmad

    2017-11-01

    Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.

  11. Development of integrated parameter database for risk assessment at the Rokkasho Reprocessing Plant

    International Nuclear Information System (INIS)

    Tamauchi, Yoshikazu

    2011-01-01

    A study to develop a parameter database for Probabilistic Safety Assessment (PSA) for the application of risk information on plant operation and maintenance activity is important because the transparency, consistency, and traceability of parameters are needed to explanation adequacy of the evaluation to third parties. Application of risk information for the plant operation and maintenance activity, equipment reliability data, human error rate, and 5 factors of 'five-factor formula' for estimation of the amount of radioactive material discharge (source term) are key inputs. As a part of the infrastructure development for the risk information application, we developed the integrated parameter database, 'R-POD' (Rokkasho reprocessing Plant Omnibus parameter Database) on the trial basis for the PSA of the Rokkasho Reprocessing Plant. This database consists primarily of the following 3 parts, 1) an equipment reliability database, 2) a five-factor formula database, and 3) a human reliability database. The underpinning for explaining the validity of the risk assessment can be improved by developing this database. Furthermore, this database is an important tool for the application of risk information, because it provides updated data by incorporating the accumulated operation experiences of the Rokkasho reprocessing plant. (author)

  12. CTDB: An Integrated Chickpea Transcriptome Database for Functional and Applied Genomics

    OpenAIRE

    Verma, Mohit; Kumar, Vinay; Patel, Ravi K.; Garg, Rohini; Jain, Mukesh

    2015-01-01

    Chickpea is an important grain legume used as a rich source of protein in human diet. The narrow genetic diversity and limited availability of genomic resources are the major constraints in implementing breeding strategies and biotechnological interventions for genetic enhancement of chickpea. We developed an integrated Chickpea Transcriptome Database (CTDB), which provides the comprehensive web interface for visualization and easy retrieval of transcriptome data in chickpea. The database fea...

  13. GDR (Genome Database for Rosaceae: integrated web resources for Rosaceae genomics and genetics research

    Directory of Open Access Journals (Sweden)

    Ficklin Stephen

    2004-09-01

    Full Text Available Abstract Background Peach is being developed as a model organism for Rosaceae, an economically important family that includes fruits and ornamental plants such as apple, pear, strawberry, cherry, almond and rose. The genomics and genetics data of peach can play a significant role in the gene discovery and the genetic understanding of related species. The effective utilization of these peach resources, however, requires the development of an integrated and centralized database with associated analysis tools. Description The Genome Database for Rosaceae (GDR is a curated and integrated web-based relational database. GDR contains comprehensive data of the genetically anchored peach physical map, an annotated peach EST database, Rosaceae maps and markers and all publicly available Rosaceae sequences. Annotations of ESTs include contig assembly, putative function, simple sequence repeats, and anchored position to the peach physical map where applicable. Our integrated map viewer provides graphical interface to the genetic, transcriptome and physical mapping information. ESTs, BACs and markers can be queried by various categories and the search result sites are linked to the integrated map viewer or to the WebFPC physical map sites. In addition to browsing and querying the database, users can compare their sequences with the annotated GDR sequences via a dedicated sequence similarity server running either the BLAST or FASTA algorithm. To demonstrate the utility of the integrated and fully annotated database and analysis tools, we describe a case study where we anchored Rosaceae sequences to the peach physical and genetic map by sequence similarity. Conclusions The GDR has been initiated to meet the major deficiency in Rosaceae genomics and genetics research, namely a centralized web database and bioinformatics tools for data storage, analysis and exchange. GDR can be accessed at http://www.genome.clemson.edu/gdr/.

  14. GDR (Genome Database for Rosaceae): integrated web resources for Rosaceae genomics and genetics research.

    Science.gov (United States)

    Jung, Sook; Jesudurai, Christopher; Staton, Margaret; Du, Zhidian; Ficklin, Stephen; Cho, Ilhyung; Abbott, Albert; Tomkins, Jeffrey; Main, Dorrie

    2004-09-09

    Peach is being developed as a model organism for Rosaceae, an economically important family that includes fruits and ornamental plants such as apple, pear, strawberry, cherry, almond and rose. The genomics and genetics data of peach can play a significant role in the gene discovery and the genetic understanding of related species. The effective utilization of these peach resources, however, requires the development of an integrated and centralized database with associated analysis tools. The Genome Database for Rosaceae (GDR) is a curated and integrated web-based relational database. GDR contains comprehensive data of the genetically anchored peach physical map, an annotated peach EST database, Rosaceae maps and markers and all publicly available Rosaceae sequences. Annotations of ESTs include contig assembly, putative function, simple sequence repeats, and anchored position to the peach physical map where applicable. Our integrated map viewer provides graphical interface to the genetic, transcriptome and physical mapping information. ESTs, BACs and markers can be queried by various categories and the search result sites are linked to the integrated map viewer or to the WebFPC physical map sites. In addition to browsing and querying the database, users can compare their sequences with the annotated GDR sequences via a dedicated sequence similarity server running either the BLAST or FASTA algorithm. To demonstrate the utility of the integrated and fully annotated database and analysis tools, we describe a case study where we anchored Rosaceae sequences to the peach physical and genetic map by sequence similarity. The GDR has been initiated to meet the major deficiency in Rosaceae genomics and genetics research, namely a centralized web database and bioinformatics tools for data storage, analysis and exchange. GDR can be accessed at http://www.genome.clemson.edu/gdr/.

  15. Integration of TGS and CTEN assays using the CTENFIT analysis and databasing program

    International Nuclear Information System (INIS)

    Estep, R.

    2000-01-01

    The CTEN F IT program, written for Windows 9x/NT in C++, performs databasing and analysis of combined thermal/epithermal neutron (CTEN) passive and active neutron assay data and integrates that with isotopics results and gamma-ray data from methods such as tomographic gamma scanning (TGS). The binary database is reflected in a companion Excel database that allows extensive customization via Visual Basic for Applications macros. Automated analysis options make the analysis of the data transparent to the assay system operator. Various record browsers and information displays simplified record keeping tasks

  16. Optimal stability polynomials for numerical integration of initial value problems

    KAUST Repository

    Ketcheson, David I.; Ahmadia, Aron

    2013-01-01

    We consider the problem of finding optimally stable polynomial approximations to the exponential for application to one-step integration of initial value ordinary and partial differential equations. The objective is to find the largest stable step

  17. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.; Sasajima, K.; Matsugaki, N.; Suzuki, M.; Kosuge, T.; Wakatsuki, S.

    2004-01-01

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view, create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments

  18. Numerical method for solving linear Fredholm fuzzy integral equations of the second kind

    Energy Technology Data Exchange (ETDEWEB)

    Abbasbandy, S. [Department of Mathematics, Imam Khomeini International University, P.O. Box 288, Ghazvin 34194 (Iran, Islamic Republic of)]. E-mail: saeid@abbasbandy.com; Babolian, E. [Faculty of Mathematical Sciences and Computer Engineering, Teacher Training University, Tehran 15618 (Iran, Islamic Republic of); Alavi, M. [Department of Mathematics, Arak Branch, Islamic Azad University, Arak 38135 (Iran, Islamic Republic of)

    2007-01-15

    In this paper we use parametric form of fuzzy number and convert a linear fuzzy Fredholm integral equation to two linear system of integral equation of the second kind in crisp case. We can use one of the numerical method such as Nystrom and find the approximation solution of the system and hence obtain an approximation for fuzzy solution of the linear fuzzy Fredholm integral equations of the second kind. The proposed method is illustrated by solving some numerical examples.

  19. A perspective for biomedical data integration: Design of databases for flow cytometry

    Directory of Open Access Journals (Sweden)

    Lakoumentas John

    2008-02-01

    Full Text Available Abstract Background The integration of biomedical information is essential for tackling medical problems. We describe a data model in the domain of flow cytometry (FC allowing for massive management, analysis and integration with other laboratory and clinical information. The paper is concerned with the proper translation of the Flow Cytometry Standard (FCS into a relational database schema, in a way that facilitates end users at either doing research on FC or studying specific cases of patients undergone FC analysis Results The proposed database schema provides integration of data originating from diverse acquisition settings, organized in a way that allows syntactically simple queries that provide results significantly faster than the conventional implementations of the FCS standard. The proposed schema can potentially achieve up to 8 orders of magnitude reduction in query complexity and up to 2 orders of magnitude reduction in response time for data originating from flow cytometers that record 256 colours. This is mainly achieved by managing to maintain an almost constant number of data-mining procedures regardless of the size and complexity of the stored information. Conclusion It is evident that using single-file data storage standards for the design of databases without any structural transformations significantly limits the flexibility of databases. Analysis of the requirements of a specific domain for integration and massive data processing can provide the necessary schema modifications that will unlock the additional functionality of a relational database.

  20. Numerical evaluation of path-integral solutions to Fokker-Planck equations. II. Restricted stochastic processes

    International Nuclear Information System (INIS)

    Wehner, M.F.

    1983-01-01

    A path-integral solution is derived for processes described by nonlinear Fokker-Plank equations together with externally imposed boundary conditions. This path-integral solution is written in the form of a path sum for small time steps and contains, in addition to the conventional volume integral, a surface integral which incorporates the boundary conditions. A previously developed numerical method, based on a histogram representation of the probability distribution, is extended to a trapezoidal representation. This improved numerical approach is combined with the present path-integral formalism for restricted processes and is show t give accurate results. 35 refs., 5 figs

  1. Numerical Time Integration Methods for a Point Absorber Wave Energy Converter

    DEFF Research Database (Denmark)

    Zurkinden, Andrew Stephen; Kramer, Morten

    2012-01-01

    on a discretization of the convolution integral. The calculation of the convolution integral is performed at each time step regardless of the chosen numerical scheme. In the second model the convolution integral is replaced by a system of linear ordinary differential equations. The formulation of the state...

  2. LmSmdB: an integrated database for metabolic and gene regulatory network in Leishmania major and Schistosoma mansoni

    Directory of Open Access Journals (Sweden)

    Priyanka Patel

    2016-03-01

    Full Text Available A database that integrates all the information required for biological processing is essential to be stored in one platform. We have attempted to create one such integrated database that can be a one stop shop for the essential features required to fetch valuable result. LmSmdB (L. major and S. mansoni database is an integrated database that accounts for the biological networks and regulatory pathways computationally determined by integrating the knowledge of the genome sequences of the mentioned organisms. It is the first database of its kind that has together with the network designing showed the simulation pattern of the product. This database intends to create a comprehensive canopy for the regulation of lipid metabolism reaction in the parasite by integrating the transcription factors, regulatory genes and the protein products controlled by the transcription factors and hence operating the metabolism at genetic level. Keywords: L.major, S.mansoni, Regulatory networks, Transcription factors, Database

  3. Numerical Integration of Stiff System of Ordinary Differential ...

    African Journals Online (AJOL)

    The goal of this work is to develop, analyse and implement a K-step Implicit Rational Runge-Kutta schemes for Integration of Stiff system of Ordinary differential Equations. Its development adopted Taylor and Binomial series expansion Techniques to generate its parameters. The analysis of its basic properties adopted ...

  4. Numerical estimation of structural integrity of salt cavern wells.

    NARCIS (Netherlands)

    Orlic, B.; Thienen-Visser, K. van; Schreppers, G.J.

    2016-01-01

    Finite element analyses were performed to estimate axial deformation of cavern wells due to gas storage operations in solution-mined salt caverns. Caverns shrink over time due to salt creep and the cavern roof subsides potentially threatening well integrity. Cavern deformation, deformation of salt

  5. KaBOB: ontology-based semantic integration of biomedical databases.

    Science.gov (United States)

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for

  6. CTDB: An Integrated Chickpea Transcriptome Database for Functional and Applied Genomics.

    Directory of Open Access Journals (Sweden)

    Mohit Verma

    Full Text Available Chickpea is an important grain legume used as a rich source of protein in human diet. The narrow genetic diversity and limited availability of genomic resources are the major constraints in implementing breeding strategies and biotechnological interventions for genetic enhancement of chickpea. We developed an integrated Chickpea Transcriptome Database (CTDB, which provides the comprehensive web interface for visualization and easy retrieval of transcriptome data in chickpea. The database features many tools for similarity search, functional annotation (putative function, PFAM domain and gene ontology search and comparative gene expression analysis. The current release of CTDB (v2.0 hosts transcriptome datasets with high quality functional annotation from cultivated (desi and kabuli types and wild chickpea. A catalog of transcription factor families and their expression profiles in chickpea are available in the database. The gene expression data have been integrated to study the expression profiles of chickpea transcripts in major tissues/organs and various stages of flower development. The utilities, such as similarity search, ortholog identification and comparative gene expression have also been implemented in the database to facilitate comparative genomic studies among different legumes and Arabidopsis. Furthermore, the CTDB represents a resource for the discovery of functional molecular markers (microsatellites and single nucleotide polymorphisms between different chickpea types. We anticipate that integrated information content of this database will accelerate the functional and applied genomic research for improvement of chickpea. The CTDB web service is freely available at http://nipgr.res.in/ctdb.html.

  7. Numerical integration of the Teukolsky equation in the time domain

    International Nuclear Information System (INIS)

    Pazos-Avalos, Enrique; Lousto, Carlos O.

    2005-01-01

    We present a fourth-order convergent (2+1)-dimensional, numerical formalism to solve the Teukolsky equation in the time domain. Our approach is first to rewrite the Teukolsky equation as a system of first-order differential equations. In this way we get a system that has the form of an advection equation. This is then used in combination with a series expansion of the solution in powers of time. To obtain a fourth-order scheme we kept terms up to fourth derivative in time and use the advectionlike system of differential equations to substitute the temporal derivatives by spatial derivatives. This scheme is applied to evolve gravitational perturbations in the Schwarzschild and Kerr backgrounds. Our numerical method proved to be stable and fourth-order convergent in r* and θ directions. The correct power-law tail, ∼1/t 2l+3 , for general initial data, and ∼1/t 2l+4 , for time-symmetric data, was found in our runs. We noted that it is crucial to resolve accurately the angular dependence of the mode at late times in order to obtain these values of the exponents in the power-law decay. In other cases, when the decay was too fast and round-off error was reached before a tail was developed, then the quasinormal modes frequencies provided a test to determine the validity of our code

  8. Numerical Simulation of Antennas with Improved Integral Equation Method

    International Nuclear Information System (INIS)

    Ma Ji; Fang Guang-You; Lu Wei

    2015-01-01

    Simulating antennas around a conducting object is a challenge task in computational electromagnetism, which is concerned with the behaviour of electromagnetic fields. To analyze this model efficiently, an improved integral equation-fast Fourier transform (IE-FFT) algorithm is presented in this paper. The proposed scheme employs two Cartesian grids with different size and location to enclose the antenna and the other object, respectively. On the one hand, IE-FFT technique is used to store matrix in a sparse form and accelerate the matrix-vector multiplication for each sub-domain independently. On the other hand, the mutual interaction between sub-domains is taken as the additional exciting voltage in each matrix equation. By updating integral equations several times, the whole electromagnetic system can achieve a stable status. Finally, the validity of the presented method is verified through the analysis of typical antennas in the presence of a conducting object. (paper)

  9. Analysis of Numerical Simulation Database for Pressure Fluctuations Induced by High-Speed Turbulent Boundary Layers

    Science.gov (United States)

    Duan, Lian; Choudhari, Meelan M.

    2014-01-01

    Direct numerical simulations (DNS) of Mach 6 turbulent boundary layer with nominal freestream Mach number of 6 and Reynolds number of Re(sub T) approximately 460 are conducted at two wall temperatures (Tw/Tr = 0.25, 0.76) to investigate the generated pressure fluctuations and their dependence on wall temperature. Simulations indicate that the influence of wall temperature on pressure fluctuations is largely limited to the near-wall region, with the characteristics of wall-pressure fluctuations showing a strong temperature dependence. Wall temperature has little influence on the propagation speed of the freestream pressure signal. The freestream radiation intensity compares well between wall-temperature cases when normalized by the local wall shear; the propagation speed of the freestream pressure signal and the orientation of the radiation wave front show little dependence on the wall temperature.

  10. An object-oriented language-database integration model: The composition filters approach

    NARCIS (Netherlands)

    Aksit, Mehmet; Bergmans, Lodewijk; Vural, Sinan; Vural, S.

    1991-01-01

    This paper introduces a new model, based on so-called object-composition filters, that uniformly integrates database-like features into an object-oriented language. The focus is on providing persistent dynamic data structures, data sharing, transactions, multiple views and associative access,

  11. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii

    Directory of Open Access Journals (Sweden)

    Kempa Stefan

    2009-05-01

    Full Text Available Abstract Background The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. Results In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. Conclusion ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.

  12. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.

    Science.gov (United States)

    May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk

    2009-05-04

    The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.

  13. Document control system as an integral part of RA documentation database application

    International Nuclear Information System (INIS)

    Steljic, M.M; Ljubenov, V.Lj. . E-mail address of corresponding author: milijanas@vin.bg.ac.yu; Steljic, M.M.)

    2005-01-01

    The decision about the final shutdown of the RA research reactor in Vinca Institute has been brought in 2002, and therefore the preparations for its decommissioning have begun. All activities are supervised by the International Atomic Energy Agency (IAEA), which also provides technical and experts' support. This paper describes the document control system is an integral part of the existing RA documentation database. (author)

  14. An Object-Oriented Language-Database Integration Model: The Composition-Filters Approach

    NARCIS (Netherlands)

    Aksit, Mehmet; Bergmans, Lodewijk; Vural, S.; Vural, Sinan; Lehrmann Madsen, O.

    1992-01-01

    This paper introduces a new model, based on so-called object-composition filters, that uniformly integrates database-like features into an object-oriented language. The focus is on providing persistent dynamic data structures, data sharing, transactions, multiple views and associative access,

  15. Critical assessment of human metabolic pathway databases: a stepping stone for future integration

    Directory of Open Access Journals (Sweden)

    Stobbe Miranda D

    2011-10-01

    Full Text Available Abstract Background Multiple pathway databases are available that describe the human metabolic network and have proven their usefulness in many applications, ranging from the analysis and interpretation of high-throughput data to their use as a reference repository. However, so far the various human metabolic networks described by these databases have not been systematically compared and contrasted, nor has the extent to which they differ been quantified. For a researcher using these databases for particular analyses of human metabolism, it is crucial to know the extent of the differences in content and their underlying causes. Moreover, the outcomes of such a comparison are important for ongoing integration efforts. Results We compared the genes, EC numbers and reactions of five frequently used human metabolic pathway databases. The overlap is surprisingly low, especially on reaction level, where the databases agree on 3% of the 6968 reactions they have combined. Even for the well-established tricarboxylic acid cycle the databases agree on only 5 out of the 30 reactions in total. We identified the main causes for the lack of overlap. Importantly, the databases are partly complementary. Other explanations include the number of steps a conversion is described in and the number of possible alternative substrates listed. Missing metabolite identifiers and ambiguous names for metabolites also affect the comparison. Conclusions Our results show that each of the five networks compared provides us with a valuable piece of the puzzle of the complete reconstruction of the human metabolic network. To enable integration of the networks, next to a need for standardizing the metabolite names and identifiers, the conceptual differences between the databases should be resolved. Considerable manual intervention is required to reach the ultimate goal of a unified and biologically accurate model for studying the systems biology of human metabolism. Our comparison

  16. Numerical integration for ab initio many-electron self energy calculations within the GW approximation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Fang, E-mail: fliu@lsec.cc.ac.cn [School of Statistics and Mathematics, Central University of Finance and Economics, Beijing 100081 (China); Lin, Lin, E-mail: linlin@math.berkeley.edu [Department of Mathematics, University of California, Berkeley, CA 94720 (United States); Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Vigil-Fowler, Derek, E-mail: vigil@berkeley.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Lischner, Johannes, E-mail: jlischner597@gmail.com [Department of Physics, University of California, Berkeley, CA 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Kemper, Alexander F., E-mail: afkemper@lbl.gov [Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Sharifzadeh, Sahar, E-mail: ssharifz@bu.edu [Department of Electrical and Computer Engineering and Division of Materials Science and Engineering, Boston University, Boston, MA 02215 (United States); Jornada, Felipe H. da, E-mail: jornada@berkeley.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Deslippe, Jack, E-mail: jdeslippe@lbl.gov [NERSC, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Yang, Chao, E-mail: cyang@lbl.gov [Computational Research Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); and others

    2015-04-01

    We present a numerical integration scheme for evaluating the convolution of a Green's function with a screened Coulomb potential on the real axis in the GW approximation of the self energy. Our scheme takes the zero broadening limit in Green's function first, replaces the numerator of the integrand with a piecewise polynomial approximation, and performs principal value integration on subintervals analytically. We give the error bound of our numerical integration scheme and show by numerical examples that it is more reliable and accurate than the standard quadrature rules such as the composite trapezoidal rule. We also discuss the benefit of using different self energy expressions to perform the numerical convolution at different frequencies.

  17. Data Integration for Spatio-Temporal Patterns of Gene Expression of Zebrafish development: the GEMS database

    Directory of Open Access Journals (Sweden)

    Belmamoune Mounia

    2008-06-01

    Full Text Available The Gene Expression Management System (GEMS is a database system for patterns of gene expression. These patterns result from systematic whole-mount fluorescent in situ hybridization studies on zebrafish embryos. GEMS is an integrative platform that addresses one of the important challenges of developmental biology: how to integrate genetic data that underpin morphological changes during embryogenesis. Our motivation to build this system was by the need to be able to organize and compare multiple patterns of gene expression at tissue level. Integration with other developmental and biomolecular databases will further support our understanding of development. The GEMS operates in concert with a database containing a digital atlas of zebrafish embryo; this digital atlas of zebrafish development has been conceived prior to the expansion of the GEMS. The atlas contains 3D volume models of canonical stages of zebrafish development in which in each volume model element is annotated with an anatomical term. These terms are extracted from a formal anatomical ontology, i.e. the Developmental Anatomy Ontology of Zebrafish (DAOZ. In the GEMS, anatomical terms from this ontology together with terms from the Gene Ontology (GO are also used to annotate patterns of gene expression and in this manner providing mechanisms for integration and retrieval . The annotations are the glue for integration of patterns of gene expression in GEMS as well as in other biomolecular databases. At the one hand, zebrafish anatomy terminology allows gene expression data within GEMS to be integrated with phenotypical data in the 3D atlas of zebrafish development. At the other hand, GO terms extend GEMS expression patterns integration to a wide range of bioinformatics resources.

  18. Tight-coupling of groundwater flow and transport modelling engines with spatial databases and GIS technology: a new approach integrating Feflow and ArcGIS

    Directory of Open Access Journals (Sweden)

    Ezio Crestaz

    2012-09-01

    Full Text Available Implementation of groundwater flow and transport numerical models is generally a challenge, time-consuming and financially-demanding task, in charge to specialized modelers and consulting firms. At a later stage, within clearly stated limits of applicability, these models are often expected to be made available to less knowledgeable personnel to support/design and running of predictive simulations within more familiar environments than specialized simulation systems. GIS systems coupled with spatial databases appear to be ideal candidates to address problem above, due to their much wider diffusion and expertise availability. Current paper discusses the issue from a tight-coupling architecture perspective, aimed at integration of spatial databases, GIS and numerical simulation engines, addressing both observed and computed data management, retrieval and spatio-temporal analysis issues. Observed data can be migrated to the central database repository and then used to set up transient simulation conditions in the background, at run time, while limiting additional complexity and integrity failure risks as data duplication during data transfer through proprietary file formats. Similarly, simulation scenarios can be set up in a familiar GIS system and stored to spatial database for later reference. As numerical engine is tightly coupled with the GIS, simulations can be run within the environment and results themselves saved to the database. Further tasks, as spatio-temporal analysis (i.e. for postcalibration auditing scopes, cartography production and geovisualization, can then be addressed using traditional GIS tools. Benefits of such an approach include more effective data management practices, integration and availability of modeling facilities in a familiar environment, streamlining spatial analysis processes and geovisualization requirements for the non-modelers community. Major drawbacks include limited 3D and time-dependent support in

  19. Advanced Numerical Integration Techniques for HighFidelity SDE Spacecraft Simulation

    Data.gov (United States)

    National Aeronautics and Space Administration — Classic numerical integration techniques, such as the ones at the heart of several NASA GSFC analysis tools, are known to work well for deterministic differential...

  20. Some applications of perturbation theory to numerical integration methods for the Schroedinger equation

    International Nuclear Information System (INIS)

    Killingbeck, J.

    1979-01-01

    By using the methods of perturbation theory it is possible to construct simple formulae for the numerical integration of the Schroedinger equation, and also to calculate expectation values solely by means of simple eigenvalue calculations. (Auth.)

  1. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  2. Numerical Procedure to Forecast the Tsunami Parameters from a Database of Pre-Simulated Seismic Unit Sources

    Science.gov (United States)

    Jiménez, César; Carbonel, Carlos; Rojas, Joel

    2018-04-01

    We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.

  3. Numerical Integration of the Vlasov Equation of Two Colliding Beams

    CERN Document Server

    Zorzano-Mier, M P

    2000-01-01

    In a circular collider the motion of particles of one beam is strongly perturbed at the interaction points by the electro-magnetic field associated with the counter-rotating beam. For any two arbitrary initial particle distributions the time evolution of the two beams can be known by solving the coupled system of two Vlasov equations. This collective description is mandatory when the two beams have similar strengths, as in the case of LEP or LHC. The coherent modes excited by this beam-beam interaction can be a strong limitation for the operation of LHC. In this work, the coupled Vlasov equations of two colliding flat beams are solved numerically using a finite difference scheme. The results suggest that, for the collision of beams with equal tunes, the tune shift between the $\\sigma$- and $\\pi$- coherent dipole mode depends on the unperturbed tune $q$ because of the deformation that the so-called dynamic beta effect induces on the beam distribution. Only when the unperturbed tune $q\\rightarrow 0.25$ this tun...

  4. Numerical method for solving integral equations of neutron transport. II

    International Nuclear Information System (INIS)

    Loyalka, S.K.; Tsai, R.W.

    1975-01-01

    In a recent paper it was pointed out that the weakly singular integral equations of neutron transport can be quite conveniently solved by a method based on subtraction of singularity. This previous paper was devoted entirely to the consideration of simple one-dimensional isotropic-scattering and one-group problems. The present paper constitutes interesting extensions of the previous work in that in addition to a typical two-group anisotropic-scattering albedo problem in the slab geometry, the method is also applied to an isotropic-scattering problem in the x-y geometry. These results are compared with discrete S/sub N/ (ANISN or TWOTRAN-II) results, and for the problems considered here, the proposed method is found to be quite effective. Thus, the method appears to hold considerable potential for future applications. (auth)

  5. An Integrated Numerical Model of the Spray Forming Process

    DEFF Research Database (Denmark)

    Pryds, Nini; Hattel, Jesper; Pedersen, Trine Bjerre

    2002-01-01

    of the deposition model is accomplished using a 2D cylindrical heat flow model. This model is now coupled with an atomization model via a log-normal droplet size distribution. The coupling between the atomization and the deposition is accomplished by ensuring that the total droplet size distribution of the spray......In this paper, an integrated approach for modelling the entire spray forming process is presented. The basis for the analysis is a recently developed model which extents previous studies and includes the interaction between an array of droplets and the enveloping gas. The formulation...... is in fact the summation of 'local' droplet size distributions along the r-axis. A key parameter, which determines the yield and the shape of the deposit material, is the sticking efficiency. The sticking phenomenon is therefore incorporated into the deposition model. (C) 2002 Acta Materialia Inc. Published...

  6. High-precision numerical integration of equations in dynamics

    Science.gov (United States)

    Alesova, I. M.; Babadzanjanz, L. K.; Pototskaya, I. Yu.; Pupysheva, Yu. Yu.; Saakyan, A. T.

    2018-05-01

    An important requirement for the process of solving differential equations in Dynamics, such as the equations of the motion of celestial bodies and, in particular, the motion of cosmic robotic systems is high accuracy at large time intervals. One of effective tools for obtaining such solutions is the Taylor series method. In this connection, we note that it is very advantageous to reduce the given equations of Dynamics to systems with polynomial (in unknowns) right-hand sides. This allows us to obtain effective algorithms for finding the Taylor coefficients, a priori error estimates at each step of integration, and an optimal choice of the order of the approximation used. In the paper, these questions are discussed and appropriate algorithms are considered.

  7. Optimal stability polynomials for numerical integration of initial value problems

    KAUST Repository

    Ketcheson, David I.

    2013-01-08

    We consider the problem of finding optimally stable polynomial approximations to the exponential for application to one-step integration of initial value ordinary and partial differential equations. The objective is to find the largest stable step size and corresponding method for a given problem when the spectrum of the initial value problem is known. The problem is expressed in terms of a general least deviation feasibility problem. Its solution is obtained by a new fast, accurate, and robust algorithm based on convex optimization techniques. Global convergence of the algorithm is proven in the case that the order of approximation is one and in the case that the spectrum encloses a starlike region. Examples demonstrate the effectiveness of the proposed algorithm even when these conditions are not satisfied.

  8. Studying Turbulence Using Numerical Simulation Databases. No. 7; Proceedings of the Summer Program

    Science.gov (United States)

    1998-01-01

    The Seventh Summer Program of the Center for Turbulence Research took place in the four-week period, July 5 to July 31, 1998. This was the largest CTR Summer Program to date, involving thirty-six participants from the U. S. and nine other countries. Thirty-one Stanford and NASA-Ames staff members facilitated and contributed to most of the Summer projects. A new feature, and perhaps a preview of the future programs, was that many of the projects were executed on non-NASA computers. These included supercomputers located in Europe as well as those operated by the Departments of Defense and Energy in the United States. In addition, several simulation programs developed by the visiting participants at their home institutions were used. Another new feature was the prevalence of lap-top personal computers which were used by several participants to carry out some of the work that in the past were performed on desk-top workstations. We expect these trends to continue as computing power is enhanced and as more researchers (many of whom CTR alumni) use numerical simulations to study turbulent flows. CTR's main role continues to be in providing a forum for the study of turbulence for engineering analysis and in facilitating intellectual exchange among the leading researchers in the field. Once again the combustion group was the largest. Turbulent combustion has enjoyed remarkable progress in using simulations to address increasingly complex and practically more relevant questions. The combustion group's studies included such challenging topics as fuel evaporation, soot chemistry, and thermonuclear reactions. The latter study was one of three projects related to the Department of Energy's ASCI Program (www.llnl.gov/asci); the other two (rocket propulsion and fire safety) were carried out in the turbulence modeling group. The flow control and acoustics group demonstrated a successful application of the so-called evolution algorithms which actually led to a previously unknown

  9. Integration of first-principles methods and crystallographic database searches for new ferroelectrics: Strategies and explorations

    International Nuclear Information System (INIS)

    Bennett, Joseph W.; Rabe, Karin M.

    2012-01-01

    In this concept paper, the development of strategies for the integration of first-principles methods with crystallographic database mining for the discovery and design of novel ferroelectric materials is discussed, drawing on the results and experience derived from exploratory investigations on three different systems: (1) the double perovskite Sr(Sb 1/2 Mn 1/2 )O 3 as a candidate semiconducting ferroelectric; (2) polar derivatives of schafarzikite MSb 2 O 4 ; and (3) ferroelectric semiconductors with formula M 2 P 2 (S,Se) 6 . A variety of avenues for further research and investigation are suggested, including automated structure type classification, low-symmetry improper ferroelectrics, and high-throughput first-principles searches for additional representatives of structural families with desirable functional properties. - Graphical abstract: Integration of first-principles methods with crystallographic database mining, for the discovery and design of novel ferroelectric materials, could potentially lead to new classes of multifunctional materials. Highlights: ► Integration of first-principles methods and database mining. ► Minor structural families with desirable functional properties. ► Survey of polar entries in the Inorganic Crystal Structural Database.

  10. A global database of seismically and non-seismically triggered landslides for 2D/3D numerical modeling

    Science.gov (United States)

    Domej, Gisela; Bourdeau, Céline; Lenti, Luca; Pluta, Kacper

    2017-04-01

    Landsliding is a worldwide common phenomenon. Every year, and ranging in size from very small to enormous, landslides cause all too often loss of life and disastrous damage to infrastructure, property and the environment. One main reason for more frequent catastrophes is the growth of population on the Earth which entails extending urbanization to areas at risk. Landslides are triggered by a variety and combination of causes, among which the role of water and seismic activity appear to have the most serious consequences. In this regard, seismic shaking is of particular interest since topographic elevation as well as the landslide mass itself can trap waves and hence amplify incoming surface waves - a phenomenon known as "site effects". Research on the topic of landsliding due to seismic and non-seismic activity is extensive and a broad spectrum of methods for modeling slope deformation is available. Those methods range from pseudo-static and rigid-block based models to numerical models. The majority is limited to 2D modeling since more sophisticated approaches in 3D are still under development or calibration. However, the effect of lateral confinement as well as the mechanical properties of the adjacent bedrock might be of great importance because they may enhance the focusing of trapped waves in the landslide mass. A database was created to study 3D landslide geometries. It currently contains 277 distinct seismically and non-seismically triggered landslides spread all around the globe whose rupture bodies were measured in all available details. Therefore a specific methodology was developed to maintain predefined standards, to keep the bias as low as possible and to set up a query tool to explore the database. Besides geometry, additional information such as location, date, triggering factors, material, sliding mechanisms, event chronology, consequences, related literature, among other things are stored for every case. The aim of the database is to enable

  11. Distributed Database Semantic Integration of Wireless Sensor Network to Access the Environmental Monitoring System

    Directory of Open Access Journals (Sweden)

    Ubaidillah Umar

    2018-06-01

    Full Text Available A wireless sensor network (WSN works continuously to gather information from sensors that generate large volumes of data to be handled and processed by applications. Current efforts in sensor networks focus more on networking and development services for a variety of applications and less on processing and integrating data from heterogeneous sensors. There is an increased need for information to become shareable across different sensors, database platforms, and applications that are not easily implemented in traditional database systems. To solve the issue of these large amounts of data from different servers and database platforms (including sensor data, a semantic sensor web service platform is needed to enable a machine to extract meaningful information from the sensor’s raw data. This additionally helps to minimize and simplify data processing and to deduce new information from existing data. This paper implements a semantic web data platform (SWDP to manage the distribution of data sensors based on the semantic database system. SWDP uses sensors for temperature, humidity, carbon monoxide, carbon dioxide, luminosity, and noise. The system uses the Sesame semantic web database for data processing and a WSN to distribute, minimize, and simplify information processing. The sensor nodes are distributed in different places to collect sensor data. The SWDP generates context information in the form of a resource description framework. The experiment results demonstrate that the SWDP is more efficient than the traditional database system in terms of memory usage and processing time.

  12. Using ontology databases for scalable query answering, inconsistency detection, and data integration

    Science.gov (United States)

    Dou, Dejing

    2011-01-01

    An ontology database is a basic relational database management system that models an ontology plus its instances. To reason over the transitive closure of instances in the subsumption hierarchy, for example, an ontology database can either unfold views at query time or propagate assertions using triggers at load time. In this paper, we use existing benchmarks to evaluate our method—using triggers—and we demonstrate that by forward computing inferences, we not only improve query time, but the improvement appears to cost only more space (not time). However, we go on to show that the true penalties were simply opaque to the benchmark, i.e., the benchmark inadequately captures load-time costs. We have applied our methods to two case studies in biomedicine, using ontologies and data from genetics and neuroscience to illustrate two important applications: first, ontology databases answer ontology-based queries effectively; second, using triggers, ontology databases detect instance-based inconsistencies—something not possible using views. Finally, we demonstrate how to extend our methods to perform data integration across multiple, distributed ontology databases. PMID:22163378

  13. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi; Ikeo, Kazuho; Katayama, Yukie; Kawabata, Takeshi; Kinjo, Akira R.; Kinoshita, Kengo; Kwon, Yeondae; Migita, Ohsuke; Mizutani, Hisashi; Muraoka, Masafumi; Nagata, Koji; Omori, Satoshi; Sugawara, Hideaki; Yamada, Daichi; Yura, Kei

    2016-01-01

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  14. VaProS: a database-integration approach for protein/genome information retrieval

    KAUST Repository

    Gojobori, Takashi

    2016-12-24

    Life science research now heavily relies on all sorts of databases for genome sequences, transcription, protein three-dimensional (3D) structures, protein–protein interactions, phenotypes and so forth. The knowledge accumulated by all the omics research is so vast that a computer-aided search of data is now a prerequisite for starting a new study. In addition, a combinatory search throughout these databases has a chance to extract new ideas and new hypotheses that can be examined by wet-lab experiments. By virtually integrating the related databases on the Internet, we have built a new web application that facilitates life science researchers for retrieving experts’ knowledge stored in the databases and for building a new hypothesis of the research target. This web application, named VaProS, puts stress on the interconnection between the functional information of genome sequences and protein 3D structures, such as structural effect of the gene mutation. In this manuscript, we present the notion of VaProS, the databases and tools that can be accessed without any knowledge of database locations and data formats, and the power of search exemplified in quest of the molecular mechanisms of lysosomal storage disease. VaProS can be freely accessed at http://p4d-info.nig.ac.jp/vapros/.

  15. MAGIC Database and Interfaces: An Integrated Package for Gene Discovery and Expression

    Directory of Open Access Journals (Sweden)

    Lee H. Pratt

    2006-03-01

    Full Text Available The rapidly increasing rate at which biological data is being produced requires a corresponding growth in relational databases and associated tools that can help laboratories contend with that data. With this need in mind, we describe here a Modular Approach to a Genomic, Integrated and Comprehensive (MAGIC Database. This Oracle 9i database derives from an initial focus in our laboratory on gene discovery via production and analysis of expressed sequence tags (ESTs, and subsequently on gene expression as assessed by both EST clustering and microarrays. The MAGIC Gene Discovery portion of the database focuses on information derived from DNA sequences and on its biological relevance. In addition to MAGIC SEQ-LIMS, which is designed to support activities in the laboratory, it contains several additional subschemas. The latter include MAGIC Admin for database administration, MAGIC Sequence for sequence processing as well as sequence and clone attributes, MAGIC Cluster for the results of EST clustering, MAGIC Polymorphism in support of microsatellite and single-nucleotide-polymorphism discovery, and MAGIC Annotation for electronic annotation by BLAST and BLAT. The MAGIC Microarray portion is a MIAME-compliant database with two components at present. These are MAGIC Array-LIMS, which makes possible remote entry of all information into the database, and MAGIC Array Analysis, which provides data mining and visualization. Because all aspects of interaction with the MAGIC Database are via a web browser, it is ideally suited not only for individual research laboratories but also for core facilities that serve clients at any distance.

  16. Numerical Treatment of Fixed Point Applied to the Nonlinear Fredholm Integral Equation

    Directory of Open Access Journals (Sweden)

    Berenguer MI

    2009-01-01

    Full Text Available The authors present a method of numerical approximation of the fixed point of an operator, specifically the integral one associated with a nonlinear Fredholm integral equation, that uses strongly the properties of a classical Schauder basis in the Banach space .

  17. Integrated database for identifying candidate genes for Aspergillus flavus resistance in maize.

    Science.gov (United States)

    Kelley, Rowena Y; Gresham, Cathy; Harper, Jonathan; Bridges, Susan M; Warburton, Marilyn L; Hawkins, Leigh K; Pechanova, Olga; Peethambaran, Bela; Pechan, Tibor; Luthe, Dawn S; Mylroie, J E; Ankala, Arunkanth; Ozkan, Seval; Henry, W B; Williams, W P

    2010-10-07

    Aspergillus flavus Link:Fr, an opportunistic fungus that produces aflatoxin, is pathogenic to maize and other oilseed crops. Aflatoxin is a potent carcinogen, and its presence markedly reduces the value of grain. Understanding and enhancing host resistance to A. flavus infection and/or subsequent aflatoxin accumulation is generally considered an efficient means of reducing grain losses to aflatoxin. Different proteomic, genomic and genetic studies of maize (Zea mays L.) have generated large data sets with the goal of identifying genes responsible for conferring resistance to A. flavus, or aflatoxin. In order to maximize the usage of different data sets in new studies, including association mapping, we have constructed a relational database with web interface integrating the results of gene expression, proteomic (both gel-based and shotgun), Quantitative Trait Loci (QTL) genetic mapping studies, and sequence data from the literature to facilitate selection of candidate genes for continued investigation. The Corn Fungal Resistance Associated Sequences Database (CFRAS-DB) (http://agbase.msstate.edu/) was created with the main goal of identifying genes important to aflatoxin resistance. CFRAS-DB is implemented using MySQL as the relational database management system running on a Linux server, using an Apache web server, and Perl CGI scripts as the web interface. The database and the associated web-based interface allow researchers to examine many lines of evidence (e.g. microarray, proteomics, QTL studies, SNP data) to assess the potential role of a gene or group of genes in the response of different maize lines to A. flavus infection and subsequent production of aflatoxin by the fungus. CFRAS-DB provides the first opportunity to integrate data pertaining to the problem of A. flavus and aflatoxin resistance in maize in one resource and to support queries across different datasets. The web-based interface gives researchers different query options for mining the database

  18. Brassica database (BRAD) version 2.0: integrating and mining Brassicaceae species genomic resources.

    Science.gov (United States)

    Wang, Xiaobo; Wu, Jian; Liang, Jianli; Cheng, Feng; Wang, Xiaowu

    2015-01-01

    The Brassica database (BRAD) was built initially to assist users apply Brassica rapa and Arabidopsis thaliana genomic data efficiently to their research. However, many Brassicaceae genomes have been sequenced and released after its construction. These genomes are rich resources for comparative genomics, gene annotation and functional evolutionary studies of Brassica crops. Therefore, we have updated BRAD to version 2.0 (V2.0). In BRAD V2.0, 11 more Brassicaceae genomes have been integrated into the database, namely those of Arabidopsis lyrata, Aethionema arabicum, Brassica oleracea, Brassica napus, Camelina sativa, Capsella rubella, Leavenworthia alabamica, Sisymbrium irio and three extremophiles Schrenkiella parvula, Thellungiella halophila and Thellungiella salsuginea. BRAD V2.0 provides plots of syntenic genomic fragments between pairs of Brassicaceae species, from the level of chromosomes to genomic blocks. The Generic Synteny Browser (GBrowse_syn), a module of the Genome Browser (GBrowse), is used to show syntenic relationships between multiple genomes. Search functions for retrieving syntenic and non-syntenic orthologs, as well as their annotation and sequences are also provided. Furthermore, genome and annotation information have been imported into GBrowse so that all functional elements can be visualized in one frame. We plan to continually update BRAD by integrating more Brassicaceae genomes into the database. Database URL: http://brassicadb.org/brad/. © The Author(s) 2015. Published by Oxford University Press.

  19. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  20. dbPAF: an integrative database of protein phosphorylation in animals and fungi.

    Science.gov (United States)

    Ullah, Shahid; Lin, Shaofeng; Xu, Yang; Deng, Wankun; Ma, Lili; Zhang, Ying; Liu, Zexian; Xue, Yu

    2016-03-24

    Protein phosphorylation is one of the most important post-translational modifications (PTMs) and regulates a broad spectrum of biological processes. Recent progresses in phosphoproteomic identifications have generated a flood of phosphorylation sites, while the integration of these sites is an urgent need. In this work, we developed a curated database of dbPAF, containing known phosphorylation sites in H. sapiens, M. musculus, R. norvegicus, D. melanogaster, C. elegans, S. pombe and S. cerevisiae. From the scientific literature and public databases, we totally collected and integrated 54,148 phosphoproteins with 483,001 phosphorylation sites. Multiple options were provided for accessing the data, while original references and other annotations were also present for each phosphoprotein. Based on the new data set, we computationally detected significantly over-represented sequence motifs around phosphorylation sites, predicted potential kinases that are responsible for the modification of collected phospho-sites, and evolutionarily analyzed phosphorylation conservation states across different species. Besides to be largely consistent with previous reports, our results also proposed new features of phospho-regulation. Taken together, our database can be useful for further analyses of protein phosphorylation in human and other model organisms. The dbPAF database was implemented in PHP + MySQL and freely available at http://dbpaf.biocuckoo.org.

  1. Integration of the ATLAS tag database with data management and analysis components

    International Nuclear Information System (INIS)

    Cranshaw, J; Malon, D; Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C

    2008-01-01

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted

  2. Integration of the ATLAS tag database with data management and analysis components

    Energy Technology Data Exchange (ETDEWEB)

    Cranshaw, J; Malon, D [Argonne National Laboratory, Argonne, IL 60439 (United States); Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C [Department of Physics and Astronomy, University of Glasgow, Glasgow, G12 8QQ, Scotland (United Kingdom)], E-mail: c.nicholson@physics.gla.ac.uk

    2008-07-15

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted.

  3. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  4. Integrating the DLD dosimetry system into the Almaraz NPP Corporative Database

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    1996-01-01

    The article discusses the experience acquired during the integration of a new MGP Instruments DLD Dosimetry System into the Almaraz NPP corporative database and general communications network, following a client-server philosophy and taking into account the computer standards of the Plant. The most important results obtained are: Integration of DLD dosimetry information into corporative databases, permitting the use of new applications Sharing of existing personnel information with the DLD dosimetry application, thereby avoiding the redundant work of introducing data and improving the quality of the information. Facilitation of maintenance, both software and hardware, of the DLD system. Maximum explotation, from the computer point of view, of the initial investment. Adaptation of the application to the applicable legislation. (Author)

  5. Computing the demagnetizing tensor for finite difference micromagnetic simulations via numerical integration

    International Nuclear Information System (INIS)

    Chernyshenko, Dmitri; Fangohr, Hans

    2015-01-01

    In the finite difference method which is commonly used in computational micromagnetics, the demagnetizing field is usually computed as a convolution of the magnetization vector field with the demagnetizing tensor that describes the magnetostatic field of a cuboidal cell with constant magnetization. An analytical expression for the demagnetizing tensor is available, however at distances far from the cuboidal cell, the numerical evaluation of the analytical expression can be very inaccurate. Due to this large-distance inaccuracy numerical packages such as OOMMF compute the demagnetizing tensor using the explicit formula at distances close to the originating cell, but at distances far from the originating cell a formula based on an asymptotic expansion has to be used. In this work, we describe a method to calculate the demagnetizing field by numerical evaluation of the multidimensional integral in the demagnetizing tensor terms using a sparse grid integration scheme. This method improves the accuracy of computation at intermediate distances from the origin. We compute and report the accuracy of (i) the numerical evaluation of the exact tensor expression which is best for short distances, (ii) the asymptotic expansion best suited for large distances, and (iii) the new method based on numerical integration, which is superior to methods (i) and (ii) for intermediate distances. For all three methods, we show the measurements of accuracy and execution time as a function of distance, for calculations using single precision (4-byte) and double precision (8-byte) floating point arithmetic. We make recommendations for the choice of scheme order and integrating coefficients for the numerical integration method (iii). - Highlights: • We study the accuracy of demagnetization in finite difference micromagnetics. • We introduce a new sparse integration method to compute the tensor more accurately. • Newell, sparse integration and asymptotic method are compared for all ranges

  6. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    Science.gov (United States)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  7. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii

    OpenAIRE

    May, P.; Christian, J.O.; Kempa, S.; Walther, D.

    2009-01-01

    Abstract Background The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. Results In the fra...

  8. CyanOmics: an integrated database of omics for the model cyanobacterium Synechococcus sp. PCC 7002

    OpenAIRE

    Yang, Yaohua; Feng, Jie; Li, Tao; Ge, Feng; Zhao, Jindong

    2015-01-01

    Cyanobacteria are an important group of organisms that carry out oxygenic photosynthesis and play vital roles in both the carbon and nitrogen cycles of the Earth. The annotated genome of Synechococcus sp. PCC 7002, as an ideal model cyanobacterium, is available. A series of transcriptomic and proteomic studies of Synechococcus sp. PCC 7002 cells grown under different conditions have been reported. However, no database of such integrated omics studies has been constructed. Here we present Cyan...

  9. Automated granularity to integrate digital information: the "Antarctic Treaty Searchable Database" case study

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2006-06-01

    Full Text Available Access to information is necessary, but not sufficient in our digital era. The challenge is to objectively integrate digital resources based on user-defined objectives for the purpose of discovering information relationships that facilitate interpretations and decision making. The Antarctic Treaty Searchable Database (http://aspire.nvi.net, which is in its sixth edition, provides an example of digital integration based on the automated generation of information granules that can be dynamically combined to reveal objective relationships within and between digital information resources. This case study further demonstrates that automated granularity and dynamic integration can be accomplished simply by utilizing the inherent structure of the digital information resources. Such information integration is relevant to library and archival programs that require long-term preservation of authentic digital resources.

  10. Integrating Environmental and Human Health Databases in the Great Lakes Basin: Themes, Challenges and Future Directions

    Directory of Open Access Journals (Sweden)

    Kate L. Bassil

    2015-03-01

    Full Text Available Many government, academic and research institutions collect environmental data that are relevant to understanding the relationship between environmental exposures and human health. Integrating these data with health outcome data presents new challenges that are important to consider to improve our effective use of environmental health information. Our objective was to identify the common themes related to the integration of environmental and health data, and suggest ways to address the challenges and make progress toward more effective use of data already collected, to further our understanding of environmental health associations in the Great Lakes region. Environmental and human health databases were identified and reviewed using literature searches and a series of one-on-one and group expert consultations. Databases identified were predominantly environmental stressors databases, with fewer found for health outcomes and human exposure. Nine themes or factors that impact integration were identified: data availability, accessibility, harmonization, stakeholder collaboration, policy and strategic alignment, resource adequacy, environmental health indicators, and data exchange networks. The use and cost effectiveness of data currently collected could be improved by strategic changes to data collection and access systems to provide better opportunities to identify and study environmental exposures that may impact human health.

  11. Analysis of thermal-plastic response of shells of revolution by numerical integration

    International Nuclear Information System (INIS)

    Leonard, J.W.

    1975-01-01

    An economic technique for the numerical analysis of the elasto-plastic behaviour of shells of revolution would be of considerable value in the nuclear reactor industry. A numerical method based on the numerical integration of the governing shell equations has been shown, for elastic cases, to be more efficient than the finite element method when applied to shells of revolution. In the numerical integration method, the governing differential equations of motion are converted into a set of initial-value problems. Each initial-value problem is integrated numerically between meridional boundary points and recombined so as to satisfy boundary conditions. For large-deflection elasto-plastic behaviour, the equations are nonlinear and, hence, are recombined in an iterative manner using the Newton-Raphson procedure. Suppression techniques are incorporated in order to eliminate extraneous solutions within the numerical integration procedure. The Reissner-Meissner shell theory for shells of revolution is adopted to account for large deflection and higher-order rotation effects. The computer modelling of the equations is quite general in that specific shell segment geometries, e.g. cylindrical, spherical, toroidal, conical segments, and any combinations thereof can be handled easily. (Auth.)

  12. Numerical

    Directory of Open Access Journals (Sweden)

    M. Boumaza

    2015-07-01

    Full Text Available Transient convection heat transfer is of fundamental interest in many industrial and environmental situations, as well as in electronic devices and security of energy systems. Transient fluid flow problems are among the more difficult to analyze and yet are very often encountered in modern day technology. The main objective of this research project is to carry out a theoretical and numerical analysis of transient convective heat transfer in vertical flows, when the thermal field is due to different kinds of variation, in time and space of some boundary conditions, such as wall temperature or wall heat flux. This is achieved by the development of a mathematical model and its resolution by suitable numerical methods, as well as performing various sensitivity analyses. These objectives are achieved through a theoretical investigation of the effects of wall and fluid axial conduction, physical properties and heat capacity of the pipe wall on the transient downward mixed convection in a circular duct experiencing a sudden change in the applied heat flux on the outside surface of a central zone.

  13. Multi-symplectic integrators: numerical schemes for Hamiltonian PDEs that conserve symplecticity

    Science.gov (United States)

    Bridges, Thomas J.; Reich, Sebastian

    2001-06-01

    The symplectic numerical integration of finite-dimensional Hamiltonian systems is a well established subject and has led to a deeper understanding of existing methods as well as to the development of new very efficient and accurate schemes, e.g., for rigid body, constrained, and molecular dynamics. The numerical integration of infinite-dimensional Hamiltonian systems or Hamiltonian PDEs is much less explored. In this Letter, we suggest a new theoretical framework for generalizing symplectic numerical integrators for ODEs to Hamiltonian PDEs in R2: time plus one space dimension. The central idea is that symplecticity for Hamiltonian PDEs is directional: the symplectic structure of the PDE is decomposed into distinct components representing space and time independently. In this setting PDE integrators can be constructed by concatenating uni-directional ODE symplectic integrators. This suggests a natural definition of multi-symplectic integrator as a discretization that conserves a discrete version of the conservation of symplecticity for Hamiltonian PDEs. We show that this approach leads to a general framework for geometric numerical schemes for Hamiltonian PDEs, which have remarkable energy and momentum conservation properties. Generalizations, including development of higher-order methods, application to the Euler equations in fluid mechanics, application to perturbed systems, and extension to more than one space dimension are also discussed.

  14. Integrated data acquisition, storage, retrieval and processing using the COMPASS DataBase (CDB)

    Energy Technology Data Exchange (ETDEWEB)

    Urban, J., E-mail: urban@ipp.cas.cz [Institute of Plasma Physics AS CR, v.v.i., Za Slovankou 3, 182 00 Praha 8 (Czech Republic); Pipek, J.; Hron, M. [Institute of Plasma Physics AS CR, v.v.i., Za Slovankou 3, 182 00 Praha 8 (Czech Republic); Janky, F.; Papřok, R.; Peterka, M. [Institute of Plasma Physics AS CR, v.v.i., Za Slovankou 3, 182 00 Praha 8 (Czech Republic); Department of Surface and Plasma Science, Faculty of Mathematics and Physics, Charles University in Prague, V Holešovičkách 2, 180 00 Praha 8 (Czech Republic); Duarte, A.S. [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade Técnica de Lisboa, 1049-001 Lisboa (Portugal)

    2014-05-15

    Highlights: • CDB is used as a new data storage solution for the COMPASS tokamak. • The software is light weight, open, fast and easily extensible and scalable. • CDB seamlessly integrates with any data acquisition system. • Rich metadata are stored for physics signals. • Data can be processed automatically, based on dependence rules. - Abstract: We present a complex data handling system for the COMPASS tokamak, operated by IPP ASCR Prague, Czech Republic [1]. The system, called CDB (COMPASS DataBase), integrates different data sources as an assortment of data acquisition hardware and software from different vendors is used. Based on widely available open source technologies wherever possible, CDB is vendor and platform independent and it can be easily scaled and distributed. The data is directly stored and retrieved using a standard NAS (Network Attached Storage), hence independent of the particular technology; the description of the data (the metadata) is recorded in a relational database. Database structure is general and enables the inclusion of multi-dimensional data signals in multiple revisions (no data is overwritten). This design is inherently distributed as the work is off-loaded to the clients. Both NAS and database can be implemented and optimized for fast local access as well as secure remote access. CDB is implemented in Python language; bindings for Java, C/C++, IDL and Matlab are provided. Independent data acquisitions systems as well as nodes managed by FireSignal [2] are all integrated using CDB. An automated data post-processing server is a part of CDB. Based on dependency rules, the server executes, in parallel if possible, prescribed post-processing tasks.

  15. Recovery Act: An Integrated Experimental and Numerical Study: Developing a Reaction Transport Model that Couples Chemical Reactions of Mineral Dissolution/Precipitation with Spatial and Temporal Flow Variations.

    Energy Technology Data Exchange (ETDEWEB)

    Saar, Martin O. [ETH Zurich (Switzerland); Univ. of Minnesota, Minneapolis, MN (United States); Seyfried, Jr., William E. [Univ. of Minnesota, Minneapolis, MN (United States); Longmire, Ellen K. [Univ. of Minnesota, Minneapolis, MN (United States)

    2016-06-24

    A total of 12 publications and 23 abstracts were produced as a result of this study. In particular, the compilation of a thermodynamic database utilizing consistent, current thermodynamic data is a major step toward accurately modeling multi-phase fluid interactions with solids. Existing databases designed for aqueous fluids did not mesh well with existing solid phase databases. Addition of a second liquid phase (CO2) magnifies the inconsistencies between aqueous and solid thermodynamic databases. Overall, the combination of high temperature and pressure lab studies (task 1), using a purpose built apparatus, and solid characterization (task 2), using XRCT and more developed technologies, allowed observation of dissolution and precipitation processes under CO2 reservoir conditions. These observations were combined with results from PIV experiments on multi-phase fluids (task 3) in typical flow path geometries. The results of the tasks 1, 2, and 3 were compiled and integrated into numerical models utilizing Lattice-Boltzmann simulations (task 4) to realistically model the physical processes and were ultimately folded into TOUGH2 code for reservoir scale modeling (task 5). Compilation of the thermodynamic database assisted comparisons to PIV experiments (Task 3) and greatly improved Lattice Boltzmann (Task 4) and TOUGH2 simulations (Task 5). PIV (Task 3) and experimental apparatus (Task 1) have identified problem areas in TOUGHREACT code. Additional lab experiments and coding work has been integrated into an improved numerical modeling code.

  16. IntPath--an integrated pathway gene relationship database for model organisms and important pathogens.

    Science.gov (United States)

    Zhou, Hufeng; Jin, Jingjing; Zhang, Haojun; Yi, Bo; Wozniak, Michal; Wong, Limsoon

    2012-01-01

    Pathway data are important for understanding the relationship between genes, proteins and many other molecules in living organisms. Pathway gene relationships are crucial information for guidance, prediction, reference and assessment in biochemistry, computational biology, and medicine. Many well-established databases--e.g., KEGG, WikiPathways, and BioCyc--are dedicated to collecting pathway data for public access. However, the effectiveness of these databases is hindered by issues such as incompatible data formats, inconsistent molecular representations, inconsistent molecular relationship representations, inconsistent referrals to pathway names, and incomprehensive data from different databases. In this paper, we overcome these issues through extraction, normalization and integration of pathway data from several major public databases (KEGG, WikiPathways, BioCyc, etc). We build a database that not only hosts our integrated pathway gene relationship data for public access but also maintains the necessary updates in the long run. This public repository is named IntPath (Integrated Pathway gene relationship database for model organisms and important pathogens). Four organisms--S. cerevisiae, M. tuberculosis H37Rv, H. Sapiens and M. musculus--are included in this version (V2.0) of IntPath. IntPath uses the "full unification" approach to ensure no deletion and no introduced noise in this process. Therefore, IntPath contains much richer pathway-gene and pathway-gene pair relationships and much larger number of non-redundant genes and gene pairs than any of the single-source databases. The gene relationships of each gene (measured by average node degree) per pathway are significantly richer. The gene relationships in each pathway (measured by average number of gene pairs per pathway) are also considerably richer in the integrated pathways. Moderate manual curation are involved to get rid of errors and noises from source data (e.g., the gene ID errors in WikiPathways and

  17. Direct Calculation of Permeability by High-Accurate Finite Difference and Numerical Integration Methods

    KAUST Repository

    Wang, Yi

    2016-07-21

    Velocity of fluid flow in underground porous media is 6~12 orders of magnitudes lower than that in pipelines. If numerical errors are not carefully controlled in this kind of simulations, high distortion of the final results may occur [1-4]. To fit the high accuracy demands of fluid flow simulations in porous media, traditional finite difference methods and numerical integration methods are discussed and corresponding high-accurate methods are developed. When applied to the direct calculation of full-tensor permeability for underground flow, the high-accurate finite difference method is confirmed to have numerical error as low as 10-5% while the high-accurate numerical integration method has numerical error around 0%. Thus, the approach combining the high-accurate finite difference and numerical integration methods is a reliable way to efficiently determine the characteristics of general full-tensor permeability such as maximum and minimum permeability components, principal direction and anisotropic ratio. Copyright © Global-Science Press 2016.

  18. Analysis of thermal-plastic response of shells of revolution by numerical integration

    International Nuclear Information System (INIS)

    Leonard, J.W.

    1975-01-01

    A numerical method based instead on the numerical integration of the governing shell equations has been shown, for elastic cases, to be more efficient than the finite element method when applied to shells of revolution. In the numerical integration method, the governing differential equations of motions are converted into a set of initial-value problems. Each initial-value problem is integrated numerically between meridional boundary points and recombined so as to satisfy boundary conditions. For large-deflection elasto-plastic behavior, the equations are nonlinear and, hence, are recombined in an iterative manner using the Newton-Raphson procedure. Suppression techniques are incorporated in order to eliminate extraneous solutions within the numerical integration procedure. The Reissner-Meissner shell theory for shells of revolution is adopted to account for large deflection and higher-order rotation effects. The computer modelling of the equations is quite general in that specific shell segment geometries, e.g. cylindrical, spherical, toroidal, conical segments, and any combinations thereof can be handled easily. The elasto-plastic constitutive relations adopted are in accordance with currently recommended constitutive equations for inelastic design analysis of FFTF Components. The Von Mises yield criteria and associated flow rule is used and the kinematic hardening law is followed. Examples are considered in which stainless steels common to LMFBR application are used

  19. Experimental research and numerical simulation on flow resistance of integrated valve

    International Nuclear Information System (INIS)

    Cai Wei; Bo Hanliang; Qin Benke

    2008-01-01

    The flow resistance of the integrated valve is one of the key parameters for the design of the control rod hydraulic drive system (CRHDS). Experimental research on the improved new integrated valve was performed, and the key data such as pressure difference, volume flow, resistance coefficient and flow coefficient of each flow channel were obtained. With the computational fluid dynamics software CFX, numerical simulation was executed to analyze the effect of Re on the flow resistance. On the basis of experimental and numerical results, fitting empirical formulas of resistance coefficient were obtained, which provide experimental and theoretical foundations for CRHDS's optimized design and theoretical analysis. (authors)

  20. pySecDec: A toolbox for the numerical evaluation of multi-scale integrals

    Science.gov (United States)

    Borowka, S.; Heinrich, G.; Jahn, S.; Jones, S. P.; Kerner, M.; Schlenk, J.; Zirke, T.

    2018-01-01

    We present pySECDEC, a new version of the program SECDEC, which performs the factorization of dimensionally regulated poles in parametric integrals, and the subsequent numerical evaluation of the finite coefficients. The algebraic part of the program is now written in the form of python modules, which allow a very flexible usage. The optimization of the C++ code, generated using FORM, is improved, leading to a faster numerical convergence. The new version also creates a library of the integrand functions, such that it can be linked to user-specific codes for the evaluation of matrix elements in a way similar to analytic integral libraries.

  1. Integrating protein structures and precomputed genealogies in the Magnum database: Examples with cellular retinoid binding proteins

    Directory of Open Access Journals (Sweden)

    Bradley Michael E

    2006-02-01

    Full Text Available Abstract Background When accurate models for the divergent evolution of protein sequences are integrated with complementary biological information, such as folded protein structures, analyses of the combined data often lead to new hypotheses about molecular physiology. This represents an excellent example of how bioinformatics can be used to guide experimental research. However, progress in this direction has been slowed by the lack of a publicly available resource suitable for general use. Results The precomputed Magnum database offers a solution to this problem for ca. 1,800 full-length protein families with at least one crystal structure. The Magnum deliverables include 1 multiple sequence alignments, 2 mapping of alignment sites to crystal structure sites, 3 phylogenetic trees, 4 inferred ancestral sequences at internal tree nodes, and 5 amino acid replacements along tree branches. Comprehensive evaluations revealed that the automated procedures used to construct Magnum produced accurate models of how proteins divergently evolve, or genealogies, and correctly integrated these with the structural data. To demonstrate Magnum's capabilities, we asked for amino acid replacements requiring three nucleotide substitutions, located at internal protein structure sites, and occurring on short phylogenetic tree branches. In the cellular retinoid binding protein family a site that potentially modulates ligand binding affinity was discovered. Recruitment of cellular retinol binding protein to function as a lens crystallin in the diurnal gecko afforded another opportunity to showcase the predictive value of a browsable database containing branch replacement patterns integrated with protein structures. Conclusion We integrated two areas of protein science, evolution and structure, on a large scale and created a precomputed database, known as Magnum, which is the first freely available resource of its kind. Magnum provides evolutionary and structural

  2. PharmDB-K: Integrated Bio-Pharmacological Network Database for Traditional Korean Medicine.

    Directory of Open Access Journals (Sweden)

    Ji-Hyun Lee

    Full Text Available Despite the growing attention given to Traditional Medicine (TM worldwide, there is no well-known, publicly available, integrated bio-pharmacological Traditional Korean Medicine (TKM database for researchers in drug discovery. In this study, we have constructed PharmDB-K, which offers comprehensive information relating to TKM-associated drugs (compound, disease indication, and protein relationships. To explore the underlying molecular interaction of TKM, we integrated fourteen different databases, six Pharmacopoeias, and literature, and established a massive bio-pharmacological network for TKM and experimentally validated some cases predicted from the PharmDB-K analyses. Currently, PharmDB-K contains information about 262 TKMs, 7,815 drugs, 3,721 diseases, 32,373 proteins, and 1,887 side effects. One of the unique sets of information in PharmDB-K includes 400 indicator compounds used for standardization of herbal medicine. Furthermore, we are operating PharmDB-K via phExplorer (a network visualization software and BioMart (a data federation framework for convenient search and analysis of the TKM network. Database URL: http://pharmdb-k.org, http://biomart.i-pharm.org.

  3. MiCroKit 3.0: an integrated database of midbody, centrosome and kinetochore.

    Science.gov (United States)

    Ren, Jian; Liu, Zexian; Gao, Xinjiao; Jin, Changjiang; Ye, Mingliang; Zou, Hanfa; Wen, Longping; Zhang, Zhaolei; Xue, Yu; Yao, Xuebiao

    2010-01-01

    During cell division/mitosis, a specific subset of proteins is spatially and temporally assembled into protein super complexes in three distinct regions, i.e. centrosome/spindle pole, kinetochore/centromere and midbody/cleavage furrow/phragmoplast/bud neck, and modulates cell division process faithfully. Although many experimental efforts have been carried out to investigate the characteristics of these proteins, no integrated database was available. Here, we present the MiCroKit database (http://microkit.biocuckoo.org) of proteins that localize in midbody, centrosome and/or kinetochore. We collected into the MiCroKit database experimentally verified microkit proteins from the scientific literature that have unambiguous supportive evidence for subcellular localization under fluorescent microscope. The current version of MiCroKit 3.0 provides detailed information for 1489 microkit proteins from seven model organisms, including Saccharomyces cerevisiae, Schizasaccharomyces pombe, Caenorhabditis elegans, Drosophila melanogaster, Xenopus laevis, Mus musculus and Homo sapiens. Moreover, the orthologous information was provided for these microkit proteins, and could be a useful resource for further experimental identification. The online service of MiCroKit database was implemented in PHP + MySQL + JavaScript, while the local packages were developed in JAVA 1.5 (J2SE 5.0).

  4. Human Ageing Genomic Resources: Integrated databases and tools for the biology and genetics of ageing

    Science.gov (United States)

    Tacutu, Robi; Craig, Thomas; Budovsky, Arie; Wuttke, Daniel; Lehmann, Gilad; Taranukha, Dmitri; Costa, Joana; Fraifeld, Vadim E.; de Magalhães, João Pedro

    2013-01-01

    The Human Ageing Genomic Resources (HAGR, http://genomics.senescence.info) is a freely available online collection of research databases and tools for the biology and genetics of ageing. HAGR features now several databases with high-quality manually curated data: (i) GenAge, a database of genes associated with ageing in humans and model organisms; (ii) AnAge, an extensive collection of longevity records and complementary traits for >4000 vertebrate species; and (iii) GenDR, a newly incorporated database, containing both gene mutations that interfere with dietary restriction-mediated lifespan extension and consistent gene expression changes induced by dietary restriction. Since its creation about 10 years ago, major efforts have been undertaken to maintain the quality of data in HAGR, while further continuing to develop, improve and extend it. This article briefly describes the content of HAGR and details the major updates since its previous publications, in terms of both structure and content. The completely redesigned interface, more intuitive and more integrative of HAGR resources, is also presented. Altogether, we hope that through its improvements, the current version of HAGR will continue to provide users with the most comprehensive and accessible resources available today in the field of biogerontology. PMID:23193293

  5. Note on the numerical calculation of the Fermi-Dirac integrals

    International Nuclear Information System (INIS)

    Graef, H.; Pabst, M.

    1977-11-01

    Expansions of the Fermi-Dirac integrals Fsub(α)(x) are developed, suitable for numerical computation. Only integrals of integer- or half-integer order are treated and expansion coefficients are tabulated for F 1 (x),....,F 9 (x); Fsub(-1/2)(x),...,Fsub(7/2)(x). Maximal relative errors vary with the function and interval considered, but are less than 3 x 10 -6 . (orig.) [de

  6. FY1995 transduction method and CAD database systems for integrated design; 1995 nendo transduction ho to CAD database togo sekkei shien system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    Transduction method developed by the research coordinator and Prof. Muroga is one of the most popular methods to design large-scale integrated circuits, and thus used by major design tool companies in USA and Japan. The major objectives of the research is to improve capability and utilize its reusable property by combining with CAD databases. Major results of the project is as follows, (1) Improvement of Transduction method : Efficiency, capability and the maximum circuit size are improved. Error compensation method is also improved. (2) Applications to new logic elements : Transduction method is modified to cope with wired logic and FPGAs. (3) CAD databases : One of the major advantages of Transduction methods is 'reusability' of already designed circuits. It is suitable to combine with CAD databases. We design CAD databases suitable for cooperative design using Transduction method. (4) Program development : Programs for Windows95 and developed for distribution. (NEDO)

  7. FY1995 transduction method and CAD database systems for integrated design; 1995 nendo transduction ho to CAD database togo sekkei shien system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    Transduction method developed by the research coordinator and Prof. Muroga is one of the most popular methods to design large-scale integrated circuits, and thus used by major design tool companies in USA and Japan. The major objectives of the research is to improve capability and utilize its reusable property by combining with CAD databases. Major results of the project is as follows, (1) Improvement of Transduction method : Efficiency, capability and the maximum circuit size are improved. Error compensation method is also improved. (2) Applications to new logic elements : Transduction method is modified to cope with wired logic and FPGAs. (3) CAD databases : One of the major advantages of Transduction methods is 'reusability' of already designed circuits. It is suitable to combine with CAD databases. We design CAD databases suitable for cooperative design using Transduction method. (4) Program development : Programs for Windows95 and developed for distribution. (NEDO)

  8. A purely Lagrangian method for the numerical integration of Fokker-Planck equations

    International Nuclear Information System (INIS)

    Combis, P.; Fronteau, J.

    1986-01-01

    A new numerical approach to Fokker-Planck equations is presented, in which the integration grid moves according to the solution of a differential system. The method is purely Lagrangian, the mean effect of the diffusion being inserted into the differential system itself

  9. Numerical Simulation and Experimental Validation of an Integrated Sleeve-Wedge Anchorage for CFRP Rods

    DEFF Research Database (Denmark)

    Schmidt, Jacob Wittrup; Smith, Scott T.; Täljsten, Björn

    2011-01-01

    . Recently, an integrated sleeve-wedge anchorage has been successfully developed specifically for CFRP rods. This paper in turn presents a numerical simulation of the newly developed anchorage using ABAQUS. The three-dimensional finite element (FE) model, which considers material non-linearity, uses...

  10. Neutron metrology file NMF-90. An integrated database for performing neutron spectrum adjustment calculations

    International Nuclear Information System (INIS)

    Kocherov, N.P.

    1996-01-01

    The Neutron Metrology File NMF-90 is an integrated database for performing neutron spectrum adjustment (unfolding) calculations. It contains 4 different adjustment codes, the dosimetry reaction cross-section library IRDF-90/NMF-G with covariances files, 6 input data sets for reactor benchmark neutron fields and a number of utility codes for processing and plotting the input and output data. The package consists of 9 PC HD diskettes and manuals for the codes. It is distributed by the Nuclear Data Section of the IAEA on request free of charge. About 10 MB of diskspace is needed to install and run a typical reactor neutron dosimetry unfolding problem. (author). 8 refs

  11. Planning the future of JPL's management and administrative support systems around an integrated database

    Science.gov (United States)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  12. Pancreatic Expression database: a generic model for the organization, integration and mining of complex cancer datasets

    Directory of Open Access Journals (Sweden)

    Lemoine Nicholas R

    2007-11-01

    Full Text Available Abstract Background Pancreatic cancer is the 5th leading cause of cancer death in both males and females. In recent years, a wealth of gene and protein expression studies have been published broadening our understanding of pancreatic cancer biology. Due to the explosive growth in publicly available data from multiple different sources it is becoming increasingly difficult for individual researchers to integrate these into their current research programmes. The Pancreatic Expression database, a generic web-based system, is aiming to close this gap by providing the research community with an open access tool, not only to mine currently available pancreatic cancer data sets but also to include their own data in the database. Description Currently, the database holds 32 datasets comprising 7636 gene expression measurements extracted from 20 different published gene or protein expression studies from various pancreatic cancer types, pancreatic precursor lesions (PanINs and chronic pancreatitis. The pancreatic data are stored in a data management system based on the BioMart technology alongside the human genome gene and protein annotations, sequence, homologue, SNP and antibody data. Interrogation of the database can be achieved through both a web-based query interface and through web services using combined criteria from pancreatic (disease stages, regulation, differential expression, expression, platform technology, publication and/or public data (antibodies, genomic region, gene-related accessions, ontology, expression patterns, multi-species comparisons, protein data, SNPs. Thus, our database enables connections between otherwise disparate data sources and allows relatively simple navigation between all data types and annotations. Conclusion The database structure and content provides a powerful and high-speed data-mining tool for cancer research. It can be used for target discovery i.e. of biomarkers from body fluids, identification and analysis

  13. Exponential Convergence for Numerical Solution of Integral Equations Using Radial Basis Functions

    Directory of Open Access Journals (Sweden)

    Zakieh Avazzadeh

    2014-01-01

    Full Text Available We solve some different type of Urysohn integral equations by using the radial basis functions. These types include the linear and nonlinear Fredholm, Volterra, and mixed Volterra-Fredholm integral equations. Our main aim is to investigate the rate of convergence to solve these equations using the radial basis functions which have normic structure that utilize approximation in higher dimensions. Of course, the use of this method often leads to ill-posed systems. Thus we propose an algorithm to improve the results. Numerical results show that this method leads to the exponential convergence for solving integral equations as it was already confirmed for partial and ordinary differential equations.

  14. An integrated algorithm for hypersonic fluid-thermal-structural numerical simulation

    Science.gov (United States)

    Li, Jia-Wei; Wang, Jiang-Feng

    2018-05-01

    In this paper, a fluid-structural-thermal integrated method is presented based on finite volume method. A unified integral equations system is developed as the control equations for physical process of aero-heating and structural heat transfer. The whole physical field is discretized by using an up-wind finite volume method. To demonstrate its capability, the numerical simulation of Mach 6.47 flow over stainless steel cylinder shows a good agreement with measured values, and this method dynamically simulates the objective physical processes. Thus, the integrated algorithm proves to be efficient and reliable.

  15. Integrating query of relational and textual data in clinical databases: a case study.

    Science.gov (United States)

    Fisk, John M; Mutalik, Pradeep; Levin, Forrest W; Erdos, Joseph; Taylor, Caroline; Nadkarni, Prakash

    2003-01-01

    The authors designed and implemented a clinical data mart composed of an integrated information retrieval (IR) and relational database management system (RDBMS). Using commodity software, which supports interactive, attribute-centric text and relational searches, the mart houses 2.8 million documents that span a five-year period and supports basic IR features such as Boolean searches, stemming, and proximity and fuzzy searching. Results are relevance-ranked using either "total documents per patient" or "report type weighting." Non-curated medical text has a significant degree of malformation with respect to spelling and punctuation, which creates difficulties for text indexing and searching. Presently, the IR facilities of RDBMS packages lack the features necessary to handle such malformed text adequately. A robust IR+RDBMS system can be developed, but it requires integrating RDBMSs with third-party IR software. RDBMS vendors need to make their IR offerings more accessible to non-programmers.

  16. Integrated Storage and Management of Vector and Raster Data Based on Oracle Database

    Directory of Open Access Journals (Sweden)

    WU Zheng

    2017-05-01

    Full Text Available At present, there are many problems in the storage and management of multi-source heterogeneous spatial data, such as the difficulty of transferring, the lack of unified storage and the low efficiency. By combining relational database and spatial data engine technology, an approach for integrated storage and management of vector and raster data is proposed on the basis of Oracle in this paper. This approach establishes an integrated storage model on vector and raster data and optimizes the retrieval mechanism at first, then designs a framework for the seamless data transfer, finally realizes the unified storage and efficient management of multi-source heterogeneous data. By comparing experimental results with the international leading similar software ArcSDE, it is proved that the proposed approach has higher data transfer performance and better query retrieval efficiency.

  17. An integral equation-based numerical solver for Taylor states in toroidal geometries

    Science.gov (United States)

    O'Neil, Michael; Cerfon, Antoine J.

    2018-04-01

    We present an algorithm for the numerical calculation of Taylor states in toroidal and toroidal-shell geometries using an analytical framework developed for the solution to the time-harmonic Maxwell equations. Taylor states are a special case of what are known as Beltrami fields, or linear force-free fields. The scheme of this work relies on the generalized Debye source representation of Maxwell fields and an integral representation of Beltrami fields which immediately yields a well-conditioned second-kind integral equation. This integral equation has a unique solution whenever the Beltrami parameter λ is not a member of a discrete, countable set of resonances which physically correspond to spontaneous symmetry breaking. Several numerical examples relevant to magnetohydrodynamic equilibria calculations are provided. Lastly, our approach easily generalizes to arbitrary geometries, both bounded and unbounded, and of varying genus.

  18. Structural Health Monitoring of Tall Buildings with Numerical Integrator and Convex-Concave Hull Classification

    Directory of Open Access Journals (Sweden)

    Suresh Thenozhi

    2012-01-01

    Full Text Available An important objective of health monitoring systems for tall buildings is to diagnose the state of the building and to evaluate its possible damage. In this paper, we use our prototype to evaluate our data-mining approach for the fault monitoring. The offset cancellation and high-pass filtering techniques are combined effectively to solve common problems in numerical integration of acceleration signals in real-time applications. The integration accuracy is improved compared with other numerical integrators. Then we introduce a novel method for support vector machine (SVM classification, called convex-concave hull. We use the Jarvis march method to decide the concave (nonconvex hull for the inseparable points. Finally the vertices of the convex-concave hull are applied for SVM training.

  19. CPLA 1.0: an integrated database of protein lysine acetylation.

    Science.gov (United States)

    Liu, Zexian; Cao, Jun; Gao, Xinjiao; Zhou, Yanhong; Wen, Longping; Yang, Xiangjiao; Yao, Xuebiao; Ren, Jian; Xue, Yu

    2011-01-01

    As a reversible post-translational modification (PTM) discovered decades ago, protein lysine acetylation was known for its regulation of transcription through the modification of histones. Recent studies discovered that lysine acetylation targets broad substrates and especially plays an essential role in cellular metabolic regulation. Although acetylation is comparable with other major PTMs such as phosphorylation, an integrated resource still remains to be developed. In this work, we presented the compendium of protein lysine acetylation (CPLA) database for lysine acetylated substrates with their sites. From the scientific literature, we manually collected 7151 experimentally identified acetylation sites in 3311 targets. We statistically studied the regulatory roles of lysine acetylation by analyzing the Gene Ontology (GO) and InterPro annotations. Combined with protein-protein interaction information, we systematically discovered a potential human lysine acetylation network (HLAN) among histone acetyltransferases (HATs), substrates and histone deacetylases (HDACs). In particular, there are 1862 triplet relationships of HAT-substrate-HDAC retrieved from the HLAN, at least 13 of which were previously experimentally verified. The online services of CPLA database was implemented in PHP + MySQL + JavaScript, while the local packages were developed in JAVA 1.5 (J2SE 5.0). The CPLA database is freely available for all users at: http://cpla.biocuckoo.org.

  20. EVpedia: an integrated database of high-throughput data for systemic analyses of extracellular vesicles

    Directory of Open Access Journals (Sweden)

    Dae-Kyum Kim

    2013-03-01

    Full Text Available Secretion of extracellular vesicles is a general cellular activity that spans the range from simple unicellular organisms (e.g. archaea; Gram-positive and Gram-negative bacteria to complex multicellular ones, suggesting that this extracellular vesicle-mediated communication is evolutionarily conserved. Extracellular vesicles are spherical bilayered proteolipids with a mean diameter of 20–1,000 nm, which are known to contain various bioactive molecules including proteins, lipids, and nucleic acids. Here, we present EVpedia, which is an integrated database of high-throughput datasets from prokaryotic and eukaryotic extracellular vesicles. EVpedia provides high-throughput datasets of vesicular components (proteins, mRNAs, miRNAs, and lipids present on prokaryotic, non-mammalian eukaryotic, and mammalian extracellular vesicles. In addition, EVpedia also provides an array of tools, such as the search and browse of vesicular components, Gene Ontology enrichment analysis, network analysis of vesicular proteins and mRNAs, and a comparison of vesicular datasets by ortholog identification. Moreover, publications on extracellular vesicle studies are listed in the database. This free web-based database of EVpedia (http://evpedia.info might serve as a fundamental repository to stimulate the advancement of extracellular vesicle studies and to elucidate the novel functions of these complex extracellular organelles.

  1. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    Science.gov (United States)

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  2. A numerical integration-based yield estimation method for integrated circuits

    International Nuclear Information System (INIS)

    Liang Tao; Jia Xinzhang

    2011-01-01

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  3. A numerical integration-based yield estimation method for integrated circuits

    Energy Technology Data Exchange (ETDEWEB)

    Liang Tao; Jia Xinzhang, E-mail: tliang@yahoo.cn [Key Laboratory of Ministry of Education for Wide Bandgap Semiconductor Materials and Devices, School of Microelectronics, Xidian University, Xi' an 710071 (China)

    2011-04-15

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  4. Conservation properties of numerical integration methods for systems of ordinary differential equations

    Science.gov (United States)

    Rosenbaum, J. S.

    1976-01-01

    If a system of ordinary differential equations represents a property conserving system that can be expressed linearly (e.g., conservation of mass), it is then desirable that the numerical integration method used conserve the same quantity. It is shown that both linear multistep methods and Runge-Kutta methods are 'conservative' and that Newton-type methods used to solve the implicit equations preserve the inherent conservation of the numerical method. It is further shown that a method used by several authors is not conservative.

  5. Experimental and Numerical Investigations in Shallow Cut Grinding by Workpiece Integrated Infrared Thermopile Array.

    Science.gov (United States)

    Reimers, Marcel; Lang, Walter; Dumstorff, Gerrit

    2017-09-30

    The purpose of our study is to investigate the heat distribution and the occurring temperatures during grinding. Therefore, we did both experimental and numerical investigations. In the first part, we present the integration of an infrared thermopile array in a steel workpiece. Experiments are done by acquiring data from the thermopile array during grinding of a groove in a workpiece made of steel. In the second part, we present numerical investigations in the grinding process to further understand the thermal characteristic during grinding. Finally, we conclude our work. Increasing the feed speed leads to two things: higher heat flux densities in the workpiece and higher temperature gradients in the material.

  6. Experimental and Numerical Investigations in Shallow Cut Grinding by Workpiece Integrated Infrared Thermopile Array

    Directory of Open Access Journals (Sweden)

    Marcel Reimers

    2017-09-01

    Full Text Available The purpose of our study is to investigate the heat distribution and the occurring temperatures during grinding. Therefore, we did both experimental and numerical investigations. In the first part, we present the integration of an infrared thermopile array in a steel workpiece. Experiments are done by acquiring data from the thermopile array during grinding of a groove in a workpiece made of steel. In the second part, we present numerical investigations in the grinding process to further understand the thermal characteristic during grinding. Finally, we conclude our work. Increasing the feed speed leads to two things: higher heat flux densities in the workpiece and higher temperature gradients in the material.

  7. A semantic data dictionary method for database schema integration in CIESIN

    Science.gov (United States)

    Hinds, N.; Huang, Y.; Ravishankar, C.

    1993-08-01

    CIESIN (Consortium for International Earth Science Information Network) is funded by NASA to investigate the technology necessary to integrate and facilitate the interdisciplinary use of Global Change information. A clear of this mission includes providing a link between the various global change data sets, in particular the physical sciences and the human (social) sciences. The typical scientist using the CIESIN system will want to know how phenomena in an outside field affects his/her work. For example, a medical researcher might ask: how does air-quality effect emphysema? This and many similar questions will require sophisticated semantic data integration. The researcher who raised the question may be familiar with medical data sets containing emphysema occurrences. But this same investigator may know little, if anything, about the existance or location of air-quality data. It is easy to envision a system which would allow that investigator to locate and perform a ``join'' on two data sets, one containing emphysema cases and the other containing air-quality levels. No such system exists today. One major obstacle to providing such a system will be overcoming the heterogeneity which falls into two broad categories. ``Database system'' heterogeneity involves differences in data models and packages. ``Data semantic'' heterogeneity involves differences in terminology between disciplines which translates into data semantic issues, and varying levels of data refinement, from raw to summary. Our work investigates a global data dictionary mechanism to facilitate a merged data service. Specially, we propose using a semantic tree during schema definition to aid in locating and integrating heterogeneous databases.

  8. On randomized algorithms for numerical solution of applied Fredholm integral equations of the second kind

    Science.gov (United States)

    Voytishek, Anton V.; Shipilov, Nikolay M.

    2017-11-01

    In this paper, the systematization of numerical (implemented on a computer) randomized functional algorithms for approximation of a solution of Fredholm integral equation of the second kind is carried out. Wherein, three types of such algorithms are distinguished: the projection, the mesh and the projection-mesh methods. The possibilities for usage of these algorithms for solution of practically important problems is investigated in detail. The disadvantages of the mesh algorithms, related to the necessity of calculation values of the kernels of integral equations in fixed points, are identified. On practice, these kernels have integrated singularities, and calculation of their values is impossible. Thus, for applied problems, related to solving Fredholm integral equation of the second kind, it is expedient to use not mesh, but the projection and the projection-mesh randomized algorithms.

  9. Numerical Study of Two-Dimensional Volterra Integral Equations by RDTM and Comparison with DTM

    Directory of Open Access Journals (Sweden)

    Reza Abazari

    2013-01-01

    Full Text Available The two-dimensional Volterra integral equations are solved using more recent semianalytic method, the reduced differential transform method (the so-called RDTM, and compared with the differential transform method (DTM. The concepts of DTM and RDTM are briefly explained, and their application to the two-dimensional Volterra integral equations is studied. The results obtained by DTM and RDTM together are compared with exact solution. As an important result, it is depicted that the RDTM results are more accurate in comparison with those obtained by DTM applied to the same Volterra integral equations. The numerical results reveal that the RDTM is very effective, convenient, and quite accurate compared to the other kind of nonlinear integral equations. It is predicted that the RDTM can be found widely applicable in engineering sciences.

  10. A numerical integration approach suitable for simulating PWR dynamics using a microcomputer system

    International Nuclear Information System (INIS)

    Zhiwei, L.; Kerlin, T.W.

    1983-01-01

    It is attractive to use microcomputer systems to simulate nuclear power plant dynamics for the purpose of teaching and/or control system design. An analysis and a comparison of feasibility of existing numerical integration methods have been made. The criteria for choosing the integration step using various numerical integration methods including the matrix exponential method are derived. In order to speed up the simulation, an approach is presented using the Newton recursion calculus which can avoid convergence limitations in choosing the integration step size. The accuracy consideration will dominate the integration step limited. The advantages of this method have been demonstrated through a case study using CBM model 8032 microcomputer to simulate a reduced order linear PWR model under various perturbations. It has been proven theoretically and practically that the Runge-Kutta method and Adams-Moulton method are not feasible. The matrix exponential method is good at accuracy and fairly good at speed. The Newton recursion method can save 3/4 to 4/5 time compared to the matrix exponential method with reasonable accuracy. Vertical Barhis method can be expanded to deal with nonlinear nuclear power plant models and higher order models as well

  11. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    Science.gov (United States)

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  12. Numerical evaluation of two-center integrals over Slater type orbitals

    Energy Technology Data Exchange (ETDEWEB)

    Kurt, S. A., E-mail: slaykurt@gmail.com [Department of Physics, Natural Sciences Institute, Ondokuz Mayıs University, 55139, Samsun (Turkey); Yükçü, N., E-mail: nyukcu@gmail.com [Department of Energy Systems Engineering, Faculty of Technology, Adıyaman University, 02040, Adıyaman (Turkey)

    2016-03-25

    Slater Type Orbitals (STOs) which one of the types of exponential type orbitals (ETOs) are used usually as basis functions in the multicenter molecular integrals to better understand physical and chemical properties of matter. In this work, we develop algorithms for two-center overlap and two-center two-electron hybrid and Coulomb integrals which are calculated with help of translation method for STOs and some auxiliary functions by V. Magnasco’s group. We use Mathematica programming language to produce algorithms for these calculations. Numerical results for some quantum numbers are presented in the tables. Consequently, we compare our obtained numerical results with the other known literature results and other details of evaluation method are discussed.

  13. Quadrature theory the theory of numerical integration on a compact interval

    CERN Document Server

    Brass, Helmut

    2011-01-01

    Every book on numerical analysis covers methods for the approximate calculation of definite integrals. The authors of this book provide a complementary treatment of the topic by presenting a coherent theory of quadrature methods that encompasses many deep and elegant results as well as a large number of interesting (solved and open) problems. The inclusion of the word "theory" in the title highlights the authors' emphasis on analytical questions, such as the existence and structure of quadrature methods and selection criteria based on strict error bounds for quadrature rules. Systematic analyses of this kind rely on certain properties of the integrand, called "co-observations," which form the central organizing principle for the authors' theory, and distinguish their book from other texts on numerical integration. A wide variety of co-observations are examined, as a detailed understanding of these is useful for solving problems in practical contexts. While quadrature theory is often viewed as a branch of nume...

  14. Numerical evaluation of two-center integrals over Slater type orbitals

    International Nuclear Information System (INIS)

    Kurt, S. A.; Yükçü, N.

    2016-01-01

    Slater Type Orbitals (STOs) which one of the types of exponential type orbitals (ETOs) are used usually as basis functions in the multicenter molecular integrals to better understand physical and chemical properties of matter. In this work, we develop algorithms for two-center overlap and two-center two-electron hybrid and Coulomb integrals which are calculated with help of translation method for STOs and some auxiliary functions by V. Magnasco’s group. We use Mathematica programming language to produce algorithms for these calculations. Numerical results for some quantum numbers are presented in the tables. Consequently, we compare our obtained numerical results with the other known literature results and other details of evaluation method are discussed.

  15. Numerical simulation and experimental research of the integrated high-power LED radiator

    Science.gov (United States)

    Xiang, J. H.; Zhang, C. L.; Gan, Z. J.; Zhou, C.; Chen, C. G.; Chen, S.

    2017-01-01

    The thermal management has become an urgent problem to be solved with the increasing power and the improving integration of the LED (light emitting diode) chip. In order to eliminate the contact resistance of the radiator, this paper presented an integrated high-power LED radiator based on phase-change heat transfer, which realized the seamless connection between the vapor chamber and the cooling fins. The radiator was optimized by combining the numerical simulation and the experimental research. The effects of the chamber diameter and the parameters of fin on the heat dissipation performance were analyzed. The numerical simulation results were compared with the measured values by experiment. The results showed that the fin thickness, the fin number, the fin height and the chamber diameter were the factors which affected the performance of radiator from primary to secondary.

  16. Numerical simulation of liquid film flow on revolution surfaces with momentum integral method

    International Nuclear Information System (INIS)

    Bottoni Maurizio

    2005-01-01

    The momentum integral method is applied in the frame of safety analysis of pressure water reactors under hypothetical loss of coolant accident (LOCA) conditions to simulate numerically film condensation, rewetting and vaporization on the inner surface of pressure water reactor containment. From the conservation equations of mass and momentum of a liquid film arising from condensation of steam upon the inner of the containment during a LOCA in a pressure water reactor plant, an integro-differential equation is derived, referring to an arbitrary axisymmetric surface of revolution. This equation describes the velocity distribution of the liquid film along a meridian of a surface of revolution. From the integro-differential equation and ordinary differential equation of first order for the film velocity is derived and integrated numerically. From the velocity distribution the film thickness distribution is obtained. The solution of the enthalpy equation for the liquid film yields the temperature distribution on the inner surface of the containment. (authors)

  17. Numerical Algorithms for Acoustic Integrals - The Devil is in the Details

    Science.gov (United States)

    Brentner, Kenneth S.

    1996-01-01

    The accurate prediction of the aeroacoustic field generated by aerospace vehicles or nonaerospace machinery is necessary for designers to control and reduce source noise. Powerful computational aeroacoustic methods, based on various acoustic analogies (primarily the Lighthill acoustic analogy) and Kirchhoff methods, have been developed for prediction of noise from complicated sources, such as rotating blades. Both methods ultimately predict the noise through a numerical evaluation of an integral formulation. In this paper, we consider three generic acoustic formulations and several numerical algorithms that have been used to compute the solutions to these formulations. Algorithms for retarded-time formulations are the most efficient and robust, but they are difficult to implement for supersonic-source motion. Collapsing-sphere and emission-surface formulations are good alternatives when supersonic-source motion is present, but the numerical implementations of these formulations are more computationally demanding. New algorithms - which utilize solution adaptation to provide a specified error level - are needed.

  18. A difference quotient-numerical integration method for solving radiative transfer problems

    International Nuclear Information System (INIS)

    Ding Peizhu

    1992-01-01

    A difference quotient-numerical integration method is adopted to solve radiative transfer problems in an anisotropic scattering slab medium. By using the method, the radiative transfer problem is separated into a system of linear algebraic equations and the coefficient matrix of the system is a band matrix, so the method is very simple to evaluate on computer and to deduce formulae and easy to master for experimentalists. An example is evaluated and it is shown that the method is precise

  19. Characterisation of large catastrophic landslides using an integrated field, remote sensing and numerical modelling approach

    OpenAIRE

    Wolter, Andrea Elaine

    2014-01-01

    I apply a forensic, multidisciplinary approach that integrates engineering geology field investigations, engineering geomorphology mapping, long-range terrestrial photogrammetry, and a numerical modelling toolbox to two large rock slope failures to study their causes, initiation, kinematics, and dynamics. I demonstrate the significance of endogenic and exogenic processes, both separately and in concert, in contributing to landscape evolution and conditioning slopes for failure, and use geomor...

  20. Numerical integration of electromagnetic cascade equations, discussion of results for air, copper, iron, and lead

    International Nuclear Information System (INIS)

    Adler, A.; Fuchs, B.; Thielheim, K.O.

    1977-01-01

    The longitudinal development of electromagnetic cascades in air, copper, iron, and lead is studied on the basis of results derived recently by numerical integration of the cascade equations applying rather accurate expressions for the cross-sections involved with the interactions of high energy electrons, positrons, and photons in electromagnetic cascades. Special attention is given to scaling properties of transition curves. It is demonstrated that a good scaling may be achieved by means of the depth of maximum cascade development. (author)

  1. Integrating stations from the North America Gravity Database into a local GPS-based land gravity survey

    Science.gov (United States)

    Shoberg, Thomas G.; Stoddard, Paul R.

    2013-01-01

    The ability to augment local gravity surveys with additional gravity stations from easily accessible national databases can greatly increase the areal coverage and spatial resolution of a survey. It is, however, necessary to integrate such data seamlessly with the local survey. One challenge to overcome in integrating data from national databases is that these data are typically of unknown quality. This study presents a procedure for the evaluation and seamless integration of gravity data of unknown quality from a national database with data from a local Global Positioning System (GPS)-based survey. The starting components include the latitude, longitude, elevation and observed gravity at each station location. Interpolated surfaces of the complete Bouguer anomaly are used as a means of quality control and comparison. The result is an integrated dataset of varying quality with many stations having GPS accuracy and other reliable stations of unknown origin, yielding a wider coverage and greater spatial resolution than either survey alone.

  2. Whistleblowing: An integrative literature review of data-based studies involving nurses.

    Science.gov (United States)

    Jackson, Debra; Hickman, Louise D; Hutchinson, Marie; Andrew, Sharon; Smith, James; Potgieter, Ingrid; Cleary, Michelle; Peters, Kath

    2014-01-01

    Abstract Aim: To summarise and critique the research literature about whistleblowing and nurses. Whistleblowing is identified as a crucial issue in maintenance of healthcare standards and nurses are frequently involved in whistleblowing events. Despite the importance of this issue, to our knowledge an evaluation of this body of the data-based literature has not been undertaken. An integrative literature review approach was used to summarise and critique the research literature. A comprehensive search of five databases including Medline, CINAHL, PubMed and Health Science: Nursing/Academic Edition, and Google, were searched using terms including: 'Whistleblow*,' 'nurs*.' In addition, relevant journals were examined, as well as reference lists of retrieved papers. Papers published during the years 2007-2013 were selected for inclusion. Fifteen papers were identified, capturing data from nurses in seven countries. The findings in this review demonstrate a growing body of research for the nursing profession at large to engage and respond appropriately to issues involving suboptimal patient care or organisational wrongdoing. Nursing plays a key role in maintaining practice standards and in reporting care that is unacceptable although the repercussions to nurses who raise concerns are insupportable. Overall, whistleblowing and how it influences the individual, their family, work colleagues, nursing practice and policy overall, requires further national and international research attention.

  3. Bio-optical data integration based on a 4 D database system approach

    Science.gov (United States)

    Imai, N. N.; Shimabukuro, M. H.; Carmo, A. F. C.; Alcantara, E. H.; Rodrigues, T. W. P.; Watanabe, F. S. Y.

    2015-04-01

    Bio-optical characterization of water bodies requires spatio-temporal data about Inherent Optical Properties and Apparent Optical Properties which allow the comprehension of underwater light field aiming at the development of models for monitoring water quality. Measurements are taken to represent optical properties along a column of water, and then the spectral data must be related to depth. However, the spatial positions of measurement may differ since collecting instruments vary. In addition, the records should not refer to the same wavelengths. Additional difficulty is that distinct instruments store data in different formats. A data integration approach is needed to make these large and multi source data sets suitable for analysis. Thus, it becomes possible, even automatically, semi-empirical models evaluation, preceded by preliminary tasks of quality control. In this work it is presented a solution, in the stated scenario, based on spatial - geographic - database approach with the adoption of an object relational Database Management System - DBMS - due to the possibilities to represent all data collected in the field, in conjunction with data obtained by laboratory analysis and Remote Sensing images that have been taken at the time of field data collection. This data integration approach leads to a 4D representation since that its coordinate system includes 3D spatial coordinates - planimetric and depth - and the time when each data was taken. It was adopted PostgreSQL DBMS extended by PostGIS module to provide abilities to manage spatial/geospatial data. It was developed a prototype which has the mainly tools an analyst needs to prepare the data sets for analysis.

  4. Development of SRS.php, a Simple Object Access Protocol-based library for data acquisition from integrated biological databases.

    Science.gov (United States)

    Barbosa-Silva, A; Pafilis, E; Ortega, J M; Schneider, R

    2007-12-11

    Data integration has become an important task for biological database providers. The current model for data exchange among different sources simplifies the manner that distinct information is accessed by users. The evolution of data representation from HTML to XML enabled programs, instead of humans, to interact with biological databases. We present here SRS.php, a PHP library that can interact with the data integration Sequence Retrieval System (SRS). The library has been written using SOAP definitions, and permits the programmatic communication through webservices with the SRS. The interactions are possible by invoking the methods described in WSDL by exchanging XML messages. The current functions available in the library have been built to access specific data stored in any of the 90 different databases (such as UNIPROT, KEGG and GO) using the same query syntax format. The inclusion of the described functions in the source of scripts written in PHP enables them as webservice clients to the SRS server. The functions permit one to query the whole content of any SRS database, to list specific records in these databases, to get specific fields from the records, and to link any record among any pair of linked databases. The case study presented exemplifies the library usage to retrieve information regarding registries of a Plant Defense Mechanisms database. The Plant Defense Mechanisms database is currently being developed, and the proposal of SRS.php library usage is to enable the data acquisition for the further warehousing tasks related to its setup and maintenance.

  5. Integration of published information into a resistance-associated mutation database for Mycobacterium tuberculosis.

    Science.gov (United States)

    Salamon, Hugh; Yamaguchi, Ken D; Cirillo, Daniela M; Miotto, Paolo; Schito, Marco; Posey, James; Starks, Angela M; Niemann, Stefan; Alland, David; Hanna, Debra; Aviles, Enrique; Perkins, Mark D; Dolinger, David L

    2015-04-01

    Tuberculosis remains a major global public health challenge. Although incidence is decreasing, the proportion of drug-resistant cases is increasing. Technical and operational complexities prevent Mycobacterium tuberculosis drug susceptibility phenotyping in the vast majority of new and retreatment cases. The advent of molecular technologies provides an opportunity to obtain results rapidly as compared to phenotypic culture. However, correlations between genetic mutations and resistance to multiple drugs have not been systematically evaluated. Molecular testing of M. tuberculosis sampled from a typical patient continues to provide a partial picture of drug resistance. A database of phenotypic and genotypic testing results, especially where prospectively collected, could document statistically significant associations and may reveal new, predictive molecular patterns. We examine the feasibility of integrating existing molecular and phenotypic drug susceptibility data to identify associations observed across multiple studies and demonstrate potential for well-integrated M. tuberculosis mutation data to reveal actionable findings. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. System/subsystem specifications for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB)

    Energy Technology Data Exchange (ETDEWEB)

    Rollow, J.P.; Shipe, P.C.; Truett, L.F. [Oak Ridge National Lab., TN (United States); Faby, E.Z.; Fluker, J.; Grubb, J.; Hancock, B.R. [Univ. of Tennessee, Knoxville, TN (United States); Ferguson, R.A. [Science Applications International Corp., Oak Ridge, TN (United States)

    1995-11-20

    A system is being developed by the Military Traffic Management Command (MTMC) to provide data integration and worldwide management and tracking of surface cargo movements. The Integrated Cargo Database (ICDB) will be a data repository for the WPS terminal-level system, will be a primary source of queries and cargo traffic reports, will receive data from and provide data to other MTMC and non-MTMC systems, will provide capabilities for processing Advance Transportation Control and Movement Documents (ATCMDs), and will process and distribute manifests. This System/Subsystem Specifications for the Worldwide Port System Regional ICDB documents the system/subsystem functions, provides details of the system/subsystem analysis in order to provide a communication link between developers and operational personnel, and identifies interfaces with other systems and subsystems. It must be noted that this report is being produced near the end of the initial development phase of ICDB, while formal software testing is being done. Following the initial implementation of the ICDB system, maintenance contractors will be in charge of making changes and enhancing software modules. Formal testing and user reviews may indicate the need for additional software units or changes to existing ones. This report describes the software units that are components of this ICDB system as of August 1995.

  7. Formulations by surface integral equations for numerical simulation of non-destructive testing by eddy currents

    International Nuclear Information System (INIS)

    Vigneron, Audrey

    2015-01-01

    The thesis addresses the numerical simulation of non-destructive testing (NDT) using eddy currents, and more precisely the computation of induced electromagnetic fields by a transmitter sensor in a healthy part. This calculation is the first step of the modeling of a complete control process in the CIVA software platform developed at CEA LIST. Currently, models integrated in CIVA are restricted to canonical (modal computation) or axially-symmetric geometries. The need for more diverse and complex configurations requires the introduction of new numerical modeling tools. In practice the sensor may be composed of elements with different shapes and physical properties. The inspected parts are conductive and may contain dielectric or magnetic elements. Due to the cohabitation of different materials in one configuration, different regimes (static, quasi-static or dynamic) may coexist. Under the assumption of linear, isotropic and piecewise homogeneous material properties, the surface integral equation (SIE) approach allows to reduce a volume-based problem to an equivalent surface-based problem. However, the usual SIE formulations for the Maxwell's problem generally suffer from numerical noise in asymptotic situations, and especially at low frequencies. The objective of this study is to determine a version that is stable for a range of physical parameters typical of eddy-current NDT applications. In this context, a block-iterative scheme based on a physical decomposition is proposed for the computation of primary fields. This scheme is accurate and well-conditioned. An asymptotic study of the integral Maxwell's problem at low frequencies is also performed, allowing to establish the eddy-current integral problem as an asymptotic case of the corresponding Maxwell problem. (author) [fr

  8. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    Science.gov (United States)

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  9. An Integrated Photogrammetric and Spatial Database Management System for Producing Fully Structured Data Using Aerial and Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Farshid Farnood Ahmadi

    2009-03-01

    Full Text Available 3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs; direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS is presented.

  10. PGSB/MIPS PlantsDB Database Framework for the Integration and Analysis of Plant Genome Data.

    Science.gov (United States)

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai; Gundlach, Heidrun; Mayer, Klaus F X

    2017-01-01

    Plant Genome and Systems Biology (PGSB), formerly Munich Institute for Protein Sequences (MIPS) PlantsDB, is a database framework for the integration and analysis of plant genome data, developed and maintained for more than a decade now. Major components of that framework are genome databases and analysis resources focusing on individual (reference) genomes providing flexible and intuitive access to data. Another main focus is the integration of genomes from both model and crop plants to form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny). Data exchange and integrated search functionality with/over many plant genome databases is provided within the transPLANT project.

  11. Implementation of numerical integration schemes for the simulation of magnetic SMA constitutive response

    International Nuclear Information System (INIS)

    Kiefer, B; Bartel, T; Menzel, A

    2012-01-01

    Several constitutive models for magnetic shape memory alloys (MSMAs) have been proposed in the literature. The implementation of numerical integration schemes, which allow the prediction of constitutive response for general loading cases and ultimately the incorporation of MSMA response into numerical solution algorithms for fully coupled magneto-mechanical boundary value problems, however, has received only very limited attention. In this work, we establish two algorithmic implementations of the internal variable model for MSMAs proposed in (Kiefer and Lagoudas 2005 Phil. Mag. Spec. Issue: Recent Adv. Theor. Mech. 85 4289–329, Kiefer and Lagoudas 2009 J. Intell. Mater. Syst. 20 143–70), where we restrict our attention to pure martensitic variant reorientation to limit complexity. The first updating scheme is based on the numerical integration of the reorientation strain evolution equation and represents a classical predictor–corrector-type general return mapping algorithm. In the second approach, the inequality-constrained optimization problem associated with internal variable evolution is converted into an unconstrained problem via Fischer–Burmeister complementarity functions and then iteratively solved in standard Newton–Raphson format. Simulations are verified by comparison to closed-form solutions for experimentally relevant loading cases. (paper)

  12. Analysis and databasing software for integrated tomographic gamma scanner (TGS) and passive-active neutron (PAN) assay systems

    International Nuclear Information System (INIS)

    Estep, R.J.; Melton, S.G.; Buenafe, C.

    2000-01-01

    The CTEN-FIT program, written for Windows 9x/NT in C++,performs databasing and analysis of combined thermal/epithermal neutron (CTEN) passive and active neutron assay data and integrates that with isotopics results and gamma-ray data from methods such as tomographic gamma scanning (TGS). The binary database is reflected in a companion Excel database that allows extensive customization via Visual Basic for Applications macros. Automated analysis options make the analysis of the data transparent to the assay system operator. Various record browsers and information displays simplify record keeping tasks

  13. Numerical Modeling of an Integrated Vehicle Fluids System Loop for Pressurizing a Cryogenic Tank

    Science.gov (United States)

    LeClair, A. C.; Hedayat, A.; Majumdar, A. K.

    2017-01-01

    This paper presents a numerical model of the pressurization loop of the Integrated Vehicle Fluids (IVF) system using the Generalized Fluid System Simulation Program (GFSSP). The IVF propulsion system, being developed by United Launch Alliance to reduce system weight and enhance reliability, uses boiloff propellants to drive thrusters for the reaction control system as well as to run internal combustion engines to develop power and drive compressors to pressurize propellant tanks. NASA Marshall Space Flight Center (MSFC) conducted tests to verify the functioning of the IVF system using a flight-like tank. GFSSP, a finite volume based flow network analysis software developed at MSFC, has been used to support the test program. This paper presents the simulation of three different test series, comparison of numerical prediction and test data and a novel method of presenting data in a dimensionless form. The paper also presents a methodology of implementing a compressor map in a system level code.

  14. Evaluation of wave power by integrating numerical models and measures at the Port of Civitavecchia

    International Nuclear Information System (INIS)

    Paladini de Mendoza, Francesco; Bonamano, Simone; Carli, Filippo Maria; Marcelli, Marco; Danelli, Andrea; Peviani, Maximo Aurelio; Burgio, Calogero

    2015-01-01

    An assessment of the available wave power at regional and local scale was carried out. Two hot spots of higher wave power level were identified and characterized along the coastline of northern Latium Region, near the 'Torre Valdaliga' power plant and in proximity of Civitavecchia’s breakwater, where the presence of a harbour and an electric power plant allows wave energy exploitation. The evaluation process was implemented through measurements, and numerical model assessment and validation. The integration of wave gauges measurements with numerical simulations made it possible to estimate the wave power on the extended area near shore. A down scaling process allowed to proceed from regional to local scale providing increased resolution thanks to highly detailed bathymetry.

  15. On the numerical evaluation of algebro-geometric solutions to integrable equations

    International Nuclear Information System (INIS)

    Kalla, C; Klein, C

    2012-01-01

    Physically meaningful periodic solutions to certain integrable partial differential equations are given in terms of multi-dimensional theta functions associated with real Riemann surfaces. Typical analytical problems in the numerical evaluation of these solutions are studied. In the case of hyperelliptic surfaces efficient algorithms exist even for almost degenerate surfaces. This allows the numerical study of solitonic limits. For general real Riemann surfaces, the choice of a homology basis adapted to the anti-holomorphic involution is important for a convenient formulation of the solutions and smoothness conditions. Since existing algorithms for algebraic curves produce a homology basis not related to automorphisms of the curve, we study symplectic transformations to an adapted basis and give explicit formulae for M-curves. As examples we discuss solutions of the Davey–Stewartson and the multi-component nonlinear Schrödinger equations

  16. Integrated numerical platforms for environmental dose assessments of large tritium inventory facilities

    International Nuclear Information System (INIS)

    Castro, P.; Ardao, J.; Velarde, M.; Sedano, L.; Xiberta, J.

    2013-01-01

    Related with a prospected new scenario of large inventory tritium facilities [KATRIN at TLK, CANDUs, ITER, EAST, other coming] the prescribed dosimetric limits by ICRP-60 for tritium committed-doses are under discussion requiring, in parallel, to surmount the highly conservative assessments by increasing the refinement of dosimetric-assessments in many aspects. Precise Lagrangian-computations of dosimetric cloud-evolution after standardized (normal/incidental/SBO) tritium cloud emissions are today numerically open to the perfect match of real-time meteorological-data, and patterns data at diverse scales for prompt/early and chronic tritium dose assessments. The trends towards integrated-numerical-platforms for environmental-dose assessments of large tritium inventory facilities under development.

  17. High-accuracy numerical integration of charged particle motion – with application to ponderomotive force

    International Nuclear Information System (INIS)

    Furukawa, Masaru; Ohkawa, Yushiro; Matsuyama, Akinobu

    2016-01-01

    A high-accuracy numerical integration algorithm for a charged particle motion is developed. The algorithm is based on the Hamiltonian mechanics and the operator decomposition. The algorithm is made to be time-reversal symmetric, and its order of accuracy can be increased to any order by using a recurrence formula. One of the advantages is that it is an explicit method. An effective way to decompose the time evolution operator is examined; the Poisson tensor is decomposed and non-canonical variables are adopted. The algorithm is extended to a time dependent fields' case by introducing the extended phase space. Numerical tests showing the performance of the algorithm are presented. One is the pure cyclotron motion for a long time period, and the other is a charged particle motion in a rapidly oscillating field. (author)

  18. Numerical evaluation of Feynman loop integrals by reduction to tree graphs

    International Nuclear Information System (INIS)

    Kleinschmidt, T.

    2007-12-01

    We present a method for the numerical evaluation of loop integrals, based on the Feynman Tree Theorem. This states that loop graphs can be expressed as a sum of tree graphs with additional external on-shell particles. The original loop integral is replaced by a phase space integration over the additional particles. In cross section calculations and for event generation, this phase space can be sampled simultaneously with the phase space of the original external particles. Since very sophisticated matrix element generators for tree graph amplitudes exist and phase space integrations are generically well understood, this method is suited for a future implementation in a fully automated Monte Carlo event generator. A scheme for renormalization and regularization is presented. We show the construction of subtraction graphs which cancel ultraviolet divergences and present a method to cancel internal on-shell singularities. Real emission graphs can be naturally included in the phase space integral of the additional on-shell particles to cancel infrared divergences. As a proof of concept, we apply this method to NLO Bhabha scattering in QED. Cross sections are calculated and are in agreement with results from conventional methods. We also construct a Monte Carlo event generator and present results. (orig.)

  19. Numerical evaluation of Feynman loop integrals by reduction to tree graphs

    Energy Technology Data Exchange (ETDEWEB)

    Kleinschmidt, T.

    2007-12-15

    We present a method for the numerical evaluation of loop integrals, based on the Feynman Tree Theorem. This states that loop graphs can be expressed as a sum of tree graphs with additional external on-shell particles. The original loop integral is replaced by a phase space integration over the additional particles. In cross section calculations and for event generation, this phase space can be sampled simultaneously with the phase space of the original external particles. Since very sophisticated matrix element generators for tree graph amplitudes exist and phase space integrations are generically well understood, this method is suited for a future implementation in a fully automated Monte Carlo event generator. A scheme for renormalization and regularization is presented. We show the construction of subtraction graphs which cancel ultraviolet divergences and present a method to cancel internal on-shell singularities. Real emission graphs can be naturally included in the phase space integral of the additional on-shell particles to cancel infrared divergences. As a proof of concept, we apply this method to NLO Bhabha scattering in QED. Cross sections are calculated and are in agreement with results from conventional methods. We also construct a Monte Carlo event generator and present results. (orig.)

  20. Numerical Integration Techniques for Curved-Element Discretizations of Molecule–Solvent Interfaces

    Science.gov (United States)

    Bardhan, Jaydeep P.; Altman, Michael D.; Willis, David J.; Lippow, Shaun M.; Tidor, Bruce; White, Jacob K.

    2012-01-01

    Surface formulations of biophysical modeling problems offer attractive theoretical and computational properties. Numerical simulations based on these formulations usually begin with discretization of the surface under consideration; often, the surface is curved, possessing complicated structure and possibly singularities. Numerical simulations commonly are based on approximate, rather than exact, discretizations of these surfaces. To assess the strength of the dependence of simulation accuracy on the fidelity of surface representation, we have developed methods to model several important surface formulations using exact surface discretizations. Following and refining Zauhar’s work (J. Comp.-Aid. Mol. Des. 9:149-159, 1995), we define two classes of curved elements that can exactly discretize the van der Waals, solvent-accessible, and solvent-excluded (molecular) surfaces. We then present numerical integration techniques that can accurately evaluate nonsingular and singular integrals over these curved surfaces. After validating the exactness of the surface discretizations and demonstrating the correctness of the presented integration methods, we present a set of calculations that compare the accuracy of approximate, planar-triangle-based discretizations and exact, curved-element-based simulations of surface-generalized-Born (sGB), surface-continuum van der Waals (scvdW), and boundary-element method (BEM) electrostatics problems. Results demonstrate that continuum electrostatic calculations with BEM using curved elements, piecewise-constant basis functions, and centroid collocation are nearly ten times more accurate than planartriangle BEM for basis sets of comparable size. The sGB and scvdW calculations give exceptional accuracy even for the coarsest obtainable discretized surfaces. The extra accuracy is attributed to the exact representation of the solute–solvent interface; in contrast, commonly used planar-triangle discretizations can only offer improved

  1. Fundamental aspects of the integration of seismic monitoring with numerical modelling.

    CSIR Research Space (South Africa)

    Mendecki, AJ

    2001-06-01

    Full Text Available of the physical state of the rock- mass. ! It must be equipped with the capability of converting the parameters of a real seismic event into a corresponding model-compatible input in the form of an additional loading on the rock-mass. ! It must allow... for an unambiguous identification and quantification of Aseismic events @ among the model-generated data. Structure of an integrated numerical model The functionality interrelations between the different components of a software package designed to implement...

  2. Numerical Integration of a Class of Singularly Perturbed Delay Differential Equations with Small Shift

    Directory of Open Access Journals (Sweden)

    Gemechis File

    2012-01-01

    Full Text Available We have presented a numerical integration method to solve a class of singularly perturbed delay differential equations with small shift. First, we have replaced the second-order singularly perturbed delay differential equation by an asymptotically equivalent first-order delay differential equation. Then, Simpson’s rule and linear interpolation are employed to get the three-term recurrence relation which is solved easily by discrete invariant imbedding algorithm. The method is demonstrated by implementing it on several linear and nonlinear model examples by taking various values for the delay parameter and the perturbation parameter .

  3. Numerical Modeling of Pressurization of Cryogenic Propellant Tank for Integrated Vehicle Fluid System

    Science.gov (United States)

    Majumdar, Alok K.; LeClair, Andre C.; Hedayat, Ali

    2016-01-01

    This paper presents a numerical model of pressurization of a cryogenic propellant tank for the Integrated Vehicle Fluid (IVF) system using the Generalized Fluid System Simulation Program (GFSSP). The IVF propulsion system, being developed by United Launch Alliance, uses boiloff propellants to drive thrusters for the reaction control system as well as to run internal combustion engines to develop power and drive compressors to pressurize propellant tanks. NASA Marshall Space Flight Center (MSFC) has been running tests to verify the functioning of the IVF system using a flight tank. GFSSP, a finite volume based flow network analysis software developed at MSFC, has been used to develop an integrated model of the tank and the pressurization system. This paper presents an iterative algorithm for converging the interface boundary conditions between different component models of a large system model. The model results have been compared with test data.

  4. Calculations of the electromechanical transfer processes using implicit methods of numerical integration

    Energy Technology Data Exchange (ETDEWEB)

    Pogosyan, T A

    1983-01-01

    The article is dedicated to the solution of systems of differential equations which describe the transfer processes in an electric power system (EES) by implicit methods of numerical integration. The distinguishing feature of the implicit methods (Euler's reverse method and the trapeze method) is their absolute stability and, consequently, the relatively small accumulation of errors in each step of integration. Therefore, they are found to be very convenient for solving problems of electric power engineering, when the transfer processes are described by a rigid system of differential equations. The rigidity is associated with the range of values of the time constants considered. The advantage of the implicit methods over explicit are shown in a specific example (calculation of the dynamic stability of the simplest electric power system), along with the field of use of the implicit methods and the expedience of their use in power engineering problems.

  5. Numerical Solution of Nonlinear Volterra Integral Equations System Using Simpson’s 3/8 Rule

    Directory of Open Access Journals (Sweden)

    Adem Kılıçman

    2012-01-01

    Full Text Available The Simpson’s 3/8 rule is used to solve the nonlinear Volterra integral equations system. Using this rule the system is converted to a nonlinear block system and then by solving this nonlinear system we find approximate solution of nonlinear Volterra integral equations system. One of the advantages of the proposed method is its simplicity in application. Further, we investigate the convergence of the proposed method and it is shown that its convergence is of order O(h4. Numerical examples are given to show abilities of the proposed method for solving linear as well as nonlinear systems. Our results show that the proposed method is simple and effective.

  6. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases.

    Science.gov (United States)

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-07-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.

  7. OECD/NEA data bank scientific and integral experiments databases in support of knowledge preservation and transfer

    International Nuclear Information System (INIS)

    Sartori, E.; Kodeli, I.; Mompean, F.J.; Briggs, J.B.; Gado, J.; Hasegawa, A.; D'hondt, P.; Wiesenack, W.; Zaetta, A.

    2004-01-01

    The OECD/Nuclear Energy Data Bank was established by its member countries as an institution to allow effective sharing of knowledge and its basic underlying information and data in key areas of nuclear science and technology. The activities as regards preserving and transferring knowledge consist of the: 1) Acquisition of basic nuclear data, computer codes and experimental system data needed over a wide range of nuclear and radiation applications; 2) Independent verification and validation of these data using quality assurance methods, adding value through international benchmark exercises, workshops and meetings and by issuing relevant reports with conclusions and recommendations, as well as by organising training courses to ensure their qualified and competent use; 3) Dissemination of the different products to authorised establishments in member countries and collecting and integrating user feedback. Of particular importance has been the establishment of basic and integral experiments databases and the methodology developed with the aim of knowledge preservation and transfer. Databases established thus far include: 1) IRPhE - International Reactor Physics Experimental Benchmarks Evaluations, 2) SINBAD - a radiation shielding experiments database (nuclear reactors, fusion neutronics and accelerators), 3) IFPE - International Fuel Performance Benchmark Experiments Database, 4) TDB - The Thermochemical Database Project, 5) ICSBE - International Nuclear Criticality Safety Benchmark Evaluations, 6) CCVM - CSNI Code Validation Matrix of Thermal-hydraulic Codes for LWR LOCA and Transients. This paper will concentrate on knowledge preservation and transfer concepts and methods related to some of the integral experiments and TDB. (author)

  8. MIPS Arabidopsis thaliana Database (MAtDB): an integrated biological knowledge resource for plant genomics

    Science.gov (United States)

    Schoof, Heiko; Ernst, Rebecca; Nazarov, Vladimir; Pfeifer, Lukas; Mewes, Hans-Werner; Mayer, Klaus F. X.

    2004-01-01

    Arabidopsis thaliana is the most widely studied model plant. Functional genomics is intensively underway in many laboratories worldwide. Beyond the basic annotation of the primary sequence data, the annotated genetic elements of Arabidopsis must be linked to diverse biological data and higher order information such as metabolic or regulatory pathways. The MIPS Arabidopsis thaliana database MAtDB aims to provide a comprehensive resource for Arabidopsis as a genome model that serves as a primary reference for research in plants and is suitable for transfer of knowledge to other plants, especially crops. The genome sequence as a common backbone serves as a scaffold for the integration of data, while, in a complementary effort, these data are enhanced through the application of state-of-the-art bioinformatics tools. This information is visualized on a genome-wide and a gene-by-gene basis with access both for web users and applications. This report updates the information given in a previous report and provides an outlook on further developments. The MAtDB web interface can be accessed at http://mips.gsf.de/proj/thal/db. PMID:14681437

  9. ViralORFeome: an integrated database to generate a versatile collection of viral ORFs.

    Science.gov (United States)

    Pellet, J; Tafforeau, L; Lucas-Hourani, M; Navratil, V; Meyniel, L; Achaz, G; Guironnet-Paquet, A; Aublin-Gex, A; Caignard, G; Cassonnet, P; Chaboud, A; Chantier, T; Deloire, A; Demeret, C; Le Breton, M; Neveu, G; Jacotot, L; Vaglio, P; Delmotte, S; Gautier, C; Combet, C; Deleage, G; Favre, M; Tangy, F; Jacob, Y; Andre, P; Lotteau, V; Rabourdin-Combe, C; Vidalain, P O

    2010-01-01

    Large collections of protein-encoding open reading frames (ORFs) established in a versatile recombination-based cloning system have been instrumental to study protein functions in high-throughput assays. Such 'ORFeome' resources have been developed for several organisms but in virology, plasmid collections covering a significant fraction of the virosphere are still needed. In this perspective, we present ViralORFeome 1.0 (http://www.viralorfeome.com), an open-access database and management system that provides an integrated set of bioinformatic tools to clone viral ORFs in the Gateway(R) system. ViralORFeome provides a convenient interface to navigate through virus genome sequences, to design ORF-specific cloning primers, to validate the sequence of generated constructs and to browse established collections of virus ORFs. Most importantly, ViralORFeome has been designed to manage all possible variants or mutants of a given ORF so that the cloning procedure can be applied to any emerging virus strain. A subset of plasmid constructs generated with ViralORFeome platform has been tested with success for heterologous protein expression in different expression systems at proteome scale. ViralORFeome should provide our community with a framework to establish a large collection of virus ORF clones, an instrumental resource to determine functions, activities and binding partners of viral proteins.

  10. Numerical Feynman integrals with physically inspired interpolation: Faster convergence and significant reduction of computational cost

    Directory of Open Access Journals (Sweden)

    Nikesh S. Dattani

    2012-03-01

    Full Text Available One of the most successful methods for calculating reduced density operator dynamics in open quantum systems, that can give numerically exact results, uses Feynman integrals. However, when simulating the dynamics for a given amount of time, the number of time steps that can realistically be used with this method is always limited, therefore one often obtains an approximation of the reduced density operator at a sparse grid of points in time. Instead of relying only on ad hoc interpolation methods (such as splines to estimate the system density operator in between these points, I propose a method that uses physical information to assist with this interpolation. This method is tested on a physically significant system, on which its use allows important qualitative features of the density operator dynamics to be captured with as little as two time steps in the Feynman integral. This method allows for an enormous reduction in the amount of memory and CPU time required for approximating density operator dynamics within a desired accuracy. Since this method does not change the way the Feynman integral itself is calculated, the value of the density operator approximation at the points in time used to discretize the Feynamn integral will be the same whether or not this method is used, but its approximation in between these points in time is considerably improved by this method. A list of ways in which this proposed method can be further improved is presented in the last section of the article.

  11. Numerical Modelling of Mechanical Integrity of the Copper-Cast Iron Canister. A Literature Review

    International Nuclear Information System (INIS)

    Lanru Jing

    2004-04-01

    This review article presents a summary of the research works on the numerical modelling of the mechanical integrity of the composite copper-cast iron canisters for the final disposal of Swedish nuclear wastes, conducted by SKB and SKI since 1992. The objective of the review is to evaluate the outstanding issues existing today about the basic design concepts and premises, fundamental issues on processes, properties and parameters considered for the functions and requirements of canisters under the conditions of a deep geological repository. The focus is placed on the adequacy of numerical modelling approaches adopted in regards to the overall mechanical integrity of the canisters, especially the initial state of canisters regarding defects and the consequences of their evolution under external and internal loading mechanisms adopted in the design premises. The emphasis is the stress-strain behaviour and failure/strength, with creep and plasticity involved. Corrosion, although one of the major concerns in the field of canister safety, was not included

  12. Numerical simulation of a lattice polymer model at its integrable point

    International Nuclear Information System (INIS)

    Bedini, A; Owczarek, A L; Prellberg, T

    2013-01-01

    We revisit an integrable lattice model of polymer collapse using numerical simulations. This model was first studied by Blöte and Nienhuis (1989 J. Phys. A: Math. Gen. 22 1415) and it describes polymers with some attraction, providing thus a model for the polymer collapse transition. At a particular set of Boltzmann weights the model is integrable and the exponents ν = 12/23 ≈ 0.522 and γ = 53/46 ≈ 1.152 have been computed via identification of the scaling dimensions x t = 1/12 and x h = −5/48. We directly investigate the polymer scaling exponents via Monte Carlo simulations using the pruned-enriched Rosenbluth method algorithm. By simulating this polymer model for walks up to length 4096 we find ν = 0.576(6) and γ = 1.045(5), which are clearly different from the predicted values. Our estimate for the exponent ν is compatible with the known θ-point value of 4/7 and in agreement with very recent numerical evaluation by Foster and Pinettes (2012 J. Phys. A: Math. Theor. 45 505003). (paper)

  13. On the hydrodynamics of archer fish jumping out of the water: Integrating experiments with numerical simulations

    Science.gov (United States)

    Sotiropoulos, Fotis; Angelidis, Dionysios; Mendelson, Leah; Techet, Alexandra

    2017-11-01

    Evolution has enabled fish to develop a range of thrust producing mechanisms to allow skillful movement and give them the ability to catch prey or avoid danger. Several experimental and numerical studies have been performed to investigate how complex maneuvers are executed and develop bioinspired strategies for aquatic robot design. We will discuss recent numerical advances toward the development of a computational framework for performing turbulent, two-phase flow, fluid-structure-interaction (FSI) simulations to investigate the dynamics of aquatic jumpers. We will also discuss the integration of such numerics with high-speed imaging and particle image velocimetry data to reconstruct anatomic fish models and prescribe realistic kinematics of fish motion. The capabilities of our method will be illustrated by applying it to simulate the motion of a small scale archer fish jumping out of the water to capture prey. We will discuss the rich vortex dynamics emerging during the hovering, rapid upward and gliding phases. The simulations will elucidate the thrust production mechanisms by the movement of the pectoral and anal fins and we will show that the fins significantly contribute to the rapid acceleration.

  14. Integrated Numerical Experiments (INEX) and the Free-Electron Laser Physical Process Code (FELPPC)

    International Nuclear Information System (INIS)

    Thode, L.E.; Chan, K.C.D.; Schmitt, M.J.; McKee, J.; Ostic, J.; Elliott, C.J.; McVey, B.D.

    1990-01-01

    The strong coupling of subsystem elements, such as the accelerator, wiggler, and optics, greatly complicates the understanding and design of a free electron laser (FEL), even at the conceptual level. To address the strong coupling character of the FEL the concept of an Integrated Numerical Experiment (INEX) was proposed. Unique features of the INEX approach are consistency and numerical equivalence of experimental diagnostics. The equivalent numerical diagnostics mitigates the major problem of misinterpretation that often occurs when theoretical and experimental data are compared. The INEX approach has been applied to a large number of accelerator and FEL experiments. Overall, the agreement between INEX and the experiments is very good. Despite the success of INEX, the approach is difficult to apply to trade-off and initial design studies because of the significant manpower and computational requirements. On the other hand, INEX provides a base from which realistic accelerator, wiggler, and optics models can be developed. The Free Electron Laser Physical Process Code (FELPPC) includes models developed from INEX, provides coupling between the subsystem models, and incorporates application models relevant to a specific trade-off or design study. In other words, FELPPC solves the complete physical process model using realistic physics and technology constraints. Because FELPPC provides a detailed design, a good estimate for the FEL mass, cost, and size can be made from a piece-part count of the FEL. FELPPC requires significant accelerator and FEL expertise to operate. The code can calculate complex FEL configurations including multiple accelerator and wiggler combinations

  15. Numerical investigation of premixed combustion in a porous burner with integrated heat exchanger

    Energy Technology Data Exchange (ETDEWEB)

    Farzaneh, Meisam; Shafiey, Mohammad; Shams, Mehrzad [K.N. Toosi University of Technology, Department of Mechanical Engineering, Tehran (Iran, Islamic Republic of); Ebrahimi, Reza [K.N. Toosi University of Technology, Department of Aerospace Engineering, Tehran (Iran, Islamic Republic of)

    2012-07-15

    In this paper, we perform a numerical analysis of a two-dimensional axisymmetric problem arising in premixed combustion in a porous burner with integrated heat exchanger. The physical domain consists of two zones, porous and heat exchanger zones. Two dimensional Navier-Stokes equations, gas and solid energy equations, and chemical species transport equations are solved and heat release is described by a multistep kinetics mechanism. The solid matrix is modeled as a gray medium, and the finite volume method is used to solve the radiative transfer equation to calculate the local radiation source/sink in the solid phase energy equation. Special attention is given to model heat transfer between the hot gas and the heat exchanger tube. Thus, the corresponding terms are added to the energy equations of the flow and the solid matrix. Gas and solid temperature profiles and species mole fractions on the burner centerline, predicted 2D temperature fields, species concentrations and streamlines are presented. Calculated results for temperature profiles are compared to experimental data. It is shown that there is good agreement between the numerical solutions and the experimental data and it is concluded that the developed numerical program is an excellent tool to investigate combustion in porous burner. (orig.)

  16. Application of Numerical Integration and Data Fusion in Unit Vector Method

    Science.gov (United States)

    Zhang, J.

    2012-01-01

    The Unit Vector Method (UVM) is a series of orbit determination methods which are designed by Purple Mountain Observatory (PMO) and have been applied extensively. It gets the conditional equations for different kinds of data by projecting the basic equation to different unit vectors, and it suits for weighted process for different kinds of data. The high-precision data can play a major role in orbit determination, and accuracy of orbit determination is improved obviously. The improved UVM (PUVM2) promoted the UVM from initial orbit determination to orbit improvement, and unified the initial orbit determination and orbit improvement dynamically. The precision and efficiency are improved further. In this thesis, further research work has been done based on the UVM: Firstly, for the improvement of methods and techniques for observation, the types and decision of the observational data are improved substantially, it is also asked to improve the decision of orbit determination. The analytical perturbation can not meet the requirement. So, the numerical integration for calculating the perturbation has been introduced into the UVM. The accuracy of dynamical model suits for the accuracy of the real data, and the condition equations of UVM are modified accordingly. The accuracy of orbit determination is improved further. Secondly, data fusion method has been introduced into the UVM. The convergence mechanism and the defect of weighted strategy have been made clear in original UVM. The problem has been solved in this method, the calculation of approximate state transition matrix is simplified and the weighted strategy has been improved for the data with different dimension and different precision. Results of orbit determination of simulation and real data show that the work of this thesis is effective: (1) After the numerical integration has been introduced into the UVM, the accuracy of orbit determination is improved obviously, and it suits for the high-accuracy data of

  17. Analysis of Wave Reflection from Structures with Berms Through an Extensive Database and 2DV Numerical Modelling

    DEFF Research Database (Denmark)

    Zanuttigh, Barbara; van der Meer, Jentsje W.; Andersen, Thomas Lykke

    2009-01-01

    This paper analyses wave reflection from permeable structures with a berm, including reshaping cases. Data are obtained from recent wave flume experiments and from 2DV numerical simulations performed with the COBRAS-UC code. The objectives of this research were to identify the proper representation...

  18. NOAA's Integrated Tsunami Database: Data for improved forecasts, warnings, research, and risk assessments

    Science.gov (United States)

    Stroker, Kelly; Dunbar, Paula; Mungov, George; Sweeney, Aaron; McCullough, Heather; Carignan, Kelly

    2015-04-01

    The National Oceanic and Atmospheric Administration (NOAA) has primary responsibility in the United States for tsunami forecast, warning, research, and supports community resiliency. NOAA's National Geophysical Data Center (NGDC) and co-located World Data Service for Geophysics provide a unique collection of data enabling communities to ensure preparedness and resilience to tsunami hazards. Immediately following a damaging or fatal tsunami event there is a need for authoritative data and information. The NGDC Global Historical Tsunami Database (http://www.ngdc.noaa.gov/hazard/) includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. The long-term data from these events, including photographs of damage, provide clues to what might happen in the future. NGDC catalogs the information on global historical tsunamis and uses these data to produce qualitative tsunami hazard assessments at regional levels. In addition to the socioeconomic effects of a tsunami, NGDC also obtains water level data from the coasts and the deep-ocean at stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services, the NOAA Tsunami Warning Centers, and the National Data Buoy Center (NDBC) and produces research-quality data to isolate seismic waves (in the case of the deep-ocean sites) and the tsunami signal. These water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC is also building high-resolution digital elevation models (DEMs) to support real-time forecasts, implemented at 75 US coastal communities. After a damaging or fatal event NGDC begins to collect and integrate data and information from many organizations into the hazards databases. Sources of data include our NOAA partners, the U.S. Geological Survey, the UNESCO Intergovernmental Oceanographic Commission (IOC) and International Tsunami Information Center

  19. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    OpenAIRE

    Errol A. Blake

    2007-01-01

    Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions ...

  20. Constraining the thermal conditions of impact environments through integrated low-temperature thermochronometry and numerical modeling

    Science.gov (United States)

    Kelly, N. M.; Marchi, S.; Mojzsis, S. J.; Flowers, R. M.; Metcalf, J. R.; Bottke, W. F., Jr.

    2017-12-01

    Impacts have a significant physical and chemical influence on the surface conditions of a planet. The cratering record is used to understand a wide array of impact processes, such as the evolution of the impact flux through time. However, the relationship between impactor size and a resulting impact crater remains controversial (e.g., Bottke et al., 2016). Likewise, small variations in the impact velocity are known to significantly affect the thermal-mechanical disturbances in the aftermath of a collision. Development of more robust numerical models for impact cratering has implications for how we evaluate the disruptive capabilities of impact events, including the extent and duration of thermal anomalies, the volume of ejected material, and the resulting landscape of impacted environments. To address uncertainties in crater scaling relationships, we present an approach and methodology that integrates numerical modeling of the thermal evolution of terrestrial impact craters with low-temperature, (U-Th)/He thermochronometry. The approach uses time-temperature (t-T) paths of crust within an impact crater, generated from numerical simulations of an impact. These t-T paths are then used in forward models to predict the resetting behavior of (U-Th)/He ages in the mineral chronometers apatite and zircon. Differences between the predicted and measured (U-Th)/He ages from a modeled terrestrial impact crater can then be used to evaluate parameters in the original numerical simulations, and refine the crater scaling relationships. We expect our methodology to additionally inform our interpretation of impact products, such as lunar impact breccias and meteorites, providing robust constraints on their thermal histories. In addition, the method is ideal for sample return mission planning - robust "prediction" of ages we expect from a given impact environment enhances our ability to target sampling sites on the Moon, Mars or other solar system bodies where impacts have strongly

  1. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...

  2. Some considerations on displacement assumed finite elements with the reduced numerical integration technique

    International Nuclear Information System (INIS)

    Takeda, H.; Isha, H.

    1981-01-01

    The paper is concerned with the displacement-assumed-finite elements by applying the reduced numerical integration technique in structural problems. The first part is a general consideration on the technique. Its purpose is to examine a variational interpretation of the finite element displacement formulation with the reduced integration technique in structural problems. The formulation is critically studied from a standpoint of the natural stiffness approach. It is shown that these types of elements are equivalent to a certain type of displacement and stress assumed mixed elements. The rank deficiency of the stiffness matrix of these elements is interpreted as a problem in the transformation from the natural system to a Cartesian system. It will be shown that a variational basis of the equivalent mixed formulation is closely related to the Hellinger-Reissner's functional. It is presented that for simple elements, e.g. bilinear quadrilateral plane stress and plate bending there are corresponding mixed elements from the functional. For relatively complex types of these elements, it is shown that they are equivalent to localized mixed elements from the Hellinger-Reissner's functional. In the second part, typical finite elements with the reduced integration technique are studied to demonstrate this equivalence. A bilinear displacement and rotation assumed shear beam element, a bilinear displacement assumed quadrilateral plane stress element and a bilinear deflection and rotation assumed quadrilateral plate bending element are examined to present equivalent mixed elements. Not only the theoretical consideration is presented but numerical studies are shown to demonstrate the effectiveness of these elements in practical analysis. (orig.)

  3. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database

    KAUST Repository

    Komatsu, Setsuko

    2017-05-10

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max ‘Enrei’). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. Biological significanceThe Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all

  4. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database.

    Science.gov (United States)

    Komatsu, Setsuko; Wang, Xin; Yin, Xiaojian; Nanjo, Yohei; Ohyanagi, Hajime; Sakata, Katsumi

    2017-06-23

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max 'Enrei'). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. The Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all predicted proteins from

  5. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database

    KAUST Repository

    Komatsu, Setsuko; Wang, Xin; Yin, Xiaojian; Nanjo, Yohei; Ohyanagi, Hajime; Sakata, Katsumi

    2017-01-01

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max ‘Enrei’). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. Biological significanceThe Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all

  6. Numerical simulations of inertial confinement fusion hohlraum with LARED-integration code

    International Nuclear Information System (INIS)

    Li Jinghong; Li Shuanggui; Zhai Chuanlei

    2011-01-01

    In the target design of the Inertial Confinement Fusion (ICF) program, it is common practice to apply radiation hydrodynamics code to study the key physical processes happened in ICF process, such as hohlraum physics, radiation drive symmetry, capsule implosion physics in the radiation-drive approach of ICF. Recently, many efforts have been done to develop our 2D integrated simulation capability of laser fusion with a variety of optional physical models and numerical methods. In order to effectively integrate the existing codes and to facilitate the development of new codes, we are developing an object-oriented structured-mesh parallel code-supporting infrastructure, called JASMIN. Based on two-dimensional three-temperature hohlraum physics code LARED-H and two-dimensional multi-group radiative transfer code LARED-R, we develop a new generation two-dimensional laser fusion code under the JASMIN infrastructure, which enable us to simulate the whole process of laser fusion from the laser beams' entrance into the hohlraum to the end of implosion. In this paper, we will give a brief description of our new-generation two-dimensional laser fusion code, named LARED-Integration, especially in its physical models, and present some simulation results of holhraum. (author)

  7. Development of an Integrated Natural Barrier Database System for Site Evaluation of a Deep Geologic Repository in Korea - 13527

    International Nuclear Information System (INIS)

    Jung, Haeryong; Lee, Eunyong; Jeong, YiYeong; Lee, Jeong-Hwan

    2013-01-01

    Korea Radioactive-waste Management Corporation (KRMC) established in 2009 has started a new project to collect information on long-term stability of deep geological environments on the Korean Peninsula. The information has been built up in the integrated natural barrier database system available on web (www.deepgeodisposal.kr). The database system also includes socially and economically important information, such as land use, mining area, natural conservation area, population density, and industrial complex, because some of this information is used as exclusionary criteria during the site selection process for a deep geological repository for safe and secure containment and isolation of spent nuclear fuel and other long-lived radioactive waste in Korea. Although the official site selection process has not been started yet in Korea, current integrated natural barrier database system and socio-economic database is believed that the database system will be effectively utilized to narrow down the number of sites where future investigation is most promising in the site selection process for a deep geological repository and to enhance public acceptance by providing readily-available relevant scientific information on deep geological environments in Korea. (authors)

  8. An Autonomic Framework for Integrating Security and Quality of Service Support in Databases

    Science.gov (United States)

    Alomari, Firas

    2013-01-01

    The back-end databases of multi-tiered applications are a major data security concern for enterprises. The abundance of these systems and the emergence of new and different threats require multiple and overlapping security mechanisms. Therefore, providing multiple and diverse database intrusion detection and prevention systems (IDPS) is a critical…

  9. European Vegetation Archive (EVA): an integrated database of European vegetation plots

    DEFF Research Database (Denmark)

    Chytrý, M; Hennekens, S M; Jiménez-Alfaro, B

    2015-01-01

    vegetation- plot databases on a single software platform. Data storage in EVA does not affect on-going independent development of the contributing databases, which remain the property of the data contributors. EVA uses a prototype of the database management software TURBOVEG 3 developed for joint management......The European Vegetation Archive (EVA) is a centralized database of European vegetation plots developed by the IAVS Working Group European Vegetation Survey. It has been in development since 2012 and first made available for use in research projects in 2014. It stores copies of national and regional...... data source for large-scale analyses of European vegetation diversity both for fundamental research and nature conservation applications. Updated information on EVA is available online at http://euroveg.org/eva-database....

  10. Low-temperature baseboard heaters with integrated air supply - An analytical and numerical investigation

    Energy Technology Data Exchange (ETDEWEB)

    Ploskic, Adnan; Holmberg, Sture [Fluid and Climate Technology, School of Architecture and Built Environment, KTH, Marinens vaeg 30, SE-13640 Handen, Stockholm (Sweden)

    2011-01-15

    The functioning of a hydronic baseboard heating system with integrated air supply was analyzed. The aim was to investigate thermal performance of the system when cold outdoor (ventilation) airflow was forced through the baseboard heater. The performance of the system was evaluated for different ventilation rates at typical outdoor temperatures during the Swedish winter season. Three different analytical models and Computational Fluid Dynamics (CFD) were used to predict the temperature rise of the airflow inside the baseboard heater. Good agreement between numerical (CFD) and analytical calculations was obtained. Calculations showed that it was fully possible to pre-heat the incoming airflow to the indoor temperature and to cover transmission losses, using 45 C supply water flow. The analytical calculations also showed that the airflow per supply opening in the baseboard heater needed to be limited to 7.0 l/s due to pressure losses inside the channel. At this ventilation rate, the integrated system with one air supply gave about 2.1 more heat output than a conventional baseboard heating system. CFD simulations also showed that the integrated system was capable of countering downdraught created by 2.0 m high glazed areas and a cold outdoor environment. Draught discomfort in the case with the conventional system was slightly above the recommended upper limit, but heat distribution across whole analyzed office space was uniform for both heating systems. It was concluded that low-temperature baseboard heating systems with integrated air supply can meet both international comfort requirements, and lead to energy savings in cold climates. (author)

  11. Numerical path integral solution to strong Coulomb correlation in one dimensional Hooke's atom

    Science.gov (United States)

    Ruokosenmäki, Ilkka; Gholizade, Hossein; Kylänpää, Ilkka; Rantala, Tapio T.

    2017-01-01

    We present a new approach based on real time domain Feynman path integrals (RTPI) for electronic structure calculations and quantum dynamics, which includes correlations between particles exactly but within the numerical accuracy. We demonstrate that incoherent propagation by keeping the wave function real is a novel method for finding and simulation of the ground state, similar to Diffusion Monte Carlo (DMC) method, but introducing new useful tools lacking in DMC. We use 1D Hooke's atom, a two-electron system with very strong correlation, as our test case, which we solve with incoherent RTPI (iRTPI) and compare against DMC. This system provides an excellent test case due to exact solutions for some confinements and because in 1D the Coulomb singularity is stronger than in two or three dimensional space. The use of Monte Carlo grid is shown to be efficient for which we determine useful numerical parameters. Furthermore, we discuss another novel approach achieved by combining the strengths of iRTPI and DMC. We also show usefulness of the perturbation theory for analytical approximates in case of strong confinements.

  12. The development and validation of a numerical integration method for non-linear viscoelastic modeling

    Science.gov (United States)

    Ramo, Nicole L.; Puttlitz, Christian M.

    2018-01-01

    Compelling evidence that many biological soft tissues display both strain- and time-dependent behavior has led to the development of fully non-linear viscoelastic modeling techniques to represent the tissue’s mechanical response under dynamic conditions. Since the current stress state of a viscoelastic material is dependent on all previous loading events, numerical analyses are complicated by the requirement of computing and storing the stress at each step throughout the load history. This requirement quickly becomes computationally expensive, and in some cases intractable, for finite element models. Therefore, we have developed a strain-dependent numerical integration approach for capturing non-linear viscoelasticity that enables calculation of the current stress from a strain-dependent history state variable stored from the preceding time step only, which improves both fitting efficiency and computational tractability. This methodology was validated based on its ability to recover non-linear viscoelastic coefficients from simulated stress-relaxation (six strain levels) and dynamic cyclic (three frequencies) experimental stress-strain data. The model successfully fit each data set with average errors in recovered coefficients of 0.3% for stress-relaxation fits and 0.1% for cyclic. The results support the use of the presented methodology to develop linear or non-linear viscoelastic models from stress-relaxation or cyclic experimental data of biological soft tissues. PMID:29293558

  13. A numerical study on the structural integrity of self-anchored cable-stayed suspension bridges

    Directory of Open Access Journals (Sweden)

    Paolo Lonetti

    2016-10-01

    Full Text Available A generalized numerical model for predicting the structural integrity of self-anchored cable-stayed suspension bridges considering both geometric and material nonlinearities is proposed. The bridge is modeled by means of a 3D finite element approach based on a refined displacement-type finite element approximation, in which geometrical nonlinearities are assumed in all components of the structure. Moreover, nonlinearities produced by inelastic material and second order effects in the displacements are considered for girder and pylon elements, which combine gradual yielding theory with CRC tangent modulus concept. In addition, for the elements of the suspension system, i.e. stays, hangers and main cable, a finite plasticity theory is adopted to fully evaluate both geometric and material nonlinearities. In this framework, the influence of geometric and material nonlinearities on the collapse bridge behavior is investigated, by means of a comparative study, which identifies the effects produced on the ultimate bridge behavior of several sources of bridge nonlinearities involved in the bridge components. Results are developed with the purpose to evaluate numerically the influence of the material and geometric characteristics of self-anchored cable-stayed suspension bridges with respect also to conventional bridge based on cablestayed or suspension schemes

  14. A Reaction Database for Small Molecule Pharmaceutical Processes Integrated with Process Information

    Directory of Open Access Journals (Sweden)

    Emmanouil Papadakis

    2017-10-01

    Full Text Available This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic solvents in reaction performance improvement. The focus of this reaction database is to provide a data rich environment with process information available to assist during the early stage synthesis of pharmaceutical products. The database is structured in terms of reaction classification of reaction types; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information for each reaction and reference are also available in the database. Additionally, the retrieved information obtained from the database can be evaluated in terms of sustainability using well-known “green” metrics published in the scientific literature. The application of the database is illustrated through the synthesis of ibuprofen, for which data on different reaction pathways have been retrieved from the database and compared using “green” chemistry metrics.

  15. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    Directory of Open Access Journals (Sweden)

    Errol A. Blake

    2007-12-01

    Full Text Available Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions will focus on unifying the process of securing data or information whether it is in use, in storage or being transmitted. Promoting a change in Database Curriculum Development trends may also play a role in helping secure databases. This paper will take the approach that if one make a conscientious effort to unifying the Database Security process, which includes Database Management System (DBMS selection process, following regulatory compliances, analyzing and learning from the mistakes of others, Implementing Networking Security Technologies, and Securing the Database, may prevent database breach.

  16. Multilingual access to full text databases; Acces multilingue aux bases de donnees en texte integral

    Energy Technology Data Exchange (ETDEWEB)

    Fluhr, C; Radwan, K [Institut National des Sciences et Techniques Nucleaires (INSTN), Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France)

    1990-05-01

    Many full text databases are available in only one language, or more, they may contain documents in different languages. Even if the user is able to understand the language of the documents in the database, it could be easier for him to express his need in his own language. For the case of databases containing documents in different languages, it is more simple to formulate the query in one language only and to retrieve documents in different languages. This paper present the developments and the first experiments of multilingual search, applied to french-english pair, for text data in nuclear field, based on the system SPIRIT. After reminding the general problems of full text databases search by queries formulated in natural language, we present the methods used to reformulate the queries and show how they can be expanded for multilingual search. The first results on data in nuclear field are presented (AFCEN norms and INIS abstracts). 4 refs.

  17. An Integrated Database of Unit Training Performance: Description an Lessons Learned

    National Research Council Canada - National Science Library

    Leibrecht, Bruce

    1997-01-01

    The Army Research Institute (ARI) has developed a prototype relational database for processing and archiving unit performance data from home station, training area, simulation based, and Combat Training Center training exercises...

  18. An accurate real-time model of maglev planar motor based on compound Simpson numerical integration

    Directory of Open Access Journals (Sweden)

    Baoquan Kou

    2017-05-01

    Full Text Available To realize the high-speed and precise control of the maglev planar motor, a more accurate real-time electromagnetic model, which considers the influence of the coil corners, is proposed in this paper. Three coordinate systems for the stator, mover and corner coil are established. The coil is divided into two segments, the straight coil segment and the corner coil segment, in order to obtain a complete electromagnetic model. When only take the first harmonic of the flux density distribution of a Halbach magnet array into account, the integration method can be carried out towards the two segments according to Lorenz force law. The force and torque analysis formula of the straight coil segment can be derived directly from Newton-Leibniz formula, however, this is not applicable to the corner coil segment. Therefore, Compound Simpson numerical integration method is proposed in this paper to solve the corner segment. With the validation of simulation and experiment, the proposed model has high accuracy and can realize practical application easily.

  19. An accurate real-time model of maglev planar motor based on compound Simpson numerical integration

    Science.gov (United States)

    Kou, Baoquan; Xing, Feng; Zhang, Lu; Zhou, Yiheng; Liu, Jiaqi

    2017-05-01

    To realize the high-speed and precise control of the maglev planar motor, a more accurate real-time electromagnetic model, which considers the influence of the coil corners, is proposed in this paper. Three coordinate systems for the stator, mover and corner coil are established. The coil is divided into two segments, the straight coil segment and the corner coil segment, in order to obtain a complete electromagnetic model. When only take the first harmonic of the flux density distribution of a Halbach magnet array into account, the integration method can be carried out towards the two segments according to Lorenz force law. The force and torque analysis formula of the straight coil segment can be derived directly from Newton-Leibniz formula, however, this is not applicable to the corner coil segment. Therefore, Compound Simpson numerical integration method is proposed in this paper to solve the corner segment. With the validation of simulation and experiment, the proposed model has high accuracy and can realize practical application easily.

  20. Numerical functional integration method for studying the properties of the physical vacuum

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.

    1998-01-01

    The new approach to investigate the physical vacuum in quantum theories including its nonperturbative topological structure is discussed. This approach is based on the representation of the matrix element of the evolution operator in Euclidean metrics in a form of the functional integral with a certain measure in the corresponding space and on the use of approximation formulas which we constructed for this kind of integral. No preliminary discretization of space and time is required, as well as no simplifying assumptions like semiclassical approximation, collective excitations, introduction of ''short-time'' propagators, etc. are necessary in this approach. The method allows to use the more preferable deterministic algorithms instead of the traditional stochastic technique. It has been proven that our approach has important advantages over the other known methods, including the higher efficiency of computations. Examples of application of the method to the numerical study of some potential nuclear models and to the computation of the topological susceptibility and the θ-vacua energy are presented. (author)

  1. An integrated numerical and physical modeling system for an enhanced in situ bioremediation process

    International Nuclear Information System (INIS)

    Huang, Y.F.; Huang, G.H.; Wang, G.Q.; Lin, Q.G.; Chakma, A.

    2006-01-01

    Groundwater contamination due to releases of petroleum products is a major environmental concern in many urban districts and industrial zones. Over the past years, a few studies were undertaken to address in situ bioremediation processes coupled with contaminant transport in two- or three-dimensional domains. However, they were concentrated on natural attenuation processes for petroleum contaminants or enhanced in situ bioremediation processes in laboratory columns. In this study, an integrated numerical and physical modeling system is developed for simulating an enhanced in situ biodegradation (EISB) process coupled with three-dimensional multiphase multicomponent flow and transport simulation in a multi-dimensional pilot-scale physical model. The designed pilot-scale physical model is effective in tackling natural attenuation and EISB processes for site remediation. The simulation results demonstrate that the developed system is effective in modeling the EISB process, and can thus be used for investigating the effects of various uncertainties. - An integrated modeling system was developed to enhance in situ bioremediation processes

  2. Use of Graph Database for the Integration of Heterogeneous Biological Data.

    Science.gov (United States)

    Yoon, Byoung-Ha; Kim, Seon-Kyu; Kim, Seon-Young

    2017-03-01

    Understanding complex relationships among heterogeneous biological data is one of the fundamental goals in biology. In most cases, diverse biological data are stored in relational databases, such as MySQL and Oracle, which store data in multiple tables and then infer relationships by multiple-join statements. Recently, a new type of database, called the graph-based database, was developed to natively represent various kinds of complex relationships, and it is widely used among computer science communities and IT industries. Here, we demonstrate the feasibility of using a graph-based database for complex biological relationships by comparing the performance between MySQL and Neo4j, one of the most widely used graph databases. We collected various biological data (protein-protein interaction, drug-target, gene-disease, etc.) from several existing sources, removed duplicate and redundant data, and finally constructed a graph database containing 114,550 nodes and 82,674,321 relationships. When we tested the query execution performance of MySQL versus Neo4j, we found that Neo4j outperformed MySQL in all cases. While Neo4j exhibited a very fast response for various queries, MySQL exhibited latent or unfinished responses for complex queries with multiple-join statements. These results show that using graph-based databases, such as Neo4j, is an efficient way to store complex biological relationships. Moreover, querying a graph database in diverse ways has the potential to reveal novel relationships among heterogeneous biological data.

  3. Numerical integration methods and layout improvements in the context of dynamic RNA visualization.

    Science.gov (United States)

    Shabash, Boris; Wiese, Kay C

    2017-05-30

    RNA visualization software tools have traditionally presented a static visualization of RNA molecules with limited ability for users to interact with the resulting image once it is complete. Only a few tools allowed for dynamic structures. One such tool is jViz.RNA. Currently, jViz.RNA employs a unique method for the creation of the RNA molecule layout by mapping the RNA nucleotides into vertexes in a graph, which we call the detailed graph, and then utilizes a Newtonian mechanics inspired system of forces to calculate a layout for the RNA molecule. The work presented here focuses on improvements to jViz.RNA that allow the drawing of RNA secondary structures according to common drawing conventions, as well as dramatic run-time performance improvements. This is done first by presenting an alternative method for mapping the RNA molecule into a graph, which we call the compressed graph, and then employing advanced numerical integration methods for the compressed graph representation. Comparing the compressed graph and detailed graph implementations, we find that the compressed graph produces results more consistent with RNA drawing conventions. However, we also find that employing the compressed graph method requires a more sophisticated initial layout to produce visualizations that would require minimal user interference. Comparing the two numerical integration methods demonstrates the higher stability of the Backward Euler method, and its resulting ability to handle much larger time steps, a high priority feature for any software which entails user interaction. The work in this manuscript presents the preferred use of compressed graphs to detailed ones, as well as the advantages of employing the Backward Euler method over the Forward Euler method. These improvements produce more stable as well as visually aesthetic representations of the RNA secondary structures. The results presented demonstrate that both the compressed graph representation, as well as the Backward

  4. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  5. Application of variational principles and adjoint integrating factors for constructing numerical GFD models

    Science.gov (United States)

    Penenko, Vladimir; Tsvetova, Elena; Penenko, Alexey

    2015-04-01

    The proposed method is considered on an example of hydrothermodynamics and atmospheric chemistry models [1,2]. In the development of the existing methods for constructing numerical schemes possessing the properties of total approximation for operators of multiscale process models, we have developed a new variational technique, which uses the concept of adjoint integrating factors. The technique is as follows. First, a basic functional of the variational principle (the integral identity that unites the model equations, initial and boundary conditions) is transformed using Lagrange's identity and the second Green's formula. As a result, the action of the operators of main problem in the space of state functions is transferred to the adjoint operators defined in the space of sufficiently smooth adjoint functions. By the choice of adjoint functions the order of the derivatives becomes lower by one than those in the original equations. We obtain a set of new balance relationships that take into account the sources and boundary conditions. Next, we introduce the decomposition of the model domain into a set of finite volumes. For multi-dimensional non-stationary problems, this technique is applied in the framework of the variational principle and schemes of decomposition and splitting on the set of physical processes for each coordinate directions successively at each time step. For each direction within the finite volume, the analytical solutions of one-dimensional homogeneous adjoint equations are constructed. In this case, the solutions of adjoint equations serve as integrating factors. The results are the hybrid discrete-analytical schemes. They have the properties of stability, approximation and unconditional monotony for convection-diffusion operators. These schemes are discrete in time and analytic in the spatial variables. They are exact in case of piecewise-constant coefficients within the finite volume and along the coordinate lines of the grid area in each

  6. DMPD: Signal integration between IFNgamma and TLR signalling pathways in macrophages. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available 16920490 Signal integration between IFNgamma and TLR signalling pathways in macroph...tml) (.csml) Show Signal integration between IFNgamma and TLR signalling pathways in macrophages. PubmedID 16920490 Title Signal inte...gration between IFNgamma and TLR signalling pathways in

  7. YPED: an integrated bioinformatics suite and database for mass spectrometry-based proteomics research.

    Science.gov (United States)

    Colangelo, Christopher M; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L; Carriero, Nicholas J; Gulcicek, Erol E; Lam, TuKiet T; Wu, Terence; Bjornson, Robert D; Bruce, Can; Nairn, Angus C; Rinehart, Jesse; Miller, Perry L; Williams, Kenneth R

    2015-02-01

    We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography-tandem mass spectrometry (LC-MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED's database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  8. Integrated application of the database for airborne geophysical survey achievement information

    International Nuclear Information System (INIS)

    Ji Zengxian; Zhang Junwei

    2006-01-01

    The paper briefly introduces the database of information for airborne geophysical survey achievements. This database was developed on the platform of Microsoft Windows System with the technical methods of Visual C++ 6.0 and MapGIS. It is an information management system concerning airborne geophysical surveying achievements with perfect functions in graphic display, graphic cutting and output, query of data, printing of documents and reports, maintenance of database, etc. All information of airborne geophysical survey achievements in nuclear industry from 1972 to 2003 was embedded in. Based on regional geological map and Meso-Cenozoic basin map, the detailed statistical information of each airborne survey area, each airborne radioactive anomalous point and high field point can be presented visually by combining geological or basin research result. The successful development of this system will provide a fairly good base and platform for management of archives and data of airborne geophysical survey achievements in nuclear industry. (authors)

  9. Data integration and knowledge discovery in biomedical databases. Reliable information from unreliable sources

    Directory of Open Access Journals (Sweden)

    A Mitnitski

    2003-01-01

    Full Text Available To better understand information about human health from databases we analyzed three datasets collected for different purposes in Canada: a biomedical database of older adults, a large population survey across all adult ages, and vital statistics. Redundancy in the variables was established, and this led us to derive a generalized (macroscopic state variable, being a fitness/frailty index that reflects both individual and group health status. Evaluation of the relationship between fitness/frailty and the mortality rate revealed that the latter could be expressed in terms of variables generally available from any cross-sectional database. In practical terms, this means that the risk of mortality might readily be assessed from standard biomedical appraisals collected for other purposes.

  10. Experimental and numerical analyses on thermal performance of different typologies of PCMs integrated in the roof space

    DEFF Research Database (Denmark)

    Elarga, Hagar; Fantucci, Stefano; Serra, Valentina

    2017-01-01

    portions, one, the bare roof, representing the reference case without PCMs, the other two integrating two PCM's typologies with different melting/solidification temperatures range. A numerical model was furthermore developed implementing the equivalent capacitance numerical method to describe the substance...... peak load between 13% and 59% depending on the PCM typology, highlighting that to reach the expected performance the proper PCM type should be carefully selected....

  11. Follicle Online: an integrated database of follicle assembly, development and ovulation.

    Science.gov (United States)

    Hua, Juan; Xu, Bo; Yang, Yifan; Ban, Rongjun; Iqbal, Furhan; Cooke, Howard J; Zhang, Yuanwei; Shi, Qinghua

    2015-01-01

    Folliculogenesis is an important part of ovarian function as it provides the oocytes for female reproductive life. Characterizing genes/proteins involved in folliculogenesis is fundamental for understanding the mechanisms associated with this biological function and to cure the diseases associated with folliculogenesis. A large number of genes/proteins associated with folliculogenesis have been identified from different species. However, no dedicated public resource is currently available for folliculogenesis-related genes/proteins that are validated by experiments. Here, we are reporting a database 'Follicle Online' that provides the experimentally validated gene/protein map of the folliculogenesis in a number of species. Follicle Online is a web-based database system for storing and retrieving folliculogenesis-related experimental data. It provides detailed information for 580 genes/proteins (from 23 model organisms, including Homo sapiens, Mus musculus, Rattus norvegicus, Mesocricetus auratus, Bos Taurus, Drosophila and Xenopus laevis) that have been reported to be involved in folliculogenesis, POF (premature ovarian failure) and PCOS (polycystic ovary syndrome). The literature was manually curated from more than 43,000 published articles (till 1 March 2014). The Follicle Online database is implemented in PHP + MySQL + JavaScript and this user-friendly web application provides access to the stored data. In summary, we have developed a centralized database that provides users with comprehensive information about genes/proteins involved in folliculogenesis. This database can be accessed freely and all the stored data can be viewed without any registration. Database URL: http://mcg.ustc.edu.cn/sdap1/follicle/index.php © The Author(s) 2015. Published by Oxford University Press.

  12. Science-Based Approach for Advancing Marine and Hydrokinetic Energy: Integrating Numerical Simulations with Experiments

    Science.gov (United States)

    Sotiropoulos, F.; Kang, S.; Chamorro, L. P.; Hill, C.

    2011-12-01

    The field of MHK energy is still in its infancy lagging approximately a decade or more behind the technology and development progress made in wind energy engineering. Marine environments are characterized by complex topography and three-dimensional (3D) turbulent flows, which can greatly affect the performance and structural integrity of MHK devices and impact the Levelized Cost of Energy (LCoE). Since the deployment of multi-turbine arrays is envisioned for field applications, turbine-to-turbine interactions and turbine-bathymetry interactions need to be understood and properly modeled so that MHK arrays can be optimized on a site specific basis. Furthermore, turbulence induced by MHK turbines alters and interacts with the nearby ecosystem and could potentially impact aquatic habitats. Increased turbulence in the wake of MHK devices can also change the shear stress imposed on the bed ultimately affecting the sediment transport and suspension processes in the wake of these structures. Such effects, however, remain today largely unexplored. In this work a science-based approach integrating state-of-the-art experimentation with high-resolution computational fluid dynamics is proposed as a powerful strategy for optimizing the performance of MHK devices and assessing environmental impacts. A novel numerical framework is developed for carrying out Large-Eddy Simulation (LES) in arbitrarily complex domains with embedded MHK devices. The model is able to resolve the geometrical complexity of real-life MHK devices using the Curvilinear Immersed Boundary (CURVIB) method along with a wall model for handling the flow near solid surfaces. Calculations are carried out for an axial flow hydrokinetic turbine mounted on the bed of rectangular open channel on a grid with nearly 200 million grid nodes. The approach flow corresponds to fully developed turbulent open channel flow and is obtained from a separate LES calculation. The specific case corresponds to that studied

  13. Social Gerontology--Integrative and Territorial Aspects: A Citation Analysis of Subject Scatter and Database Coverage

    Science.gov (United States)

    Lasda Bergman, Elaine M.

    2011-01-01

    To determine the mix of resources used in social gerontology research, a citation analysis was conducted. A representative sample of citations was selected from three prominent gerontology journals and information was added to determine subject scatter and database coverage for the cited materials. Results indicate that a significant portion of…

  14. Influenza research database: an integrated bioinformatics resource for influenza virus research

    Science.gov (United States)

    The Influenza Research Database (IRD) is a U.S. National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Bioinformatics Resource Center dedicated to providing bioinformatics support for influenza virus research. IRD facilitates the research and development of vaccines, diagnostics, an...

  15. Analysis of enhancement in available power transfer capacity by STATCOM integrated SMES by numerical simulation studies

    Directory of Open Access Journals (Sweden)

    Saraswathi Ananthavel

    2016-06-01

    Full Text Available Power system researches are mainly focused in enhancing the available power capacities of the existing transmission lines. But still, no prominent solutions have been made due to several factors that affect the transmission lines which include the length, aging of the cables and losses on generation, transmission and distribution etc. This paper exploited the integration of static synchronous compensator (STATCOM and superconducting magnetic energy storage (SMES which is then connected to existing power transmission line for enhancing the available power transfer capacity (ATC. STATCOM is power electronic voltage source converter (VSC which is connected to the transmission system for shunt reactive power and harmonics compensation. SMES is a renowned clean energy storage technology. Feasibility of the proposed power system can control the real as well as reactive power flow independently between the transmission lines and STATCOM-(SMES units. Complete proposed power system is implemented in numerical simulation software (Matlab/Simulink and its performance is validated based on obtained investigation results.

  16. Screening of groundwater remedial alternatives for brownfield sites: a comprehensive method integrated MCDA with numerical simulation.

    Science.gov (United States)

    Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu

    2018-06-01

    Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.

  17. Boundary integral equation methods and numerical solutions thin plates on an elastic foundation

    CERN Document Server

    Constanda, Christian; Hamill, William

    2016-01-01

    This book presents and explains a general, efficient, and elegant method for solving the Dirichlet, Neumann, and Robin boundary value problems for the extensional deformation of a thin plate on an elastic foundation. The solutions of these problems are obtained both analytically—by means of direct and indirect boundary integral equation methods (BIEMs)—and numerically, through the application of a boundary element technique. The text discusses the methodology for constructing a BIEM, deriving all the attending mathematical properties with full rigor. The model investigated in the book can serve as a template for the study of any linear elliptic two-dimensional problem with constant coefficients. The representation of the solution in terms of single-layer and double-layer potentials is pivotal in the development of a BIEM, which, in turn, forms the basis for the second part of the book, where approximate solutions are computed with a high degree of accuracy. The book is intended for graduate students and r...

  18. An Integrated Numerical Model for the Design of Coastal Protection Structures

    Directory of Open Access Journals (Sweden)

    Theophanis V. Karambas

    2017-10-01

    Full Text Available In the present work, an integrated coastal engineering numerical model is presented. The model simulates the linear wave propagation, wave-induced circulation, and sediment transport and bed morphology evolution. It consists of three main modules: WAVE_L, WICIR, and SEDTR. The nearshore wave transformation module WAVE_L (WAVE_Linear is based on the hyperbolic-type mild slope equation and is valid for a compound linear wave field near coastal structures where the waves are subjected to the combined effects of shoaling, refraction, diffraction, reflection (total and partial, and breaking. Radiation stress components (calculated from WAVE_L drive the depth averaged circulation module WICIR (Wave Induced CIRculation for the description of the nearshore wave-induced currents. Sediment transport and bed morphology evolution in the nearshore, surf, and swash zone are simulated by the SEDTR (SEDiment TRansport module. The model is tested against experimental data to study the effect of representative coastal protection structures and is applied to a real case study of a coastal engineering project in North Greece, producing accurate and consistent results for a versatile range of layouts.

  19. A database and tool, IM Browser, for exploring and integrating emerging gene and protein interaction data for Drosophila

    Directory of Open Access Journals (Sweden)

    Parrish Jodi R

    2006-04-01

    Full Text Available Abstract Background Biological processes are mediated by networks of interacting genes and proteins. Efforts to map and understand these networks are resulting in the proliferation of interaction data derived from both experimental and computational techniques for a number of organisms. The volume of this data combined with the variety of specific forms it can take has created a need for comprehensive databases that include all of the available data sets, and for exploration tools to facilitate data integration and analysis. One powerful paradigm for the navigation and analysis of interaction data is an interaction graph or map that represents proteins or genes as nodes linked by interactions. Several programs have been developed for graphical representation and analysis of interaction data, yet there remains a need for alternative programs that can provide casual users with rapid easy access to many existing and emerging data sets. Description Here we describe a comprehensive database of Drosophila gene and protein interactions collected from a variety of sources, including low and high throughput screens, genetic interactions, and computational predictions. We also present a program for exploring multiple interaction data sets and for combining data from different sources. The program, referred to as the Interaction Map (IM Browser, is a web-based application for searching and visualizing interaction data stored in a relational database system. Use of the application requires no downloads and minimal user configuration or training, thereby enabling rapid initial access to interaction data. IM Browser was designed to readily accommodate and integrate new types of interaction data as it becomes available. Moreover, all information associated with interaction measurements or predictions and the genes or proteins involved are accessible to the user. This allows combined searches and analyses based on either common or technique-specific attributes

  20. Development and implementation of a custom integrated database with dashboards to assist with hematopathology specimen triage and traffic

    Directory of Open Access Journals (Sweden)

    Elizabeth M Azzato

    2014-01-01

    Full Text Available Background: At some institutions, including ours, bone marrow aspirate specimen triage is complex, with hematopathology triage decisions that need to be communicated to downstream ancillary testing laboratories and many specimen aliquot transfers that are handled outside of the laboratory information system (LIS. We developed a custom integrated database with dashboards to facilitate and streamline this workflow. Methods: We developed user-specific dashboards that allow entry of specimen information by technologists in the hematology laboratory, have custom scripting to present relevant information for the hematopathology service and ancillary laboratories and allow communication of triage decisions from the hematopathology service to other laboratories. These dashboards are web-accessible on the local intranet and accessible from behind the hospital firewall on a computer or tablet. Secure user access and group rights ensure that relevant users can edit or access appropriate records. Results: After database and dashboard design, two-stage beta-testing and user education was performed, with the first focusing on technologist specimen entry and the second on downstream users. Commonly encountered issues and user functionality requests were resolved with database and dashboard redesign. Final implementation occurred within 6 months of initial design; users report improved triage efficiency and reduced need for interlaboratory communications. Conclusions: We successfully developed and implemented a custom database with dashboards that facilitates and streamlines our hematopathology bone marrow aspirate triage. This provides an example of a possible solution to specimen communications and traffic that are outside the purview of a standard LIS.

  1. Semi-empirical γ-ray peak efficiency determination including self-absorption correction based on numerical integration

    International Nuclear Information System (INIS)

    Noguchi, M.; Takeda, K.; Higuchi, H.

    1981-01-01

    A method of γ-ray efficiency determination for extended (plane or bulk) samples based on numerical integration of point source efficiency is studied. The proposed method is widely applicable to samples of various shapes and materials. The geometrical factor in the peak efficiency can easily be corrected for by simply changing the integration region, and γ-ray self-absorption is also corrected by the absorption coefficients for the sample matrix. (author)

  2. Computation of Green function of the Schroedinger-like partial differential equations by the numerical functional integration

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.; Shahbagian, R.R.; Zhidkov, E.P.

    1991-01-01

    A new method for numerical solution of the boundary problem for Schroedinger-like partial differential equations in R n is elaborated. The method is based on representation of multidimensional Green function in the form of multiple functional integral and on the use of approximation formulas which are constructed for such integrals. The convergence of approximations to the exact value is proved, the remainder of the formulas is estimated. Method reduces the initial differential problem to quadratures. 16 refs.; 7 tabs

  3. Integrating information systems : linking global business goals to local database applications

    NARCIS (Netherlands)

    Dignum, F.P.M.; Houben, G.J.P.M.

    1999-01-01

    This paper describes a new approach to design modern information systems that offer an integrated access to the data and knowledge that is available in local applications. By integrating the local data management activities into one transparent information distribution process, modern organizations

  4. A numerical study of the integral equations for the laser fields in free-electron lasers

    International Nuclear Information System (INIS)

    Yoo, J. G.; Park, S. H.; Jeong, Y. U.; Lee, B. C.; Rhee, Y. J.; Cho, S. O.

    2004-01-01

    The dynamics of the radiation fields in free-electron lasers is investigated on the basis of the integro-differential equations in the one-dimensional formulation. For simple cases we solved the integro-differential equations analytically and numerically to test our numerical procedures developed on the basis of the Filon method. The numerical results showed good agreement with the analytical solutions. To confirm the legitimacy of the numerical package, we carried out numerical studies on the inhomogeneous broadening effects, where no analytic solutions are available, due to the energy spread and the emittance of the electron beam.

  5. A reference methylome database and analysis pipeline to facilitate integrative and comparative epigenomics.

    Directory of Open Access Journals (Sweden)

    Qiang Song

    Full Text Available DNA methylation is implicated in a surprising diversity of regulatory, evolutionary processes and diseases in eukaryotes. The introduction of whole-genome bisulfite sequencing has enabled the study of DNA methylation at a single-base resolution, revealing many new aspects of DNA methylation and highlighting the usefulness of methylome data in understanding a variety of genomic phenomena. As the number of publicly available whole-genome bisulfite sequencing studies reaches into the hundreds, reliable and convenient tools for comparing and analyzing methylomes become increasingly important. We present MethPipe, a pipeline for both low and high-level methylome analysis, and MethBase, an accompanying database of annotated methylomes from the public domain. Together these resources enable researchers to extract interesting features from methylomes and compare them with those identified in public methylomes in our database.

  6. EchoBASE: an integrated post-genomic database for Escherichia coli.

    Science.gov (United States)

    Misra, Raju V; Horler, Richard S P; Reindl, Wolfgang; Goryanin, Igor I; Thomas, Gavin H

    2005-01-01

    EchoBASE (http://www.ecoli-york.org) is a relational database designed to contain and manipulate information from post-genomic experiments using the model bacterium Escherichia coli K-12. Its aim is to collate information from a wide range of sources to provide clues to the functions of the approximately 1500 gene products that have no confirmed cellular function. The database is built on an enhanced annotation of the updated genome sequence of strain MG1655 and the association of experimental data with the E.coli genes and their products. Experiments that can be held within EchoBASE include proteomics studies, microarray data, protein-protein interaction data, structural data and bioinformatics studies. EchoBASE also contains annotated information on 'orphan' enzyme activities from this microbe to aid characterization of the proteins that catalyse these elusive biochemical reactions.

  7. The Eukaryotic Pathogen Databases: a functional genomic resource integrating data from human and veterinary parasites.

    Science.gov (United States)

    Harb, Omar S; Roos, David S

    2015-01-01

    Over the past 20 years, advances in high-throughput biological techniques and the availability of computational resources including fast Internet access have resulted in an explosion of large genome-scale data sets "big data." While such data are readily available for download and personal use and analysis from a variety of repositories, often such analysis requires access to seldom-available computational skills. As a result a number of databases have emerged to provide scientists with online tools enabling the interrogation of data without the need for sophisticated computational skills beyond basic knowledge of Internet browser utility. This chapter focuses on the Eukaryotic Pathogen Databases (EuPathDB: http://eupathdb.org) Bioinformatic Resource Center (BRC) and illustrates some of the available tools and methods.

  8. An integrative clinical database and diagnostics platform for biomarker identification and analysis in ion mobility spectra of human exhaled air

    DEFF Research Database (Denmark)

    Schneider, Till; Hauschild, Anne-Christin; Baumbach, Jörg Ingo

    2013-01-01

    data integration and semi-automated data analysis, in particular with regard to the rapid data accumulation, emerging from the high-throughput nature of the MCC/IMS technology. Here, we present a comprehensive database application and analysis platform, which combines metabolic maps with heterogeneous...... biomedical data in a well-structured manner. The design of the database is based on a hybrid of the entity-attribute-value (EAV) model and the EAV-CR, which incorporates the concepts of classes and relationships. Additionally it offers an intuitive user interface that provides easy and quick access...... to have a clear understanding of the detailed composition of human breath. Therefore, in addition to the clinical studies, there is a need for a flexible and comprehensive centralized data repository, which is capable of gathering all kinds of related information. Moreover, there is a demand for automated...

  9. TransAtlasDB: an integrated database connecting expression data, metadata and variants

    Science.gov (United States)

    Adetunji, Modupeore O; Lamont, Susan J; Schmidt, Carl J

    2018-01-01

    Abstract High-throughput transcriptome sequencing (RNAseq) is the universally applied method for target-free transcript identification and gene expression quantification, generating huge amounts of data. The constraint of accessing such data and interpreting results can be a major impediment in postulating suitable hypothesis, thus an innovative storage solution that addresses these limitations, such as hard disk storage requirements, efficiency and reproducibility are paramount. By offering a uniform data storage and retrieval mechanism, various data can be compared and easily investigated. We present a sophisticated system, TransAtlasDB, which incorporates a hybrid architecture of both relational and NoSQL databases for fast and efficient data storage, processing and querying of large datasets from transcript expression analysis with corresponding metadata, as well as gene-associated variants (such as SNPs) and their predicted gene effects. TransAtlasDB provides the data model of accurate storage of the large amount of data derived from RNAseq analysis and also methods of interacting with the database, either via the command-line data management workflows, written in Perl, with useful functionalities that simplifies the complexity of data storage and possibly manipulation of the massive amounts of data generated from RNAseq analysis or through the web interface. The database application is currently modeled to handle analyses data from agricultural species, and will be expanded to include more species groups. Overall TransAtlasDB aims to serve as an accessible repository for the large complex results data files derived from RNAseq gene expression profiling and variant analysis. Database URL: https://modupeore.github.io/TransAtlasDB/ PMID:29688361

  10. International integral experiments databases in support of nuclear data and code validation

    International Nuclear Information System (INIS)

    Briggs, J. Blair; Gado, Janos; Hunter, Hamilton; Kodeli, Ivan; Salvatores, Massimo; Sartori, Enrico

    2002-01-01

    The OECD/NEA Nuclear Science Committee (NSC) has identified the need to establish international databases containing all the important experiments that are available for sharing among the specialists. The NSC has set up or sponsored specific activities to achieve this. The aim is to preserve them in an agreed standard format in computer accessible form, to use them for international activities involving validation of current and new calculational schemes including computer codes and nuclear data libraries, for assessing uncertainties, confidence bounds and safety margins, and to record measurement methods and techniques. The databases so far established or in preparation related to nuclear data validation cover the following areas: SINBAD - A Radiation Shielding Experiments database encompassing reactor shielding, fusion blanket neutronics, and accelerator shielding. ICSBEP - International Criticality Safety Benchmark Experiments Project Handbook, with more than 2500 critical configurations with different combination of materials and spectral indices. IRPhEP - International Reactor Physics Experimental Benchmarks Evaluation Project. The different projects are described in the following including results achieved, work in progress and planned. (author)

  11. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis

    Directory of Open Access Journals (Sweden)

    Raquel L. Costa

    2017-07-01

    Full Text Available There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were

  12. Sull'Integrazione delle Strutture Numeriche nella Scuola dell'Obbligo (Integrating Numerical Structures in Mandatory School).

    Science.gov (United States)

    Bonotto, C.

    1995-01-01

    Attempted to verify knowledge regarding decimal and rational numbers in children ages 10-14. Discusses how pupils can receive and assimilate extensions of the number system from natural numbers to decimals and fractions and later can integrate this extension into a single and coherent numerical structure. (Author/MKR)

  13. New methods for the numerical integration of ordinary differential equations and their application to the equations of motion of spacecraft

    Science.gov (United States)

    Banyukevich, A.; Ziolkovski, K.

    1975-01-01

    A number of hybrid methods for solving Cauchy problems are described on the basis of an evaluation of advantages of single and multiple-point numerical integration methods. The selection criterion is the principle of minimizing computer time. The methods discussed include the Nordsieck method, the Bulirsch-Stoer extrapolation method, and the method of recursive Taylor-Steffensen power series.

  14. BOKASUN: a fast and precise numerical program to calculate the Master Integrals of the two-loop sunrise diagrams

    OpenAIRE

    Caffo, Michele; Czyz, Henryk; Gunia, Michal; Remiddi, Ettore

    2008-01-01

    We present the program BOKASUN for fast and precise evaluation of the Master Integrals of the two-loop self-mass sunrise diagram for arbitrary values of the internal masses and the external four-momentum. We use a combination of two methods: a Bernoulli accelerated series expansion and a Runge-Kutta numerical solution of a system of linear differential equations.

  15. Implementing Families of Implicit Chebyshev Methods with Exact Coefficients for the Numerical Integration of First- and Second-Order Differential Equations

    National Research Council Canada - National Science Library

    Mitchell, Jason

    2002-01-01

    A method is presented for the generation of exact numerical coefficients found in two families of implicit Chebyshev methods for the numerical integration of first- and second-order ordinary differential equations...

  16. Development of an Information Database for the Integrated Airline Management System (IAMS

    Directory of Open Access Journals (Sweden)

    Bogdane Ruta

    2017-08-01

    Full Text Available In present conditions the activity of any enterprise is represented as a combination of operational processes. Each of them corresponds to relevant airline management systems. Combining two or more management systems, it is possible to obtain an integrated management system. For the effective functioning of the integrated management system, an appropriate information system should be developed. This article proposes a model of such an information system.

  17. An Integrative Database System of Agro-Ecology for the Black Soil Region of China

    Directory of Open Access Journals (Sweden)

    Cuiping Ge

    2007-12-01

    Full Text Available The comprehensive database system of the Northeast agro-ecology of black soil (CSDB_BL is user-friendly software designed to store and manage large amounts of data on agriculture. The data was collected in an efficient and systematic way by long-term experiments and observations of black land and statistics information. It is based on the ORACLE database management system and the interface is written in PB language. The database has the following main facilities:(1 runs on Windows platforms; (2 facilitates data entry from *.dbf to ORACLE or creates ORACLE tables directly; (3has a metadata facility that describes the methods used in the laboratory or in the observations; (4 data can be transferred to an expert system for simulation analysis and estimates made by Visual C++ and Visual Basic; (5 can be connected with GIS, so it is easy to analyze changes in land use ; and (6 allows metadata and data entity to be shared on the internet. The following datasets are included in CSDB_BL: long-term experiments and observations of water, soil, climate, biology, special research projects, and a natural resource survey of Hailun County in the 1980s; images from remote sensing, graphs of vectors and grids, and statistics from Northeast of China. CSDB_BL can be used in the research and evaluation of agricultural sustainability nationally, regionally, or locally. Also, it can be used as a tool to assist the government in planning for agricultural development. Expert systems connected with CSDB_BL can give farmers directions for farm planting management.

  18. A Reaction Database for Small Molecule Pharmaceutical Processes Integrated with Process Information

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Anantpinijwatna, Amata; Woodley, John

    2017-01-01

    This article describes the development of a reaction database with the objective to collect data for multiphase reactions involved in small molecule pharmaceutical processes with a search engine to retrieve necessary data in investigations of reaction-separation schemes, such as the role of organic......; compounds participating in the reaction; use of organic solvents and their function; information for single step and multistep reactions; target products; reaction conditions and reaction data. Information for reactor scale-up together with information for the separation and other relevant information...

  19. Integrated Data Acquisition, Storage, Retrieval and Processing Using the COMPASS DataBase (CDB)

    Czech Academy of Sciences Publication Activity Database

    Urban, Jakub; Pipek, Jan; Hron, Martin; Janky, Filip; Papřok, Richard; Peterka, Matěj; Duarte, A.S.

    2014-01-01

    Roč. 89, č. 5 (2014), s. 712-716 ISSN 0920-3796. [Ninth IAEA TM on Control, Data Acquisition, and Remote Participation for Fusion Research. Hefei, 06.05.2013-10.05.2013] R&D Projects: GA ČR GP13-38121P; GA ČR GAP205/11/2470; GA MŠk(CZ) LM2011021 Institutional support: RVO:61389021 Keywords : tokamak * CODAC * database Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 1.152, year: 2014 http://dx.doi.org/10.1016/j.fusengdes.2014.03.032

  20. Integration of Narrative Processing, Data Fusion, and Database Updating Techniques in an Automated System.

    Science.gov (United States)

    1981-10-29

    are implemented, respectively, in the files "W-Update," "W-combine" and RW-Copy," listed in the appendix. The appendix begins with a typescript of an...the typescript ) and the copying process (steps 45 and 46) are shown as human actions in the typescript , but can be performed easily by a "master...for Natural Language, M. Marcus, MIT Press, 1980. I 29 APPENDIX: DATABASE UPDATING EXPERIMENT 30 CONTENTS Typescript of an experiment in Rosie

  1. Databases in welding engineering - definition and starting phase of the integrated welding engineering information system

    International Nuclear Information System (INIS)

    Barthelmess, H.; Queren, W.; Stracke, M.

    1989-01-01

    The structure and function of the Information AAssociation for Welding Engineering, newly established by the Deutscher Verband fuer Schweisstechnik, are presented. Examined are: special literature for welding techniques - value and prospects; databases accessible to the public for information on welding techniques; concept for the Information Association for Welding Engineering; the four phases to establish databasis for facts and expert systems of the Information Association for Welding Engineering; the pilot project 'MVT-Data base' (hot crack data base for data of modified varestraint-transvarestraint tests). (orig./MM) [de

  2. Data integration for European marine biodiversity research: creating a database on benthos and plankton to study large-scale patterns and long-term changes

    NARCIS (Netherlands)

    Vandepitte, L.; Vanhoorne, B.; Kraberg, A.; Anisimova, N.; Antoniadou, C.; Araújo, R.; Bartsch, I.; Beker, B.; Benedetti-Cecchi, L.; Bertocci, I.; Cochrane, S.J.; Cooper, K.; Craeymeersch, J.A.; Christou, E.; Crisp, D.J.; Dahle, S.; de Boissier, M.; De Kluijver, M.; Denisenko, S.; De Vito, D.; Duineveld, G.; Escaravage, V.L.; Fleischer, D.; Fraschetti, S.; Giangrande, A.; Heip, C.H.R.; Hummel, H.; Janas, U.; Karez, R.; Kedra, M.; Kingston, P.; Kuhlenkamp, R.; Libes, M.; Martens, P.; Mees, J.; Mieszkowska, N.; Mudrak, S.; Munda, I.; Orfanidis, S.; Orlando-Bonaca, M.; Palerud, R.; Rachor, E.; Reichert, K.; Rumohr, H.; Schiedek, D.; Schubert, P.; Sistermans, W.C.H.; Sousa Pinto, I.S.; Southward, A.J.; Terlizzi, A.; Tsiaga, E.; Van Beusekom, J.E.E.; Vanden Berghe, E.; Warzocha, J.; Wasmund, N.; Weslawski, J.M.; Widdicombe, C.; Wlodarska-Kowalczuk, M.; Zettler, M.L.

    2010-01-01

    The general aim of setting up a central database on benthos and plankton was to integrate long-, medium- and short-term datasets on marine biodiversity. Such a database makes it possible to analyse species assemblages and their changes on spatial and temporal scales across Europe. Data collation

  3. An integrated data-analysis and database system for AMS {sup 14}C

    Energy Technology Data Exchange (ETDEWEB)

    Kjeldsen, Henrik, E-mail: kjeldsen@phys.au.d [AMS 14C Dating Centre, Department of Physics and Astronomy, Aarhus University, Aarhus (Denmark); Olsen, Jesper [Department of Earth Sciences, Aarhus University, Aarhus (Denmark); Heinemeier, Jan [AMS 14C Dating Centre, Department of Physics and Astronomy, Aarhus University, Aarhus (Denmark)

    2010-04-15

    AMSdata is the name of a combined database and data-analysis system for AMS {sup 14}C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS {sup 14}C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  4. An integrated data-analysis and database system for AMS 14C

    International Nuclear Information System (INIS)

    Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan

    2010-01-01

    AMSdata is the name of a combined database and data-analysis system for AMS 14 C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14 C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  5. MIPS Arabidopsis thaliana Database (MAtDB): an integrated biological knowledge resource based on the first complete plant genome

    Science.gov (United States)

    Schoof, Heiko; Zaccaria, Paolo; Gundlach, Heidrun; Lemcke, Kai; Rudd, Stephen; Kolesov, Grigory; Arnold, Roland; Mewes, H. W.; Mayer, Klaus F. X.

    2002-01-01

    Arabidopsis thaliana is the first plant for which the complete genome has been sequenced and published. Annotation of complex eukaryotic genomes requires more than the assignment of genetic elements to the sequence. Besides completing the list of genes, we need to discover their cellular roles, their regulation and their interactions in order to understand the workings of the whole plant. The MIPS Arabidopsis thaliana Database (MAtDB; http://mips.gsf.de/proj/thal/db) started out as a repository for genome sequence data in the European Scientists Sequencing Arabidopsis (ESSA) project and the Arabidopsis Genome Initiative. Our aim is to transform MAtDB into an integrated biological knowledge resource by integrating diverse data, tools, query and visualization capabilities and by creating a comprehensive resource for Arabidopsis as a reference model for other species, including crop plants. PMID:11752263

  6. Disciplining Change, Displacing Frictions. Two Structural Dimensions of Digital Circulation Across Land Registry Database Integration

    NARCIS (Netherlands)

    Pelizza, Annalisa

    2016-01-01

    Data acquire meaning through circulation. Yet most approaches to high-quality data aim to flatten this stratification of meanings. In government, data quality is achieved through integrated systems of authentic registers that reduce multiple trajectories to a single, official one. These systems can

  7. MEGADOCK-Web: an integrated database of high-throughput structure-based protein-protein interaction predictions.

    Science.gov (United States)

    Hayashi, Takanori; Matsuzaki, Yuri; Yanagisawa, Keisuke; Ohue, Masahito; Akiyama, Yutaka

    2018-05-08

    Protein-protein interactions (PPIs) play several roles in living cells, and computational PPI prediction is a major focus of many researchers. The three-dimensional (3D) structure and binding surface are important for the design of PPI inhibitors. Therefore, rigid body protein-protein docking calculations for two protein structures are expected to allow elucidation of PPIs different from known complexes in terms of 3D structures because known PPI information is not explicitly required. We have developed rapid PPI prediction software based on protein-protein docking, called MEGADOCK. In order to fully utilize the benefits of computational PPI predictions, it is necessary to construct a comprehensive database to gather prediction results and their predicted 3D complex structures and to make them easily accessible. Although several databases exist that provide predicted PPIs, the previous databases do not contain a sufficient number of entries for the purpose of discovering novel PPIs. In this study, we constructed an integrated database of MEGADOCK PPI predictions, named MEGADOCK-Web. MEGADOCK-Web provides more than 10 times the number of PPI predictions than previous databases and enables users to conduct PPI predictions that cannot be found in conventional PPI prediction databases. In MEGADOCK-Web, there are 7528 protein chains and 28,331,628 predicted PPIs from all possible combinations of those proteins. Each protein structure is annotated with PDB ID, chain ID, UniProt AC, related KEGG pathway IDs, and known PPI pairs. Additionally, MEGADOCK-Web provides four powerful functions: 1) searching precalculated PPI predictions, 2) providing annotations for each predicted protein pair with an experimentally known PPI, 3) visualizing candidates that may interact with the query protein on biochemical pathways, and 4) visualizing predicted complex structures through a 3D molecular viewer. MEGADOCK-Web provides a huge amount of comprehensive PPI predictions based on

  8. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    OpenAIRE

    Alexandra Maria Ioana FLOREA

    2013-01-01

    In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for ...

  9. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  10. Design Integration of Man-Machine Interface (MMI) Display Drawings and MMI Database

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Jun; Seo, Kwang Rak; Song, Jeong Woog; Kim, Dae Ho; Han, Jung A [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    The conventional Main Control Room (MCR) was designed using hardwired controllers and analog indications mounted on control boards for control and acquisition of plant information. This is compared with advanced MCR design where Flat Panel Displays (FPDs) with soft controls and mimic displays are used. The advanced design needs MMI display drawings replacing the conventional control board layout drawings and component lists. The data is linked to related object of the MMI displays. Compilation of the data into the DB is generally done manually, which tends to introduce errors and discrepancies. Also, updating and managing is difficult due to a huge number of entries in the DB and the update must closely track the changes in the associated drawing. Therefore, automating the DB update whenever a related drawing is updated would be quite beneficial. An attempt is made to develop a new method to integrate the MMIS display drawing design and the DB management. This would significantly reduce the amount of errors and improve design quality. The design integration of the MMI Display drawing and MMI DB is explained briefly but concisely in this paper. The existing method involved individually and separately inputting design data for the MMI display drawings. This caused to the potential problem of data discrepancies and errors as well as the update time lag between related drawings and the DB. This led to development of an integration of design process which automates the design data input activity.

  11. CyanoEXpress: A web database for exploration and visualisation of the integrated transcriptome of cyanobacterium Synechocystis sp. PCC6803.

    Science.gov (United States)

    Hernandez-Prieto, Miguel A; Futschik, Matthias E

    2012-01-01

    Synechocystis sp. PCC6803 is one of the best studied cyanobacteria and an important model organism for our understanding of photosynthesis. The early availability of its complete genome sequence initiated numerous transcriptome studies, which have generated a wealth of expression data. Analysis of the accumulated data can be a powerful tool to study transcription in a comprehensive manner and to reveal underlying regulatory mechanisms, as well as to annotate genes whose functions are yet unknown. However, use of divergent microarray platforms, as well as distributed data storage make meta-analyses of Synechocystis expression data highly challenging, especially for researchers with limited bioinformatic expertise and resources. To facilitate utilisation of the accumulated expression data for a wider research community, we have developed CyanoEXpress, a web database for interactive exploration and visualisation of transcriptional response patterns in Synechocystis. CyanoEXpress currently comprises expression data for 3073 genes and 178 environmental and genetic perturbations obtained in 31 independent studies. At present, CyanoEXpress constitutes the most comprehensive collection of expression data available for Synechocystis and can be freely accessed. The database is available for free at http://cyanoexpress.sysbiolab.eu.

  12. An Integrative Clinical Database and Diagnostics Platform for Biomarker Identification and Analysis in Ion Mobility Spectra of Human Exhaled Air

    Directory of Open Access Journals (Sweden)

    Schneider Till

    2013-06-01

    Full Text Available Over the last decade the evaluation of odors and vapors in human breath has gained more and more attention, particularly in the diagnostics of pulmonary diseases. Ion mobility spectrometry coupled with multi-capillary columns (MCC/IMS, is a well known technology for detecting volatile organic compounds (VOCs in air. It is a comparatively inexpensive, non-invasive, high-throughput method, which is able to handle the moisture that comes with human exhaled air, and allows for characterizing of VOCs in very low concentrations. To identify discriminating compounds as biomarkers, it is necessary to have a clear understanding of the detailed composition of human breath. Therefore, in addition to the clinical studies, there is a need for a flexible and comprehensive centralized data repository, which is capable of gathering all kinds of related information. Moreover, there is a demand for automated data integration and semi-automated data analysis, in particular with regard to the rapid data accumulation, emerging from the high-throughput nature of the MCC/IMS technology. Here, we present a comprehensive database application and analysis platform, which combines metabolic maps with heterogeneous biomedical data in a well-structured manner. The design of the database is based on a hybrid of the entity-attribute- value (EAV model and the EAV-CR, which incorporates the concepts of classes and relationships. Additionally it offers an intuitive user interface that provides easy and quick access to the platform’s functionality: automated data integration and integrity validation, versioning and roll-back strategy, data retrieval as well as semi-automatic data mining and machine learning capabilities. The platform will support MCC/IMS-based biomarker identification and validation. The software, schemata, data sets and further information is publicly available at http://imsdb.mpi-inf.mpg.de.

  13. IMPACT web portal: oncology database integrating molecular profiles with actionable therapeutics.

    Science.gov (United States)

    Hintzsche, Jennifer D; Yoo, Minjae; Kim, Jihye; Amato, Carol M; Robinson, William A; Tan, Aik Choon

    2018-04-20

    With the advancement of next generation sequencing technology, researchers are now able to identify important variants and structural changes in DNA and RNA in cancer patient samples. With this information, we can now correlate specific variants and/or structural changes with actionable therapeutics known to inhibit these variants. We introduce the creation of the IMPACT Web Portal, a new online resource that connects molecular profiles of tumors to approved drugs, investigational therapeutics and pharmacogenetics associated drugs. IMPACT Web Portal contains a total of 776 drugs connected to 1326 target genes and 435 target variants, fusion, and copy number alterations. The online IMPACT Web Portal allows users to search for various genetic alterations and connects them to three levels of actionable therapeutics. The results are categorized into 3 levels: Level 1 contains approved drugs separated into two groups; Level 1A contains approved drugs with variant specific information while Level 1B contains approved drugs with gene level information. Level 2 contains drugs currently in oncology clinical trials. Level 3 provides pharmacogenetic associations between approved drugs and genes. IMPACT Web Portal allows for sequencing data to be linked to actionable therapeutics for translational and drug repurposing research. The IMPACT Web Portal online resource allows users to query genes and variants to approved and investigational drugs. We envision that this resource will be a valuable database for personalized medicine and drug repurposing. IMPACT Web Portal is freely available for non-commercial use at http://tanlab.ucdenver.edu/IMPACT .

  14. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2013-05-01

    Full Text Available In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for security and confidentiality of stored data and also on presenting the identified techniques and procedures to implement these requirements.

  15. The Planteome database: an integrated resource for reference ontologies, plant genomics and phenomics

    Science.gov (United States)

    Cooper, Laurel; Meier, Austin; Laporte, Marie-Angélique; Elser, Justin L; Mungall, Chris; Sinn, Brandon T; Cavaliere, Dario; Carbon, Seth; Dunn, Nathan A; Smith, Barry; Qu, Botong; Preece, Justin; Zhang, Eugene; Todorovic, Sinisa; Gkoutos, Georgios; Doonan, John H; Stevenson, Dennis W; Arnaud, Elizabeth

    2018-01-01

    Abstract The Planteome project (http://www.planteome.org) provides a suite of reference and species-specific ontologies for plants and annotations to genes and phenotypes. Ontologies serve as common standards for semantic integration of a large and growing corpus of plant genomics, phenomics and genetics data. The reference ontologies include the Plant Ontology, Plant Trait Ontology and the Plant Experimental Conditions Ontology developed by the Planteome project, along with the Gene Ontology, Chemical Entities of Biological Interest, Phenotype and Attribute Ontology, and others. The project also provides access to species-specific Crop Ontologies developed by various plant breeding and research communities from around the world. We provide integrated data on plant traits, phenotypes, and gene function and expression from 95 plant taxa, annotated with reference ontology terms. The Planteome project is developing a plant gene annotation platform; Planteome Noctua, to facilitate community engagement. All the Planteome ontologies are publicly available and are maintained at the Planteome GitHub site (https://github.com/Planteome) for sharing, tracking revisions and new requests. The annotated data are freely accessible from the ontology browser (http://browser.planteome.org/amigo) and our data repository. PMID:29186578

  16. A Fortran program for the numerical integration of the one-dimensional Schroedinger equation using exponential and Bessel fitting methods

    International Nuclear Information System (INIS)

    Cash, J.R.; Raptis, A.D.; Simos, T.E.

    1990-01-01

    An efficient algorithm is described for the accurate numerical integration of the one-dimensional Schroedinger equation. This algorithm uses a high-order, variable step Runge-Kutta like method in the region where the potential term dominates, and an exponential or Bessel fitted method in the asymptotic region. This approach can be used to compute scattering phase shifts in an efficient and reliable manner. A Fortran program which implements this algorithm is provided and some test results are given. (orig.)

  17. Integration of numerical modeling and observations for the Gulf of Naples monitoring network

    Science.gov (United States)

    Iermano, I.; Uttieri, M.; Zambianchi, E.; Buonocore, B.; Cianelli, D.; Falco, P.; Zambardino, G.

    2012-04-01

    Lethal effects of mineral oils on fragile marine and coastal ecosystems are now well known. Risks and damages caused by a maritime accident can be reduced with the help of better forecasts and efficient monitoring systems. The MED project TOSCA (Tracking Oil Spills and Coastal Awareness Network), which gathers 13 partners from 4 Mediterranean countries, has been designed to help create a better response system to maritime accidents. Through the construction of an observational network, based on state of the art technology (HF radars and drifters), TOSCA provides real-time observations and forecasts of the Mediterranean coastal marine environmental conditions. The system is installed and assessed in five test sites on the coastal areas of oil spill outlets (Eastern Mediterranean) and on high traffic areas (Western Mediterranean). The Gulf of Naples, a small semi-closed basin opening to the Tyrrhenian Sea is one of the five test-sites. It is of particular interest from both the environmental point of view, due to peculiar ecosystem properties in the area, and because it sustains important touristic and commercial activities. Currently the Gulf of Naples monitoring network is represented by five automatic weather stations distributed along the coasts of the Gulf, one weather radar, two tide gauges, one waverider buoy, and moored physical, chemical and bio-optical instrumentation. In addition, a CODAR-SeaSonde HF coastal radar system composed of three antennas is located in Portici, Massa Lubrense and Castellammare. The system provides hourly data of surface currents over the entire Gulf with a 1km spatial resolution. A numerical modeling implementation based on Regional Ocean Modeling System (ROMS) is actually integrated in the Gulf of Naples monitoring network. ROMS is a 3-D, free-surface, hydrostatic, primitive equation, finite difference ocean model. In our configuration, the model has high horizontal resolution (250m), and 30 sigma levels in the vertical. Thanks

  18. Using reefcheck monitoring database to develop the coral reef index of biological integrity

    DEFF Research Database (Denmark)

    Nguyen, Hai Yen T.; Pedersen, Ole; Ikejima, Kou

    2009-01-01

    The coral reef indices of biological integrity was constituted based on the reef check monitoring data. Seventy six minimally disturbed sites and 72 maximallv disturbed sites in shallow water and 39 minimally disturbed sites and 37 maximally disturbed sites in deep water were classified based...... on the high-end and low-end percentages and ratios of hard coral, dead coral and fieshy algae. A total of 52 candidate metrics was identified and compiled, Eight and four metrics were finally selected to constitute the shallow and deep water coral reef indices respectively. The rating curve was applied.......05) and coral damaged by other factors -0.283 (pcoral reef indices were sensitive responses to stressors and can be capable to use as the coral reef biological monitoring tool....

  19. Numerical solution of the potential problem by integral equations without Green's functions

    International Nuclear Information System (INIS)

    De Mey, G.

    1977-01-01

    An integral equation technique will be presented to solve Laplace's equation in a two-dimensional area S. The Green's function has been replaced by a particular solution of Laplace equation in order to establish the integral equation. It is shown that accurate results can be obtained provided the pivotal elimination method is used to solve the linear algebraic set

  20. Block-pulse functions approach to numerical solution of Abel’s integral equation

    Directory of Open Access Journals (Sweden)

    Monireh Nosrati Sahlan

    2015-12-01

    Full Text Available This study aims to present a computational method for solving Abel’s integral equation of the second kind. The introduced method is based on the use of Block-pulse functions (BPFs via collocation method. Abel’s integral equations as singular Volterra integral equations are hard and heavy in computation, but because of the properties of BPFs, as is reported in examples, this method is more efficient and more accurate than some other methods for solving this class of integral equations. On the other hand, the benefit of this method is low cost of computing operations. The applied method transforms the singular integral equation into triangular linear algebraic system that can be solved easily. An error analysis is worked out and applications are demonstrated through illustrative examples.

  1. A delta-rule model of numerical and non-numerical order processing.

    Science.gov (United States)

    Verguts, Tom; Van Opstal, Filip

    2014-06-01

    Numerical and non-numerical order processing share empirical characteristics (distance effect and semantic congruity), but there are also important differences (in size effect and end effect). At the same time, models and theories of numerical and non-numerical order processing developed largely separately. Currently, we combine insights from 2 earlier models to integrate them in a common framework. We argue that the same learning principle underlies numerical and non-numerical orders, but that environmental features determine the empirical differences. Implications for current theories on order processing are pointed out. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  2. Establishment of computerized numerical databases on thermophysical and other properties of molten as well as solid materials and data evaluation and validation for generating recommended reliable reference data

    Science.gov (United States)

    Ho, C. Y.

    1993-01-01

    The Center for Information and Numerical Data Analysis and Synthesis, (CINDAS), measures and maintains databases on thermophysical, thermoradiative, mechanical, optical, electronic, ablation, and physical properties of materials. Emphasis is on aerospace structural materials especially composites and on infrared detector/sensor materials. Within CINDAS, the Department of Defense sponsors at Purdue several centers: the High Temperature Material Information Analysis Center (HTMIAC), the Ceramics Information Analysis Center (CIAC) and the Metals Information Analysis Center (MIAC). The responsibilities of CINDAS are extremely broad encompassing basic and applied research, measurement of the properties of thin wires and thin foils as well as bulk materials, acquisition and search of world-wide literature, critical evaluation of data, generation of estimated values to fill data voids, investigation of constitutive, structural, processing, environmental, and rapid heating and loading effects, and dissemination of data. Liquids, gases, molten materials and solids are all considered. The responsibility of maintaining widely used databases includes data evaluation, analysis, correlation, and synthesis. Material property data recorded on the literature are often conflicting, diverging, and subject to large uncertainties. It is admittedly difficult to accurately measure materials properties. Systematic and random errors both enter. Some errors result from lack of characterization of the material itself (impurity effects). In some cases assumed boundary conditions corresponding to a theoretical model are not obtained in the experiments. Stray heat flows and losses must be accounted for. Some experimental methods are inappropriate and in other cases appropriate methods are carried out with poor technique. Conflicts in data may be resolved by curve fitting of the data to theoretical or empirical models or correlation in terms of various affecting parameters. Reasons (e.g. phase

  3. ANISEED 2017: extending the integrated ascidian database to the exploration and evolutionary comparison of genome-scale datasets.

    Science.gov (United States)

    Brozovic, Matija; Dantec, Christelle; Dardaillon, Justine; Dauga, Delphine; Faure, Emmanuel; Gineste, Mathieu; Louis, Alexandra; Naville, Magali; Nitta, Kazuhiro R; Piette, Jacques; Reeves, Wendy; Scornavacca, Céline; Simion, Paul; Vincentelli, Renaud; Bellec, Maelle; Aicha, Sameh Ben; Fagotto, Marie; Guéroult-Bellone, Marion; Haeussler, Maximilian; Jacox, Edwin; Lowe, Elijah K; Mendez, Mickael; Roberge, Alexis; Stolfi, Alberto; Yokomori, Rui; Brown, C Titus; Cambillau, Christian; Christiaen, Lionel; Delsuc, Frédéric; Douzery, Emmanuel; Dumollard, Rémi; Kusakabe, Takehiro; Nakai, Kenta; Nishida, Hiroki; Satou, Yutaka; Swalla, Billie; Veeman, Michael; Volff, Jean-Nicolas; Lemaire, Patrick

    2018-01-04

    ANISEED (www.aniseed.cnrs.fr) is the main model organism database for tunicates, the sister-group of vertebrates. This release gives access to annotated genomes, gene expression patterns, and anatomical descriptions for nine ascidian species. It provides increased integration with external molecular and taxonomy databases, better support for epigenomics datasets, in particular RNA-seq, ChIP-seq and SELEX-seq, and features novel interactive interfaces for existing and novel datatypes. In particular, the cross-species navigation and comparison is enhanced through a novel taxonomy section describing each represented species and through the implementation of interactive phylogenetic gene trees for 60% of tunicate genes. The gene expression section displays the results of RNA-seq experiments for the three major model species of solitary ascidians. Gene expression is controlled by the binding of transcription factors to cis-regulatory sequences. A high-resolution description of the DNA-binding specificity for 131 Ciona robusta (formerly C. intestinalis type A) transcription factors by SELEX-seq is provided and used to map candidate binding sites across the Ciona robusta and Phallusia mammillata genomes. Finally, use of a WashU Epigenome browser enhances genome navigation, while a Genomicus server was set up to explore microsynteny relationships within tunicates and with vertebrates, Amphioxus, echinoderms and hemichordates. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Development and Exploration of a Regional Stormwater BMP Performance Database to Parameterize an Integrated Decision Support Tool (i-DST)

    Science.gov (United States)

    Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.

    2017-12-01

    Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.

  5. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    Science.gov (United States)

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  6. Database Description - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Database Description General information of database Database n... BioResource Center Hiroshi Masuya Database classification Plant databases - Arabidopsis thaliana Organism T...axonomy Name: Arabidopsis thaliana Taxonomy ID: 3702 Database description The Arabidopsis thaliana phenome i...heir effective application. We developed the new Arabidopsis Phenome Database integrating two novel database...seful materials for their experimental research. The other, the “Database of Curated Plant Phenome” focusing

  7. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  8. Waterborne disease outbreak detection: an integrated approach using health administrative databases.

    Science.gov (United States)

    Coly, S; Vincent, N; Vaissiere, E; Charras-Garrido, M; Gallay, A; Ducrot, C; Mouly, D

    2017-08-01

    Hundreds of waterborne disease outbreaks (WBDO) of acute gastroenteritis (AGI) due to contaminated tap water are reported in developed countries each year. Such outbreaks are probably under-detected. The aim of our study was to develop an integrated approach to detect and study clusters of AGI in geographical areas with homogeneous exposure to drinking water. Data for the number of AGI cases are available at the municipality level while exposure to tap water depends on drinking water networks (DWN). These two geographical units do not systematically overlap. This study proposed to develop an algorithm which would match the most relevant grouping of municipalities with a specific DWN, in order that tap water exposure can be taken into account when investigating future disease outbreaks. A space-time detection method was applied to the grouping of municipalities. Seven hundred and fourteen new geographical areas (groupings of municipalities) were obtained compared with the 1,310 municipalities and the 1,706 DWN. Eleven potential WBDO were identified in these groupings of municipalities. For ten of them, additional environmental investigations identified at least one event that could have caused microbiological contamination of DWN in the days previous to the occurrence of a reported WBDO.

  9. Multisource Data-Based Integrated Agricultural Drought Monitoring in the Huai River Basin, China

    Science.gov (United States)

    Sun, Peng; Zhang, Qiang; Wen, Qingzhi; Singh, Vijay P.; Shi, Peijun

    2017-10-01

    Drought monitoring is critical for early warning of drought hazard. This study attempted to develop an integrated remote sensing drought monitoring index (IRSDI), based on meteorological data for 2003-2013 from 40 meteorological stations and soil moisture data from 16 observatory stations, as well as Moderate Resolution Imaging Spectroradiometer data using a linear trend detection method, and standardized precipitation evapotranspiration index. The objective was to investigate drought conditions across the Huai River basin in both space and time. Results indicate that (1) the proposed IRSDI monitors and describes drought conditions across the Huai River basin reasonably well in both space and time; (2) frequency of drought and severe drought are observed during April-May and July-September. The northeastern and eastern parts of Huai River basin are dominated by frequent droughts and intensified drought events. These regions are dominated by dry croplands, grasslands, and highly dense population and are hence more sensitive to drought hazards; (3) intensified droughts are detected during almost all months except January, August, October, and December. Besides, significant intensification of droughts is discerned mainly in eastern and western Huai River basin. The duration and regions dominated by intensified drought events would be a challenge for water resources management in view of agricultural and other activities in these regions in a changing climate.

  10. Numerical Integration Methods for the Takagi-Taupin Equations for Crystals of Rectangular Cross Section

    International Nuclear Information System (INIS)

    Kolosov, S.I.; Punegov, V.I.

    2005-01-01

    Two independent methods for calculation of the rocking curves for laterally bounded crystals are developed. Numerical simulation of diffraction for crystals of different sizes is performed. The results obtained using the dynamical theory of diffraction are compared to those obtained in the kinematic approximation

  11. Mixing-to-eruption timescales: an integrated model combining numerical simulations and high-temperature experiments with natural melts

    Science.gov (United States)

    Montagna, Chiara; Perugini, Diego; De Campos, Christina; Longo, Antonella; Dingwell, Donald Bruce; Papale, Paolo

    2015-04-01

    Arrival of magma from depth into shallow reservoirs and associated mixing processes have been documented as possible triggers of explosive eruptions. Quantifying the timing from beginning of mixing to eruption is of fundamental importance in volcanology in order to put constraints about the possible onset of a new eruption. Here we integrate numerical simulations and high-temperature experiment performed with natural melts with the aim to attempt identifying the mixing-to-eruption timescales. We performed two-dimensional numerical simulations of the arrival of gas-rich magmas into shallow reservoirs. We solve the fluid dynamics for the two interacting magmas evaluating the space-time evolution of the physical properties of the mixture. Convection and mingling develop quickly into the chamber and feeding conduit/dyke. Over time scales of hours, the magmas in the reservoir appear to have mingled throughout, and convective patterns become harder to identify. High-temperature magma mixing experiments have been performed using a centrifuge and using basaltic and phonolitic melts from Campi Flegrei (Italy) as initial end-members. Concentration Variance Decay (CVD), an inevitable consequence of magma mixing, is exponential with time. The rate of CVD is a powerful new geochronometer for the time from mixing to eruption/quenching. The mingling-to-eruption time of three explosive volcanic eruptions from Campi Flegrei (Italy) yield durations on the order of tens of minutes. These results are in perfect agreement with the numerical simulations that suggest a maximum mixing time of a few hours to obtain a hybrid mixture. We show that integration of numerical simulation and high-temperature experiments can provide unprecedented results about mixing processes in volcanic systems. The combined application of numerical simulations and CVD geochronometer to the eruptive products of active volcanoes could be decisive for the preparation of hazard mitigation during volcanic unrest.

  12. Numerical calculation of a class of highly oscillatory integrals with the Mathieu function

    International Nuclear Information System (INIS)

    Long Yongxing

    1992-01-01

    The author describes a method for computing highly oscillatory integrals with the Mathieu function. The practice proves that not only the results are highly satisfactory, but also the method is time-saving

  13. Experimental and numerical study of electrical crosstalk in photonic integrated circuits

    NARCIS (Netherlands)

    Yao, W.; Gilardi, G.; Calabretta, N.; Smit, M.K.; Wale, M.J.

    2015-01-01

    This paper presents measurement results on electrical crosstalk between interconnect lines and electro-optical phaseshifters in photonic integrated circuits. The results indicate that overall crosstalk originates from radiative and substrate coupling between lines and from shared ground connections.

  14. Integral abutment bridges under thermal loading : numerical simulations and parametric study.

    Science.gov (United States)

    2016-06-01

    Integral abutment bridges (IABs) have become of interest due to their decreased construction and maintenance costs in : comparison to conventional jointed bridges. Most prior IAB research was related to substructure behavior, and, as a result, most :...

  15. Local linearization methods for the numerical integration of ordinary differential equations: An overview

    International Nuclear Information System (INIS)

    Jimenez, J.C.

    2009-06-01

    Local Linearization (LL) methods conform a class of one-step explicit integrators for ODEs derived from the following primary and common strategy: the vector field of the differential equation is locally (piecewise) approximated through a first-order Taylor expansion at each time step, thus obtaining successive linear equations that are explicitly integrated. Hereafter, the LL approach may include some additional strategies to improve that basic affine approximation. Theoretical and practical results have shown that the LL integrators have a number of convenient properties. These include arbitrary order of convergence, A-stability, linearization preserving, regularity under quite general conditions, preservation of the dynamics of the exact solution around hyperbolic equilibrium points and periodic orbits, integration of stiff and high-dimensional equations, low computational cost, and others. In this paper, a review of the LL methods and their properties is presented. (author)

  16. Effects of planar element formulation and numerical integration order on checkerboard material layouts

    CSIR Research Space (South Africa)

    Long, CS

    2009-01-01

    Full Text Available The effects of selected planar finite element formulations, and their associated integration schemes, on the stiffness of a checkerboard material layout are investigated. Standard 4-node bilinear elements, 8- and 9-node quadratic elements, as well...

  17. ICM: an Integrated Compartment Method for numerically solving partial differential equations

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, G.T.

    1981-05-01

    An integrated compartment method (ICM) is proposed to construct a set of algebraic equations from a system of partial differential equations. The ICM combines the utility of integral formulation of finite element approach, the simplicity of interpolation of finite difference approximation, and the flexibility of compartment analyses. The integral formulation eases the treatment of boundary conditions, in particular, the Neumann-type boundary conditions. The simplicity of interpolation provides great economy in computation. The flexibility of discretization with irregular compartments of various shapes and sizes offers advantages in resolving complex boundaries enclosing compound regions of interest. The basic procedures of ICM are first to discretize the region of interest into compartments, then to apply three integral theorems of vectors to transform the volume integral to the surface integral, and finally to use interpolation to relate the interfacial values in terms of compartment values to close the system. The Navier-Stokes equations are used as an example of how to derive the corresponding ICM alogrithm for a given set of partial differential equations. Because of the structure of the algorithm, the basic computer program remains the same for cases in one-, two-, or three-dimensional problems.

  18. Integration of artificial intelligence and numerical optimization techniques for the design of complex aerospace systems

    International Nuclear Information System (INIS)

    Tong, S.S.; Powell, D.; Goel, S.

    1992-02-01

    A new software system called Engineous combines artificial intelligence and numerical methods for the design and optimization of complex aerospace systems. Engineous combines the advanced computational techniques of genetic algorithms, expert systems, and object-oriented programming with the conventional methods of numerical optimization and simulated annealing to create a design optimization environment that can be applied to computational models in various disciplines. Engineous has produced designs with higher predicted performance gains that current manual design processes - on average a 10-to-1 reduction of turnaround time - and has yielded new insights into product design. It has been applied to the aerodynamic preliminary design of an aircraft engine turbine, concurrent aerodynamic and mechanical preliminary design of an aircraft engine turbine blade and disk, a space superconductor generator, a satellite power converter, and a nuclear-powered satellite reactor and shield. 23 refs

  19. Database Description - RMOS | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name RMOS Alternative nam...arch Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Microarray Data and other Gene Expression Database...s Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The Ric...19&lang=en Whole data download - Referenced database Rice Expression Database (RED) Rice full-length cDNA Database... (KOME) Rice Genome Integrated Map Database (INE) Rice Mutant Panel Database (Tos17) Rice Genome Annotation Database

  20. Integrated Tsunami Database: simulation and identification of seismic tsunami sources, 3D visualization and post-disaster assessment on the shore

    Science.gov (United States)

    Krivorot'ko, Olga; Kabanikhin, Sergey; Marinin, Igor; Karas, Adel; Khidasheli, David

    2013-04-01

    One of the most important problems of tsunami investigation is the problem of seismic tsunami source reconstruction. Non-profit organization WAPMERR (http://wapmerr.org) has provided a historical database of alleged tsunami sources around the world that obtained with the help of information about seaquakes. WAPMERR also has a database of observations of the tsunami waves in coastal areas. The main idea of presentation consists of determining of the tsunami source parameters using seismic data and observations of the tsunami waves on the shore, and the expansion and refinement of the database of presupposed tsunami sources for operative and accurate prediction of hazards and assessment of risks and consequences. Also we present 3D visualization of real-time tsunami wave propagation and loss assessment, characterizing the nature of the building stock in cities at risk, and monitoring by satellite images using modern GIS technology ITRIS (Integrated Tsunami Research and Information System) developed by WAPMERR and Informap Ltd. The special scientific plug-in components are embedded in a specially developed GIS-type graphic shell for easy data retrieval, visualization and processing. The most suitable physical models related to simulation of tsunamis are based on shallow water equations. We consider the initial-boundary value problem in Ω := {(x,y) ?R2 : x ?(0,Lx ), y ?(0,Ly ), Lx,Ly > 0} for the well-known linear shallow water equations in the Cartesian coordinate system in terms of the liquid flow components in dimensional form Here ?(x,y,t) defines the free water surface vertical displacement, i.e. amplitude of a tsunami wave, q(x,y) is the initial amplitude of a tsunami wave. The lateral boundary is assumed to be a non-reflecting boundary of the domain, that is, it allows the free passage of the propagating waves. Assume that the free surface oscillation data at points (xm, ym) are given as a measured output data from tsunami records: fm(t) := ? (xm, ym,t), (xm

  1. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    Science.gov (United States)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  2. Applications of integral equation methods for the numerical solution of magnetostatic and eddy current problems

    International Nuclear Information System (INIS)

    Trowbridge, C.W.

    1976-06-01

    Various integral equation methods are described. For magnetostatic problems three formulations are considered in detail, (a) the direct solution method for the magnetisation distribution in permeable materials, (b) a method based on a scalar potential and (c) the use of an integral equation derived from Green's Theorem, i.e. the so-called Boundary Integral Method (BIM). In the case of (a) results are given for two-and three-dimensional non-linear problems with comparisons against measurement. For methods (b) and (c) which both lead to a more economic use of the computer than (a) some preliminary results are given for simple cases. For eddy current problems various methods are discussed and some results are given from a computer program based on a vector potential formulation. (author)

  3. Applications of integral equation methods for the numerical solution of magnetostatic and eddy current problems

    Energy Technology Data Exchange (ETDEWEB)

    Trowbridge, C W

    1976-06-01

    Various integral equation methods are described. For magnetostatic problems three formulations are considered in detail, (a) the direct solution method for the magnetisation distribution in permeable materials, (b) a method based on a scalar potential, and (c) the use of an integral equation derived from Green's Theorem, i.e. the so-called Boundary Integral Method (BIM). In the case of (a) results are given for two-and three-dimensional non-linear problems with comparisons against measurement. For methods (b) and (c), which both lead to a more economical use of the computer than (a), some preliminary results are given for simple cases. For eddy current problems various methods are discussed and some results are given from a computer program based on a vector potential formulation.

  4. A variable timestep generalized Runge-Kutta method for the numerical integration of the space-time diffusion equations

    International Nuclear Information System (INIS)

    Aviles, B.N.; Sutton, T.M.; Kelly, D.J. III.

    1991-09-01

    A generalized Runge-Kutta method has been employed in the numerical integration of the stiff space-time diffusion equations. The method is fourth-order accurate, using an embedded third-order solution to arrive at an estimate of the truncation error for automatic timestep control. The efficiency of the Runge-Kutta method is enhanced by a block-factorization technique that exploits the sparse structure of the matrix system resulting from the space and energy discretized form of the time-dependent neutron diffusion equations. Preliminary numerical evaluation using a one-dimensional finite difference code shows the sparse matrix implementation of the generalized Runge-Kutta method to be highly accurate and efficient when compared to an optimized iterative theta method. 12 refs., 5 figs., 4 tabs

  5. BOKASUN: A fast and precise numerical program to calculate the Master Integrals of the two-loop sunrise diagrams

    Science.gov (United States)

    Caffo, Michele; Czyż, Henryk; Gunia, Michał; Remiddi, Ettore

    2009-03-01

    We present the program BOKASUN for fast and precise evaluation of the Master Integrals of the two-loop self-mass sunrise diagram for arbitrary values of the internal masses and the external four-momentum. We use a combination of two methods: a Bernoulli accelerated series expansion and a Runge-Kutta numerical solution of a system of linear differential equations. Program summaryProgram title: BOKASUN Catalogue identifier: AECG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9404 No. of bytes in distributed program, including test data, etc.: 104 123 Distribution format: tar.gz Programming language: FORTRAN77 Computer: Any computer with a Fortran compiler accepting FORTRAN77 standard. Tested on various PC's with LINUX Operating system: LINUX RAM: 120 kbytes Classification: 4.4 Nature of problem: Any integral arising in the evaluation of the two-loop sunrise Feynman diagram can be expressed in terms of a given set of Master Integrals, which should be calculated numerically. The program provides a fast and precise evaluation method of the Master Integrals for arbitrary (but not vanishing) masses and arbitrary value of the external momentum. Solution method: The integrals depend on three internal masses and the external momentum squared p. The method is a combination of an accelerated expansion in 1/p in its (pretty large!) region of fast convergence and of a Runge-Kutta numerical solution of a system of linear differential equations. Running time: To obtain 4 Master Integrals on PC with 2 GHz processor it takes 3 μs for series expansion with pre-calculated coefficients, 80 μs for series expansion without pre-calculated coefficients, from a few seconds up to a few minutes for Runge-Kutta method (depending

  6. Physics in Design : Real-time Numerical Simulation Integrated into the CAD Environment

    NARCIS (Netherlands)

    Zwier, Marijn P.; Wits, Wessel W.

    2017-01-01

    As today's markets are more susceptible to rapid changes and involve global players, a short time to market is required to keep a competitive edge. Concurrently, products are integrating an increasing number of functions and technologies, thus becoming progressively complex. Therefore, efficient and

  7. Updates on drug-target network; facilitating polypharmacology and data integration by growth of DrugBank database.

    Science.gov (United States)

    Barneh, Farnaz; Jafari, Mohieddin; Mirzaie, Mehdi

    2016-11-01

    Network pharmacology elucidates the relationship between drugs and targets. As the identified targets for each drug increases, the corresponding drug-target network (DTN) evolves from solely reflection of the pharmaceutical industry trend to a portrait of polypharmacology. The aim of this study was to evaluate the potentials of DrugBank database in advancing systems pharmacology. We constructed and analyzed DTN from drugs and targets associations in the DrugBank 4.0 database. Our results showed that in bipartite DTN, increased ratio of identified targets for drugs augmented density and connectivity of drugs and targets and decreased modular structure. To clear up the details in the network structure, the DTNs were projected into two networks namely, drug similarity network (DSN) and target similarity network (TSN). In DSN, various classes of Food and Drug Administration-approved drugs with distinct therapeutic categories were linked together based on shared targets. Projected TSN also showed complexity because of promiscuity of the drugs. By including investigational drugs that are currently being tested in clinical trials, the networks manifested more connectivity and pictured the upcoming pharmacological space in the future years. Diverse biological processes and protein-protein interactions were manipulated by new drugs, which can extend possible target combinations. We conclude that network-based organization of DrugBank 4.0 data not only reveals the potential for repurposing of existing drugs, also allows generating novel predictions about drugs off-targets, drug-drug interactions and their side effects. Our results also encourage further effort for high-throughput identification of targets to build networks that can be integrated into disease networks. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. Trajectory errors of different numerical integration schemes diagnosed with the MPTRAC advection module driven by ECMWF operational analyses

    Science.gov (United States)

    Rößler, Thomas; Stein, Olaf; Heng, Yi; Baumeister, Paul; Hoffmann, Lars

    2018-02-01

    The accuracy of trajectory calculations performed by Lagrangian particle dispersion models (LPDMs) depends on various factors. The optimization of numerical integration schemes used to solve the trajectory equation helps to maximize the computational efficiency of large-scale LPDM simulations. We analyzed global truncation errors of six explicit integration schemes of the Runge-Kutta family, which we implemented in the Massive-Parallel Trajectory Calculations (MPTRAC) advection module. The simulations were driven by wind fields from operational analysis and forecasts of the European Centre for Medium-Range Weather Forecasts (ECMWF) at T1279L137 spatial resolution and 3 h temporal sampling. We defined separate test cases for 15 distinct regions of the atmosphere, covering the polar regions, the midlatitudes, and the tropics in the free troposphere, in the upper troposphere and lower stratosphere (UT/LS) region, and in the middle stratosphere. In total, more than 5000 different transport simulations were performed, covering the months of January, April, July, and October for the years 2014 and 2015. We quantified the accuracy of the trajectories by calculating transport deviations with respect to reference simulations using a fourth-order Runge-Kutta integration scheme with a sufficiently fine time step. Transport deviations were assessed with respect to error limits based on turbulent diffusion. Independent of the numerical scheme, the global truncation errors vary significantly between the different regions. Horizontal transport deviations in the stratosphere are typically an order of magnitude smaller compared with the free troposphere. We found that the truncation errors of the six numerical schemes fall into three distinct groups, which mostly depend on the numerical order of the scheme. Schemes of the same order differ little in accuracy, but some methods need less computational time, which gives them an advantage in efficiency. The selection of the integration

  9. MannDB – A microbial database of automated protein sequence analyses and evidence integration for protein characterization

    Directory of Open Access Journals (Sweden)

    Kuczmarski Thomas A

    2006-10-01

    Full Text Available Abstract Background MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. Description MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-source tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. Conclusion MannDB comprises a large number of genomes and comprehensive protein

  10. Optimization Algorithm for Kalman Filter Exploiting the Numerical Characteristics of SINS/GPS Integrated Navigation Systems.

    Science.gov (United States)

    Hu, Shaoxing; Xu, Shike; Wang, Duhu; Zhang, Aiwu

    2015-11-11

    Aiming at addressing the problem of high computational cost of the traditional Kalman filter in SINS/GPS, a practical optimization algorithm with offline-derivation and parallel processing methods based on the numerical characteristics of the system is presented in this paper. The algorithm exploits the sparseness and/or symmetry of matrices to simplify the computational procedure. Thus plenty of invalid operations can be avoided by offline derivation using a block matrix technique. For enhanced efficiency, a new parallel computational mechanism is established by subdividing and restructuring calculation processes after analyzing the extracted "useful" data. As a result, the algorithm saves about 90% of the CPU processing time and 66% of the memory usage needed in a classical Kalman filter. Meanwhile, the method as a numerical approach needs no precise-loss transformation/approximation of system modules and the accuracy suffers little in comparison with the filter before computational optimization. Furthermore, since no complicated matrix theories are needed, the algorithm can be easily transplanted into other modified filters as a secondary optimization method to achieve further efficiency.

  11. Experimental and numerical study of heat transfer across insulation wall of a refrigerated integral panel van

    International Nuclear Information System (INIS)

    Glouannec, Patrick; Michel, Benoit; Delamarre, Guillaume; Grohens, Yves

    2014-01-01

    This paper presents an experimental and numerical design study of an insulation wall for refrigerated vans. The thermophysical properties of the insulating multilayer panel, the external environment impact (solar irradiation, temperature, etc.) and durability are taken into account. Different tools are used to characterize the thermal performances of the insulation walls and the thermal properties of the insulation materials are measured. In addition, an experiment at the wall scale is carried out and a 2D FEM model of heat and mass transfer within the wall is formulated. Three configurations are studied with this design approach. Multilayer insulation walls containing reflective multi-foil insulation, aerogel and phase change materials (PCM) are tested. Promising results are obtained with these materials, especially the reduction of peak heat transfer and energy consumption during the daytime period. Furthermore, the major influence of solar irradiation is highlighted as it can increase the peak heat transfer crossing the insulation wall by up to 43%. Nevertheless, we showed that the use of reflective multi-foil insulation and aerogel layers allowed decreasing this impact by 27%. - Highlights: • A design study of an insulation wall for a refrigerated van is carried out. • Experimental and numerical studies of multilayer insulation walls are performed. • The major influence of solar irradiation is highlighted. • New insulation materials (reflective multi-foil, aerogel and PCM) are tested

  12. On the formulation, parameter identification and numerical integration of the EMMI model :plasticity and isotropic damage.

    Energy Technology Data Exchange (ETDEWEB)

    Bammann, Douglas J.; Johnson, G. C. (University of California, Berkeley, CA); Marin, Esteban B.; Regueiro, Richard A. (University of Colorado, Boulder, CO)

    2006-01-01

    In this report we present the formulation of the physically-based Evolving Microstructural Model of Inelasticity (EMMI) . The specific version of the model treated here describes the plasticity and isotropic damage of metals as being currently applied to model the ductile failure process in structural components of the W80 program . The formulation of the EMMI constitutive equations is framed in the context of the large deformation kinematics of solids and the thermodynamics of internal state variables . This formulation is focused first on developing the plasticity equations in both the relaxed (unloaded) and current configurations. The equations in the current configuration, expressed in non-dimensional form, are used to devise the identification procedure for the plasticity parameters. The model is then extended to include a porosity-based isotropic damage state variable to describe the progressive deterioration of the strength and mechanical properties of metals induced by deformation . The numerical treatment of these coupled plasticity-damage constitutive equations is explained in detail. A number of examples are solved to validate the numerical implementation of the model.

  13. A development and integration of the concentration database for relative method, k0 method and absolute method in instrumental neutron activation analysis using Microsoft Access

    International Nuclear Information System (INIS)

    Hoh Siew Sin

    2012-01-01

    Instrumental Neutron Activation Analysis (INAA) is offen used to determine and calculate the concentration of an element in the sample by the National University of Malaysia, especially students of Nuclear Science Program. The lack of a database service leads consumers to take longer time to calculate the concentration of an element in the sample. This is because we are more dependent on software that is developed by foreign researchers which are costly. To overcome this problem, a study has been carried out to build an INAA database software. The objective of this study is to build a database software that help the users of INAA in Relative Method and Absolute Method for calculating the element concentration in the sample using Microsoft Excel 2010 and Microsoft Access 2010. The study also integrates k 0 data, k 0 Concent and k 0 -Westcott to execute and complete the system. After the integration, a study was conducted to test the effectiveness of the database software by comparing the concentrations between the experiments and in the database. Triple Bare Monitor Zr-Au and Cr-Mo-Au were used in Abs-INAA as monitor to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration are the net peak area (N p ), the measurement time (t m ), the irradiation time (t irr ), k-factor (k), thermal to epithermal neutron flux ratio (f), the parameters of the neutron flux distribution epithermal (α) and detection efficiency (ε p ). For Com-INAA databases, reference material IAEA-375 Soil was used to calculate the concentration of elements in the sample. CRM, SRM are also used in this database. After the INAA database integration, a verification process was to examine the effectiveness of the Abs-INAA was carried out by comparing the sample concentration between the in database and the experiment. The result of the experimental concentration value of INAA database software performed with high accuracy and precision. ICC

  14. An Integrated Approach for the Numerical Modelling of the Spray Forming Process

    DEFF Research Database (Denmark)

    Hattel, Jesper; Thorborg, Jesper; Pryds, Nini

    2003-01-01

    In this paper, an integrated approach for modelling the entire spray forming process is presented. The basis for the analysis is a recently developed model which extents previous studies and includes the interaction between an array of droplets and the enveloping gas. The formulation of the depos......In this paper, an integrated approach for modelling the entire spray forming process is presented. The basis for the analysis is a recently developed model which extents previous studies and includes the interaction between an array of droplets and the enveloping gas. The formulation...... is in fact the summation of "local" droplet size distributions along the r-axis. Furthermore, the deposition model proposed in the paper involves both the sticking efficiency of the droplets to the substrate as well as a geometrical model involving the effects of shadowing for the production of billet...

  15. MINDMAP: establishing an integrated database infrastructure for research in ageing, mental well-being, and the urban environment.

    Science.gov (United States)

    Beenackers, Mariëlle A; Doiron, Dany; Fortier, Isabel; Noordzij, J Mark; Reinhard, Erica; Courtin, Emilie; Bobak, Martin; Chaix, Basile; Costa, Giuseppe; Dapp, Ulrike; Diez Roux, Ana V; Huisman, Martijn; Grundy, Emily M; Krokstad, Steinar; Martikainen, Pekka; Raina, Parminder; Avendano, Mauricio; van Lenthe, Frank J

    2018-01-19

    Urbanization and ageing have important implications for public mental health and well-being. Cities pose major challenges for older citizens, but also offer opportunities to develop, test, and implement policies, services, infrastructure, and interventions that promote mental well-being. The MINDMAP project aims to identify the opportunities and challenges posed by urban environmental characteristics for the promotion and management of mental well-being and cognitive function of older individuals. MINDMAP aims to achieve its research objectives by bringing together longitudinal studies from 11 countries covering over 35 cities linked to databases of area-level environmental exposures and social and urban policy indicators. The infrastructure supporting integration of this data will allow multiple MINDMAP investigators to safely and remotely co-analyse individual-level and area-level data. Individual-level data is derived from baseline and follow-up measurements of ten participating cohort studies and provides information on mental well-being outcomes, sociodemographic variables, health behaviour characteristics, social factors, measures of frailty, physical function indicators, and chronic conditions, as well as blood derived clinical biochemistry-based biomarkers and genetic biomarkers. Area-level information on physical environment characteristics (e.g. green spaces, transportation), socioeconomic and sociodemographic characteristics (e.g. neighbourhood income, residential segregation, residential density), and social environment characteristics (e.g. social cohesion, criminality) and national and urban social policies is derived from publically available sources such as geoportals and administrative databases. The linkage, harmonization, and analysis of data from different sources are being carried out using piloted tools to optimize the validity of the research results and transparency of the methodology. MINDMAP is a novel research collaboration that is

  16. TriMEDB: A database to integrate transcribed markers and facilitate genetic studies of the tribe Triticeae

    Directory of Open Access Journals (Sweden)

    Yoshida Takuhiro

    2008-06-01

    Full Text Available Abstract Background The recent rapid accumulation of sequence resources of various crop species ensures an improvement in the genetics approach, including quantitative trait loci (QTL analysis as well as the holistic population analysis and association mapping of natural variations. Because the tribe Triticeae includes important cereals such as wheat and barley, integration of information on the genetic markers in these crops should effectively accelerate map-based genetic studies on Triticeae species and lead to the discovery of key loci involved in plant productivity, which can contribute to sustainable food production. Therefore, informatics applications and a semantic knowledgebase of genome-wide markers are required for the integration of information on and further development of genetic markers in wheat and barley in order to advance conventional marker-assisted genetic analyses and population genomics of Triticeae species. Description The Triticeae mapped expressed sequence tag (EST database (TriMEDB provides information, along with various annotations, regarding mapped cDNA markers that are related to barley and their homologues in wheat. The current version of TriMEDB provides map-location data for barley and wheat ESTs that were retrieved from 3 published barley linkage maps (the barley single nucleotide polymorphism database of the Scottish Crop Research Institute, the barley transcript map of Leibniz Institute of Plant Genetics and Crop Plant Research, and HarvEST barley ver. 1.63 and 1 diploid wheat map. These data were imported to CMap to allow the visualization of the map positions of the ESTs and interrelationships of these ESTs with public gene models and representative cDNA sequences. The retrieved cDNA sequences corresponding to each EST marker were assigned to the rice genome to predict an exon-intron structure. Furthermore, to generate a unique set of EST markers in Triticeae plants among the public domain, 3472 markers were

  17. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    Science.gov (United States)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our

  18. Numerical study and design optimization of electromagnetic energy harvesters integrated with flexible magnetic materials

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Sang Won [Hanyang University, Seoul (Korea, Republic of)

    2017-05-15

    This study presents a new design of an electromagnetic energy harvester integrated with a soft magnetic material. The harvester design optimizes the magnetic material characteristics and the size of a rectangular permanent magnet. The design employs a complete magnetic circuit made of (1) a thin-film soft magnetic material that facilitates a flexible but highly (magnetically) permeable beam and (2) an optimally-sized magnet that maximizes the harvester performance. The design is demonstrated to reduce magnetic flux leakage, and thus considerably enhances both magnetic flux density (B) and its change by time (dB/dt), which both influence harvester performance. The improvement in harvester performances strongly depends on critical design parameters, especially, the magnet size and characteristics of magnetic materials, including permeability, stiffness, and thickness. The analyses conclude that recently-introduced nanomaterials (having ultrahigh magnetic permeability) can potentially innovate harvester performances. However, the performance may be degraded without design optimization. Once optimized, the integrated nanomaterials facilitate a significant improvement compared with a conventional design without integrated magnetic materials.

  19. Numerical study and design optimization of electromagnetic energy harvesters integrated with flexible magnetic materials

    International Nuclear Information System (INIS)

    Yoon, Sang Won

    2017-01-01

    This study presents a new design of an electromagnetic energy harvester integrated with a soft magnetic material. The harvester design optimizes the magnetic material characteristics and the size of a rectangular permanent magnet. The design employs a complete magnetic circuit made of (1) a thin-film soft magnetic material that facilitates a flexible but highly (magnetically) permeable beam and (2) an optimally-sized magnet that maximizes the harvester performance. The design is demonstrated to reduce magnetic flux leakage, and thus considerably enhances both magnetic flux density (B) and its change by time (dB/dt), which both influence harvester performance. The improvement in harvester performances strongly depends on critical design parameters, especially, the magnet size and characteristics of magnetic materials, including permeability, stiffness, and thickness. The analyses conclude that recently-introduced nanomaterials (having ultrahigh magnetic permeability) can potentially innovate harvester performances. However, the performance may be degraded without design optimization. Once optimized, the integrated nanomaterials facilitate a significant improvement compared with a conventional design without integrated magnetic materials.

  20. Numerical Weather Prediction and Relative Economic Value framework to improve Integrated Urban Drainage- Wastewater management

    DEFF Research Database (Denmark)

    Courdent, Vianney Augustin Thomas

    domains during which the IUDWS can be coupled with the electrical smart grid to optimise its energy consumption. The REV framework was used to determine which decision threshold of the EPS (i.e. number of ensemble members predicting an event) provides the highest benefit for a given situation...... in cities where space is scarce and large-scale construction work a nuisance. This the-sis focuses on flow domain predictions of IUDWS from numerical weather prediction (NWP) to select relevant control objectives for the IUDWS and develops a framework based on the relative economic value (REV) approach...... to evaluate when acting on the forecast is beneficial or not. Rainfall forecasts are extremely valuable for estimating near future storm-water-related impacts on the IUDWS. Therefore, weather radar extrapolation “nowcasts” provide valuable predictions for RTC. However, radar nowcasts are limited...

  1. Integration of finite element analysis and numerical optimization techniques for RAM transport package design

    International Nuclear Information System (INIS)

    Harding, D.C.; Eldred, M.S.; Witkowski, W.R.

    1995-01-01

    Type B radioactive material transport packages must meet strict Nuclear Regulatory Commission (NRC) regulations specified in 10 CFR 71. Type B containers include impact limiters, radiation or thermal shielding layers, and one or more containment vessels. In the past, each component was typically designed separately based on its driving constraint and the expertise of the designer. The components were subsequently assembled and the design modified iteratively until all of the design criteria were met. This approach neglects the fact that components may serve secondary purposes as well as primary ones. For example, an impact limiter's primary purpose is to act as an energy absorber and protect the contents of the package, but can also act as a heat dissipater or insulator. Designing the component to maximize its performance with respect to both objectives can be accomplished using numerical optimization techniques

  2. Emerging opportunities in enterprise integration with open architecture computer numerical controls

    Science.gov (United States)

    Hudson, Christopher A.

    1997-01-01

    The shift to open-architecture machine tool computer numerical controls is providing new opportunities for metal working oriented manufacturers to streamline the entire 'art to part' process. Production cycle times, accuracy, consistency, predictability and process reliability are just some of the factors that can be improved, leading to better manufactured product at lower costs. Open architecture controllers are allowing manufacturers to apply general purpose software and hardware tools increase where previous approaches relied on proprietary and unique hardware and software. This includes DNC, SCADA, CAD, and CAM, where the increasing use of general purpose components is leading to lower cost system that are also more reliable and robust than the past proprietary approaches. In addition, a number of new opportunities exist, which in the past were likely impractical due to cost or performance constraints.

  3. Efficient O(N) integration for all-electron electronic structure calculation using numeric basis functions

    International Nuclear Information System (INIS)

    Havu, V.; Blum, V.; Havu, P.; Scheffler, M.

    2009-01-01

    We consider the problem of developing O(N) scaling grid-based operations needed in many central operations when performing electronic structure calculations with numeric atom-centered orbitals as basis functions. We outline the overall formulation of localized algorithms, and specifically the creation of localized grid batches. The choice of the grid partitioning scheme plays an important role in the performance and memory consumption of the grid-based operations. Three different top-down partitioning methods are investigated, and compared with formally more rigorous yet much more expensive bottom-up algorithms. We show that a conceptually simple top-down grid partitioning scheme achieves essentially the same efficiency as the more rigorous bottom-up approaches.

  4. Numerical Development

    Science.gov (United States)

    Siegler, Robert S.; Braithwaite, David W.

    2016-01-01

    In this review, we attempt to integrate two crucial aspects of numerical development: learning the magnitudes of individual numbers and learning arithmetic. Numerical magnitude development involves gaining increasingly precise knowledge of increasing ranges and types of numbers: from non-symbolic to small symbolic numbers, from smaller to larger…

  5. Numerical simulation of experimental data from planar SIS mixers with integrated tuning elements

    International Nuclear Information System (INIS)

    Mears, C.A.; Hu, Qing; Richards, P.L.

    1988-08-01

    We have used the full Tucker theory including the quantum susceptance to fit data from planar lithographed mm-wave mixers with bow tie antennas and integrated RF coupling elements. Essentially perfect fits to pumped IV curves have been obtained. The deduced imbedding admittances agree well with those independently calculated from the geometry of the antenna and matching structures. We find that the quantum susceptance is essential to the fit and thus to predictions of the mixer performance. For junctions with moderately sharp gap structures, the quantum susceptance is especially important in the production of steps with low and/or negative dynamic conductance. 15 refs., 4 figs

  6. Parallel Implementation of Numerical Solution of Few-Body Problem Using Feynman’s Continual Integrals

    Directory of Open Access Journals (Sweden)

    Naumenko Mikhail

    2018-01-01

    Full Text Available Modern parallel computing algorithm has been applied to the solution of the few-body problem. The approach is based on Feynman’s continual integrals method implemented in C++ programming language using NVIDIA CUDA technology. A wide range of 3-body and 4-body bound systems has been considered including nuclei described as consisting of protons and neutrons (e.g., 3,4He and nuclei described as consisting of clusters and nucleons (e.g., 6He. The correctness of the results was checked by the comparison with the exactly solvable 4-body oscillatory system and experimental data.

  7. Parallel Implementation of Numerical Solution of Few-Body Problem Using Feynman's Continual Integrals

    Science.gov (United States)

    Naumenko, Mikhail; Samarin, Viacheslav

    2018-02-01

    Modern parallel computing algorithm has been applied to the solution of the few-body problem. The approach is based on Feynman's continual integrals method implemented in C++ programming language using NVIDIA CUDA technology. A wide range of 3-body and 4-body bound systems has been considered including nuclei described as consisting of protons and neutrons (e.g., 3,4He) and nuclei described as consisting of clusters and nucleons (e.g., 6He). The correctness of the results was checked by the comparison with the exactly solvable 4-body oscillatory system and experimental data.

  8. An integrated numerical model for the prediction of Gaussian and billet shapes

    DEFF Research Database (Denmark)

    Hattel, Jesper; Pryds, Nini; Pedersen, Trine Bjerre

    2004-01-01

    Separate models for the atomisation and the deposition stages were recently integrated by the authors to form a unified model describing the entire spray-forming process. In the present paper, the focus is on describing the shape of the deposited material during the spray-forming process, obtained...... by this model. After a short review of the models and their coupling, the important factors which influence the resulting shape, i.e. Gaussian or billet, are addressed. The key parameters, which are utilized to predict the geometry and dimension of the deposited material, are the sticking efficiency...

  9. Direct numerical solution of the Ornstein-Zernike integral equation and spatial distribution of water around hydrophobic molecules

    Science.gov (United States)

    Ikeguchi, Mitsunori; Doi, Junta

    1995-09-01

    The Ornstein-Zernike integral equation (OZ equation) has been used to evaluate the distribution function of solvents around solutes, but its numerical solution is difficult for molecules with a complicated shape. This paper proposes a numerical method to directly solve the OZ equation by introducing the 3D lattice. The method employs no approximation the reference interaction site model (RISM) equation employed. The method enables one to obtain the spatial distribution of spherical solvents around solutes with an arbitrary shape. Numerical accuracy is sufficient when the grid-spacing is less than 0.5 Å for solvent water. The spatial water distribution around a propane molecule is demonstrated as an example of a nonspherical hydrophobic molecule using iso-value surfaces. The water model proposed by Pratt and Chandler is used. The distribution agrees with the molecular dynamics simulation. The distribution increases offshore molecular concavities. The spatial distribution of water around 5α-cholest-2-ene (C27H46) is visualized using computer graphics techniques and a similar trend is observed.

  10. Proteomic biomarkers for ovarian cancer risk in women with polycystic ovary syndrome: a systematic review and biomarker database integration.

    Science.gov (United States)

    Galazis, Nicolas; Olaleye, Olalekan; Haoula, Zeina; Layfield, Robert; Atiomo, William

    2012-12-01

    To review and identify possible biomarkers for ovarian cancer (OC) in women with polycystic ovary syndrome (PCOS). Systematic literature searches of MEDLINE, EMBASE, and Cochrane using the search terms "proteomics," "proteomic," and "ovarian cancer" or "ovarian carcinoma." Proteomic biomarkers for OC were then integrated with an updated previously published database of all proteomic biomarkers identified to date in patients with PCOS. Academic department of obstetrics and gynecology in the United Kingdom. A total of 180 women identified in the six studies. Tissue samples from women with OC vs. tissue samples from women without OC. Proteomic biomarkers, proteomic technique used, and methodologic quality score. A panel of six biomarkers was overexpressed both in women with OC and in women with PCOS. These biomarkers include calreticulin, fibrinogen-γ, superoxide dismutase, vimentin, malate dehydrogenase, and lamin B2. These biomarkers could help improve our understanding of the links between PCOS and OC and could potentially be used to identify subgroups of women with PCOS at increased risk of OC. More studies are required to further evaluate the role these biomarkers play in women with PCOS and OC. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  11. Numerical Simulation of Fluidized Bed Gasifier for Integrated Gasification Combined Cycle

    Directory of Open Access Journals (Sweden)

    CHEN Ju-hui

    2017-06-01

    Full Text Available The overall thermal efficiency of the integrated gasification combined cycle ( IGCC has not been sufficiently improved. In order to achieve higher power generation efficiency,the advanced technology of IGCC has been developed which is on the basis of the concept of exergy recovery. IGCC systems and devices from the overall structure of opinion,this technology will generate electricity for the integration of advanced technology together,the current utilization of power generation technology and by endothermic reaction of steam in the gasifier,a gas turbine exhaust heat recovery or the solid oxide fuel cell. It is estimated that such the use of exergy recycling has the advantage of being easy to use,separating,collecting fixed CO2,making it very attractive,and can increase the overall efficiency by 10% or more. The characteristics of fluidized bed gasifier,one of the core equipment of the IGCC system,and its effect on the whole system were studied.

  12. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  13. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  14. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    Science.gov (United States)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    Within the framework of the transition to renewable energy sources ("Energiewende"), the German government defined the target of producing 60 % of the final energy consumption from renewable energy sources by the year 2050. However, renewable energies are subject to natural fluctuations. Energy storage can help to buffer the resulting time shifts between production and demand. Subsurface geological structures provide large potential capacities for energy stored in the form of heat or gas on daily to seasonal time scales. In order to explore this potential sustainably, the possible induced effects of energy storage operations have to be quantified for both specified normal operation and events of failure. The ANGUS+ project therefore integrates experimental laboratory studies with numerical approaches to assess subsurface energy storage scenarios and monitoring methods. Subsurface storage options for gas, i.e. hydrogen, synthetic methane and compressed air in salt caverns or porous structures, as well as subsurface heat storage are investigated with respect to site prerequisites, storage dimensions, induced effects, monitoring methods and integration into spatial planning schemes. The conceptual interdisciplinary approach of the ANGUS+ project towards the integration of subsurface energy storage into a sustainable subsurface planning scheme is presented here, and this approach is then demonstrated using the examples of two selected energy storage options: Firstly, the option of seasonal heat storage in a shallow aquifer is presented. Coupled thermal and hydraulic processes induced by periodic heat injection and extraction were simulated in the open-source numerical modelling package OpenGeoSys. Situations of specified normal operation as well as cases of failure in operational storage with leaking heat transfer fluid are considered. Bench-scale experiments provided parameterisations of temperature dependent changes in shallow groundwater hydrogeochemistry. As a

  15. Integrated numerical design of an innovative Lower Hybrid launcher for Alcator C-Mod

    International Nuclear Information System (INIS)

    Meneghini, O.; Shiraiwa, S.; Beck, W.; Irby, J.; Koert, P.; Parker, R. R.; Viera, R.; Wukitch, S.; Wilson, J.

    2009-01-01

    The new Alcator C-Mod LHCD system (LH2) is based on the concept of a four way splitter [1] which evenly splits the RF power among the four waveguides that compose one of the 16 columns of the LH grill. In this work several simulation tools have been used to study the LH2 coupling performance and the launched spectra when facing a plasma, numerically verifying the effectiveness of the four way splitter concept and further improving its design. The TOPLHA code has been used for modeling reflections at the antenna/plasma interface. TOPLHA results have been then coupled to the commercial code CST Microwave Studio to efficiently optimize the four way splitter geometry for several plasma scenarios. Subsequently, the COMSOL Multiphysics code has been used to self consistently take into account the electromagnetic-thermal-structural interactions. This comprehensive and predictive analysis has proven to be very valuable for understanding the behavior of the system when facing the plasma and has profoundly influenced several design choices of the LH2. According to the simulations, the final design ensures even poloidal power splitting for a wide range of plasma parameters, which ultimately results in an improvement of the wave coupling and an increased maximum operating power.

  16. An Integrated Numerical and Experimental Analysis for Enhancing the Performance of the Hidden Ceiling Fan

    Directory of Open Access Journals (Sweden)

    Sheam-Chyun Lin

    2014-02-01

    Full Text Available Since the inlet and outlet of hidden ceiling fan are almost located at the same Plane; thus, an improper housing may cause inhale-return phenomenon which significantly affects its power consumption and performance. In this study, a comprehensive investigation by numerical and experimental techniques was used to predict and identify the flow pattern, airflow rate, efficiency, and noise for ceiling fans with different design parameters. The results showed that the unique inhale-return phenomenon happens for an inappropriate housing. Several key parameters, such as fan guard, housing ring, inlet-to-outlet area ratio, and blockage height, are evaluated for finding out the criterion to avoid the inhale-return flow. Consequently the study finds that fan guard changes the airflow to a wider distribution with a lower velocity. A minimum blockage distance and a maximum height of ring-plate are set at 80 mm and 30 mm, respectively. Also, it is suggested that the inlet area must be bigger than the outlet area. Moreover, all the parameters show the same trend under various rotational speeds. In conclusion, this systematic investigation not only provides the fan engineer's design ability to avoid the inhale-return phenomenon, but also the predicting capability on its aerodynamic and acoustic performances.

  17. Numerical studies of the integration of a trapped vortex combustor into traditional combustion chambers

    Energy Technology Data Exchange (ETDEWEB)

    Patrignani, L.; Losurdo, M.; Bruno, C. [Sapienza Univ. de Roma, Rome (Italy)

    2010-09-15

    Exhaust emissions from furnace burners can be reduced by premixing reactants with combustion products. This paper discussed the use of a trapped vortex combustor (TVC) as a very promising technology for gas turbines. The TVC can reduce emissions and ensure that the temperature is uniform in the exhaust products, which is a key aspect for certain types of heat treatments, such as in steel rolling mills. The TVC for gas turbines is configured to mix air, fuel and hot products at turbulent scales fine enough to render the combustion mode flameless, or close to flameless. The vortex ensures a high recirculation factor between hot combustion products and reactants, and ultimately flame stability. In this study, the TVC configuration for an existing gas turbine was numerically investigated by means of RANS and LES. According to preliminary results of the fast-flameless combustion (FFC) strategy, the proposed TVC is a suitable candidate to reduce nitrogen oxide (NOx) emissions while keeping the pressure drop below 1 per cent. Both RANS and LES show that too much fuel burns along the main duct. Better fuel splitting or a different position for the injectors may enhance combustion inside the recirculation zone. Behaviour of the main vortices showed that a more accurate design of the internal shape of the combustor is needed to prevent excessive velocity fluctuation or vortex instabilities and therefore emissions. 13 refs., 9 figs.

  18. Analysis of Enhancement in Available Power Transfer Capacity by STATCOM Integrated SMES by Numerical Simulation Studies

    DEFF Research Database (Denmark)

    Saraswathi, Ananthavel; Sanjeevikumar, Padmanaban; Shanmugham, Sutha

    2016-01-01

    Power system researches are mainly focused in enhancing the available power capacities of the existing transmission lines. But still, no prominent solutions have been made due to several factors that affect the transmission lines which include the length, aging of the cables and losses...... on generation, transmission and distribution etc. This paper exploited the integration of static synchronous compensator (STATCOM) and superconducting magnetic energy storage (SMES) which is then connected to existing power transmission line for enhancing the available power transfer capacity (ATC). STATCOMis...... power electronic voltage source converter (VSC) which is connected to the transmission system for shunt reactive power and harmonics compensation. SMES is a renowned clean energy storage technology. Feasibility of the proposed power system can control the real as well as reactive power flow...

  19. Numerical study on coolant flow distribution at the core inlet for an integral pressurized water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Lin; Peng, Min Jun; Xia, Genglei; Lv, Xing; Li, Ren [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, Harbin Engineering University, Harbin (China)

    2017-02-15

    When an integral pressurized water reactor is operated under low power conditions, once-through steam generator group operation strategy is applied. However, group operation strategy will cause nonuniform coolant flow distribution at the core inlet and lower plenum. To help coolant flow mix more uniformly, a flow mixing chamber (FMC) has been designed. In this paper, computational fluid dynamics methods have been used to investigate the coolant distribution by the effect of FMC. Velocity and temperature characteristics under different low power conditions and optimized FMC configuration have been analyzed. The results illustrate that the FMC can help improve the nonuniform coolant temperature distribution at the core inlet effectively; at the same time, the FMC will induce more resistance in the downcomer and lower plenum.

  20. Results of numerically solving an integral equation for a two-fermion system

    International Nuclear Information System (INIS)

    Skachkov, N.B.; Solov'eva, T.M.

    2003-01-01

    A two-particle system is described by integral equations whose kernels are dependent on the total energy of the system. Such equations can be reduced to an eigenvalue problem featuring an eigenvalue-dependent operator. This nonlinear eigenvalue problem is solved by means of an iterative procedure developed by the present authors. The energy spectra of a two-fermion system formed by particles of identical masses are obtained for two cases, that where the total spin of the system is equal to zero and that where the total spin of the system is equal to unity. The splitting of the ground-state levels of positronium and dimuonium, the frequency of the transition from the ground state of orthopositronium to its first excited state, and the probabilities of parapositronium and paradimuonium decays are computed. The results obtained in this way are found to be in good agreement with experimental data

  1. An integrated numerical model for the prediction of Gaussian and billet shapes

    International Nuclear Information System (INIS)

    Hattel, J.H.; Pryds, N.H.; Pedersen, T.B.

    2004-01-01

    Separate models for the atomisation and the deposition stages were recently integrated by the authors to form a unified model describing the entire spray-forming process. In the present paper, the focus is on describing the shape of the deposited material during the spray-forming process, obtained by this model. After a short review of the models and their coupling, the important factors which influence the resulting shape, i.e. Gaussian or billet, are addressed. The key parameters, which are utilized to predict the geometry and dimension of the deposited material, are the sticking efficiency and the shading effect for Gaussian and billet shape, respectively. From the obtained results, the effect of these parameters on the final shape is illustrated

  2. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  3. Evaluation of diffuse hepatic diseases by integrated image, SPECT and numerical taxonomic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Shin

    1987-02-01

    In 135 patients with various hepatic diseases, cardiopulmonary circulation and hepatic accumulation of the activity were collected for 100 sec after bolus injection of 111-222 MBq (3 - 6 mci) of /sup 99m/Tc-phytate, and then integrated as a single image. Anterior, right lateral and posterior planar images, and hepatosplenic SPECT images were obtained thereafter. Lung to liver count ratio (P/L) was estimated by the integrated image. Liver volume (HV), spleen volume (SV) and liver to spleen count ratio (MHC/MSC) were calculated using the data obtained by SPECT. P/L was useful as an index of effective hepatic blood flow. MHC/MSC was closely correlated with the grade of portal hypertension. HV or SV alone shows low clinical value in discriminating liver diseases. Principal component analysis was applied to the 4 above-mentioned radinuclide data and the following 11 laboratory data ; total serum protein, serum albumine, glutamate oxaloacetate transaminase (GOT), glutamate pyruvate transaminase (GPT), lactic dehydrogenase (LDH), alkaline phosphatase (AL-P), zink sulfateturbidity test (ZTT), thymol turbidity test (TTT), r-glutamyl transpeptidase (r-GTP), cholinesterase (Ch-E), and total bilirubin (T-Bil). These fifteen data were condensed to 5 principal components. And then cluster analysis was carried out among 135 patients. The subjects were classified in 7 small groups. In group (G) I to GIII, frequency of liver cirrhosis was high, while on the contrary in GIV to GVII, the frequency of normal cases increased gradually. From the above results, cluster analysis seemed to reflect the pathophysiological state and the grade of the disease. This method might be useful for estimation of the grade of damage in diffuse hepatic disease and a good objective evaluation method in follow-up studies. (J.P.N.).

  4. Directory of Factual and Numeric Databases of Relevance to Aerospace and Defence R and D (Repertoire de Bases de donnees Factuelles ou Numeriques d’interet pour la R and D).

    Science.gov (United States)

    1992-07-01

    Madrid Faculty of Science Institute of Materials Sciences (Instituto de Ciencia de Materiales Facultad de Ciencias, Universidad Autonoma de Madrid) ADDRESS...OF ORGANIZATION: Construcciones Aeronauticas SA CASA ADDRESS/POINT OF CONTACT: Attention: J. Pascual Laboratory Aeropuerto de San Pablo 41007 Sevilla...Factual and Numeric Databases of Relevance to Aerospace and Defence R & D (Repertoire de Bases de donnees Factuelles ou Num~riques d’inte’re^t pour

  5. Runge–Kutta type methods with special properties for the numerical integration of ordinary differential equations

    International Nuclear Information System (INIS)

    Kalogiratou, Z.; Monovasilis, Th.; Psihoyios, G.; Simos, T.E.

    2014-01-01

    In this work we review single step methods of the Runge–Kutta type with special properties. Among them are methods specially tuned to integrate problems that exhibit a pronounced oscillatory character and such problems arise often in celestial mechanics and quantum mechanics. Symplectic methods, exponentially and trigonometrically fitted methods, minimum phase-lag and phase-fitted methods are presented. These are Runge–Kutta, Runge–Kutta–Nyström and Partitioned Runge–Kutta methods. The theory of constructing such methods is given as well as several specific methods. In order to present the performance of the methods we have tested 58 methods from all categories. We consider the two dimensional harmonic oscillator, the two body problem, the pendulum problem and the orbital problem studied by Stiefel and Bettis. Also we have tested the methods on the computation of the eigenvalues of the one dimensional time independent Schrödinger equation with the harmonic oscillator, the doubly anharmonic oscillator and the exponential potentials

  6. Development and Validation of a Momentum Integral Numerical Analysis Code for Liquid Metal Fast Reactor

    International Nuclear Information System (INIS)

    Chen, Xiangyi; Suh, Kune Y.

    2016-01-01

    In this work, this benchmark problem is conducted to assess the precision of the upgraded in-house code MINA. Comparison of the results from different best estimate codes employed by various grid spacer pressure drop correlations is carried out to suggest the best one. By modifying In's method, it presents good agreement with the experiment data which is shown in Figure 7. The reason for the failure of the prediction in previous work is caused by the utilization of Rehme's method which is categorized into four groups according to different fitting strategy. Through comparison of drag coefficients calculated by four groups of Rheme's method, equivalent drag coefficient calculated by In's method and experiment data shown in Figure 8, we can conclude that Rehme's method considerably underestimate the drag coefficients in grid spacers used in HELIOS and In's method give a reasonable prediction. Starting from the core inlet, the accumulated pressure losses are presented in figure 9 along the accumulated length of the forced convection flow path; the good agreement of the prediction from MINA with the experiment result shows MINA has very good capability in integrated momentum analysis makes it robust in the future design scoping method development of LFR.

  7. Development and Validation of a Momentum Integral Numerical Analysis Code for Liquid Metal Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xiangyi; Suh, Kune Y. [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this work, this benchmark problem is conducted to assess the precision of the upgraded in-house code MINA. Comparison of the results from different best estimate codes employed by various grid spacer pressure drop correlations is carried out to suggest the best one. By modifying In's method, it presents good agreement with the experiment data which is shown in Figure 7. The reason for the failure of the prediction in previous work is caused by the utilization of Rehme's method which is categorized into four groups according to different fitting strategy. Through comparison of drag coefficients calculated by four groups of Rheme's method, equivalent drag coefficient calculated by In's method and experiment data shown in Figure 8, we can conclude that Rehme's method considerably underestimate the drag coefficients in grid spacers used in HELIOS and In's method give a reasonable prediction. Starting from the core inlet, the accumulated pressure losses are presented in figure 9 along the accumulated length of the forced convection flow path; the good agreement of the prediction from MINA with the experiment result shows MINA has very good capability in integrated momentum analysis makes it robust in the future design scoping method development of LFR.

  8. Numerical simulation of the integrated solar/North Benghazi combined power plant

    International Nuclear Information System (INIS)

    Aldali, Y.; Morad, K.

    2016-01-01

    Highlights: • The thermodynamic and economic evaluation of power plant have been studied. • Saving and boosting modes are considered as the same solar field area. • Two modes of operation have been used and simulated on Libyan climate conditions. • The benefit/cost ratios are 1.74 and 1.30 for fuel saving and power boosting mode. • Fuel saving mode is more economical than power boosting mode. - Abstract: The aim of this paper is to study the thermodynamic performance of a proposed integrated solar/North Benghazi combined power plant under Libyan climatic conditions. The parabolic trough collector field with direct steam generation was considered as solar system. Two modes of operations with the same solar field area are considered: fuel saving mode in which the generated solar steam was used to preheat the combustion air in the gas turbine unit and power boosting mode in which the generated solar steam was added into the steam turbine for boosting the electrical power generated from steam turbine unit. Moreover, the economic impact of solar energy is assessed in the form of benefit/cost ratio to justify the substitution potential of such clean energy. This study shows that, for fuel saving mode: the annual saving of natural gas consumption and CO_2 emission are approximately 3001.56 and 7972.25 tons, respectively, in comparison with the conventional North Benghazi combined cycle power plant. For power boosting mode: the annual solar share of electrical energy is approximately 93.33 GW h. The economic analysis of solar supported plant has indicated that the benefit/cost ratios are 1.74 and 1.30 for fuel saving and power boosting mode, therefore, then fuel saving mode is more economical than power boosting mode for the same solar field area, moreover, it reduces the greenhouse CO_2 emission in order to avoid a collapse of the word climate.

  9. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    Science.gov (United States)

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Monitoring concept for structural integration of PZT-fiber arrays in metal sheets: a numerical and experimental study

    Science.gov (United States)

    Drossel, Welf-Guntram; Schubert, Andreas; Putz, Matthias; Koriath, Hans-Joachim; Wittstock, Volker; Hensel, Sebastian; Pierer, Alexander; Müller, Benedikt; Schmidt, Marek

    2018-01-01

    The technique joining by forming allows the structural integration of piezoceramic fibers into locally microstructured metal sheets without any elastic interlayers. A high-volume production of the joining partners causes in statistical deviations from the nominal dimensions. A numerical simulation on geometric process sensitivity shows that the deviations have a high significant influence on the resulting fiber stresses after the joining by forming operation and demonstrate the necessity of a monitoring concept. On this basis, the electromechanical behavior of piezoceramic array transducers is investigated experimentally before, during and after the joining process. The piezoceramic array transducer consists of an arrangement of five electrical interconnected piezoceramic fibers. The findings show that the impedance spectrum depends on the fiber stresses and can be used for in-process monitoring during the joining process. Based on the impedance values the preload state of the interconnected piezoceramic fibers can be specifically controlled and a fiber overload.

  11. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    Science.gov (United States)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  12. Integral and differential methods for the numerical solution of 2-D field problems in high energy physics magnets and electrical machines

    International Nuclear Information System (INIS)

    Hannalla, A.Y.; Simkin, J.; Trowbridge, C.W.

    1979-10-01

    Numerical calculations of electromagnetic fields have been performed by solving integral or differential equations. Integral methods are ideally suited to open boundary problems and on the other hand the geometric complexity of electrical machines makes differential methods more attractive. In this paper both integral and differential equation methods are reviewed, and the limitations of the methods are highlighted, in an attempt to show how to select the best method for a particular problem. (author)

  13. Tangent modulus in numerical integration of constitutive relations and its influence on convergence of N-R method

    Directory of Open Access Journals (Sweden)

    Poruba Z.

    2009-06-01

    Full Text Available For the numerical solution of elasto-plastic problems with use of Newton-Raphson method in global equilibrium equation it is necessary to determine the tangent modulus in each integration point. To reach the parabolic convergence of Newton-Raphson method it is convenient to use so called algorithmic tangent modulus which is consistent with used integration scheme. For more simple models for example Chaboche combined hardening model it is possible to determine it in analytical way. In case of more robust macroscopic models it is in many cases necessary to use the approximation approach. This possibility is presented in this contribution for radial return method on Chaboche model. An example solved in software Ansys corresponds to line contact problem with assumption of Coulomb's friction. The study shows at the end that the number of iteration of N-R method is higher in case of continuum tangent modulus and many times higher with use of modified N-R method, initial stiffness method.

  14. SynechoNET: integrated protein-protein interaction database of a model cyanobacterium Synechocystis sp. PCC 6803

    OpenAIRE

    Kim, Woo-Yeon; Kang, Sungsoo; Kim, Byoung-Chul; Oh, Jeehyun; Cho, Seongwoong; Bhak, Jong; Choi, Jong-Soon

    2008-01-01

    Background Cyanobacteria are model organisms for studying photosynthesis, carbon and nitrogen assimilation, evolution of plant plastids, and adaptability to environmental stresses. Despite many studies on cyanobacteria, there is no web-based database of their regulatory and signaling protein-protein interaction networks to date. Description We report a database and website SynechoNET that provides predicted protein-protein interactions. SynechoNET shows cyanobacterial domain-domain interactio...

  15. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  16. Written records of historical tsunamis in the northeastern South China Sea – challenges associated with developing a new integrated database

    Directory of Open Access Journals (Sweden)

    A. Y. A. Lau

    2010-09-01

    Full Text Available Comprehensive analysis of 15 previously published regional databases incorporating more than 100 sources leads to a newly revised historical tsunami database for the northeastern (NE region of the South China Sea (SCS including Taiwan. The validity of each reported historical tsunami event listed in our database is assessed by comparing and contrasting the information and descriptions provided in the other databases. All earlier databases suffer from errors associated with inaccuracies in translation between different languages, calendars and location names. The new database contains 205 records of "events" reported to have occurred between AD 1076 and 2009. We identify and investigate 58 recorded tsunami events in the region. The validity of each event is based on the consistency and accuracy of the reports along with the relative number of individual records for that event. Of the 58 events, 23 are regarded as "valid" (confirmed events, three are "probable" events and six are "possible". Eighteen events are considered "doubtful" and eight events "invalid". The most destructive tsunami of the 23 valid events occurred in 1867 and affected Keelung, northern Taiwan, killing at least 100 people. Inaccuracies in the historical record aside, this new database highlights the occurrence and geographical extent of several large tsunamis in the NE SCS region and allows an elementary statistical analysis of annual recurrence intervals. Based on historical records from 1951–2009 the probability of a tsunami (from any source affecting the region in any given year is relatively high (33.4%. However, the likelihood of a tsunami that has a wave height >1 m, and/or causes fatalities and damage to infrastructure occurring in the region in any given year is low (1–2%. This work indicates the need for further research using coastal stratigraphy and inundation modeling to help validate some of the historical accounts of tsunamis as well as adequately evaluate

  17. Bending of Euler-Bernoulli nanobeams based on the strain-driven and stress-driven nonlocal integral models: a numerical approach

    Science.gov (United States)

    Oskouie, M. Faraji; Ansari, R.; Rouhi, H.

    2018-04-01

    Eringen's nonlocal elasticity theory is extensively employed for the analysis of nanostructures because it is able to capture nanoscale effects. Previous studies have revealed that using the differential form of the strain-driven version of this theory leads to paradoxical results in some cases, such as bending analysis of cantilevers, and recourse must be made to the integral version. In this article, a novel numerical approach is developed for the bending analysis of Euler-Bernoulli nanobeams in the context of strain- and stress-driven integral nonlocal models. This numerical approach is proposed for the direct solution to bypass the difficulties related to converting the integral governing equation into a differential equation. First, the governing equation is derived based on both strain-driven and stress-driven nonlocal models by means of the minimum total potential energy. Also, in each case, the governing equation is obtained in both strong and weak forms. To solve numerically the derived equations, matrix differential and integral operators are constructed based upon the finite difference technique and trapezoidal integration rule. It is shown that the proposed numerical approach can be efficiently applied to the strain-driven nonlocal model with the aim of resolving the mentioned paradoxes. Also, it is able to solve the problem based on the strain-driven model without inconsistencies of the application of this model that are reported in the literature.

  18. An integrated database on ticks and tick-borne zoonoses in the tropics and subtropics with special reference to developing and emerging countries.

    Science.gov (United States)

    Vesco, Umberto; Knap, Nataša; Labruna, Marcelo B; Avšič-Županc, Tatjana; Estrada-Peña, Agustín; Guglielmone, Alberto A; Bechara, Gervasio H; Gueye, Arona; Lakos, Andras; Grindatto, Anna; Conte, Valeria; De Meneghi, Daniele

    2011-05-01

    Tick-borne zoonoses (TBZ) are emerging diseases worldwide. A large amount of information (e.g. case reports, results of epidemiological surveillance, etc.) is dispersed through various reference sources (ISI and non-ISI journals, conference proceedings, technical reports, etc.). An integrated database-derived from the ICTTD-3 project ( http://www.icttd.nl )-was developed in order to gather TBZ records in the (sub-)tropics, collected both by the authors and collaborators worldwide. A dedicated website ( http://www.tickbornezoonoses.org ) was created to promote collaboration and circulate information. Data collected are made freely available to researchers for analysis by spatial methods, integrating mapped ecological factors for predicting TBZ risk. The authors present the assembly process of the TBZ database: the compilation of an updated list of TBZ relevant for (sub-)tropics, the database design and its structure, the method of bibliographic search, the assessment of spatial precision of geo-referenced records. At the time of writing, 725 records extracted from 337 publications related to 59 countries in the (sub-)tropics, have been entered in the database. TBZ distribution maps were also produced. Imported cases have been also accounted for. The most important datasets with geo-referenced records were those on Spotted Fever Group rickettsiosis in Latin-America and Crimean-Congo Haemorrhagic Fever in Africa. The authors stress the need for international collaboration in data collection to update and improve the database. Supervision of data entered remains always necessary. Means to foster collaboration are discussed. The paper is also intended to describe the challenges encountered to assemble spatial data from various sources and to help develop similar data collections.

  19. The Numerical Nuclear Reactor for High-Fidelity Integrated Simulation of Neutronic, Thermal-Hydraulic, and Thermo-Mechanical Phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K. S.; Ju, H. G.; Jeon, T. H. and others

    2005-03-15

    A comprehensive high fidelity reactor core modeling capability has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. High fidelity was accomplished by integrating highly refined solution modules for the coupled neutronic, thermal-hydraulic, and thermo-mechanical phenomena. Each solution module employs methods and models that are formulated faithfully to the first-principles governing the physics, real geometry, and constituents. Specifically, the critical analysis elements that are incorporated in the coupled code capability are whole-core neutron transport solution, ultra-fine-mesh computational fluid dynamics/heat transfer solution, and finite-element-based thermo-mechanics solution, all obtained with explicit (fuel pin cell level) heterogeneous representations of the components of the core. The vast computational problem resulting from such highly refined modeling is solved on massively parallel computers, and serves as the 'numerical nuclear reactor'. Relaxation of modeling parameters were also pursued to make problems run on clusters of workstations and PCs for smaller scale applications as well.

  20. The Numerical Nuclear Reactor for High-Fidelity Integrated Simulation of Neutronic, Thermal-Hydraulic, and Thermo-Mechanical Phenomena

    International Nuclear Information System (INIS)

    Kim, K. S.; Ju, H. G.; Jeon, T. H. and others

    2005-03-01

    A comprehensive high fidelity reactor core modeling capability has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. High fidelity was accomplished by integrating highly refined solution modules for the coupled neutronic, thermal-hydraulic, and thermo-mechanical phenomena. Each solution module employs methods and models that are formulated faithfully to the first-principles governing the physics, real geometry, and constituents. Specifically, the critical analysis elements that are incorporated in the coupled code capability are whole-core neutron transport solution, ultra-fine-mesh computational fluid dynamics/heat transfer solution, and finite-element-based thermo-mechanics solution, all obtained with explicit (fuel pin cell level) heterogeneous representations of the components of the core. The vast computational problem resulting from such highly refined modeling is solved on massively parallel computers, and serves as the 'numerical nuclear reactor'. Relaxation of modeling parameters were also pursued to make problems run on clusters of workstations and PCs for smaller scale applications as well

  1. An integrated DEA PCA numerical taxonomy approach for energy efficiency assessment and consumption optimization in energy intensive manufacturing sectors

    International Nuclear Information System (INIS)

    Azadeh, A.; Amalnick, M.S.; Ghaderi, S.F.; Asadzadeh, S.M.

    2007-01-01

    This paper introduces an integrated approach based on data envelopment analysis (DEA), principal component analysis (PCA) and numerical taxonomy (NT) for total energy efficiency assessment and optimization in energy intensive manufacturing sectors. Total energy efficiency assessment and optimization of the proposed approach considers structural indicators in addition conventional consumption and manufacturing sector output indicators. The validity of the DEA model is verified and validated by PCA and NT through Spearman correlation experiment. Moreover, the proposed approach uses the measure-specific super-efficiency DEA model for sensitivity analysis to determine the critical energy carriers. Four energy intensive manufacturing sectors are discussed in this paper: iron and steel, pulp and paper, petroleum refining and cement manufacturing sectors. To show superiority and applicability, the proposed approach has been applied to refinery sub-sectors of some OECD (Organization for Economic Cooperation and Development) countries. This study has several unique features which are: (1) a total approach which considers structural indicators in addition to conventional energy efficiency indicators; (2) a verification and validation mechanism for DEA by PCA and NT and (3) utilization of DEA for total energy efficiency assessment and consumption optimization of energy intensive manufacturing sectors

  2. A conservative, thermodynamically consistent numerical approach for low Mach number combustion. Part I: Single-level integration

    Science.gov (United States)

    Nonaka, Andrew; Day, Marcus S.; Bell, John B.

    2018-01-01

    We present a numerical approach for low Mach number combustion that conserves both mass and energy while remaining on the equation of state to a desired tolerance. We present both unconfined and confined cases, where in the latter the ambient pressure changes over time. Our overall scheme is a projection method for the velocity coupled to a multi-implicit spectral deferred corrections (SDC) approach to integrate the mass and energy equations. The iterative nature of SDC methods allows us to incorporate a series of pressure discrepancy corrections naturally that lead to additional mass and energy influx/outflux in each finite volume cell in order to satisfy the equation of state. The method is second order, and satisfies the equation of state to a desired tolerance with increasing iterations. Motivated by experimental results, we test our algorithm on hydrogen flames with detailed kinetics. We examine the morphology of thermodiffusively unstable cylindrical premixed flames in high-pressure environments for confined and unconfined cases. We also demonstrate that our algorithm maintains the equation of state for premixed methane flames and non-premixed dimethyl ether jet flames.

  3. A fundamental numerical analysis for noninvasive thermometry integrated in a heating applicator based on the reentrant cavity

    International Nuclear Information System (INIS)

    Ohwada, Hiroshi; Ishihara, Yasutoshi

    2010-01-01

    To improve the efficacy of hyperthermia treatment, a novel method of noninvasive measurement of body temperature change is proposed. The proposed technology, thermometry, is based on changes in the electromagnetic field distribution inside the heating applicator with temperature changes and the temperature dependence of the dielectric constant. In addition, an image of the temperature change distribution inside a body is reconstructed by applying a computed tomography (CT) algorithm. The proposed thermometry method can serve as a possible noninvasive method to monitor the temperature change distribution inside the body without the use of enormous thermometers such as in the case of magnetic resonance imaging (MRI). Furthermore, this temperature monitoring method can be easily combined with a heating applicator based on a cavity resonator, and the novel integrated treatment system can possibly be used to treat cancer effectively while noninvasively monitoring the heating effect. In this paper, the phase change distributions of the electromagnetic field with temperature changes are simulated by numerical analysis using the finite difference time domain (FDTD) method. Moreover, to estimate the phase change distributions inside a target body, the phase change distributions with temperature changes are reconstructed by a filtered back-projection. In addition, the reconstruction accuracy of the converted temperature change distribution from the phase change is evaluated. (author)

  4. Scaling up health knowledge at European level requires sharing integrated data: an approach for collection of database specification

    Directory of Open Access Journals (Sweden)

    Menditto E

    2016-06-01

    Full Text Available Enrica Menditto,1 Angela Bolufer De Gea,2 Caitriona Cahir,3,4 Alessandra Marengoni,5 Salvatore Riegler,1 Giuseppe Fico,6 Elisio Costa,7 Alessandro Monaco,8 Sergio Pecorelli,5 Luca Pani,8 Alexandra Prados-Torres9 1School of Pharmacy, CIRFF/Center of Pharmacoeconomics, University of Naples Federico II, Naples, Italy; 2Directorate-General for Health and Food Safety, European Commission, Brussels, Belgium; 3Division of Population Health Sciences, Royal College of Surgeons in Ireland, 4Department of Pharmacology and Therapeutics, St James’s Hospital, Dublin, Ireland; 5Department of Clinical and Experimental Science, University of Brescia, Brescia; 6Life Supporting Technologies, Photonics Technology and Bioengineering Department, School of Telecomunications Engineering, Polytechnic University of Madrid, Madrid, Spain; 7Faculty of Pharmacy, University of Porto, Porto, Portugal; 8Italian Medicines Agency – AIFA, Rome, Italy; 9EpiChron Research Group on Chronic Diseases, Aragón Health Sciences Institute (IACS, IIS Aragón REDISSEC ISCIII, Miguel Servet University Hospital, University of Zaragoza, Zaragoza, Spain Abstract: Computerized health care databases have been widely described as an excellent opportunity for research. The availability of “big data” has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on “adherence to prescription and medical plans” identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners

  5. Source-rock maturation characteristics of symmetric and asymmetric grabens inferred from integrated analogue and numerical modeling: The southern Viking Graben (North Sea)

    NARCIS (Netherlands)

    Corver, M.P.; Doust, H.; van Wees, J.D.A.M.; Cloetingh, S.A.P.L.

    2011-01-01

    We present the results of an integrated analogue and numerical modeling study with a focus on structural, stratigraphic and thermal differences between symmetric and asymmetric grabens. These models enable fault interpretation and subsidence analyses in studies of active rifting and graben

  6. Simple numerical evaluation of modified Bessel functions Ksub(ν)(x) of fractional order and the integral ∫sup(infinitely)sub(x)Ksub(ν)(eta)d eta

    International Nuclear Information System (INIS)

    Kostroun, V.O.

    1980-01-01

    Theoretical expressions for the angular and spectral distributions of synchrotron radiation involve modified Bessel functions of fractional order and the integral ∫sup(infinitely)sub(x)Ksub(ν)(eta)d eta. A simple series expression for these quantities which can be evaluated numerically with hand-held programmable calculators is presented. (orig.)

  7. A Lie-admissible method of integration of Fokker-Planck equations with non-linear coefficients (exact and numerical solutions)

    International Nuclear Information System (INIS)

    Fronteau, J.; Combis, P.

    1984-08-01

    A Lagrangian method is introduced for the integration of non-linear Fokker-Planck equations. Examples of exact solutions obtained in this way are given, and also the explicit scheme used for the computation of numerical solutions. The method is, in addition, shown to be of a Lie-admissible type

  8. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  9. The impact of watershed management on coastal morphology: A case study using an integrated approach and numerical modeling

    Science.gov (United States)

    Samaras, Achilleas G.; Koutitas, Christopher G.

    2014-04-01

    Coastal morphology evolves as the combined result of both natural- and human- induced factors that cover a wide range of spatial and temporal scales of effect. Areas in the vicinity of natural stream mouths are of special interest, as the direct connection with the upstream watershed extends the search for drivers of morphological evolution from the coastal area to the inland as well. Although the impact of changes in watersheds on the coastal sediment budget is well established, references that study concurrently the two fields and the quantification of their connection are scarce. In the present work, the impact of land-use changes in a watershed on coastal erosion is studied for a selected site in North Greece. Applications are based on an integrated approach to quantify the impact of watershed management on coastal morphology through numerical modeling. The watershed model SWAT and a shoreline evolution model developed by the authors (PELNCON-M) are used, evaluating with the latter the performance of the three longshore sediment transport rate formulae included in the model formulation. Results document the impact of crop abandonment on coastal erosion (agricultural land decrease from 23.3% to 5.1% is accompanied by the retreat of ~ 35 m in the vicinity of the stream mouth) and show the effect of sediment transport formula selection on the evolution of coastal morphology. Analysis denotes the relative importance of the parameters involved in the dynamics of watershed-coast systems, and - through the detailed description of a case study - is deemed to provide useful insights for researchers and policy-makers involved in their study.

  10. Event driven software package for the database of Integrated Coastal and Marine Area Management (ICMAM) (Developed in 'C')

    Digital Repository Service at National Institute of Oceanography (India)

    Sadhuram, Y.; Murty, T.V.R.; Chandramouli, P.; Murthy, K.S.R.

    National Institute of Oceanography (NIO, RC, Visakhapatnam, India) had taken up the Integrated Coastal and Marine Area Management (ICMAM) project funded by Department of Ocean Development (DOD), New Delhi, India. The main objective of this project...

  11. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  12. A framework for organizing cancer-related variations from existing databases, publications and NGS data using a High-performance Integrated Virtual Environment (HIVE).

    Science.gov (United States)

    Wu, Tsung-Jung; Shamsaddini, Amirhossein; Pan, Yang; Smith, Krista; Crichton, Daniel J; Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    Years of sequence feature curation by UniProtKB/Swiss-Prot, PIR-PSD, NCBI-CDD, RefSeq and other database biocurators has led to a rich repository of information on functional sites of genes and proteins. This information along with variation-related annotation can be used to scan human short sequence reads from next-generation sequencing (NGS) pipelines for presence of non-synonymous single-nucleotide variations (nsSNVs) that affect functional sites. This and similar workflows are becoming more important because thousands of NGS data sets are being made available through projects such as The Cancer Genome Atlas (TCGA), and researchers want to evaluate their biomarkers in genomic data. BioMuta, an integrated sequence feature database, provides a framework for automated and manual curation and integration of cancer-related sequence features so that they can be used in NGS analysis pipelines. Sequence feature information in BioMuta is collected from the Catalogue of Somatic Mutations in Cancer (COSMIC), ClinVar, UniProtKB and through biocuration of information available from publications. Additionally, nsSNVs identified through automated analysis of NGS data from TCGA are also included in the database. Because of the petabytes of data and information present in NGS primary repositories, a platform HIVE (High-performance Integrated Virtual Environment) for storing, analyzing, computing and curating NGS data and associated metadata has been developed. Using HIVE, 31 979 nsSNVs were identified in TCGA-derived NGS data from breast cancer patients. All variations identified through this process are stored in a Curated Short Read archive, and the nsSNVs from the tumor samples are included in BioMuta. Currently, BioMuta has 26 cancer types with 13 896 small-scale and 308 986 large-scale study-derived variations. Integration of variation data allows identifications of novel or common nsSNVs that can be prioritized in validation studies. Database URL: BioMuta: http

  13. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 4 : web-based bridge information database--visualization analytics and distributed sensing.

    Science.gov (United States)

    2012-03-01

    This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...

  14. TcruziDB, an Integrated Database, and the WWW Information Server for the Trypanosoma cruzi Genome Project

    Directory of Open Access Journals (Sweden)

    Degrave Wim

    1997-01-01

    Full Text Available Data analysis, presentation and distribution is of utmost importance to a genome project. A public domain software, ACeDB, has been chosen as the common basis for parasite genome databases, and a first release of TcruziDB, the Trypanosoma cruzi genome database, is available by ftp from ftp://iris.dbbm.fiocruz.br/pub/genomedb/TcruziDB as well as versions of the software for different operating systems (ftp://iris.dbbm.fiocruz.br/pub/unixsoft/. Moreover, data originated from the project are available from the WWW server at http://www.dbbm.fiocruz.br. It contains biological and parasitological data on CL Brener, its karyotype, all available T. cruzi sequences from Genbank, data on the EST-sequencing project and on available libraries, a T. cruzi codon table and a listing of activities and participating groups in the genome project, as well as meeting reports. T. cruzi discussion lists (tcruzi-l@iris.dbbm.fiocruz.br and tcgenics@iris.dbbm.fiocruz.br are being maintained for communication and to promote collaboration in the genome project

  15. Experimental and numerical study of heat transfer phenomena, inside a flat-plate integrated collector storage solar water heater (ICSSWH), with indirect heat withdrawal

    International Nuclear Information System (INIS)

    Gertzos, K.P.; Pnevmatikakis, S.E.; Caouris, Y.G.

    2008-01-01

    The thermal behavior of a particular flat-plate integrated collector storage solar water heater (ICSSWH) is examined, experimentally and numerically. The particularity consists of the indirect heating of the service hot water, through a heat exchanger incorporated into front and back major surfaces of the ICSSWH. Natural and forced convection mechanisms are both examined. A prototype tank was fabricated and experimental data of temperature profiles are extracted, during various energy withdrawals. A 3D computational fluid dynamics (CFD) model was developed and validated against experimental results. Numerical predictions are found highly accurate, providing thus the use of the 3D CFD model for the optimization of this and similar devices

  16. The human interactome knowledge base (hint-kb): An integrative human protein interaction database enriched with predicted protein–protein interaction scores using a novel hybrid technique

    KAUST Repository

    Theofilatos, Konstantinos A.

    2013-07-12

    Proteins are the functional components of many cellular processes and the identification of their physical protein–protein interactions (PPIs) is an area of mature academic research. Various databases have been developed containing information about experimentally and computationally detected human PPIs as well as their corresponding annotation data. However, these databases contain many false positive interactions, are partial and only a few of them incorporate data from various sources. To overcome these limitations, we have developed HINT-KB (http://biotools.ceid.upatras.gr/hint-kb/), a knowledge base that integrates data from various sources, provides a user-friendly interface for their retrieval, cal-culatesasetoffeaturesofinterest and computesaconfidence score for every candidate protein interaction. This confidence score is essential for filtering the false positive interactions which are present in existing databases, predicting new protein interactions and measuring the frequency of each true protein interaction. For this reason, a novel machine learning hybrid methodology, called (Evolutionary Kalman Mathematical Modelling—EvoKalMaModel), was used to achieve an accurate and interpretable scoring methodology. The experimental results indicated that the proposed scoring scheme outperforms existing computational methods for the prediction of PPIs.

  17. Integrated numerical modeling of a landslide early warning system in a context of adaptation to future climatic pressures

    Science.gov (United States)

    Khabarov, Nikolay; Huggel, Christian; Obersteiner, Michael; Ramírez, Juan Manuel

    2010-05-01

    Mountain regions are typically characterized by rugged terrain which is susceptible to different types of landslides during high-intensity precipitation. Landslides account for billions of dollars of damage and many casualties, and are expected to increase in frequency in the future due to a projected increase of precipitation intensity. Early warning systems (EWS) are thought to be a primary tool for related disaster risk reduction and climate change adaptation to extreme climatic events and hydro-meteorological hazards, including landslides. An EWS for hazards such as landslides consist of different components, including environmental monitoring instruments (e.g. rainfall or flow sensors), physical or empirical process models to support decision-making (warnings, evacuation), data and voice communication, organization and logistics-related procedures, and population response. Considering this broad range, EWS are highly complex systems, and it is therefore difficult to understand the effect of the different components and changing conditions on the overall performance, ultimately being expressed as human lives saved or structural damage reduced. In this contribution we present a further development of our approach to assess a landslide EWS in an integral way, both at the system and component level. We utilize a numerical model using 6 hour rainfall data as basic input. A threshold function based on a rainfall-intensity/duration relation was applied as a decision criterion for evacuation. Damage to infrastructure and human lives was defined as a linear function of landslide magnitude, with the magnitude modelled using a power function of landslide frequency. Correct evacuation was assessed with a ‘true' reference rainfall dataset versus a dataset of artificially reduced quality imitating the observation system component. Performance of the EWS using these rainfall datasets was expressed in monetary terms (i.e. damage related to false and correct evacuation). We

  18. Study on safety of a nuclear ship having an integral marine water reactor. Intelligent information database program concerned with thermal-hydraulic characteristics

    International Nuclear Information System (INIS)

    Inasaka, Fujio; Nariai, Hideki; Kobayashi, Michiyuki; Murata, Hiroyuki; Aya, Izuo

    2001-01-01

    As a high economical marine reactor with sufficient safety functions, an integrated type marine water reactor has been considered most promising. At the National Maritime Research Institute, a series of the experimental studies on the thermal-hydraulic characteristics of an integrated/passive-safety type marine water reactor such as the flow boiling of a helical-coil type steam generator, natural circulation of primary water under a ship rolling motion and flashing-condensation oscillation phenomena in pool water has been conducted. This current study aims at making use of the safety analysis or evaluation of a future marine water reactor by developing an intelligent information database program concerned with the thermal-hydraulic characteristics of an integral/passive-safety reactor on the basis of the above-mentioned valuable experimental knowledge. Since the program was created as a Windows application using the Visual Basic, it is available to the public and can be easily installed in the operating system. Main functions of the program are as follows: (1) steady state flow boiling analysis and determination of stability limit for any helical-coil type once-through steam generator design. (2) analysis and comparison with the flow boiling data, (3) reference and graphic display of the experimental data, (4) indication of the knowledge information such as analysis method and results of the study. The program will be useful for the design of not only the future integrated type marine water reactor but also the small sized water reactor. (author)

  19. Numerical Evaluation of the "Dual-Kernel Counter-flow" Matric Convolution Integral that Arises in Discrete/Continuous (D/C) Control Theory

    Science.gov (United States)

    Nixon, Douglas D.

    2009-01-01

    Discrete/Continuous (D/C) control theory is a new generalized theory of discrete-time control that expands the concept of conventional (exact) discrete-time control to create a framework for design and implementation of discretetime control systems that include a continuous-time command function generator so that actuator commands need not be constant between control decisions, but can be more generally defined and implemented as functions that vary with time across sample period. Because the plant/control system construct contains two linear subsystems arranged in tandem, a novel dual-kernel counter-flow convolution integral appears in the formulation. As part of the D/C system design and implementation process, numerical evaluation of that integral over the sample period is required. Three fundamentally different evaluation methods and associated algorithms are derived for the constant-coefficient case. Numerical results are matched against three available examples that have closed-form solutions.

  20. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  1. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  2. Introduction to numerical analysis

    CERN Document Server

    Hildebrand, F B

    1987-01-01

    Well-known, respected introduction, updated to integrate concepts and procedures associated with computers. Computation, approximation, interpolation, numerical differentiation and integration, smoothing of data, other topics in lucid presentation. Includes 150 additional problems in this edition. Bibliography.

  3. The economic impact of GERD and PUD: examination of direct and indirect costs using a large integrated employer claims database.

    Science.gov (United States)

    Joish, Vijay N; Donaldson, Gary; Stockdale, William; Oderda, Gary M; Crawley, Joseph; Sasane, Rahul; Joshua-Gotlib, Sandra; Brixner, Diana I

    2005-04-01

    The objective of this study was to examine the relationship of work loss associated with gastro- the relationship of work loss associated with gastro- the relationship of work loss associated with gastro-esophageal reflux disease (GERD) and peptic ulcer disease (GERD) and peptic ulcer disease (PUD) in a large population of employed individuals in the United States (US) and quantify the individuals in the United States (US) and quantify the economic impact of these diseases to the employer. A proprietary database that contained work place absence, disability and workers' compensation data in addition to prescription drug and medical claims was used to answer the objectives. Employees with a medical claim with an ICD-9 code for GERD or PUD were identified from 1 January 1997 to 31 December 2000. A cohort of controls was identified for the same time period using the method of frequency matching on age, gender, industry type, occupational status, and employment status. Work absence rates and health care costs were compared between the groups after adjusting for demo graphic, and employment differences using analysis of covariance models. There were significantly lower (p rate of adjusted all-cause absenteeism and sickness-related absenteeism were observed between the disease groups versus the controls. In particular, controls had an average of 1.2 to 1.6 days and 0.4 to 0.6 lower all-cause and sickness-related absenteeism compared to the disease groups. The incremental economic impact projected to a hypothetical employed population was estimated to be $3441 for GERD, $1374 for PUD, and $4803 for GERD + PUD per employee per year compared to employees without these diseases. Direct medical cost and work absence in employees with GERD, PUD and GERD + PUD represent a significant burden to employees and employers.

  4. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  5. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases such...

  6. An XCT image database system

    International Nuclear Information System (INIS)

    Komori, Masaru; Minato, Kotaro; Koide, Harutoshi; Hirakawa, Akina; Nakano, Yoshihisa; Itoh, Harumi; Torizuka, Kanji; Yamasaki, Tetsuo; Kuwahara, Michiyoshi.

    1984-01-01

    In this paper, an expansion of X-ray CT (XCT) examination history database to XCT image database is discussed. The XCT examination history database has been constructed and used for daily examination and investigation in our hospital. This database consists of alpha-numeric information (locations, diagnosis and so on) of more than 15,000 cases, and for some of them, we add tree structured image data which has a flexibility for various types of image data. This database system is written by MUMPS database manipulation language. (author)

  7. Numerical Asymptotic Solutions Of Differential Equations

    Science.gov (United States)

    Thurston, Gaylen A.

    1992-01-01

    Numerical algorithms derived and compared with classical analytical methods. In method, expansions replaced with integrals evaluated numerically. Resulting numerical solutions retain linear independence, main advantage of asymptotic solutions.

  8. Numerical evaluation of general n-dimensional integrals by the repeated use of Newton-Cotes formulas

    International Nuclear Information System (INIS)

    Nihira, Takeshi; Iwata, Tadao.

    1992-07-01

    The composites Simpson's rule is extended to n-dimensional integrals with variable limits. This extension is illustrated by means of the recursion relation of n-fold series. The structure of calculation by the Newton-Cotes formulas for n-dimensional integrals is clarified with this method. A quadrature formula corresponding to the Newton-Cotes formulas can be readily constructed. The results computed for some examples are given, and the error estimates for two or three dimensional integrals are described using the error term. (author)

  9. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  10. Overlap of proteomics biomarkers between women with pre-eclampsia and PCOS: a systematic review and biomarker database integration.

    Science.gov (United States)

    Khan, Gulafshana Hafeez; Galazis, Nicolas; Docheva, Nikolina; Layfield, Robert; Atiomo, William

    2015-01-01

    Do any proteomic biomarkers previously identified for pre-eclampsia (PE) overlap with those identified in women with polycystic ovary syndrome (PCOS). Five previously identified proteomic biomarkers were found to be common in women with PE and PCOS when compared with controls. Various studies have indicated an association between PCOS and PE; however, the pathophysiological mechanisms supporting this association are not known. A systematic review and update of our PCOS proteomic biomarker database was performed, along with a parallel review of PE biomarkers. The study included papers from 1980 to December 2013. In all the studies analysed, there were a total of 1423 patients and controls. The number of proteomic biomarkers that were catalogued for PE was 192. Five proteomic biomarkers were shown to be differentially expressed in women with PE and PCOS when compared with controls: transferrin, fibrinogen α, β and γ chain variants, kininogen-1, annexin 2 and peroxiredoxin 2. In PE, the biomarkers were identified in serum, plasma and placenta and in PCOS, the biomarkers were identified in serum, follicular fluid, and ovarian and omental biopsies. The techniques employed to detect proteomics have limited ability in identifying proteins that are of low abundance, some of which may have a diagnostic potential. The sample sizes and number of biomarkers identified from these studies do not exclude the risk of false positives, a limitation of all biomarker studies. The biomarkers common to PE and PCOS were identified from proteomic analyses of different tissues. This data amalgamation of the proteomic studies in PE and in PCOS, for the first time, discovered a panel of five biomarkers for PE which are common to women with PCOS, including transferrin, fibrinogen α, β and γ chain variants, kininogen-1, annexin 2 and peroxiredoxin 2. If validated, these biomarkers could provide a useful framework for the knowledge infrastructure in this area. To accomplish this goal, a

  11. Utilization of a Clinical Trial Management System for the Whole Clinical Trial Process as an Integrated Database: System Development.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho; Kim, Tae Won

    2018-04-24

    Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and

  12. TRENDS: The aeronautical post-test database management system

    Science.gov (United States)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  13. Visualization of numerically simulated aerodynamic flow fields

    International Nuclear Information System (INIS)

    Hian, Q.L.; Damodaran, M.

    1991-01-01

    The focus of this paper is to describe the development and the application of an interactive integrated software to visualize numerically simulated aerodynamic flow fields so as to enable the practitioner of computational fluid dynamics to diagnose the numerical simulation and to elucidate essential flow physics from the simulation. The input to the software is the numerical database crunched by a supercomputer and typically consists of flow variables and computational grid geometry. This flow visualization system (FVS), written in C language is targetted at the Personal IRIS Workstations. In order to demonstrate the various visualization modules, the paper also describes the application of this software to visualize two- and three-dimensional flow fields past aerodynamic configurations which have been numerically simulated on the NEC-SXIA Supercomputer. 6 refs

  14. Performance of overlapped shield tunneling through an integrated physical model tests, numerical simulations and real-time field monitoring

    Directory of Open Access Journals (Sweden)

    Junlong Yang

    2017-03-01

    Full Text Available In this work, deformations and internal forces of an existing tunnel subjected to a closely overlapped shield tunneling are monitored and analyzed using a series of physical model experiments and numerical simulations. Effects of different excavation sequences and speeds are explicitly considered in the analysis. The results of the physical model experiments show that the bottom-up tunneling procedure is better than the top-down tunneling procedure. The incurred deformations and internal forces of the existing tunnel increase with the excavation speed and the range of influence areas also increase accordingly. For construction process control, real-time monitoring of the power tunnel is used. The monitoring processes feature full automation, adjustable frequency, real-time monitor and dynamic feedback, which are used to guide the construction to achieve micro-disturbance control. In accordance with the situation of crossing construction, a numerical study on the performance of power tunnel is carried out. Construction control measures are given for the undercrossing construction, which helps to accomplish the desired result and meet protection requirements of the existing tunnel structure. Finally, monitoring data and numerical results are compared, and the displacement and joint fracture change models in the power tunnel subject to the overlapped shield tunnel construction are analyzed. Keywords: Overlapped tunnel, Automatic monitoring, Micro-disturbance control

  15. Data, models, and views: towards integration of diverse numerical model components and data sets for scientific and public dissemination

    Science.gov (United States)

    Hofmeister, Richard; Lemmen, Carsten; Nasermoaddeli, Hassan; Klingbeil, Knut; Wirtz, Kai

    2015-04-01

    Data and models for describing coastal systems span a diversity of disciplines, communities, ecosystems, regions and techniques. Previous attempts of unifying data exchange, coupling interfaces, or metadata information have not been successful. We introduce the new Modular System for Shelves and Coasts (MOSSCO, http://www.mossco.de), a novel coupling framework that enables the integration of a diverse array of models and data from different disciplines relating to coastal research. In the MOSSCO concept, the integrating framework imposes very few restrictions on contributed data or models; in fact, there is no distinction made between data and models. The few requirements are: (1) principle coupleability, i.e. access to I/O and timing information in submodels, which has recently been referred to as the Basic Model Interface (BMI) (2) open source/open data access and licencing and (3) communication of metadata, such as spatiotemporal information, naming conventions, and physical units. These requirements suffice to integrate different models and data sets into the MOSSCO infrastructure and subsequently built a modular integrated modeling tool that can span a diversity of processes and domains. We demonstrate how diverse coastal system constituents were integrated into this modular framework and how we deal with the diverging development of constituent data sets and models at external institutions. Finally, we show results from simulations with the fully coupled system using OGC WebServices in the WiMo geoportal (http://kofserver3.hzg.de/wimo), from where stakeholders can view the simulation results for further dissemination.

  16. Thermodynamic Database for the Terrestrial and Planetary Mantle Studies: Where we stand, and some future directions involving experimental studies, numerical protocol for EoS and atomistic calculations (Invited)

    Science.gov (United States)

    Ganguly, J.; Tirone, M.; Sorcar, N.

    2013-12-01

    Reliable thermodynamic databases for rock forming minerals are essential for petrological and geodynamic studies. While the available databases (1-3) represent laudable efforts, none seems to be completely satisfactory. We show inter-comparison of phase diagrams computed from different databases and also their comparisons with experimental phase diagrams in complex systems. The results show good agreement and also significant disagreements in some P-T-X regimes; resolution of these disagreements via new experimental and thermodynamic data is needed to sort out the problems and make further progress. Two of the main challenges in the development of databases (4) seem to be (a) appropriate formulation of an EoS for solids that is suitable for studies of Earth and planetary interiors and (b) relatively simple formulations of thermodynamic mixing properties of mantle minerals that perform well within the compositional space of interest. While work on EoS formulation continues, we present a semi-empirical numerical approach that creates a consistent set of material properties (α, K, Cp, Cv) up to very high P-T conditions by satisfying certain physical constraints. Adequate experimental data are not available to constrain the mixing properties of several minerals that would be valid over the compositional range of interest in the natural environments. We have, thus, pursued an alternative approach on the basis of physical and crystal-chemical data. It is found that combination of elastic mixing energy, incorporating the effect of multi-atom interactions (5, 6), and crystal-field (CF) energy of mixing provide enthalpy of mixing in binary solid solutions that are in good agreement with experimental and calorimetric data. The CF-splitting vs. composition in a solid solution involving transition metal ion may be approximated by a semi-empirical relation using mean metal-oxygen bond-distance when such data are not available from spectroscopic studies. We also discuss the

  17. Solar Radiation and the UV Index: An Application of Numerical Integration, Trigonometric Functions, Online Education and the Modelling Process

    Science.gov (United States)

    Downs, Nathan; Parisi, Alfio V.; Galligan, Linda; Turner, Joanna; Amar, Abdurazaq; King, Rachel; Ultra, Filipina; Butler, Harry

    2016-01-01

    A short series of practical classroom mathematics activities employing the use of a large and publicly accessible scientific data set are presented for use by students in years 9 and 10. The activities introduce and build understanding of integral calculus and trigonometric functions through the presentation of practical problem solving that…

  18. The numerical assessment of motion strategies for integrated linear motor during starting of a free-piston engine generator

    Science.gov (United States)

    Razali Hanipah, M.; Razul Razali, Akhtar

    2017-10-01

    Free-piston engine generator (FPEG) provides a novel method for electrical power generation in hybrid electric vehicle applications with scarcely reported prototype development and testing. This paper is looking into the motion control strategy for motoring the FPEG during starting. There are two motion profiles investigated namely, trapezoidal velocity and Scurve velocity. Both motion profiles were investigated numerically and the results have shown that the S-curve motion can only achieve 80% of the stroke when operated at the proposed motoring speed of 10Hz.

  19. Application of Computer Aided Design (CADD) in data display and integration of numerical and field results - Stripa phase 3

    International Nuclear Information System (INIS)

    Press, D.E.; Halliday, S.M.; Gale, J.E.

    1990-12-01

    Existing CAD/CADD systems have been reviewed and the micro-computer compatible solids modelling CADD software SilverScreen was selected for use in constructing a CADD model of the Stripa site. Maps of the Stripa mine drifts, shafts, raises and stopes were digitized and used to create three-dimensional images of the north-eastern part of the mine and the SCV site. In addition, the use of CADD sub-programs to display variation in fracture geometry and hydraulic heads have been demonstrated. The database developed in this study is available as either raw digitized files, processed data files, SilverScreen script files or in DXF or IGES formats; all of which are described in this report. (au)

  20. A theoretical study for the real-time assessment of external gamma exposure using equivalent-volume numerical integration

    International Nuclear Information System (INIS)

    Han, Moon Hee

    1995-02-01

    An approximate method for estimating gamma external dose due to an arbitrary distribution of radioactive material has been developed. For the assessment of external gamma dose, the space over which radioactive material is distributed has been assumed to be composed of hexagonal cells. The evaluation of three-dimensional integration over the space is an extremely time-consuming task. Hence, a different approach has been used for the study, i.e., a equivalent-volume spherical approach in which a regular hexahedron is modeled as a equivalent-volume sphere to simplify the integration. For the justification of the current approach, two case studies have been performed: a comparison with a point source approximation and a comparison of external dose rate with the Monte Carlo integration. These comparisons show that the current approach gives reasonable results in a physical sense. Computing times of the developed and Monte Carlo integration method on VAX system have been compared as a function of the number of hexagonal cells. This comparison shows that CPU times for both methods are comparable in the region of small number of cells, but in the region of large number, Monte Carlo integration needs much more computing times. The proposed method is shown to have an accuracy equivalent to Monte Carlo method with an advantage of much shorter calculation time. Then, the method developed here evaluates early off-site consequences of a nuclear accident. An accident consequence assessment model has been integrated using Gaussian puff model which is used to obtain the distribution of radioactive material in the air and on the ground. For this work, the real meteorological data measured at Kori site for 10 years (1976 - 1985) have been statistically analyzed for obtaining site-specific conditions. The short-term external gamma exposures have been assessed for several site-specific meteorological conditions. The results show that the extent and the pattern of short-term external